The Wall Street Journal exposes Facebook’s lies about content moderation

Comet Ping Pong. Photo (cc) 2016 by DOCLVHUGO.

What could shock us about Facebook at this point? That Mark Zuckerberg and Sheryl Sandberg are getting ready to shut it down and donate all of their wealth because of their anguish over how toxic the platform has become?

No, we all know there is no bottom to Facebook. So Jeff Horwitz’s investigative report in The Wall Street Journal on Monday — revealing the extent to which celebrities and politicians are allowed to break rules the rest of us must follow — was more confirmatory than revelatory.

That’s not to say it lacks value. Seeing it all laid out in internal company documents is pretty stunning, even if the information isn’t especially surprising.

Become a member of Media Nation for just $5 a month!

The story involves a program called XCheck, under which VIP users are given special treatment. Incredibly, there are 5.8 million people who fall into this category, so I guess you could say they’re not all that special. Horwitz explains: “Some users are ‘whitelisted’ — rendered immune from enforcement actions — while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.”

And here’s the killer paragraph, quoting a 2019 internal review:

“We are not actually doing what we say we do publicly,” said the confidential review. It called the company’s actions “a breach of trust” and added: “Unlike the rest of our community, these people can violate our standards without any consequences.”

Among other things, the story reveals that Facebook has lied to the Oversight Board it set up to review its content-moderation decisions — news that should prompt the entire board to resign.

Perhaps the worst abuse documented by Horwitz involves the Brazilian soccer star Neymar:

After a woman accused Neymar of rape in 2019, he posted Facebook and Instagram videos defending himself — and showing viewers his WhatsApp correspondence with his accuser, which included her name and nude photos of her. He accused the woman of extorting him.

Facebook’s standard procedure for handling the posting of “nonconsensual intimate imagery” is simple: Delete it. But Neymar was protected by XCheck.

For more than a day, the system blocked Facebook’s moderators from removing the video. An internal review of the incident found that 56 million Facebook and Instagram users saw what Facebook described in a separate document as “revenge porn,” exposing the woman to what an employee referred to in the review as abuse from other users.

“This included the video being reposted more than 6,000 times, bullying and harassment about her character,” the review found.

As good a story as this is, there’s a weird instance of both-sides-ism near the top. Horwitz writes: “Whitelisted accounts shared inflammatory claims that Facebook’s fact checkers deemed false, including that vaccines are deadly, that Hillary Clinton had covered up ‘pedophile rings,’ and that then-President Donald Trump had called all refugees seeking asylum ‘animals,’ according to the documents.”

The pedophile claim, of course, is better known as Pizzagate, the ur-conspiracy theory promulgated by QAnon, which led to an infamous shooting incident at the Comet Ping Pong pizza restaurant in Washington in 2016. Trump, on the other hand, had this to say in 2018, according to USA Today: “We have people coming into the country or trying to come in, we’re stopping a lot of them, but we’re taking people out of the country. You wouldn’t believe how bad these people are. These aren’t people. These are animals.”

Apparently the claim about Trump was rated as false because he appeared to be referring specifically to gang members, not to “all” refugees. But that “all” is doing a lot of work.

The Journal series continues today with a look at how Instagram is having a damaging effect on the self-esteem of teenage girls — and that Facebook, which owns the service, knows about it and isn’t doing anything.

9 thoughts on “The Wall Street Journal exposes Facebook’s lies about content moderation

  1. mfidelman

    Perhaps this gives the lie to the whole notion of content moderation.

    There was a time when we rejected “thought policing” – be it by prudes, Sen. McCarthy, the Soviet Politburo, or the Holy Office. Now, all too many call for “moderation” and “filtering” for all kinds of reasons – and then find their own materials blocked for (to them) spurious reasons. And we have plenty of instances of automated software misbehaving (e.g., child safety software keeping kids from access to health information about breast cancer).

    You’d think that we’d learned by now, and that cooler heads would prevail – but no, it’s the folks who’ve been burned before who are most loudly calling for censorship.

    Can’t we just let adults be adults, and use their own delete key & filters? (Not to say that fact checking, peer-rating/review aren’t helpful in doing so.)

      1. mfidelman

        Prosecute them for assault. Just like folks prosecute for slander and libel. No reason to turn the media into police. Or drop drones on their head.

        Folks stormed the Capitol. It’s a GOOD thing that they posted selfies. Makes it easier to find them and prosecute. Do we really want the yahoos to go back to wearing hoods, and hiding in the shadows?

    1. Dan Kennedy

      Prosecute them for assault? Publishing nude photos of adults isn’t a crime. Even without consent, it’s no more than a civil offense; see Gawker. The question is whether you leave it up or take it down.

      Do you think private companies like Facebook are under some sort of philosophical obligation not to take down content that hurts its business model and drives away customers?

      1. mfidelman

        Posting material with the intent to harm certainly goes beyond a civil offense.

        As to “philosophical obligation” – no, they’re under no obligation. On the other hand, if one presents an environment as akin to a public gathering space, then I think they have a moral obligation to stay out of the conversation. Kind of like a bar – you have bouncers to pull people apart, and sometimes eject them, and a legal obligation to cut off drunks and/or take their keys – but a bar does NOT put a “moderator” at every table. For that matter, one doesn’t have moderators at church coffee hours. Moderation is for formal meetings.

        Personally, I kind of find it troubling when a journalist, and journalism professor, stands up for any kind of speech control. I’m with Larry Flynt on that one – Hustler was a horrible magazine, but the man took a bullet standing up for freedom to publish. (And that’s what the Internet really is, one big printing press & copy machines – we don’t police folks standing at a copy machine, either.)

  2. Dan Kennedy

    You really ought to work on your analogies. Larry Flynt published what he wanted to publish and didn’t publish what he didn’t want to publish. It’s called editing. Yes, I’m for editing. I’m for private entities exercising their First Amendment right to publish, not publish, delete, whatever.

    As for what goes beyond a civil offense, you need to bone up on Gawker’s legal tribulations.

    1. mfidelman

      Re. my analogy – I’m referring to Larry Flynt in his role as a writer/publisher – not having someone preventing him from publishing.

      Re. Facebook, et. al – they are maintaining a quasi-public forum, like a bar or cafe. The owners of a bar don’t get to exercise editorial control over what gets said at the various tables – unless folks start taking swings at each other (and even then, at some bars, it’s only a problem when they start taking swings at people at other tables).

      What goes beyond a civil complaint… that’s for the legislatures to determine. Cyberbullying is very much a crime in some jurisdictions – as it should be – right up their with verbal assault in person. As someone who hosts a number of email lists, I sure don’t want to be put in the middle as judge, jury and executioner – that just has a chilling effect on providing any kind of forum.

      1. Don Wilkinson

        A bar owner has every right to eject a patron who is making threats, using racial slurs, spouting foul language, yelling “fire” in a crowed tavern, etc. Hence the ubitiquious signs that state “we reserve the right to refuse service” to anyone, except for reasons of race, ethnicity, gender, and so on.

      2. mfidelman

        Every right, yes. But they ALSO generally don’t respond to customer demands about ejecting the folks at the next table, because they don’t like the language, or the topic. My observation is that it’s somewhat ironic that, all the folks who were calling for “moderation” and outright exclusion of people/topics/messages are now complaining that they’re getting bit on the ass by the procedures and bots that they insisted be in place to protect them from the speech of other people.

        Personally, I think that folks like Facebook should stay out of the frey, not get drawn in as police. (Personally, I absolutely refuse to “moderate” email lists that I host. There’s too much pain, and potentially legal exposure, involved.)

Comments are closed.