A Facebook moderator says she took down beheadings, child pornography, and animal abuse every day - but was 'treated like nothing'

Advertisement
A Facebook moderator says she took down beheadings, child pornography, and animal abuse every day - but was 'treated like nothing'

Monika Bickert

Getty

Facebook Head of Global Policy Management Monika Bickert.

Advertisement
  • A former Facebook moderator revealed the disturbing imagery she had to remove from the site everyday.
  • In an anonymous interview with the BBC, she said she had to make quickfire takedown decisions about photos and videos containing beheadings, animal abuse, and child pornography.
  • The content reviewer said the work gave her nightmares and criticised Facebook for the lack of support it provided staff.
  • Facebook said graphic content is a "fraction" of what needs to reviewed, and it is committed to giving moderators the tools to do the job well.


A former Facebook moderator has revealed the horrors she was exposed to every day - and criticised the social network for not doing enough to support staff handling disturbing imagery.

The content reviewer worked in a Facebook takedown centre in Berlin and spoke to the BBC on the condition of anonymity, only being identified under the pseudonym of Laura.

She told the BBC about the disturbing photos and videos she had to make five-second decisions on whether to remove or not. Among the worst imagery were beheadings, animal abuse, and child pornography, Laura said, describing how she had seen a six-month-old baby being raped.

Laura suggested that the work had an impact on her mental health, describing a vivid nightmare she had during her time working for Facebook:

Advertisement

"I had nightmares a couple of times. I remember one for example: People jumping from a building, I don't know why. And I remember people, instead of helping the people jumping, they were just taking photos and videos... I woke up crying."

Laura was critical of the lack of support Facebook provided content reviewers, claiming that staff made regular complaints to management. "It's the most important job in Facebook and it's the worst and no-one cares about it," she said.

In a message directed at Facebook CEO Mark Zuckerberg, she added: "How are you allowing this to happen? That young people like us are having to see these things, but we were treated like nothing."

Business Insider has contacted Facebook for comment. Head of Global Policy Management Monika Bickert recognised that Facebook moderating is difficult work, but said support systems are in place for employees.

"This work is hard, but I will say that the graphic content is a small fraction of what reviewers might see. Increasingly we've been able to use technology to review and remove some of the worst content," she told the BBC.

Advertisement

"We're committed to giving them what they need to do this job well. If they're ever uncomfortable at work, there are counselling resources for them and they can be shifted to work on a different kind of content."

Facebook published the internal guidelines its moderators use to police the social network this week. The "Community Standards" are 8,500 words long and go into great detail about exactly what is and isn't allowed - from sexual and violent content, to hate speech.

Facebook is increasingly relying on artificial intelligence to identify offending items on its site. But Zuckerberg said on Wednesday that it is "easier to build an AI system to detect a nipple than what is hate speech."

{{}}