Content moderators for tech giants like Facebook and YouTube reveal what it's like to sift through some of the most disturbing material on the internet
- People who are hired to sift through online postings that have been flagged on platforms like Facebook and YouTube say the job takes a tremendous psychological toll.
- The moderators are frequently tasked with watching disturbing video and images of child sexual abuse, violence, animal cruelty, and more.
- One former moderator described the role as "watching the content of deranged psychos ... who don't have a conscience."
A legion of temporary employees who are tasked with moderating content on platforms like Facebook and YouTube say they were unprepared for the emotional and psychological toll the job would take.
The content moderators are hired to sift through online posts, including pictures and video that were flagged as inappropriate. Several of those employees shared their experiences in a Wall Street Journal report published Wednesday night.Moderators said they watched images of war victims who had been "gutted," and "child soldiers engaged in killings." A former moderator who worked at Facebook recalled watching video of a cat thrown into a microwave, The Journal reported.
Shaka Tafari, who moderated content on the Whisper messaging app, said he was "alarmed" by the number of messages that contained references to rape, or included images of bestiality and animal abuse.
"I was watching the content of deranged psychos in the woods somewhere who don't have a conscience for the texture or feel of human connection," Tafari told The Journal.
Tech giants oversee thousands of content moderators - jobs that are typically staffed through temporary employment agencies and have a high turnover rate due to the nature of the work. But that turnover could also be attributed to the emotional stress of the role, according to former employees interviewed by The Journal.
Some of those people claimed they had few tools to deal with the aftereffects of a job that required them to consume some of the most depraved material on the internet. Content moderators at Facebook and Microsoft are offered various avenues for psychological counseling, The Journal reported, but some of the employees said it was not enough. Moderators typically left the job within a year or less.
Content moderation became a hot-button issue on Facebook in particular this year after the fallout from the 2016 US election, when it was revealed that Russia leveraged the platform to execute influence campaigns that boosted then-candidate Donald Trump and disparaged his Democratic opponent, Hillary Clinton.Though Facebook CEO Mark Zuckerberg initially balked at the notion, the social-media platform later admitted that Russia-based operatives published some 80,000 posts on the platform over a two-year period.
The matter gained heightened urgency this week when The Washington Post reported that Russia's election-influence efforts would likely continue, as officials recently warned that digital platforms are still vulnerable to such misuse.