A reporter went undercover as a Facebook moderator and found the firm is failing to delete shocking child abuse and racism

Advertisement
A reporter went undercover as a Facebook moderator and found the firm is failing to delete shocking child abuse and racism

Channel 4 Facebook

Channel 4/Firecrest Films

The undercover reporter at work as a Facebook moderator.

Advertisement
  • A reporter for British broadcaster Channel 4 went undercover as a Facebook moderator at CPL Resources, a Dublin-based content moderation contractor.
  • They found that Facebook is failing to delete shocking examples of graphic violence, child abuse, and racism, including a little boy being beaten by a grown man.
  • The documentary, "Inside Facebook: Secrets of the Social Network," exposes wild inconsistencies in how moderators were being trained and Facebook's standards.
  • Facebook said it has made "mistakes," but denied accusations that it seeks to profit from extreme content.


A journalist from British broadcaster Channel 4 went undercover as a Facebook moderator and found a stream of toxic content that the company was failing to delete.

The reporter posed as an employee of CPL Resources - a Dublin-based content moderation contractor that has worked with Facebook since 2010 - for documentary "Inside Facebook: Secrets of the Social Network."

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

The journalist undertook CPL Resources' training, where new staff are brought up to speed with Facebook's community standards, and set to work reviewing content, including graphic violence, child abuse, and hate speech.

Moderators were given three options when reviewing a queue of material: Ignore, delete, or mark as disturbing. The latter means it remains on Facebook, but places a restriction on who is able to view the content.

Advertisement

The reporter found that shocking examples of child abuse, racism, and violence were allowed to remain on Facebook. The findings also exposed wild inconsistencies in how moderators were being trained and Facebook's standards on specific pieces of content.

A video of a little boy being beaten by a grown man

During the training session, the reporter was shown an example of content that should be marked as disturbing - a video of a grown man beating a small boy.

Facebook

Channel 4/Firecrest Films

The video of a grown man beating a small boy.

The video was reported to Facebook in December 2012 by online anti-child abuse campaigner, Nicci Astin, but she was told at the time that the video did not violate Facebook's terms.

In its first two days on Facebook, the video was shared 44,000 times, and it was still up years later when Channel 4 investigated.

Advertisement

Richard Allan, Facebook's vice president of public policy, told Channel 4's Krishnan Guru-Murthy that the video "should have been taken down."

A week after Channel 4 brought the video to Facebook's attention, it was still online. As of Monday, Business Insider was still able to find a version of the video on the platform.

A racist meme of a girl being drowned

In Facebook's community standards on hate speech, it says "we do not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence."

CPL Resources trainees were shown a meme of a little girl having her head held underwater with the caption "when your daughter's first crush is a little negro boy."

The reporter was told that the image was an "ignore," because "it implies a lot, but to reach the actual violation, you have to jump through a lot of hoops to get there."

Advertisement

Facebook told Channel 4 that the image did, in fact, violate its hate speech policy, and that it was "reviewing what went wrong to prevent it happening again."

Richard Allan

Getty

Richard Allan, Facebook's vice president of public policy.

The undercover reporter also found that certain instances of hate speech were permitted. A comment aimed at Muslim immigrants that said "f**k off back to your own countries" was allowed to remain on the site. Had the comment been aimed solely at Muslims, rather than Muslim immigrants, it would have been deleted.

"People are debating very sensitive issues on Facebook, including issues like immigration. And that debate can be entirely legitimate," Allan said in response to the comment. When pressed about whether it constituted hate speech, he said it's "right on that line."

In a statement to Business Insider, Allan said Facebook had made mistakes. The company has reviewed training materials at contractors like CPL and provided refresher training courses for moderators.

Advertisement

"It's clear that some of what is shown in the program does not reflect Facebook's policies or values, and falls short of the high standards we expect," Allan said.

"We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention. Where we know we have made mistakes, we have taken action immediately. We are providing additional training and are working to understand exactly what happened so we can rectify it."

Does Facebook profit from extreme content?

Channel 4 spoke to Roger McNamee, an early Facebook investor who has become a critic of the company over issues including the Cambridge Analytica scandal. He said Facebook stands to benefit from extreme content.

"It's the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook understood that it was desirable to have people spend more time on site if you're going to have an advertising-based business," he said.

Roger McNamee

Rob Kim/Getty

Early Facebook investor Roger McNamee.

Advertisement

This was backed up by a CPL Resources staff member. They told the undercover reporter that violent content is left on the Facebook because "if you start censoring too much then people lose interest in the platform. It's all about making money at the end of the day."

Facebook's Allan strongly disagreed. He said: "Shocking content does not make us more money, that's just a misunderstanding of how the system works...

"There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material. But I just don't agree that that is the experience that most people want and that's not the experience we're trying to deliver."

The documentary also tackles problems with moderating images of self-harm, underage users, and far-right pages with large followings. "Inside Facebook: Secrets of the Social Network" will air on Channel 4 at 9 p.m. on Tuesday, July 17. It is produced by Firecrest Films.

{{}}