Why Facebook’s new ‘one strike’ policy isn’t enough to address its moderation problems

Advertisement
Why Facebook’s new ‘one strike’ policy isn’t enough to address its moderation problems
  • Facebook’s new ‘one strike’ policy says that users will be banned from streaming for a ‘set time period’ if they violate the platform’s community standards.
  • A Facebook spokesperson stated that the Christchurch shooter would not have been able to live stream if these regulations had been in place.
  • But the new regulations don’t address the company’s unavailability during the time of crisis or their defence for how the situation came to be the first place.
Advertisement
If you’re someone who uses Facebook Live often, you should probably go back and read the platform’s community standards once more — just make sure you understand them.

From now onwards, Facebook is only allowing for one violation per user before banning them altogether. Apparently the duration of the ban is flexible depending on the violation.

Following the horrific terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate.

Guy Rosen, Vice President Integrity at Facebook

During the ‘Chruchchrist Call to Action’ summit in Paris yesterday, a Facebook spokesperson remarked that the Christchurch shooter would not have been able to use his Live account if the regulations had been in place — without specifying how exactly.
Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

The sudden change in Facebook’s moderation

Facebook’s moderation methods were seen failing in the face of the Christchruch shooting in New Zealand, which was actually live-streamed on the social-networking platform was viewed at least 4,000 times.

Advertisement

Even though Facebook was eventually able to take down the original post, multiple copies of the video were already being circulated and had breached the barriers to reach other video sharing website like YouTube.

While these are the most stringent regulations that Facebook has ever introduced, it has been under duress pressure from media. It’s also not a pre-emptive approach that can keep live streaming from posting toxic material.

Facebook's many excuses

When the Christchurch shooting happened, the live stream was the primary incident. The lack of response from Facebook escalated the situation to a global catastrophe. Even when it finally stepped into the limelight, Facebook only had ‘lack of AI training data’ to blame.

The tech giant even said at the time that it wouldn’t implement a time delay on Facebook feeds because it would ‘compromise’ the Facebook Live experience.

And, unfortunately, the new regulations don’t account for that.
Advertisement

Facebook-owned Instagram also brought in harsh restrictions to reduce abuse on its platform, even though it hasn’t set a strict violation limit.

For now, the restrictions are limited to Facebook Live but the Vice President for Integrity at Facebook, Guy Rosen, wrote, “We plan on extending these restrictions to other areas over the coming weeks, beginning with preventing those same people from creating ads on Facebook.”


See also:
Facebook is raising the minimum wage of its contractors and content moderators after facing scrutiny over low pay and 'inhumane' working conditions

A reporter went undercover as a Facebook moderator and found the firm is failing to delete shocking child abuse and racism

Advertisement
Facebook endured a staggering number of scandals and controversies in 2018 - here they all are
{{}}