'It's definitely going to be more violent': A former Facebook content moderator says election results could spark violence, no matter who wins

'It's definitely going to be more violent': A former Facebook content moderator says election results could spark violence, no matter who wins
Facebook CEO Mark Zuckerberg leaving The Merrion Hotel in Dublin after a meeting with politicians to discuss regulation of social media and harmful content in April 2019.Niall Carson/PA Images via Getty Images
  • Viana Ferguson, a former Facebook content moderator, said during a panel hosted by the nonprofit the Real Facebook Oversight Board that users have become more vocal about the "violence they are willing to execute" toward people.
  • Ferguson said there would "definitely" be calls for violence after the election, regardless of the winner.
  • Facebook CEO Mark Zuckerberg will appear before Congress on Wednesday to address a law that shields social-media companies from being held liable for the content of users' post.
  • "We've applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios," a Facebook spokesperson told Business Insider.
  • Are you an insider with information to share? Email aakhtar@businessinsider using a nonwork device.

A person who was responsible for looking at some of the most hateful content on Facebook said she was sure there would be calls for violence after the US election — regardless of the winner.

Viana Ferguson, a former content moderator, joined the nonprofit the Real Facebook Oversight Board to discuss rising racism and hate speech on Facebook. Ferguson, who worked as a Facebook content moderator from 2016 to 2019, said through tears that users have become more vocal about the "violence they are willing to execute" toward people.

Content moderators are typically outside contractors who review flagged content on the site to determine whether it should be removed. They review posts that can be violent, pornographic, racist, and otherwise hateful.
Advertisement
"There's going to be a wave of hate speech. It's definitely going to be more violent," Ferguson said during the panel on Monday. "It's going to happen; it doesn't matter who wins. Facebook needs to be prepared for that."

Read more: How Mark Zuckerberg's competitiveness and attempts to keep Facebook politically neutral turned it into a haven for misinformation and conspiracy theories that can swing elections

Facebook has announced some actions it has taken to curb calls for violence on Election Day, including removing "thousands" of groups that could incite civil unrest. Nick Clegg, Facebook's head of global affairs, declined to share specific policies in September but said the company planned to "restrict the circulation of content" that could prompt civil unrest.
Advertisement

"We've applied lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios," a Facebook spokesperson told Business Insider.

Facebook CEO Mark Zuckerberg said in an interview with Axios that false information about the election could spread on Facebook and acknowledged there could be "civil unrest" after Election Day. "I just think we need to be doing everything that we can to reduce the chances of violence or civil unrest in the wake of this election," Zuckerberg said.
Advertisement

Read more: Facebook salaries revealed: How much the social network pays for software engineers, product managers, copywriters, and more

One other former content moderator joined Ferguson during the panel — in addition to a current worker who spoke anonymously — to discuss the difficulty in removing racist and hateful posts because of loopholes in Facebook's guidelines.

Facebook has come under scrutiny for its treatments of content moderators. Moderators have gone back to physical workspaces during the coronavirus pandemic, even as Facebook employees work remote. The company agreed to pay $52 million to current and former content moderators who developed mental-health conditions on the job, The Verge reported.
Advertisement

Zuckerberg will appear before Congress on Wednesday to address Section 230, a 1996 law that shields social-media companies from being held liable for the content of users' posts. Lawmakers say the measure disincentives firms from moderating hate speech.

Are you an insider with information to share? Email aakhtar@businessinsider using a nonwork device.

{{}}