Facebook reportedly killed its political moderation task force in the weeks leading up to the January 6 insurrection

Advertisement
Facebook reportedly killed its political moderation task force in the weeks leading up to the January 6 insurrection
Facebook cofounder and CEO Mark Zuckerberg spoke at Georgetown University in October 2019.Andrew Caballero-Reynolds/AFP via Getty Images
  • Facebook reportedly created a task force to police increasingly "toxic" political groups before the 2020 election.
  • That task force was dissolved weeks before the attempted insurrection on January 6, a new report said.
Advertisement

In the approximately two months between the November 3, 2020, US presidential election and the January 6, 2021, attempted insurrection at the US Capitol, Facebook "took its eye off the ball" on moderation of political groups, a former Facebook employee told the Washington Post.

"There was a lot of violating content that did appear on the platform that wouldn't otherwise have," the former employee, who worked on the Integrity team, said.

That's because, directly after the election, Facebook dissolved a critical moderation task force that moderated the service's insular (and often toxic) politics Groups, according to the Post.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

As a result of that move, Facebook politics Groups were "inundated with posts attacking the legitimacy of Biden's election" in the weeks leading up to the attempted insurrection, the Post reported.

According to Facebook data compiled and analyzed by ProPublica, moderation of these politics Groups saw a stark decline during this period — primarily in December 2020.

Advertisement

Facebook reportedly killed its political moderation task force in the weeks leading up to the January 6 insurrection
A mob of insurrectionists stormed the US Capitol building on January 6, 2021.Brent Stirton/Getty

Representatives for Facebook's newly rebranded parent company, Meta, denied the findings of the Post's reporting.

"The idea that we deprioritized our Civic Integrity work in any way is simply not true," Meta spokesperson Drew Pusateri told Insider. "We integrated it into a larger Central Integrity team to allow us to apply the work that this team pioneered for elections to other challenges like health-related issues for example. Their work continues to this day."

Facebook has been repeatedly criticized across the last several years for moderation issues.

The world's largest social network was used by foreign state actors to sow discord during the 2016 presidential election, which Facebook vowed to more carefully monitor during the 2020 election. And since March 2020, Facebook has been repeatedly criticized for moderation issues around coronavirus disinformation.

More recently, a trove of internal documents was leaked by a former Facebook product manager which detailed major issues at the company and led to several congressional hearings.

Advertisement

The documents demonstrated that Facebook's own research showed that Instagram is bad for teen mental health, and that the company prioritized parts of the world differently for content moderation which sometimes had major consequences. For example, in Myanmar, where Facebook was blamed in part for not moderating hate speech, posts led to real-world violence against the Muslim Rohingya population in 2018.

Got a tip? Contact Insider senior correspondent Ben Gilbert via email (bgilbert@insider.com), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.

{{}}