Facebook reportedly killed its political moderation task force in the weeks leading up to the January 6 insurrection
- That task force was dissolved weeks before the attempted
insurrectionon January 6, a new report said.
In the approximately two months between the November 3, 2020, US presidential election and the January 6, 2021, attempted insurrection at the US Capitol, Facebook "took its eye off the ball" on
"There was a lot of violating content that did appear on the platform that wouldn't otherwise have," the former employee, who worked on the Integrity team, said.
That's because, directly after the election, Facebook dissolved a critical moderation task force that moderated the service's insular (and often toxic)
As a result of that move, Facebook politics Groups were "inundated with posts attacking the legitimacy of Biden's election" in the weeks leading up to the attempted insurrection, the Post reported.
According to Facebook data compiled and analyzed by ProPublica, moderation of these politics Groups saw a stark decline during this period — primarily in December 2020.
Representatives for Facebook's newly rebranded parent company,
"The idea that we deprioritized our Civic Integrity work in any way is simply not true," Meta spokesperson Drew Pusateri told Insider. "We integrated it into a larger Central Integrity team to allow us to apply the work that this team pioneered for elections to other challenges like health-related issues for example. Their work continues to this day."
Facebook has been repeatedly criticized across the last several years for moderation issues.
The world's largest social network was used by foreign state actors to sow discord during the 2016 presidential election, which Facebook vowed to more carefully monitor during the 2020 election. And since March 2020, Facebook has been repeatedly criticized for moderation issues around coronavirus disinformation.
More recently, a trove of internal documents was leaked by a former Facebook product manager which detailed major issues at the company and led to several congressional hearings.
The documents demonstrated that Facebook's own research showed that Instagram is bad for teen mental health, and that the company prioritized parts of the world differently for content moderation which sometimes had major consequences. For example, in Myanmar, where Facebook was blamed in part for not moderating hate speech, posts led to real-world violence against the Muslim Rohingya population in 2018.
Got a tip? Contact Insider senior correspondent Ben Gilbert via email (email@example.com), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.
- Sebi to boost disclosure norms; do away with permanent board seats for individuals
- Here are the ten big income tax rule changes that will come into effect from April 1
- Not just for OTT, people rely on digital to discover & engage with content across TV and movies: BCG-Meta Report
- SPC Lifesciences files draft papers with SEBI for IPO
- Sensex rallies 346 pts, Nifty near 17,100 on firm global markets