Facebook knew it had a huge problem with hateful groups 5 months ahead of the Capitol riot, internal docs show

Facebook knew it had a huge problem with hateful groups 5 months ahead of the Capitol riot, internal docs show
Facebook's CEO Mark Zuckerberg.REUTERS/Erin Scott/File Photo
  • Facebook staff told execs in August 2020 that 70% of top political groups were full of hate speech, misinformation, and violent threats.
  • Researchers had to fight to convince management to take action against these groups, per a Wall Street Journal report.
  • Facebook imposed some restrictions on groups, but loosened them again after the election — and before the Capitol riot.

Internal Facebook documents reviewed by the Wall Street Journal show the company knew in August 2020, five months before the US Capitol riots, that it had a big problem with groups spouting hate speech.

The documents were given to Facebook executives by internal data scientists. They said that the majority of civic groups on Facebook - i.e. groups dedicated to political issues - were home to misinformation, hate speech, and threats of violence.

A presentation given by the researchers said that "70% of the top 100 most active US Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment."

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

The researchers found that one of the top groups in terms of engagement "aggregates the most inflammatory news stories of the day and feeds them to a vile crowd that immediately and repeatedly calls for violence."

"We need to do something to stop these conversations from happening and growing as quickly as they do," they added, suggesting a series of measures to stem the rapid growth of groups.


They also highlighted hints of foreign influence in some of the groups - one popular group they highlighted called "Trump Train 2020, Red Wave" had "possible Macedonian ties." This group grew to over 2 million members before being taken down by Facebook in September.

According to the internal documents reviewed by the Journal, as well as company sources who spoke to the paper, the reaction to the researchers' findings was not favorable, and employees started sending daily analyses to the company's Vice President of Integrity Guy Rosen, as well as other top executives, to force the company to take action.

Facebook implemented some changes ahead of the November 2020 Presidential Election, including stopping its platform from recommending civic groups to users. It loosened these restrictions in the gap between the election result and the January 6 storming of the US Capitol building by pro-Trump rioters.

The day before the riot, BuzzFeed journalist Ryan Mac found that would-be rioters were organizing in private groups, and that Facebook's algorithm was automatically suggesting similar groups.

Last week, Facebook decided to make the ban on civic-group recommendations permanent, following a letter from Democrat lawmakers calling for it to do so. Going forward, it would also force group administrators to review content posted by other group members, it said.


"That helps us because we can then hold them accountable," Rosen told the Journal.

Facebook attracted heated political scrutiny following the Capitol riot, with some lawmakers accusing the company of fostering extremism. Facebook COO Sheryl Sandberg told reporters the rioters had "largely organized" on other platforms.