scorecard
  1. Home
  2. tech
  3. news
  4. Facebook knew it had a huge problem with hateful groups 5 months ahead of the Capitol riot, internal docs show

Facebook knew it had a huge problem with hateful groups 5 months ahead of the Capitol riot, internal docs show

Isobel Asher Hamilton   

Facebook knew it had a huge problem with hateful groups 5 months ahead of the Capitol riot, internal docs show
  • Facebook staff told execs in August 2020 that 70% of top political groups were full of hate speech, misinformation, and violent threats.
  • Researchers had to fight to convince management to take action against these groups, per a Wall Street Journal report.
  • Facebook imposed some restrictions on groups, but loosened them again after the election — and before the Capitol riot.

Internal Facebook documents reviewed by the $4 show the company knew in August 2020, five months before the $4, that it had a big problem with groups spouting hate speech.

The documents were given to Facebook executives by internal data scientists. They said that the majority of $4 - i.e. groups dedicated to political issues - were home to misinformation, hate speech, and threats of violence.

A presentation given by the researchers said that "70% of the top 100 most active US Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment."

The researchers found that one of the top groups in terms of engagement "aggregates the most inflammatory news stories of the day and feeds them to a vile crowd that immediately and repeatedly calls for violence."

"We need to do something to stop these conversations from happening and growing as quickly as they do," they added, suggesting a series of measures to stem the rapid growth of groups.

They also highlighted hints of foreign influence in some of the groups - one popular group they highlighted called "Trump Train 2020, Red Wave" had "possible Macedonian ties." This group grew to over 2 million members before being taken down by Facebook in September.

According to the internal documents reviewed by the Journal, as well as company sources who spoke to the paper, the reaction to the researchers' findings was not favorable, and employees started sending daily analyses to the company's Vice President of Integrity Guy Rosen, as well as other top executives, to force the company to take action.

Facebook implemented some changes ahead of the November 2020 Presidential Election, including stopping its platform from recommending civic groups to users. It loosened these restrictions in the gap between the election result and the January 6 storming of the US Capitol building by pro-Trump rioters.

The day before the riot, BuzzFeed journalist Ryan Mac found that would-be rioters were $4, and that Facebook's algorithm was automatically suggesting similar groups.

$4, following a $4 calling for it to do so. Going forward, it would also force group administrators to review content posted by other group members, it said.

"That helps us because we can then hold them accountable," Rosen told the Journal.

Facebook attracted heated political scrutiny following the Capitol riot, with some lawmakers $4. Facebook COO $4 the rioters had "largely organized" on other platforms.

READ MORE ARTICLES ON



Popular Right Now



Advertisement