+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

A Facebook whistleblower said it knows that its algorithms are pushing QAnon and white nationalist content to Trump fans, but denies it

Oct 4, 2021, 20:37 IST
Business Insider
Whistleblower Frances Haugen in a 60 Minutes interview on October 3 said that Facebook had not done enough to halt the spread of misinformation. CBS News
  • A Facebook whistleblower said the site pushes far-right misinformation to Trump supporters.
  • Frances Haugen told "60 Minutes" that Facebook has data confirming the scale of the problem.
  • Facebook has rejected claims that it has done too little to limit the spread of disinformation.
Advertisement

A Facebook whistleblower said that the social media giant's algorithms are pushing QAnon and white nationalist content to Trump supporters.

In an interview for "60 Minutes" that aired Sunday, former Facebook executive Frances Hogan said that the company's claims to be tackling the spread of disinformation on its platform were false.

She said that the site promotes divisive and extremist content to users who have taken no action to search it out.

Haugen said that Facebook's claims that it only shows such content to people who seek it out are false.

She characterised Facebook's response to criticism on the issue as: "'It takes two to tango."

Advertisement

"'You picked your friends, you picked the topics you engage with, don't just blame us it's on you," she said, paraphrasing the company's position.

The section of the interview was not in the final cut of the show, but in extra parts of the interviews contained on the "60 Minutes Overtime" clips.

Haugen said that an experiment conducted by the company internally proved how far-right content travelled to users, even those who had expressed no prior interest in extremism.

"So they've taken brand new accounts, so no friends, and all they've done is follow Donald Trump, Melania Trump, Fox News, and like a local news source.

"And then all they did is click on the first ten things that Facebook showed them - where Facebook suggested a group, they joined that group," she said.

Advertisement

"So they're not doing any conscious action here, just one time go in - and within a week you see QAnon, and in two weeks you see things about white genocide," said Haugen.

QAnon is a sprawling conspiracy theory movement, whose supporters groundlessly believe that Trump is on a mission to purge a cabal of Satanic child abusers from the government.

The white genocide conspiracy theory has long been espoused by white-supremacist groups, who groundlessly believe that elites are deliberately seeking to destroy white people.

Haugen said that such content spreads because it "gets the highest engagement."

She said that Facebook consistently chooses to keep extremist content around because it makes money, rather than restricting it and risking losing some of its users.

Advertisement

In response to a request for comment, Facebook referred Insider to the statement they gave CBS in which they denied not doing enough.

Lena Pietsch, Facebook's director of policy communications, said: "If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.

"We have a strong track record of using our research - as well as external research and close collaboration with experts and organizations - to inform changes to our apps."

Facebook has long faced criticism for failing to restrict the flow of disinformation and conspiracy theories on its platform.

Insider in a July investigation found that so-called militia groups in the US were promoting anti-vaccination narratives in a bid to lure in new recruits.

Advertisement
Next Article