Facebook whistleblower Frances Haugen says Meta can't recover until Mark Zuckerberg steps down: report

Advertisement
Facebook whistleblower Frances Haugen says Meta can't recover until Mark Zuckerberg steps down: report
Facebook whistleblower Frances Haugen testifying to a Senate committee.Matt McClain/Getty Images
  • Frances Haugen said Facebook can only recover if Mark Zuckerberg steps down.
  • In an interview with Bloomberg, she said Facebook promotes hate speech in fragile countries.
Advertisement

Facebook whistleblower Frances Haugen said the social media giant won't be able to recover until Mark Zuckerberg steps down as chief executive.

In an interview with Bloomberg, she spoke about what compelled her to go public after she left the company in May 2021.

Haugen took tens of thousands of pages of documents that showed the social media giant knew its products were damaging teenagers' mental health, fomenting ethnic violence in countries such as Ethiopia, and were failing to curb misinformation before the Washington DC riots on January 6 last year.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

She told Bloomberg that Zuckerberg "genuinely believes that Facebook is just a mirror" of reality, and that "you are unhappy because you can see it now."

"Mark has been surrounded by people since he was 19 years old who told him he was doing a great job," adding "we can demonize Zuckerberg, but it's not going to make him heal faster."

Advertisement

Haugen added that, unlike most other public companies, Zuckerberg holds 56% of the voting rights. "No one, but Mark Zuckerberg can control Facebook right now."

When asked by Bloomberg's Emma Barnett if Zuckerberg should go, she said: "I don't think the company can recover as long as he is the leader of it."

Haugen spoke about a change Facebook made to its algorithm in 2018 to trigger reactions from users, rather than aiming to increase dwell time alone. "Unless we can provoke a reaction from you, it's not good," she added.

Extreme content gets the most distribution in "linguistically diverse places", Haugen told Bloomberg, which meant they were using the most "dangerous version of Facebook" and leaving room for hate speech.

Last month, Insider reported that a former Facebook moderator in Kenya had accused Meta of human trafficking.

Advertisement

Daniel Motaung said that he tried to start an employee union as a result of the traumatic content that moderators had to watch but was fired because of his actions.

"The fact that he [Zuckerberg] doubled down on the metaverse when I brought up issues around genocide," Haugen added in the interview, "instead of actually making these systems safe. I think that's a dereliction of duty."

Facebook was contacted for comment.

{{}}