Zuckerberg reportedly brushed aside internal research that showed Facebook exposed users to more and more extreme views, saying he never wanted the topic brought to him again

Advertisement
Zuckerberg reportedly brushed aside internal research that showed Facebook exposed users to more and more extreme views, saying he never wanted the topic brought to him again
Facebook co-founder, Chairman and CEO Mark Zuckerberg arrives to testify before a combined Senate Judiciary and Commerce committee hearingChip Somodevilla/Getty Images
  • Facebook conducted internal efforts in 2016 and 2018 to examine whether the platform's algorithm was encouraging the polarization and the proliferation of extremist content, according to The Wall Street Journal.
  • Though Facebook's experts found that the platform contributed to divisive rhetoric, senior executives dismissed proposed changes, some which would adversely affect user engagement, The Journal said.
  • According to The Journal, CEO Mark Zuckerberg indicated he was "losing interest" in these changes and asked employees not to bring up those topics to him again.
Advertisement

Facebook has long known its platform encourages extremist and polarizing content, and the company's disinterest in making changes to solve these problems goes all the way up to CEO Mark Zuckerberg, The Wall Street Journal reported.

The Journal reported that research Facebook conducted internally in 2016 and 2018 showed the platform's algorithm contributed to the proliferation of polarizing content. When Zuckerberg was presented with proposed changes to the platform to stymie the spread, the Facebook CEO asked employees not to bring him any more proposed algorithm edits that were "in the name of social good," the newspaper said.

The report on Facebook's inaction comes at a time when the platform is in the spotlight once again over its inability to stop the spread of harmful misinformation — this time, about the coronavirus pandemic. Facebook and Zuckerberg have long insisted that the platform is not to blame for users' polarizing opinions and content, but The Journal said experts have informed the company's executives of its impact since 2016.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Internally, Facebook formed a task force in 2017 called "Common Ground," which consisted of engineers, researchers, and employee "integrity teams" to look into how divisive content was trending on the platform, The Journal reported. According to a 2018 presentation obtained by The Journal, experts found that Facebook's algorithm — which prioritizes user engagement and time spent on the platform — encouraged polarizing content by exploiting "the human brain's attraction to divisiveness."

Despite the findings, higher-ups dismissed or watered down proposed changes to make sure it didn't appear that Facebook was trying to shape users' opinions or take a moral stance. Facebook executives were even harsher in shutting down efforts that could make it appear as if the platform had a political bias.

Advertisement

Facebook managers told employees in 2018 that the platform was shifting priorities "away from societal good" and to "individual value," according to The Journal. From then on out, Facebook and Zuckerberg have promoted a more "hands-off" approach on content, taking action only if users "specifically violate the company's rules."

In a statement to The Wall Street Journal that was also sent to Business Insider, a Facebook spokesperson said: "We've learned a lot since 2016 and are not the same company today. We've built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve."

Read the original article on Business Insider
{{}}