A 'lobbyist who has to make the president happy' is part of Facebook's content moderation, employees complain: report

Advertisement
A 'lobbyist who has to make the president happy' is part of Facebook's content moderation, employees complain: report
Facebook CEO Mark Zuckerberg with Facebook's Vice President of Global Public Policy Joel Kaplan. Samuel Corum/Getty Images
  • Joel Kaplan, head of the company's policy shop and a former GOP operative, oversees content moderation.
  • Employees complain that the arrangement "protects powerful constituencies," especially on the right.
Advertisement

Facebook employees have been sounding the alarm over the role that the company's policy shop - which is in charge of engaging with and lobbying lawmakers in Washington, DC and elsewhere - takes in sensitive content moderation decisions, according to a new report by POLITICO.

Joel Kaplan, a former Bush administration official and Republican operative, oversees the company's lobbying and government relations team. Employees complained that his involvement in content moderation has led to the protection of "powerful constituencies," especially on the political right.

"Facebook routinely makes exceptions for powerful actors when enforcing content policy," said one data scientist at the company in a December 2020 presentation provided to POLITICO. "The standard protocol for enforcement and policy involves consulting Public Policy on any significant changes, and their input regularly protects powerful constituencies."

"Public policy typically are interested in the impact on politicians and political media, and they commonly veto launches which have significant negative impacts on politically sensitive actors," the presentation also read.

In contrast, Twitter maintains a firewall between its "trust and safety" content moderation team and its policy team, while Google's content and public policy teams answer to different higher-ups.

Advertisement

"When you have the head of content policy reporting to a lobbyist who has to make the president happy, that's an unhealthy dynamic," a former employee told POLITICO. "It was often that making the president happy was the top priority."

These revelations emerged as part of the "Facebook Papers," a release of internal documents provided to 17 American news organizations by whistleblower Frances Haugen, a former product manager at the social media giant. The documents include research, emails, message board threads, and presentations.

"People bring up over and over again this idea that having both those functions tied to the same group is dangerous because they have different interests," Haugen, the whistleblower, said in a virtual briefing with POLITICO and other news outlets on Oct. 15.

"There should be a firewall between the two teams," Evelyn Douek, a Harvard scholar who researches private content moderation, told POLITICO. "As long as they are representing that [political] considerations don't play into how they do content moderation, they should make that real and have an internal structure that mirrors their external representations. That is something that other platforms have done."

And Kaplan's team at Facebook reportedly intervened to protect conservative activist Charlie Kirk, the pro-Trump website Breitbart, and Youtube stars Diamond and Silk from any consequences for violating Facebook's own misinformation policies according to a 2020 document.

Advertisement

Facebook issued statements to both POLITICO and Insider defending the arrangement and Kaplan's role at the company.

"In these instances Public Policy is just one of many groups consulted," spokesperson Corey Chambliss told POLITICO. "And, while the perspective of Global Public Policy is key to understanding local context, no single team's opinion has more influence than the other."

"Recycling the same warmed over conspiracy theories about the influence of one person at Facebook doesn't make them true," spokesman Joe Osborne told Insider, referring to Kaplan. "The reality is big decisions at Facebook are made with input from people across different teams who have different perspectives and expertise in different areas. To suggest otherwise is inaccurate."

Osborne also reiterated that the public policy team is just one of several groups consulted as part of the content moderation process.

However, Jesse Lehrich, co-founder of advocacy group called Accountable Tech, told POLITICO that "it is a fatal flaw of Facebook as a company that their team in charge of lobbying governments clearly is empowered to intervene on product and content decisions in ways that make it impossible to do good work."

Advertisement
{{}}