Facebook wants to know how it's shaping the 2020 elections — researchers say it's looking too late and in the wrong places

Advertisement
Facebook wants to know how it's shaping the 2020 elections — researchers say it's looking too late and in the wrong places
Facebook's executives have repeatedly pledged to do better about cleaning up the platform, but the company has a long way to go.Drew Angerer/Getty Images; The Asahi Shimbun/Getty Images
  • Facebook said it will pay thousands of users to participate in an independent study of how its social media products influence the 2020 US elections.
  • The company wants to know "whether and how" social media changes people's votes, but said it doesn't expect results until at least mid-2021, long after Americans will have cast their ballots.
  • Researchers are asking why Facebook waited until two months before the election to start the study, and whether its focus might end up understating the impact of political misinformation.
  • They applauded Facebook for promising to be more transparent than it has in the past but also worried it could use the study to absolve itself of responsibility and future criticism.
Advertisement

Facebook was first warned in late 2015 that Cambridge Analytica was misusing data illicitly harvested from millions of Americans in an attempt to sway the 2016 US elections.

It didn't pull the plug on the firm's access to user data until March 2018 after reporting from The Guardian turned the breach into a global scandal.

More than two years later — and barely two months before the deadline for votes to cast their ballots in the 2020 elections — Facebook has decided it wants to know more about how it impacts democracy, announcing last week that it would partner with 17 researchers to study the impact of Facebook and Instagram on voters' attitudes and actions.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

But researchers outside of the project are conflicted. While they praised Facebook for promising to ensure more transparency and independence than it has before, they also questioned why the company waited so long and just how much this study will really bring to light.

"Isn't this a little bit too late?" Fadi Quran, a campaign director with nonprofit research group Avaaz, told Business Insider.

Advertisement

"Facebook has known now for a long time that there's election interference, that malicious actors are using the platform to influence voters," he said. "Why is this only happening now at such a late stage?"

Facebook said it doesn't "expect to publish any findings until mid-2021 at the earliest." The company did not reply to a request for comment on this story.

Since the company is leaving it to the research team to decide which questions to ask and draw their own conclusions — a good thing — we don't yet know much about what they hope to learn. In its initial announcement, Facebook said it's curious about: "whether social media makes us more polarized as a society, or if it largely reflects the divisions that already exist; if it helps people to become better informed about politics, or less; or if it affects people's attitudes towards government and democracy, including whether and how they vote."

Facebook executives have reportedly known the answer to that first question — that the company's algorithms do help polarize and radicalize people — and that they knowingly shut down efforts to fix the issue or even research it more.

But even setting that aside, researchers say they've already identified some potential shortcomings in the study.

Advertisement

"A lot of the focus of this work is very much about how honest players are using these systems," Laura Edelson, a researcher who studies political ads and misinformation at New York University, told Business Insider.

"Where I'm concerned is that they're almost exclusively not looking at the ways that things are going wrong, and that's where I wish this was going further," she added.

Quran echoed that assessment, saying: "One big thing that they're going to miss by not looking more deeply at these malicious actors, and just by the design, is the scale of content that's been created by these actors and that's influencing public opinion."

A long list of research and media reports have documented Facebook's struggles to effectively keep political misinformation off its platform — let alone misleading health claims, which despite Facebook's more aggressive approach, still racked up four times as many views as posts from sites pushing accurate information, according to Avaaz.

But political information is much more nuanced and constantly evolving, and even in what seem to be clear-cut cases, Facebook has, according to reports, at times incorrectly enforced its own policies or bent over backward to avoid possible political backlash.

Advertisement

Quran and Edelson both worried that Facebook's election study may not capture the full impact of aspects of the platform like its algorithms, billions of fake accounts, or private groups.

"You find what you go and you look for," Edelson said. "The great problem of elections on Facebook is not how the honest actors are working within the system."

Quran also said, though it's too early say this will happen for sure, that because it's Facebook asking users directly within their apps to join the study, sometimes in exchange for payment, it risks inadvertently screening out people who are distrustful of the company to begin with.

"We're already seeing posts on different groups that share disinformation telling people: 'Don't participate in the study, this is a Facebook conspiracy'" to spy on users or keep Republicans off the platform ahead of the election, he said. "What this could lead to, potentially, is that the people most impacted by disinformation are not even part of the study."

In a best-case scenario, Edelson said the researchers could learn valuable information about how our existing understanding of elections maps onto the digital world. Quran said the study could even serve as an "information ecosystem impact assessment," similar to environmental impact studies, that would help Facebook understand how changes it could make might impact the democratic process.

Advertisement

But both were skeptical that Facebook would make major changes based on this study or the 2020 elections more broadly. And Quran warned that, despite Facebook's efforts to make the study independent, people shouldn't take the study as definitive or allow it to become a "stamp of approval."

It took Facebook nearly four years from when it learned about Cambridge Analytica to identify the tens of thousands of apps that were also misusing data. And though it just published the results of its first independent civil rights audit, the company has made few commitments to implement any of the auditors' recommendations.

{{}}