Facebook is now actively demoting false election posts and limiting livestreams in an attempt to crack down on growing misinformation about votes

Advertisement
Facebook is now actively demoting false election posts and limiting livestreams in an attempt to crack down on growing misinformation about votes
Left to right: US President Donald Trump, Facebook's CEO Mark Zuckerberg, Democratic presidential candidate Joe Biden.Chip Somodevilla, Abdulhamid Hosbas/Anadolu Agency, Drew Angerer all via Getty Images
  • Facebook will now actively demote posts containing election misinformation on both Facebook and Instagram, it told Forbes on Thursday.
  • It will also limit the spread of Facebook Live videos related to the election.
  • This marks a more aggressive stance for the platform, which previously focused on labeling posts that were potentially false without limiting their spread.
  • As counting continues for the 2020 presidential election, Facebook is "seeing more reports of inaccurate claims about the election," it told Forbes.
Advertisement

Facebook has moved from labeling posts that contain misinformation about the 2020 presidential election to actively demoting them and limiting their spread on the platform, it said Thursday.

It will now demote content related to election misinformation on Facebook and Instagram — including "debunked claims about voting" — and limit the distribution of election-related livestreams on Facebook, it said in a statement to Forbes.

The posts are therefore less likely to appear on people's newsfeeds.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

This marks Facebook taking a more active role in curbing the spread of misinformation. Up until now, the platform had focused on adding labels to posts, rather than limiting their reach. This lack of sharing restrictions means "false information [could] still quickly spread," on Facebook one expert told Business Insider.

Facebook also said that "when people try to share a post that features an informational election label, they will see a message encouraging them to visit the Voting Information Center for reliable election information."

Advertisement

As counting continues, Facebook is "seeing more reports of inaccurate claims about the election," it told Forbes.

"While many of these claims have low engagement on our platform, we are taking additional temporary steps ... to keep this content from reaching more people," it said.

Facebook also plans to add more "friction" before people can share posts related to the election, sources close to the matter told the New York Times Thursday. This may include "an additional click or two," they said. It is unclear whether this is related to the measures Facebook mentioned to Forbes.

Facebook did not immediately respond to a request for comment.

Read more: Recounts, debt, and litigation: Here's why Donald Trump keeps asking for your money even while votes are still being counted

Advertisement

The pop-up message encouraging users to visit Facebook's voting information page will appear on election posts that are labeled. These labels include:

  • "The US has laws, procedures, and established institutions to ensure the integrity of our elections."
  • "Differences between final results and initial vote counts are due to it taking several days after polls closed to ensure all votes are counted."
  • "Voting by mail has a long history of trustworthiness in the US. Voter fraud is extremely rare across voting methods."
  • "Election officials follow strict rules when it comes to ballot counting, handling and reporting."
  • "The winner of the 2020 US Presidential Election has not been projected."
Facebook is now actively demoting false election posts and limiting livestreams in an attempt to crack down on growing misinformation about votes
Facebook

Facebook's announcement came just hours after it shut down a viral Facebook group that spread conspiracy theories about voter fraud and included "worrying calls for violence." Over a period of two days, the "Stop the Steal" group accusing Democrats of trying to steal the presidential election, and spreading theories about voter fraud, had amassed 365,000 members.

As well as removing the group, Facebook blocked the #StopTheSteal hashtag as well as #Sharpiegate.

Misinformation has festered on Facebook during election week

Calls to violence have reportedly been spiking on the site.

Facebook measures how posts, including dangerous search terms and hashtags, could lead to real-world violence through an internal tool that tracks "violence and incitement trends," BuzzFeed News reported Thursday – and this has spiked 45% during the week of the election.

Advertisement

Facebook has also tightened its rules against voter suppression and intimidation, cracked down on disinformation networks, and spun up its election operations center, which has been actively working with law enforcement to spot foreign influence efforts.

Facebook has also rolled out a system for flagging posts for review by moderators before they go viral.

The site also temporarily turned off political group recommendations – though groups such as "Stop the Steal" were still able to go viral.

Facebook also limited how many chats people can forward a message to on Messenger to "reduc[e] the risk of misinformation and harmful content going viral," and after Election Day, it suspended political advertising for an indefinite period.

{{}}