Facebook may restrict accounts that repeatedly spread misinformation

Advertisement
Facebook may restrict accounts that repeatedly spread misinformation
Facebook promises to curb misinformation with its new initiatives that will flag offenders.Facebook
  • Facebook is taking action against people who repeatedly share misinformation.
  • With the help of independent fact-checkers, the platform is going to cut the reach of posts with false information.
Advertisement
Facebook has promised to take stricter action against users who frequently share misinformation and fake news. The US-based social media giant is introducing penalties to tackle offenders by restricting their accounts.

Misinformation on social media is a raging issue and these companies have taken several initiatives to curb its spread in the past. But with misinformation around Covid-19 seems to have alerted their resting senses, given the seriousness of the issue.

"Whether it's false or misleading content about COVID-19 and vaccines, climate change, elections, or other topics, we're making sure fewer people see misinformation on our apps," Facebook said in a blog post.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More
"Starting today, we will reduce the distribution of all posts in News Feed from an individual's Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners," it added.

This means Facebook will be cutting down the reach of those posts shared by a person who has a history of sharing misinformation as per the fact-checking partners of the firm. So, the reach of the flagged post is lower from that individual on users’ News Feed.

Advertisement

Facebook may restrict accounts that repeatedly spread misinformation
This is how the pop up will look when you're following a page rated by fact-checkers.Facebook

The social media giant is also launching an improvement alert tool that notifies users if they're engaging with content that has been rated by a fact-checking partner. The new update is said to make it easier for anyone to identify that the information could be unreliable. Also, Facebook says it wants to provide helpful information with a pop-up to users before they hit like on a page that has repeatedly shared content that fact-checkers have rated.

"We want to give people more information before they like a Page that has repeatedly shared content that fact-checkers have rated, so you'll see a pop up if you go to like one of these Pages," said Facebook.

Facebook may restrict accounts that repeatedly spread misinformation
Redesigned Notifications When People Share Fact-Checked ContentFacebook

The notification will include the fact-checkers’ article with corrected facts debunking the claims made in the other article with a prompt to share the right article with their connections. It also includes a notification alerting users who share false information may have their posts ranked lower in News Feed so that it's not seen by a large number of people.

SEE ALSO:
WhatsApp moves the Delhi High Court against India’s new IT laws concerning traceability of users
Here’s what India’s new IT rules want — and why WhatsApp, Facebook, Twitter and others are reluctant to comply
Advertisement
Indians are holding on to their cryptos, amidst all the uncertainty — here is why
{{}}