WhatsApp is touting steps taken to cut the viral spread of coronavirus misinformation, but experts question whether it's done enough

Advertisement
WhatsApp is touting steps taken to cut the viral spread of coronavirus misinformation, but experts question whether it's done enough
A woman holds a sign with the image of presidential candidate Jair Bolsonaro that reads "He lies in WhatsApp," during a protest in 2018.Reuters/Nacho Doce
  • WhatsApp, the Facebook-owned encrypted messaging platform, said last month it had seen a 70% drop in "frequently forwarded" messages on its app after enacting limits on use of the feature in response to the spread of coronavirus misinformation.
  • But this isn't the first time WhatsApp has dealt with viral misinformation that could have deadly consequences: Dozens of people in India were killed in India in 2018 after false rumors accusing them of child kidnapping spread on the platform.
  • However, WhatsApp's handling of misinformation extends beyond forwarded messages. Disinformation experts say that since WhatsApp is used as a way to communicate intimately with those you trust, users often won't verify the information they're sent.
  • Even as Facebook lauds WhatsApp's success in limiting the spread of viral rumors, coronavirus-related misinformation continues to proliferate across Facebook-owned platforms.
  • Visit Business Insider's homepage for more stories.
Advertisement

After years of battling misinformation on the social media frontlines, WhatsApp has a wealth of hard-earned experience to help it defuse coronavirus hoaxes and half-truths. But the Facebook-owned messaging app's numerous measures still fall short of what's necessary to meet the challenge, according to several experts on misinformation that Business Insider spoke to.

WhatsApp told Business Insider last week that the encrypted messaging app saw a 70% drop in "frequently forwarded" messages — messages forwarded on at least five times — after it enacted limits in April on users' ability to send such messages en masse. However, the restriction, designed to contain the sharing of coronavirus misinformation, fails to get to the root of how the platform can be weaponized to spread false rumors, the experts said.

The features that make WhatsApp popular throughout the world are also what make the app fertile ground for the coronavirus "infodemic." And while parent-company Facebook has deployed a battery of defenses across its three major platforms — Facebook, Instagram and WhatsApp — the methods don't always take into account the nuances of each platform and its type of users.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

"Unlike a Facebook feed where you scroll past legacy Facebook friends, you're more likely to read every single message [on WhatsApp]. It's filled with people you trust," said Aimee Rinehart, the deputy director of disinformation-fighting nonprofit First Draft.

Facebook is removing posts on Facebook and Instagram, for instance, and informing users when they interact with coronavirus misinformation. But WhatsApp puts the onus on users to fact-check what they're being told.

Advertisement

And because of the way WhatsApp is designed — making users feel safe using the platform, while also enabling them to easily share information — the app is especially vulnerable to the spread of false information and rumors.

"It can be a very potent way to communicate and coordinate, whether that's for political messaging or your family barbecue. It's made to share information," Rinehart said.

With 2 billion users, WhatsApp is an information-sharing platform that appeals to health authorities and malefactors

In the last couple months, hoaxers have used WhatsApp to widely spread false claims about coronavirus. Business Insider reported Wednesday that users in the United Kingdom were spreading rumors about when the country's lockdown order would be lifted, including a bogus report about opening the country up in August. The platform has also been home to viral messaging sharing false COVID-19 treatments and faulty disease prevention advice, as well as a spoofed audio clip saying the UK government was planning to cook a lasagna inside a soccer stadium.

WhatsApp is touting steps taken to cut the viral spread of coronavirus misinformation, but experts question whether it's done enough
FILE PHOTO: The WhatsApp app logo is seen on a smartphone in this illustrationThomson Reuters

WhatsApp's vast reach has made it an important platform for health authorities and fact checkers to disseminate information. Both the World Health Organization and the UK's National Health Service, as well as The Poynter Institute's International Fact-Checking Network, have partnered with WhatsApp to integrate chatbots into the platform that users can message to get up-to-date information and dispel false rumors.

Advertisement

With more than two billions users, the Facebook-owned WhatsApp is one of the most popular messaging apps in the world, and is especially dominant in countries across Asia and Latin America. In Brazil, more than 90% of mobile internet users employ the messaging app. More than 400 million people in India — of an estimated 450 million smartphone users, according to research firm Counterpoint — are users on WhatsApp.

The app is particularly popular due to its end-to end encryption, which "ensures only you and the person you're communicating with can read what's sent." However, this feature also makes it difficult for WhatsApp to regulate messages sent on the platform — such as those used to share child pornography — or figure out the dissemination of false information.

In countries where WhatsApp is particularly dominant, the platform has become source for widespread disinformation campaigns. Ahead of Brazil's presidential election in 2018, thousands of viral messages containing false information were spread on WhatsApp, largely by supporters of Jair Bolsonoro, the far-right candidate who went on to win.

In India, dozens of people were killed in 2018 after rumors circulated on WhatsApp about "child-lifters" kidnapping kids across the country. Even as authorities tried to dispel rumors, edited videos purporting to show kidnappings went viral on the platform for months, and mobs gathered to attack men and women they assumed to be responsible.

Users in developing countries can't afford to leave Facebook and fact check info on the web

WhatsApp introduced the message-forwarding feature to its app in 2016, and has since continued to laud it as a way to pass on helpful information, share memes, and organize mass movements online. Not long after the reports of mob lynchings in India garnered national attention, WhatsApp rolled out a series of new features to crack down on the spread of misinformation: The platform started marking messages that were forwarded in 2018, then rolled out a limit in early 2019 on how many chats users could forward messages to at once. The moves resulted in a 25% decrease in message forwards, according to WhatsApp.

Advertisement

WhatsApp is touting steps taken to cut the viral spread of coronavirus misinformation, but experts question whether it's done enough
India's Rapid Action Force (RAF) personnel pose for pictures inside their base camp in New Delhi.Adnan Abidi/Reuters

However, users devoted to weaponizing the platform have found ways to bypass these limitations, according to Kanishk Karan, a disinformation researcher with Atlantic Council's Digital Forensic Research Lab. Although Karan commends the step WhatsApp took last month as a "game changer" in further limiting the amount of forwarded messages, he told Business Insider it won't be effective in "fully" wiping disinformation on the platform.

"We've seen during our research that forwarding messages without confirming the veracity of the information is somewhat of a habit in India," Karan told Business Insider. "The app cannot fully eradicate false and misleading information on its platform without severely compromising the end-to-end encryption feature, which is what makes it popular amongst Indian users to begin with."

Further complicating users' incentives to fact-check information sent to them is Facebook's zero-rate initiative, says Rinehart, the deputy director at First Draft News. Through Facebook Zero, mobile phone users on carriers in some countries — including in Brazil, the Phillipines, and several African countries — are able to access Facebook's family of platforms without it counting toward their data plans. Rinehart told Business Insider this leads users to rely on Facebook-branded platforms for all their news and communication, and takes away the incentive to look elsewhere for verification.

"If they see a piece of misinformation, they're less inclined to go online and look for the truth," Rinehart said. "It's a huge boom for people who can't afford a cell-plan otherwise, but it's also a hindrance for people to see the truth outside of the information on these platforms. It's created a scourge of problematic content."

Advertisement

In addition to pointing Business Insider toward several blog posts, a WhatsApp spokesperson also pointed us toward WhatsApp's Coronavirus Information Hub.

Read the original article on Business Insider
{{}}