Facebook says it doesn't read WhatsApp messages, but an investigation found it actually does

Advertisement
Facebook says it doesn't read WhatsApp messages, but an investigation found it actually does
Andrew Caballero-Reynolds/AFP via Getty Images
  • Facebook-owned messaging service WhatsApp touts privacy as its core feature.
  • Facebook has repeatedly said it doesn't read messages sent between users.
  • A new investigation found that Facebook employs vast teams of WhatsApp moderators who read messages.

Facebook-owned WhatsApp, a messaging service used by billions of people around the world, isn't as private as Facebook says it is.

The service touts privacy at its core, and Facebook says it can't read messages sent between users. But Facebook is reportedly paying teams of contractors around the world to read through WhatsApp messages and moderate the content therein - reading and moderating their supposedly private messages.

As a new ProPublica investigation highlights, Facebook employs, "more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users' content."

Advertisement

Those contractors, which Facebook acknowledges, reportedly spend their days sifting through content that WhatsApp users and the service's own algorithms flag.

A Facebook representative told Insider that it allows users to report abuse, and those reports are then reviewed by contractors. When a user reports abuse, WhatsApp moderators are sent "the most recent messages sent to you by the reported user or group," according to WhatsApp's FAQ.

Facebook is still unable to listen to personal calls or read messages sent through WhatsApp, according to Facebook, due to the service's use of encryption.

Advertisement

Facebook says it doesn't read WhatsApp messages, but an investigation found it actually does
Facebook owns and operates Facebook, WhatsApp, Facebook Messenger, Instagram, and Oculus VR. Hakan Nural/Anadolu Agency via Getty Images

WhatsApp is founded on so-called "end-to-end" encryption, which means that messages are scrambled before being sent and only unscrambled when they're received by the intended user. But when a user reports abuse, unencrypted versions of the message are sent to WhatsApp's moderation contractors, ProPublica reports.

"Every day WhatsApp protects over 100 billion messages with end-to-end encryption to help people communicate safely. We've built our service in a manner that limits the data we collect while providing us the ability to prevent spam, investigate threats, and ban those engaged in the worst kind of abuse," a WhatsApp spokesperson at Facebook said in a statement sent to Insider. "We value our trust and safety team who work tirelessly to provide over two billion users with the ability to communicate privately."

Facebook has owned WhatsApp since 2014, when the social media giant purchased the then-nascent messaging app for $19 billion.

Advertisement

Update: Following publication, a WhatsApp spokesperson at Facebook sent another statement specifically addressing the investigation."WhatsApp provides a way for people to report spam or abuse, which includes sharing the most recent messages in a chat," the statement said. "This feature is important for preventing the worst abuse on the internet. We strongly disagree with the notion that accepting reports a user chooses to send us is incompatible with end-to-end encryption."

Got a tip? Contact Insider senior correspondent Ben Gilbert via email (bgilbert@insider.com), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.

{{}}