A 16-year-old girl is suing Snap, Google, and Apple after a Marine sexually exploited her on Snapchat starting at age 12

Advertisement
A 16-year-old girl is suing Snap, Google, and Apple after a Marine sexually exploited her on Snapchat starting at age 12
The Snapchat logo seen on an iPhone in March 2022.Thomas Trutschel/Photothek via Getty Images
  • A 16-year-old girl has filed a class action lawsuit against Snap, Apple, and Google.
  • From 2018 until last year, the girl was groomed by an active-duty Marine who convinced her to send nude photos and videos.
Advertisement

A 16-year-old girl and her mother filed a class-action lawsuit in a California federal court this week against Snap, Apple, and Google, claiming the platforms fail to protect teen users from "egregious harm" and the spread of Child Sexual Abuse Materials (CSAM).

In the lawsuit, lawyers for the girl, who is identified as L.W., argue that Snap, the parent company of Snapchat, takes a reactive approach to protect teens from abuse, requiring children to report their own abuse after it has occurred.

"The claims alleged in this case are not against the adult perpetrator – they are against three major technology companies who enable him and others to commit these crimes," the lawyers wrote in the suit, which Insider viewed.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

Her lawyers argued that the "tools and policies" of Snap, Apple, and Google, are designed to increase their wealth rather than protect the minors who use their products and apps.

In L.W.'s case, an adult man, identified in the lawsuit as B.P., began requesting nude photos from her when she was 12 years old. He saved her Snapchat photos and videos and shared them with others, said the lawsuit, filed Monday in California.

Advertisement

B.P, who was an active-duty Marine, acknowledged he used Snapchat because he knew his chats would disappear, the lawsuit states. He was convicted in a military court last year on charges relating to child pornography and sexual abuse following a criminal investigation, according to The Washington Post which first reported on the case.

The lawsuit alleges that Snapchat is a "safe haven" for child sexual abuse, claiming the app's automatically disappearing messages feature opens the door "for exploitation and predatory behavior."

A Snapchat spokesperson told Insider it could not comment on active litigation but called the situation "tragic."

"Nothing is more important to us than the safety of our community. We employ the latest technologies and develop our own tools to help us find and remove content that exploits or abuses minors," the Snapchat spokesperson said, adding that it reports all detected CSAM to the National Center for Missing and Exploited Children.

The abuse and grooming at the center of the lawsuit happened over the course of more than two years, while the victim was between the ages of 12 and 16. The man also targeted "many other children" victims, according to the lawsuit.

Advertisement

B.P. first contacted L.W. via Instagram in September 2018 when she was 12 years old. The two then connected using Snapchat, where the man asked the girl to send nude photographs.

When she refused, he sent the victim a nude photograph of himself, according to the lawsuit. He "manipulated and coerced" L.W. to continue sending nude photographs through April 15, 2021.

If she refused to send a photo, the man would "ridicule and berate" her, according to her lawyers. He also attempted to get her to meet him at a hotel or an Airbnb, which she refused, the lawsuit said. L.W. told her mother about the grooming and abuse in May last year, according to the lawsuit.

"Due to the physical and psychological harms, L.W. was assessed at a teen suicide outpatient program, and even an emergency room after a suicide attempt," the lawyers said. "She sought care from a personal therapist, psychiatrist, and was prescribed antidepressants."

Google and Apple were sued for allowing an app called Chitter

Apple and Google were named in the suit for allowing the app Chitter on their respective marketplaces. B.P. used the app to circulate photos and videos of the victim, according to the lawsuit. The lawyers said the companies were "enabling, recommending, and steering users to Chitter," and were profiting from in-app purchases.

Advertisement

Chitter allows two random users to connect and share messages, photos, and videos anonymously. The app is not named as a defendant in the suit. The lawyers argued the application had gained a reputation for attracting users who want to spread CSAM.

Chitter did not return Insider's request for comment sent Friday.

Apple and Google removed the app from their respective marketplaces this week after being contacted for comment by The Washington Post.

A spokesperson for Apple told Insider on Friday that it removed the app because of repeated violations of guidelines related to "proper moderation of all user-generated content." A Google spokesperson told Insider it was "deeply committed to fighting online child sexual exploitation and abuse" across its products.

Snapchat was also sued last month by the family of a 17-year-old boy who died by suicide in 2015. His parents sued Snap and Meta, Facebook's parent company, in a Wisconsin court on April 11, arguing the companies were "aware of the addictive nature of their products and failed to protect minors in the name of more clicks and additional revenue."

Advertisement

US politicians have publicly criticized big tech companies for their policies and actions relating to protecting young users, especially after former Facebook employee Frances Haugen last year shared information with the media and Congress about internal data at Facebook that showed its products were harmful to teenagers.

{{}}