Meta launched an investigation after a woman said she was groped by a stranger in the metaverse

Advertisement
Meta launched an investigation after a woman said she was groped by a stranger in the metaverse
Horizon Worlds opened to users 18 and above on December 9th.SOPA Images / Contributor / Getty Images
  • A beta tester of Meta's Horizon Worlds said she was groped on the platform, per The Verge.
  • A Meta internal investigation concluded that the victim hadn't enabled safety features.
Advertisement

A woman has said she was groped by a stranger while working as a beta tester on Meta's Horizon Worlds platform.

The Verge first reported the news.

Horizon World enables users to socialize and play games with up to 20 other avatars. In a post in the beta testers' official Facebook group, the unnamed woman said that her avatar was groped by a stranger, and other users had supported the incident, The Verge reported.

"Sexual harassment is no joke on the regular internet, but being in VR adds another layer that makes the event more intense," the woman wrote. She said that incident occurred in the Plaza, one of the main areas that people gather, per The Verge.

On December 9, Meta opened up its version of Horizon Worlds to users aged 18 and above in North America. The platform is thought to closely resemble Mark Zuckerberg's vision of the metaverse.

Advertisement

Following the woman's post, Meta launched an internal investigation into the incident which, according to MIT Technology Review, occurred on November 26. The woman wrote the post on December 1, the publication reported.

It added that Meta concluded that at the time of the incident, the victim hadn't turned on a safety feature designed to prevent harassment on the platform.

Vivek Sharma, Meta's VP of Horizon, called the incident "absolutely unfortunate" but told The Verge that it was good feedback because he wanted to make the Safe Zone blocking tool "trivially easy and findable."

Safe Zone can be activated by users if they feel threatened on the platform. The feature is designed to stop other users from interacting with them.

Kristina Milian, a spokesperson for Meta, told MIT Technology Review that the company would continue to improve its user interface to understand how its tools are used and allow users to report issues easily and reliably.

Advertisement

Meta did not immediately reply to Insider's request for comment.

Meta, which rebranded from Facebook in October 2021, has faced repeated allegations that it fails to keep users safe on its platforms.

A trove of documents released by whistleblower Frances Haugen shed light on the company's internal practices.

Facebook's internal research showed Instagram had an adverse effect on teenage mental health, according to one document. Company leaders also repeatedly dismissed concerns over Meta's business practices.

In November, the FT revealed a leaked separate memo in which Meta executive Andrew Bosworth said that it was "practically impossible" to moderate the behaviour of metaverse users on a meaningful scale.

Advertisement
{{}}