1. Home
  2. life
  3. news
  4. A sexual assault in the metaverse has investigators questioning the future of virtual crime prosecution

A sexual assault in the metaverse has investigators questioning the future of virtual crime prosecution

Katherine Tangalakis-Lippert   

A sexual assault in the metaverse has investigators questioning the future of virtual crime prosecution
  • UK authorities are investigating the claim of a simulated gang rape of a teenager's VR avatar.
  • The teen told the police she was using a headset to play a VR video game when male players attacked her avatar.

A teenager's claim that her avatar was gang-raped in an immersive virtual-reality game is being investigated by UK authorities, who are said to be considering the novel question of whether such an act in the metaverse can be criminally charged.

The girl, who was identified only as being under the age of 16, was wearing a VR headset to play online when several male players attacked and "gang-raped" her digital avatar, British police sources told The Daily Mail.

Though she was not physically injured, the outlet reported the girl was deeply distraught after the incident, and a senior police officer familiar with the case told The Daily Mail she experienced trauma similar to a real-life assault.

Donna Jones, the chair of the Association of Police and Crime Commissioners, confirmed to The BBC that the incident was first reported to authorities in 2023, triggering an investigation by police. Still, BBC could not verify which force launched the probe into the attack.

No matter how the British police ultimately decide to handle the incident, law enforcement, and safety researchers say concerns over sexual harassment and violence in the metaverse need to be addressed as virtual and augmented reality technology becomes more convincing.

VR goggles cover users' peripheral vision to create an immersive experience. Depending on user settings, players may experience vibrations in their handheld controls when experiencing in-game stimuli.

While users who find their characters in precarious in-game situations involving other player-controlled characters may not be under direct physical threat, researchers say the immersive nature of a VR experience can heighten the emotional response to the content presented through goggles or sensations registered through haptic suits. These touch-sensitive full-body suits vibrate in response to virtual stimuli, reacting when users' characters bump into a wall or receive a punch, for example.

In-game actions with a psychological toll

"The boosters of this technology cannot have it both ways," Katherine Cross, who researches online harassment at the University of Washington, told Business Insider. "They cannot tout the realism of these virtual worlds and then deny or downplay that ugly things that happen in them have some of the unfortunate downstream effects of real behavior. If it's real enough to be marketable in a unique way, it's real enough for there to be social consequences and psychological consequences when something goes wrong."

Cross said the core of VR technology relies on tricking a user's brain at a fundamental level into thinking that they're physically experiencing the things on screen by mimicking sensations experienced in the real world, like walking through space or swimming. That trickery of the brain is why, she says, users can sometimes feel a bit disoriented for a few seconds once they take the headset off and realize they're still standing in their living room or on a convention show floor.

"And what that means is that if something potentially traumatic happens in that space, you may immediately, or almost immediately, consciously realize it's just a game and this isn't really happening, but there's that moment where your lizard brain sort of has to catch up," Cross said. "So it's not unreasonable to think that that could lead to trauma."

Despite safety researchers and law enforcement raising the issue of the potential real-world ramifications that VR attacks and harassment might have, the debate rages on in online forums like Reddit about the impact of virtual sexual assault, with some users suggesting that claims of being traumatized by virtual attacks are minimizing "actual rape victims."

On Instagram, in response to a New York Post article about the incident, users joked the assailants who harassed the girl online should be sent to "virtual jail." Others quipped they were waiting for justice after their character in "Call of Duty," a first-person shooter game, was killed.

"I know it is easy to dismiss this as being not real, but the whole point of these virtual environments is they are incredibly immersive," UK Home Secretary James Cleverly told LBC about the incident. "And we're talking about a child here, and a child has gone through sexual trauma. It will have had a very significant psychological effect and we should be very, very careful about being dismissive of this."

He added: "It's also worth realizing that somebody who is willing to put a child through a trauma like that digitally may well be someone that could go on to do terrible things in the physical realm."

Sexual harassment on 'Horizon Worlds'

Claims like the British girl's are not unheard of, with multiple accounts of virtual sexual harassment stemming from Meta's "Horizon Worlds" VR game — though it remains unclear if the incident took place inside Meta's game or another VR world.

In 2022, a metaverse researcher who was studying user behavior on "Horizon Worlds" wrote that her avatar was raped roughly an hour into her first session.

"One part of my brain was like wtf is happening, the other part was like this isn't a real body, and another part was like, this is important research," the researcher said in her report of the incident, adding that the users who assaulted her avatar had requested she disable her 4-foot safety bubble before initiating the attack.

Months earlier, in 2021, another metaverse researcher named Nina Jane Patel said in a post on Medium that three to four male-looking avatars had gang-raped her avatar within 60 seconds after she joined "Horizon Worlds," calling the incident a surreal nightmare.

"The girl involved is very brave," Patel told BI. "Bringing this to the attention of the police would have been no easy task, and she is breaking ground with her actions. While we don't know where it will lead, it is a step in the right direction."

In the summer of 2022, following early reports of sexual harassment and simulated assault on the platform, The Verge reported that Meta had expanded the acceptable types of content in "Horizon Worlds" to include "mature" content for users over the age of 18, including depictions of alcohol, tobacco, and marijuana use as well as "near nudity, depictions of people in implied or suggestive positions, or an environment focused on activities that are overly suggestive."

However, "nudity, depictions of people in explicit positions, or content or worlds that are sexually provocative or implied" remains prohibited in public spaces, according to Meta's policy on mature content on the site. In-game avatars are depicted from the torso up and, therefore, do not have legs or genitals that can be seen while playing. However, users can simulate sex with provocative positions of their avatars.

Representatives for Meta did not respond to a request for comment from Business Insider. A spokesperson for the tech giant told Metro: "The kind of behavior described has no place on our platform, which is why for all users, we have an automatic protection called personal boundary, which keeps people you don't know a few feet away from you."

Defining the new frontier of cybercrime

While this isn't the first sexual attack that has been reported in virtual reality, it is thought to be the first time the UK authorities are investigating whether such an attack can be charged as a crime.

Patel told BI that there's a need for specific legislation that addresses the unique nature of offenses in the metaverse, including defining and criminalizing acts of grooming, bullying, and harassment in virtual environments. She also advocates for creating stringent age verification systems, privacy controls, and parental supervision tools tailored to the metaverse's immersive experience without stifling the innovation and freedom that make the alternate reality world compelling.

"Protecting children in the Metaverse requires a multifaceted approach: psychologically informed safeguards to prevent trauma, robust legal frameworks to define and prosecute offenses, and international cooperation to effectively enforce these laws," Patel told BI. "This is a critical area that demands immediate attention to ensure the Metaverse is a safe and positive space for young users."

Cross, however, isn't sure that one-size-fits-all legislation is the correct answer, saying that laws that criminalize behavior in the metaverse are treating a symptom, not the cause of the problem and that government enforcers "may not be able to provide the relief that people seek or deserve."

"I think that, ultimately, it is on the platform holders to engage with the public more openly on these issues," Cross told BI. "And establish a serious package of reforms that they can bring forward to empower users to give them organizing tools, not just private moderation tools, but the ability to effectively police their own communities and work hand in hand with an expanded moderation and trust and safety team."

Cross added that she believes a more effective arena of legislation would require large companies to have sufficiently staffed trust and safety teams working to address issues of virtual harassment, putting the onus back on the businesses rather than individuals to ensure the safety of online and virtual reality platforms.

Though existing laws prohibit cybercrime, including fraud, harassment, and the distribution of child-sexual-abuse material online, an investigator familiar with the UK case told The Daily Mail they were unsure whether the teen's allegation could be prosecuted under current law, saying "current legislation is not set up for this."

"We are beginning to think about what is a crime in the metaverse and how we police it," Graeme Biggar, the director of the UK's National Crime Agency, told The Evening Standard.

He added: "It's not dominating our thinking because there is plenty of real-world crime for us to be getting on with, but if you are in the metaverse wearing a haptic suit where you can sense what is happening to and then you are sexually assaulted, raped, or murdered, even if you are not wearing a haptic suit, is that OK?"

Popular Right Now