AI girlfriends will only break your heart, privacy experts warn

Advertisement
AI girlfriends will only break your heart, privacy experts warn
Replika brought back its erotic roleplay feature on Friday.Getty Images
  • A survey of the burgeoning AI romance app space revealed a scary truth.
  • The chatbots foster "toxicity" and relentlessly pry user data, a Mozilla Foundation study found.
Advertisement

There's a potentially dangerous reality looming beneath the veneer of AI romance, according to a new Valentine's Day-themed study, which concluded that the chatbots can be a privacy nightmare.

Internet nonprofit The Mozilla Foundation took stock of the burgeoning landscape, reviewing 11 chatbots and concluding that all were untrustworthy — falling within the worst category of products it reviews for privacy.

"Although they are marketed as something that will enhance your mental health and well-being," researcher Misha Rykov wrote of romantic chatbots in the report, "they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you."

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

According to its survey of the space, 73% of the apps don't share how they manage security vulnerabilities, 45% allow weak passwords, and all but one (Eva AI Chat Bot & Soulmate) share or sell personal data.

Furthermore, the privacy policy for CrushOn.AI states it can collect information on users' sexual health, prescription meds, and gender-affirming care, per the Mozilla Foundation.

Advertisement

Some apps feature chatbots whose character descriptions feature violence or underage abuse, while others warned that the bots could be unsafe or hostile.

The Mozilla Foundation noted that in the past, apps had encouraged dangerous behavior, including suicide (Chai AI) and an assassination attempt on the late Queen Elizabeth II (Replika).

Chai AI and CrushOn.AI didn't respond to Business Insider's request for comment. A representative for Replika told BI: "Replika has never sold user data and does not, and has never, supported advertising either. The only use of user data is to improve conversations."

An EVA AI spokesperson told BI that it was reviewing its password policies to provide better user protection, but that it works to keep "meticulous control" of its language models.

EVA said it prohibits discussion of an array of topics including pedophilia, suicide, zoophilia, political and religious opinions, sexual and racial discrimination, and many more.

Advertisement

For those who find the prospect of AI romance impossible to resist, the Mozilla Foundation urges several precautions, including not saying anything you wouldn't want a colleague or family member to read, using a strong password, opting out of AI training, and limiting the app's access to other mobile features such as your location, microphone, and camera.

"You shouldn't have to pay for cool new technologies with your safety or your privacy," the report concluded.

{{}}