scorecardHelpline workers for the National Eating Disorder Association say they are being replaced by AI
  1. Home
  2. tech
  3. news
  4. Helpline workers for the National Eating Disorder Association say they are being replaced by AI

Helpline workers for the National Eating Disorder Association say they are being replaced by AI

Britney Nguyen   

Helpline workers for the National Eating Disorder Association say they are being replaced by AI
Tech3 min read
  • An eating disorder nonprofit is shutting down its helpline and reportedly firing staffers.
  • The National Eating Disorders Association is planning to transition to an AI chatbot.

The largest nonprofit organization supporting people with eating disorders is firing human staff and volunteers for its telephone helpline, NPR reported. The helpline will be shut down and the organization will transition to an AI chatbot named Tessa, a spokesperson for the National Eating Disorder Association confirmed in a statement to Insider.

"We are adding Tessa as a new opportunity and ending the Helpline program, but bear in mind these two services are NOT comparable," the statement said. "It is a completely different program offering and was borne out of the need to adapt to the changing needs and expectations of our community."

Staff who are part of the Helpline Associates United at NEDA said in a Twitter statement that they were told that they would be fired and replaced with a chatbot on June 1. The staff members won federal recognition for their union on March 17, according to the statement, and wrote that two weeks after the election to form a union, they were told they would lose their jobs.

The NEDA helpline had six paid staffers, NPR reported, and was also worked by up to 200 volunteers at any given time.

When Insider called the helpline, an automated voice message said it was no longer accepting calls. People can still text NEDA to 741741 to text with a volunteer at the Crisis Text Line.

Tessa is a chatbot that is focused on mental health and eating disorder prevention, according to its website. It originally launched in 2021, a NEDA spokesperson said.

"Please note that Tessa does not replace therapy nor the NEDA Helpline, but is always available to provide additional support when needed," Tessa's website says. "Tessa is not equipped to provide crisis support, but she will provide crisis resources when prompted."

The helpline, which launched in 1999, served 69,718 individuals and families last year, the NEDA spokesperson said.

But there were gaps in the service, as a number of people reached out on weekends and after-hours, meaning there was often a days-long wait for callers to receive a response, the spokesperson said.

Lauren Smolar, the vice president of mission and education at NEDA, told NPR that so many people calling the helpline with a crisis could mean more legal liability.

"Our volunteers are volunteers," Smolar said. "They're not professionals. They don't have crisis training. And we really can't accept that kind of responsibility. We really need them to go to those services who are appropriate."

Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University's medical school, told NPR that NEDA paid her team to create Tessa years ago and that it's "a tool in its current form that's going to help you learn and use some strategies to address your disordered eating and your body image."

Fitzsimmons-Craft told NPR that small studies done by her team with Tessa showed people who used the chatbot did better than people who had to wait on NEDA's waitlist to receive services.

But members of the helpline's union say the move is retaliation and that a human-operated helpline is crucial for those suffering.

"Some of us have personally recovered from eating disorders and bring that invaluable experience to our work," helpline staffer Abbie Harper wrote in a blog post. "All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference."

Marzyeh Ghassemi, a professor at MIT who studies machine learning and health, told NPR that using a chatbot over humans could be harmful and said her research shows people who need help want to be understood.

"If I'm disclosing to you that I have an eating disorder, I'm not sure how I can get through lunch tomorrow — I don't think most of the people who would be disclosing that would want to get a generic link," Ghassemi told NPR. "Click here for tips on how to rethink food."

"We're not quitting. We're not striking. We will continue to show up every day to support our community until June 1," the helpline union said in a statement shared with Insider. "A chat bot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community."




Advertisement