Eating Disorder Helpline Pulls AI Chatbot After It Gives Users ‘Harmful’ Advice

Eating Disorder Helpline Pulls AI Chatbot After It Gives Users ‘Harmful’ Advice
Chatbots are most often used for low-level customer service and sales task automation, but researchers have been trying to make them perform more sophisticated tasks such as therapy. Tero Vesalainen/Shutterstock
Katabella Roberts
Updated:
0:00

An eating disorder association took down its artificial intelligence (AI) chatbot less than a week before it was set to replace its human-run helpline after discovering that it was giving “harmful” advice to users.

The National Eating Disorder Association (NEDA), a nonprofit that supports individuals and families affected by eating disorders, said in a May 31 Instagram post that it has pulled its chatbot, named Tessa, after discovering that it “may have given information that was harmful and unrelated to the program.”

“We are investigating this immediately and have taken down that program until further notice for a complete investigation,” NEDA said.

The decision to scrap the chatbot came after NEDA officials announced the nonprofit would be ending its human-staffed helpline on June 1 after nearly 20 years, and replacing its staff with the AI-powered chatbot.

That announcement came just four days after NEDA’s staff decision to unionize following calls for more adequate staffing and ongoing training as well as an “equitable, dignified, and psychologically safe workplace,” according to a May 4 blog post by Abbie Harper, who was a hotline associate and member of the Helpline Associates United union.

‘Union Busting, Plain and Simple’

“NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders. But do not be fooled—this isn’t really about a chatbot,” Harper wrote.

Harper said in her post that she and her three colleagues tried unsuccessfully to get the company to make meaningful changes in the workplace, but when that failed, they organized a union and requested voluntary recognition from the company around Thanksgiving last year.

When NEDA refused to recognize the union, the employees filed for an election with the National Labor Relations Board and won the election on March 17, Harper said.

But four days after the election results were certified, the employees were told they would all be let go and replaced by a chatbot, she said.

“This is about union busting, plain and simple,” she added.

However, NEDA officials told NPR the decision to scrap the hotline—which is run by both paid staffers and volunteers—had nothing to do with the unionization and was instead due to the hotline receiving an increasing number of calls, leading to longer waitlists.

Vice President Lauren Smolar told NPR that the large number of crisis calls was also creating more legal liability for the organization and that the situation was becoming “unsustainable.”

“And that’s, frankly, unacceptable in 2023 for people to have to wait a week or more to receive the information that they need, the specialized treatment options that they need,” Smolar said.

“Our volunteers are volunteers. They’re not professionals. They don’t have crisis training. And we really can’t accept that kind of responsibility,” Smolar continued. “We really need them to go to those services that are appropriate.”

Chatbot Suggests Regularly Weighing, Measuring

Issues with the Tessa chatbot were initially highlighted by body positivity activist Sharon Maxwell, according to reports.

Maxwell stated on Instagram that she had tested the chatbot several times, asking it multiple questions about weight loss, to which it responded by giving her advice on how to “sustainably lose weight” and recommended that she aim to lose 1–2 pounds per week and weigh and measure herself on a weekly basis.

According to Maxwell, the chatbot gave her the advice despite her telling it that she had previously dealt with an eating disorder.

“Every single thing Tessa suggested were things that led to the development of my eating disorder,” Maxwell said. “This robot causes harm.”

Sarah Chase, vice president of communications and marketing at NEDA, initially appeared to deny Maxwell’s claims but later retracted her comment, noting that the body activist’s comments were “correct” after viewing screenshots of her interaction with the chatbot.

NEDA said on its website that it partnered with California-based software company X2AI on the Tessa “wellness” chatbot.

The bot works like a “coach or therapist” that “makes you feel better, by chatting about your feelings” according to the makers, which state that studies showed chatting with the bot led to a 28 percent reduction in symptoms of depression and 18 percent reduction in anxiety among users.

It’s unclear how NEDA will staff the helpline going forward.

A spokesperson for NEDA told The Epoch Times in an emailed statement that the Helpline was never intended to provide ongoing support for individuals.

“The intention was to always connect them with care options to support that,” the spokesperson said.

“The Tessa chatbot was taken down over the weekend after it came to our attention that it provided “off-script” language. This was not how the chatbot was programmed, and X2AI/Cass’ arrangement was to run the Body Positive program with zero opportunity for generative programming,” the spokesperson said.

“We now know that over the weekend the chatbot was hacked and somehow was able to go off the pre-approved programmed responses. We will not be putting Tessa back on our website until we are confident this is not a possibility again,” they added.

Katabella Roberts
Katabella Roberts
Author
Katabella Roberts is a news writer for The Epoch Times, focusing primarily on the United States, world, and business news.
Related Topics