Early this spring, dozens of volunteers fielding requests from people struggling with eating disorders — many of whom call amid mental health crises — received surprising and abrupt news: Their service at the National Eating Disorders Association would end within weeks.
In their place, the nonprofit would offer an AI chatbot called Tessa.
advertisement
NEDA told volunteers the tool could respond faster and help more people than they could; the helpline was missing almost half of clients’ attempts to contact them, and wait times had surged to several days or even almost two weeks, according to emails reviewed by STAT.
But for volunteers — many of whom either had personal experience with eating disorders or family members who did — the helpline was much more than a directory or guidance that a chatbot could dispense. As one of the only national eating disorder helplines, it was a rare opportunity to directly help callers taking a first step toward recovery. Trawling provider databases, navigating health plan websites, and simply talking to callers offered valuable experience that some volunteers said propelled them toward careers as therapists. And it was also personally fulfilling, they said.
“This is a job I undertook not only as somebody who is recovered from an eating disorder, but also because I’m looking to get involved professionally,” said one volunteer and recent college graduate. As an undergraduate, she said it was a rare way to get hands-on experience with people facing mental health challenges. With Tessa, she said, “I had this opportunity taken away.”
advertisement
“There’s a lot of stigma around eating disorders, so it’s not a conversation that a lot of people feel comfortable reaching out to their friends or their parents about,” she said. “There’s this innate sense of being different from everyone else.”
STAT spoke with three former volunteers, all of whom requested anonymity for fear of professional repercussions. While NEDA has suspended use of the bot after it offered dieting advice, the volunteers said they haven’t been contacted about re-joining the organization and have since sought out other mental health support roles or applied to graduate school. NEDA’s current chief executive, Liz Thompson, told STAT in an email that there were no plans to restart the phone helpline, and that the incoming chief executive, Doreen Marshall, would share about the organization’s direction once she joined.
In its March email to volunteers, NEDA acknowledged that the organization was a “training ground” for people planning on graduate school or for pursuing other mental health careers, and that volunteers stayed for an average of 14 to 18 months. The organization staffed between 90 and 165 volunteers at the time it decided to instead rely on the chatbot for its services, according to NEDA’s communication with volunteers.
When NEDA’s decision was announced, the organization faced swift pushback for replacing its volunteer staff with technology that, if unchecked, might provide potentially harmful responses to people in crisis. Unlike generative AI technologies like ChatGPT, which can create content, Tessa is a rule-based system that follows a decision-tree to guide its responses. Though they’re experimenting with chatbot-based scheduling or automated follow-up questions, hospitals and clinics experimenting with chatbots have emphasized the importance of keeping humans in the loop. NEDA suspended its use a few weeks later, but has not clarified whether it plans to modify it and deploy it again.
While they weren’t allowed to offer crisis support — NEDA has emphasized that it is a source of information and not a crisis line — former volunteers told STAT that they did often receive calls from people experiencing suicidal ideation or who felt at risk of relapsing into eating disorders. One volunteer said they had spoken with multiple callers who had disclosed they’d suffered child abuse or elder abuse.
That volunteer said she didn’t think an automated system could offer empathy or sympathy if a caller disclosed those types of details.
“A robot has not experienced the human experience,” she said. “We can at least relate on some level, or at least be able to put ourselves in someone else’s shoes and see where they’ve come from,” she said.
Another former volunteer applied to NEDA because her child had experienced eating disorders, and she found she was frequently contacted by other parents seeking advice for their own children.
“I just felt like I was really doing something good, and making a difference in people’s lives, and all kinds of different people would call,” said that volunteer, who joined the helpline both to connect callers to local resources and information and also to explore potentially pursuing a related career.
Volunteers said they were given just a couple of months’ notice about their replacement, and that they were not asked about the bot. If they had been consulted, they might have alerted leadership to potential problems, one volunteer said.
“I can’t imagine how somebody would feel — and how I would feel [with] my eating disorder — if I was directed to tips for how to continue those behaviors,” she said. “This is something they should have anticipated.”
In ending the helpline, NEDA missed an opportunity to use AI to automate tedious database searches, or find up-to-date provider information, one volunteer said. That would still have let volunteers offer empathy to callers, while streamlining the time-intensive work of collecting resources for individual callers that led to longer wait times for help.
“We probably wasted so many people’s time just searching up and putting together emails of people that weren’t even relevant,” she said.
But even gathering resources required some human judgment that it’s not yet clear if AI could handle, another volunteer said — like weeding out doctors or therapists whose listings reference “eating disorders” but might also suggest specific interests in nutrition or fitness that might be harmful for some patients. “AI wouldn’t necessarily pick that up,” she said.
This story is part of a series examining the use of artificial intelligence in health care and practices for exchanging and analyzing patient data. It is supported with funding from the Gordon and Betty Moore Foundation.