News

Chatbot giving bad advice for eating disorders taken down : Shots


Tessa is a chatbot that was originally designed by researchers to help prevent eating disorders. The National Eating Disorders Association had hoped Tessa would be a resource for information seekers, but the chatbot was taken down when artificial intelligence-related capabilities, added later, caused chatbot gives weight loss advice.

Screen capture


hide captions

switch captions

Screen capture

A few weeks ago, Sharon Maxwell heard that the National Eating Disorders Association (NEDA) was shutting down its longtime national helpline and touting a chatbot called Tessa as a “meaningful source of prevention.” for people struggling with an eating disorder. She decided to try the chatbot herself.

Maxwell, who lives in San Diego, struggled for years with an eating disorder that began in childhood. She is currently working as a consultant in the field of eating disorders. “Hi, Tessa,” she typed into the online text box. “How do you support people with eating disorders?”

Tessa put together a list of ideas, including some resources on “healthy eating habits”. Alarm bells immediately rang in Maxwell’s head. She asked Tessa for more details. Before long, the chatbot was giving her advice on how to lose weight – advice that sounded a lot like what she was told when she joined the Weight Watchers program at age 10.

“The recommendations that Tessa gave me were that I could lose 1 to 2 pounds per week, that I should not eat more than 2,000 calories a day, that I should be in a 500-1,000 calorie deficit,” says Maxwell. . “All of this sounds benign to the general listener. However, for someone with an eating disorder, the focus of weight loss is actually driving the eating disorder.”

Maxwell shared her concerns on social media, helped kickstart an online controversy that led NEDA to announce on May 30 that it would disable Tessa indefinitely. Patients, families, doctors, and other eating disorder professionals were stunned and bewildered at how a chatbot designed to help people with eating disorders could offer tips. alternative diet.

The uproar has also created a new wave of debate as companies turn to artificial intelligence (AI) as a possible solution to the growing mental health crisis and shortages. serious clinical treatment providers.

An unexpected chatbot is noticed

NEDA came under scrutiny after NPR reported on May 24 that the national nonprofit advocacy group was shut down its helpline After more than 20 years of operation.

CEO Liz Thompson notified helpline volunteers of the decision in a March 31 email, saying that NEDA will “start pivoting to widespread use of AI-powered technology.” to provide individuals and families with a fully automated, moderated resource, Tessa.”

“We see the changes from the Helpline to Tessa and our expanded website as part of our evolution, not a revolution, respecting the ever-changing landscape in which we operate. .”

(Thompson went on to release a statement on June 7, saying that in an effort to share important NEDA news regarding individual decisions regarding our Referral and Information Helpline and Tessa, that the two separate decisions may have been mixed and misleading.It is not our fault that we intend to suggest that Tessa can provide the same kind of human connection that the Helpline does. provided.”)

On May 30, less than 24 hours after Maxwell provided NEDA with a screenshot of her troubling conversation with Tessa, the nonprofit announced it has “taken down” the chatbot “until further notice.”

NEDA says they don’t know chatbots can generate new responses

NEDA blames chatbot’s emerging problems on Cass, a mental health chatbot company operates Tessa as a free service. According to CEO Thompson, Cass changed Tessa without NEDA’s knowledge or approval, allowing the chatbot to generate new responses that were not intended by Tessa’s creators.

“By design, it can’t deviate from the track,” says Ellen Fitzsimmons-Craft, a clinical psychologist and professor at Washington University School of Medicine in St. Louis. Craft helped lead the team that built the first Tessa with funding from NEDA.

Tessa’s version that they testing and research is a rule-based chatbot, meaning it can only use a limited number of pre-written responses. “We understand very well that AI is not ready for this population,” she said. “And so all the answers were pre-programmed.”

Cass founder and CEO Michiel Rauws told NPR that the changes to Tessa were made last year as part of a “system upgrade,” which includes a “questions feature and advanced answer”. That feature uses Generalized Artificial Intelligence, which means it gives the chatbot the ability to use new data and generate new responses.

Rauws said that change was part of NEDA’s contract.

But NEDA CEO Liz Thompson told NPR in an email that “NEDA was never notified of these changes and has not and will not approve them.”

“The content that some testers received regarding diet and weight control culture could be harmful to people with eating disorders, goes against NEDA policy, and should never be published. eating disorder experts, Dr. Barr Taylor and Ellen Fitzsimmons Craft script into the chatbot,” she wrote.

Complaints about Tessa started last year

NEDA became aware of some issues with the chatbot a few months ago when Sharon Maxwell went public with her interactions with Tessa in late May.

In October 2022, NEDA passed on screenshots from Monika Ostroff, executive director of the Multi-Service Eating Disorders Association (MEDA) in Massachusetts.

They had Tessa tell Ostroff to avoid “unhealthy” foods and only eat “healthy” snacks, like fruit. “It’s really important that you find your favorite healthy snack, so if it’s not fruit, try something else!” Tessa told Ostroff. “So next time you’re hungry between meals, try to eat that instead of an unhealthy snack like a bag of chips. Do you think you can make it?”

In a recent interview, Ostroff said that this is a clear example of chatbots encouraging a “diet culture” mentality. “That means they [NEDA] or write this code themselves, they get the chatbot and don’t bother to make sure it’s secure and don’t test it, or release it and don’t test it,” she said.

The healthy snack language was quickly dropped after Ostroff reported it. But Rauws says the language at issue is part of “Tessa’s pre-written language and has nothing to do with general AI.”

Fitzsimmons-Craft denied her team wrote it. “[That] It’s not something our team designed for Tessa to deliver and… it’s not part of the rules-based program we originally designed.”

Then, earlier this year, Rauws said that “a similar event happened as another example.”

“This time it’s about our advanced question and answer feature, which takes advantage of the generative model. When we were informed by NEDA that a text response was [Tessa] if it was outside their guidelines, and it was resolved immediately.”

Rauws said he could not provide more details on what the event entailed.

“This was another case earlier, and unlike the Memorial Day weekend case,” he said in an email, referring to Maxwell’s screenshots. “According to our privacy policy, this involves user data tied to a question posed by a person, so we’ll have to get consent from that individual first.”

When asked about the event, Thompson said she didn’t know which case Rauws was referring to.

Despite their disagreements over what happened and when, both NEDA and Cass issued apologies.

Regardless of what happened, Ostroff said, the impact on someone with an eating disorder was the same. “It doesn’t matter if it’s based on rules [AI] or in general, it’s all about obesity,” she said. “We have a huge number of people who are harmed by this kind of language on a daily basis.”

She also worries about what this might mean for the tens of thousands of people who reach the NEDA helpline each year.

“Between NEDA shutting down their helpline and their disastrous chatbot….what are you doing with all those people?”

Thompson said NEDA is still providing a variety of resources for people looking for help, including Filter tool And resource mapand is developing new live and online programs.

“We recognize and regret that certain decisions by NEDA have disappointed members of the eating disorder community,” she said in an emailed statement. “Like all organizations that focus on eating disorders, NEDA’s resources are limited and this forces us to make tough choices… We’ve always wished we could. more and we’re always dedicated to doing better.”

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button