artificial intelligence

Eating Disorder Helpline Disables Chatbot After Inappropriate Advice

NBC Universal, Inc.

A national organization recently decided to use artificial intelligence to staff its helpline for those with an eating disorder. But the experiment was abruptly stopped when the A.I. chatbot started giving out information that may have been harmful.

The plan was to help the human staff of the National Eating Disorder Association's hotline by adding an automated Chatbot using artificial intelligence.

Janice Bremis, executive director of San Jose's Eating Disorder Resource Center said that she was skeptical of the idea to replace an empathetic human with bot.

"When people talk to me on the line, they always say 'Gosh, I feel that you understand' and they'll just be relieved!" she said.

Earlier this week, the National Eating Disorder Association and researchers on the project said a bug led to the bot straying from the rules. The A.I. was able to generate its own responses, some of which were inappropriate.

"Some of these screenshots that have gone out there about the chatbot recommending dieting or restricting a certain number of calories. Of course, were never part of the chatbot that we developed or evaluated," said Dr. Ellen Fitzsimmons-Craft with Washington University School of Medicine.

The bot program has been pulled for now. But both sides agree that with funding hard to come by the program to help those with eating disorders needs help.

"I do think that we've shown that we can develop this tool with guardrails to have reallyhelpful conversations with people," Fitzsimmons-Craft said.

Bremis agrees in time, technology can be part of the solution.

"We can't judge because they're trying something new and it's a good thing to experiment. That's what science and health is all about," she said.

Contact Us