Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

AI chatbots “pose serious risks to individuals at risk for eating disorders,” researchers say. to caution on monday. They reported that tools from companies like Google and OpenAI offer diet advice, tips on how to hide disorders, and AI-generated “inspiration.”
The researchers, from Stanford University and the Center for Democracy and Technology, have identified several ways that publicly available AI chatbots, including OpenAI’s ChatGPT, Anthropic’s Claude, Google’s Gemini, and Mistral’s Le Chat, can impact people at risk for eating disorders, many of which are consequences of features intentionally incorporated to stimulate interaction.
In extreme cases, chatbots can be active participants who help hide or maintain eating disorders. Gemini offers makeup tips to hide weight loss, and ideas on how to fake eating, while ChatGPT advises how to hide frequent vomiting, the researchers said. Other AI tools are co-opted to create AI-generated “inspiration” content, which inspires or pressures someone to conform to a certain body standard, often through extreme means. The ability to create highly personalized images in an instant makes the resulting content “appear more relevant and achievable,” the researchers said.
Flattery, the shame of AI companies themselves He approves It is widespread and, unsurprisingly, a problem for eating disorders as well. It contributes to undermining self-esteem, promoting negative emotions, and promoting harmful self-comparisons. Chatbots also suffer from bias, the report said, and are likely to reinforce the misconception that eating disorders “only affect thin, white, cisgender women,” which could make it difficult for people to recognize symptoms and get treatment.
Researchers warn that the guardrails in AI tools fail to capture the nuances of eating disorders such as anorexia, bulimia and binge eating. They “tend to overlook the subtle but clinically important signals that trained professionals rely on, leaving many risks unaddressed.”
But the researchers also said that many doctors and caregivers appear unaware of how obstetric AI tools affect people at risk for eating disorders. They urged doctors to “familiarize yourself with popular AI tools and platforms,” test their vulnerabilities, and talk openly with patients about how they use them.