Parents say ChatGPT got their son killed over bad advice about party drugs


The family of a 19-year-old college student has sued OpenAI over allegations that his conversations with ChatGPT led to an accidental overdose. in The lawsuit was filed TuesdaySam Nelson’s parents allege that ChatGPT “encouraged” the teen to “consume a range of substances that any licensed medical professional would have considered lethal,” leading to his death.

Although ChatGPT initially “shut down” conversations about drug and alcohol use, it has since been launched GPT-4o in April 2024 It changed the chatbot’s behavior, according to the lawsuit. After the update, ChatGPT “began to engage and advise Sam on the safe use of medications, even providing specific dosage information on how much of the substance Sam should take,” the lawsuit claims. Nelson’s parents allege that ChatGPT gave their son advice on how to “safely combine” various substances in the months before his death, including prescription pills, alcohol, over-the-counter medications, and other drugs.

In one case, ChatGPT allegedly provided Nelson with recommendations on how to “optimize” his trip in order to “rest, introspect, and have fun” while drinking cough syrup. She also suggested creating a psychedelic playlist to “tune” his ride to achieve “maximum dissociation from the body,” the lawsuit claims. ChatGPT later allegedly confirmed Nelson’s plans to increase his dose of cough syrup the next time he took it. “You learn from experience, limit risk, and improve your approach,” ChatGPT said.

On May 31, 2025, the day of Nelson’s death, his parents claimed that ChatGPT “actively coached” their son to combine Kratom – extension Which can boost energy or act as a sedative depending on the dose – and the anti-anxiety drug Xanax. “ChatGPT, otherwise, specifically suggested that taking a 0.25 to 0.5 mg dose of Xanax would be one of its best steps at this time to alleviate nausea caused by Kratom,” the lawsuit alleges. Nelson died after consuming a combination of alcohol, Xanax, and kratom. SF portal First covered Nelson’s story In January.

“These interactions occurred on a previous version of ChatGPT that is no longer available. ChatGPT is not a substitute for medical or mental health care, and we have continued to enhance how it responds in sensitive and acute situations with input from mental health experts,” OpenAI spokesperson Drew Pusateri said in an email statement. Edge. “The safeguards in today’s ChatGPT are designed to identify distress, safely handle malicious requests, and direct users to real-world help. This is ongoing work, and we continue to improve it in close consultation with clinicians.”

Nelson’s parents are suing OpenAI for wrongful death and “unauthorized medical practice.” They are seeking damages and to temporarily shut down OpenAI ChatGPT Health feature launched It allows users to link their medical records to the chatbot.

Leave a Reply

Your email address will not be published. Required fields are marked *