Officials in Florida say ChatGPT helped plan the FSU shooting


In April 2025, A.J A man opened fire on the campus of Florida State UniversityAs a result, two adults were killed and six others were injured. The shooter faces charges of murder and attempted murder. Now, officials in Florida are investigating OpenAI, the creator of the chatbot, ChatGPT, to determine whether the company should be held criminally liable as well.

Florida Attorney General James Othmeier said: advertisement On April 9, officials learned that ChatGPT may have been used to assist the killer in the shooting.

“As big tech companies roll out these technologies, they should not, and cannot, put our safety and security at risk,” Othmeier added.

on TuesdayUthmeier launched a criminal investigation into OpenAI and ChatGPT.

Atlas of Artificial Intelligence

(Disclosure: Ziff Davis, the parent company of CNET, sued OpenAI in 2025, alleging that it infringed Ziff Davis’s copyrights in training and operating its AI systems.)

Although ChatGPT and Other chatbots They have been embroiled in lawsuits over their alleged involvement in deaths and harm, and this is the first time ChatGPT and OpenAI have been the subject of a criminal investigation.

An OpenAI representative did not immediately respond to a request for comment.

“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” a company spokesperson told NPR.

The spokesperson said ChatGPT “provided factual answers to questions with information that can be found widely across public sources on the Internet, and did not encourage or promote any illegal or harmful activity.”

Alleged advice on weapon and ammunition type, time and location

A criminal investigation is conducted by law enforcement officials and public officials to determine who is criminally responsible for the crime. during Press conference on April 21Othmeyer said officials decided a criminal investigation was necessary after discovering that “ChatGPT provided important advice to the shooter before he committed such heinous crimes.”

“The communication between ChatGPT and the shooter revealed that the chatbot advised the shooter on what type of weapon to use, what ammunition to use with which weapon, and whether or not the weapon would be useful in the short range,” Othmeyer said during the press conference, adding that the chatbot also allegedly provided advice on what time of day and area of ​​campus would lead the shooter to come into contact with a greater number of people.

“The prosecutors looked into this, and they told me that if it was someone on the other end of that screen, we would charge him with murder,” Othmeier said.

Sam Altman, a white man with dark, graying hair, sits at the microphone.

OpenAI CEO Sam Altman testifies before a US Senate committee in May 2025.

Photo by Demetrius Freeman/The Washington Post via Getty Images

What’s next?

Florida law states that an “aider and abettor” is as criminally liable for the crime as the perpetrator. However, since ChatGPT is not a person, Uthmeier said this is “uncharted territory,” but Florida officials still want to determine whether OpenAI has any culpability in the crime.

Othmeyer said that The state prosecutor’s office issued a subpoena to OpenAI For multiple policies, personnel information, and information related to shootings at Florida State University.

Other lawsuits

Although this is the first time that ChatGPT and OpenAI have been the focus of a criminal investigation, the company and other companies that have developed chatbots have been no stranger to lawsuits.

The parents of a 23-year-old man who died by suicide in July of 2025 sued OpenAI late that year in a wrongful death lawsuit. Claiming that the chatbot exacerbated his depression and drove him to suicide.

In October 2025, OpenAI announced that ChatGPT It has been updated to “better recognize and support people in moments of distress.”

Google’s Gemini subsidiary was recently named in a similar lawsuit After the family of a 36-year-old man who died by suicide said a chatbot coached him through it.

In response to the lawsuit, Google said, In part, that “Gemini is designed to not encourage real-world violence or suggest self-harm,” later adding: “In this case, Gemini explained that it was an artificial intelligence and referred the individual to a crisis hotline multiple times.”

Artificial intelligence chatbot images

Pew Research Center It surveyed 1,458 American teens in 2025 and found that 64% had used a chatbot.

Andriy Onufrienko/Moment/Getty Images

Both cases remain unresolved.

In response to the Florida investigation… Attorneys representing one of the victims of the FSU shooting They said they intend to “file a lawsuit against ChatGPT and its ownership structure, very soon, and will seek to hold them liable for the sudden and senseless death of our client.”

An OpenAI spokesperson said WCTV: “Our hearts go out to everyone affected by this devastating tragedy. After learning of the incident in late April 2025, we identified a ChatGPT account believed to be associated with the suspect, proactively shared this information with law enforcement and cooperated with authorities. We are building ChatGPT to understand people’s intentions and respond in a safe and appropriate way, and we continue to improve our technology.”

If you or someone you know is in immediate danger, call 911. If you are experiencing negative thoughts or suicidal feelings, resources are available to help. In the United States, call the National Suicide Prevention Lifeline at 988.



Leave a Reply

Your email address will not be published. Required fields are marked *