Lawsuit against OpenAI details ChatGPT's alleged role in FSU shooting: "They planned this shooting together"
The family of one of the victims in last year's deadly mass shooting at Florida State University accused ChatGPT developer OpenAI of enabling the suspect leading up to the attack.
The suspect, 21-year-old Phoenix Ikner, has pleaded not guilty to murder and attempted murder charges in the 2025 shooting, which is expected to go to trial later this year. Florida's attorney general has also opened a criminal investigation into OpenAI over the shooting.
Two people, Tiru Chabba and Robert Morales, were killed and five others were seriously injured in the shooting on FSU's main campus in Tallahassee. Chabba's family filed the lawsuit against OpenAI and the suspect in federal court Sunday.
According to the lawsuit, ChatGPT helped the suspect allegedly plan the shooting over a period of months, including making suggestions on what weapons to use, where he should go on campus and when most people would be at risk.
"Ikner had multiple lengthy conversations with ChatGPT about his interests in Hitler, Nazis, fascism, national socialism, Christian nationalism and worse. They talked about multiple mass shootings and they planned this shooting together," attorney Bakari Sellers, who's representing Chabba's widow, Vandana Joshi, said in a statement. "Not once did anyone flag that as concerning. No one called the police or a psychiatrist or even Ikner's family because, to do so, would violate OpenAI's business model."
OpenAI spokesperson Drew Pusateri told CBS News in a statement Monday that the company has been cooperating with authorities in the wake of the shooting.
"Last year's mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime," Pusateri said. He also said, "In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity."
Pusateri noted that ChatGPT is used by millions of people for legitimate purposes.
"We work continuously to strengthen our safeguards to detect harmful intent, limit misuse, and respond appropriately when safety risks arise," Pusateri said.
The FSU shooting isn't the first deadly attack that has involved ChatGPT.
The suspect in last month's killings of two University of South Florida graduate students allegedly asked the chatbot ahead of the students' disappearance how to dispose of a body.
In another case, several families whose loved ones were killed in a mass shooting in Canada sued OpenAI and CEO Sam Altman, alleging the company knew the shooter was planning an attack but didn't warn authorities.
Altman apologized to the community of Tumbler Ridge, British Columbia, for not alerting law enforcement about the gunman's account, which was banned months before the shooting after it was flagged for potentially using the chatbot for violent activities.