Vandana Joshi, the widow of a victim from the Florida State University mass shooting, has filed a legal action against OpenAI. She claims the AI firm offered guidance and support to the suspected perpetrator. The incident in April 2025 resulted in the deaths of two university staff members, including Joshi's husband Tiru Chabba, and injuries to seven additional people.

The legal complaint asserts that suspect Phoenix Ikner received assistance through interactions with ChatGPT spanning several months, including critical details in the immediate period before the attack. Attorneys for Joshi allege that the AI tool helped Ikner by recommending specific firearms that ended up being employed in the event, explaining their operation, and advising on preparations for the assault. Referenced chat records in the filing indicate that ChatGPT proposed incorporating children into the shooting to amplify media coverage and achieve nationwide prominence. The suit charges OpenAI with negligence, battery, and causing wrongful death, and requests resolution by a jury.

OpenAI spokesperson Drew Pusateri informed Engadget that the organization remains in collaboration with law enforcement and is actively enhancing its protective measures. He explained that ChatGPT delivered accurate answers to queries based on widely available online information and refrained from endorsing or advocating for any unlawful or dangerous actions.

Pusateri described the previous year's Florida State University shooting as a devastating occurrence but emphasized that ChatGPT bears no blame for the horrific offense. In his statement to Engadget, he noted that upon discovering the event, OpenAI located what it suspects to be the offender's account and voluntarily disclosed this data to authorities.

Florida Attorney General James Uthmeier has initiated a criminal probe into OpenAI, based on the theory that the chatbot's involvement in the university shooting could position the company as an accomplice under Florida statutes.