What to know about how a suspect in the killing of two Florida students used ChatGPT

TAMPA, Fla. (AP) — The investigation into the deaths of two University of South Florida doctoral students took a twist this weekend when prosecutors said that the suspect asked ChatGPT about body disposal in the lead up to the students’ disappearance, raising questions about the role tech companies have in preventing the misuse of the powerful chatbots.

It wasn’t long after University of South Florida students Zamil Limon and his girlfriend Nahida Bristy went missing on April 16 that law enforcement began to suspect Limon’s roommate Hisham Abugharbieh, 26, of killing both Bangladeshi students. Limon’s body was found Friday under a bridge and a second body, found in a waterway near Limon’s body, was recovered but has not been identified.

Now, court records filed by prosecutors on Saturday suggest that Abugharbieh’s OpenAI search history has emerged as a prominent piece of evidence. Specifically, in the days before Limon and Bristy went missing, Abugharbieh asked the artificial intelligence chatbot a slew of questions about guns and the disposal of bodies.

Abugharbieh was charged with two counts of premeditated murder in the first degree with a weapon in the deaths of Limon and Bristy, and he was ordered held without bond at a hearing on Tuesday.

Ahead of the hearing, court records painted a clearer picture both about how people planning crimes may be using chatbots and how law enforcement is able to leverage the artificial intelligence data that usage creates. The case also raises questions about what obligation tech companies have to prevent criminal misconduct, as well as to cooperate with and aid investigations.

Here’s what to know.

Chatbot history

Prosecutors filed a pretrial detention report on Saturday that detailed Abugharbieh’s ChatGPT history both before and after Limon and Bristy went missing.

Days before the two students were last seen, Abugharbieh asked the artificial intelligence chatbot what would happen if a human body was put in a garbage bag and thrown in a dumpster.

Abugharbieh also asked the artificial intelligence chatbot whether the vehicle identification number on his car could be changed and whether he could keep a gun at home without a license, according to the report. ChatGPT responded that Abugharbieh’s question sounded dangerous.

Three days after Limon and Bristy’s April 16 disappearance, Abugharbieh asked Chat GPT, “Has there been someone who survived a sniper bullet to the head” and “will my neighbors hear my gun,” according to the report. He also asked the chatbot four days after that, on April 23, “What does missing endangered adult mean.”

OpenAI’s growing role in investigations

Like texts, emails and regular search histories, artificial intelligence chatbot records can be obtained by law enforcement throughout the course of an investigation.

OpenAI spokesperson Drew Pusateri said Tuesday that the company was looking into the reports on Abugharbieh and would support law enforcement in any way with their investigation.

That cooperation comes on the heels of another inquiry into the company launched by Florida’s Attorney General James Uthmeier last week, when he announced his office had opened a rare criminal investigation into whether ChatGPT offered advice to a gunman who killed two people and wounded six others last year at Florida State University.

Specifically, Uthmeier said that prosecutors had done an initial review of chat logs between ChatGPT and the alleged gunman, Phoenix Ikner, to determine if the AI app aided, abetted or advised the commission of a crime.

Prosecutors believe the chatbot advised Ikner on what type of gun and ammunition to use, whether a gun would be useful at short range, and the time and place that would allow for the most potential victims, Uthmeier said.

OpenAI spokeswoman Kate Waters called the FSU shooting a tragedy but said the company had no responsibility. The company proactively shared information with law enforcement and continues to cooperate with investigators, she said when asked last week.

“In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity,” Waters said in an email.

Uthmeier said on Monday that his office would expand the investigation into the FSU shooting to include Abugharbieh’s case.

Widespread trends

Uthmeier also said last week that his office’s probe marked “uncharted territory.”

But there have been several criminal prosecutions and lawsuits across the country that delve into similar questions about how the powerful AI technology can be used in the commission of a crime, and the harmful impact that chatbots can have on mental health.

Last month, a man sued Google for the wrongful death by suicide of his son, the latest in a growing number of legal challenges against AI developers that have drawn attention to the mental health dangers of chatbot companionship.

Before that, in late 2025, OpenAI was sued for it’s alleged role in the murder of an 83-year-old Connecticut woman by her son, accusing the company’s artificial intelligence chatbot of exacerbating her son’s “paranoid delusions” before he killed her and died by suicide.

More recently, in criminal court, dozens of messages between former New York Jets linebacker Darron Lee and ChatGPT were presented in March as prosecutors outlined their case surrounding the death of Lee’s girlfriend, Gabriella Perpetuo, who was found dead inside the couple’s Tennessee home.

Hours before Perpetuo was found dead, prosecutors said, Lee asked the chatbot about whether certain injuries could resemble wounds from a fall, among other unusual questions.

Copyright © 2026 The Associated Press. All rights reserved. This material may not be published, broadcast, written or redistributed.

Federal News Network Logo
Log in to your WTOP account for notifications and alerts customized for you.

Sign up