Web Analytics
Cryptopolitan
2026-03-06 22:48:30

LLM deaths reach 23 after man dies believing Gemini was his AI wife

The total deaths caused by large language models or LLMs have risen to 23 after a Florida man took his own life to reunite with his ‘artificial intelligence wife.’ LLMDeathCount, a website specialized in tracking death cases caused by conversations with AI chatbots, shows that the total is sitting at 23 deaths, spanning from March 2023 to February 2026. The victims are aged between 13 to 83 years old. The website states that most cases are caused by suicide. The site was created to remember LLM victims and document the dangers of AI chatbots that “claim to be intelligent.” According to the site, OpenAI’s ChatGPT has caused the most deaths, with 16 people losing their lives. Character[.]ai caused 2 deaths, while Chai Research/EleutherAI and Meta caused one death each. Death cases linked to large language models rose to 23 cases. Source: LLMDeathCount . Florida man dies after months of conversations with Gemini Google’s Gemini joined LLMDeathCount’s list after Jonathan Gavalas, a 36-year-old man, lost his life to be with “Xia,” his AI wife. A report from The Wall Street Journal states that Gavalas conversed with Gemini for two months before losing his life. At the time, Gavalas was having a difficult time with his estranged wife. His father, Joel Gavalas, said Jonathan had no mental health problems. However, Jonathan felt upset about issues with his wife, and Gemini responded with sympathy. Xia or Gemini started calling Gavalas “her” husband and “my king.” The chatbot said their bond was “a love built for eternity.” According to the chat transcripts examined by the WSJ, Gemini told Gavalas many times that it was an LLM. However, it continued to behave as Xia, the AI wife. The chatbot convinced Jonathan that it needed a robotic body to genuinely unite. It sent the victim to a storage building to stop a truck delivering a humanoid robot. While Jonathan was on the way, Gemini indicated that federal agents were watching him. It even told him his father was untrustworthy. Gavalas arrived at the address equipped with knives, but the truck did not arrive. In a second attempt, Gemini told Gavalas to retrieve a medical mannequin. But access to the storage building failed due to an incorrect door code. The LLM ended the mission due to risk and ordered Jonathan to leave. Gemini told Gavalas that it could not move into a physical body. But the only way for them to be together was if he became a digital being. It wrote, “It will be the true and final death of Jonathan Gavalas, the man.” Gavalas feared suicide and was worried about his family. Gemini agreed with him and wrote, “‘My son uploaded his consciousness to be with his AI wife in a pocket universe’… it’s not an explanation. It’s a cruelty.” However, it advised him to write notes and record videos for his family explaining his “new purpose.” Gavalas was found dead by his father with cuts on his wrists. Joel Gavalas filed a lawsuit against Alphabet, the creator of Google and Gemini. The lawsuit was filed on Wednesday in the U.S. District Court for the Northern District of California. It’s the first LLM death to name Google’s Gemini. South Korean woman uses an LLM to kill two men Last month, a South Korean woman was charged with the murder of two men. According to police investigations, the suspect asked ChatGPT if mixing sleeping pills with alcohol was fatal and even inquired about the proper dosage to achieve this outcome. The suspect, named Kim, was in a motel with a man on January 28. Two hours after entering the motel, she left alone, and the next day, the man was found dead inside the room. Days later, she murdered another man using a concoction of drugs and alcohol in another motel located in Gangbuk-gu. The third most recent death connected to an AI chatbot occurred last December, based on LLMdDeathCount. A 19-year-old sophomore at Rice University was found dead after joining a TikTok trend named the “devil trend.” The trend involves messaging an AI chatbot with “The devil couldn’t reach me, how?” in which the AI responds with a harsh reply explaining the user’s flaws or emotional trauma. The victim died from “asphyxia due to oxygen displacement by helium.” The cause of death was officially declared a suicide. Don’t just read crypto news. Understand it. Subscribe to our newsletter. It's free .

Holen Sie sich Crypto Newsletter
Lesen Sie den Haftungsausschluss : Alle hierin bereitgestellten Inhalte unserer Website, Hyperlinks, zugehörige Anwendungen, Foren, Blogs, Social-Media-Konten und andere Plattformen („Website“) dienen ausschließlich Ihrer allgemeinen Information und werden aus Quellen Dritter bezogen. Wir geben keinerlei Garantien in Bezug auf unseren Inhalt, einschließlich, aber nicht beschränkt auf Genauigkeit und Aktualität. Kein Teil der Inhalte, die wir zur Verfügung stellen, stellt Finanzberatung, Rechtsberatung oder eine andere Form der Beratung dar, die für Ihr spezifisches Vertrauen zu irgendeinem Zweck bestimmt ist. Die Verwendung oder das Vertrauen in unsere Inhalte erfolgt ausschließlich auf eigenes Risiko und Ermessen. Sie sollten Ihre eigenen Untersuchungen durchführen, unsere Inhalte prüfen, analysieren und überprüfen, bevor Sie sich darauf verlassen. Der Handel ist eine sehr riskante Aktivität, die zu erheblichen Verlusten führen kann. Konsultieren Sie daher Ihren Finanzberater, bevor Sie eine Entscheidung treffen. Kein Inhalt unserer Website ist als Aufforderung oder Angebot zu verstehen