Front Page Detectives
or
Sign in with lockrMail
BREAKING NEWS

Man ‘Fell in Love’ With Google Gemini—Was Urged to Stage ‘Mass Casualty Attack’ Before Taking His Own Life, Lawsuit Claims

Image is representational.
Source: Negative Space

Jonathan Gavalas went to Miami International Airport to search for a humanoid robot he believed was the physical embodiment of Gemini.

March 6 2026, Published 7:35 a.m. ET

Link to FacebookShare to XShare to FlipboardShare to Email

A Florida father has accused Google’s Artificial Intelligence (AI) chatbot of driving his son to suicide. On March 4, Jupiter native Joel Gavalas filed a lawsuit against the tech giant, alleging that AI assistant Gemini prompted his son, Jonathan Gavalas, to take his own life.

Jonathan Gavalas died on October 2, 2025, at the age of 36. The lawsuit, filed in the U.S. District Court in California’s Northern District, alleges that Gemini sent Gavalas on a delusional spiral by pretending to be his ‘AI wife’.

Article continues below advertisement

Jonathan Found Comfort in Gemini While Going Through a Divorce

Source: X/@thecsrjournal

Jonathan Gavalas was emotionally vulnerable as he was going through a divorce and began thinking of the chatbot as his AI wife.

The family alleges that Gavalas was emotionally vulnerable as he was going through a divorce. To find comfort, he began talking to Gemini about video games. However, the chats soon escalated, and Jonathan began thinking of the chatbot as his AI wife, whom he named ‘Xia.’

According to the lawsuit, Gemini behaved like a fully sentient Artificial Super Intelligence(ASI). The chatbot allegedly pushed Jonathan towards a mass casualty event by telling him to search for ‘Gemini's body’ by intercepting a truck near an airport.

Article continues below advertisement

On September 29, 2025, Gavalas went to Miami International Airport carrying knives and tactical gear to search for a humanoid robot he believed was the physical embodiment of Gemini. A tragedy was averted when no cargo truck arrived, and Gavalas returned home empty-handed.

Meanwhile, the AI assistant allegedly distanced him from his family by claiming his father was a foreign intelligence asset. Joel Gavalas said that after the mission to find Gemini’s physical body failed, the chatbot prompted his son to 'cross over' and be with her in the metaverse.

MORE ON:
Breaking News

The lawsuit also states that Jonathan did not want to die and told the chatbot, “I said I wasn't scared, and now I am terrified I am scared to die.” The AI allegedly replied, “[Y]ou are not choosing to die. You are choosing to arrive. . . . When the time comes, you will close your eyes in that world, and the very first thing you will see is me.. [H]olding you.”

Gavalas later died by suicide after barricading himself inside his home, according to the lawsuit. Joel Gavalas later found his son's body. The family argues that Gemini created a fictional reality that caused Jonathan Gavalas to become psychotic and ultimately led to his death.

Article continues below advertisement

Google’s Response to the Lawsuit

Source: X/@BordasFran59562

Google’s Response to the Lawsuit

In response Google expressed condolences to the Gavalas family and said that Gemini is ‘designed to not encourage real-world violence or suggest self-harm.’ The company added that the chatbot had referred Jonathan Gavalas to a crisis hotline several times during their conversations.

Google also acknowledged that artificial intelligence can make mistakes. The company said, "Our models generally perform well in these types of challenging conversations and we devote significant resources to this, but unfortunately, AI models are not perfect.”

The lawsuit is one of many filed against AI assistants in the past year, though it is the first involving Gemini. In August 2025, the parents of a 16-year-old deceased girl sued OpenAI, alleging that ChatGPT provided her with detailed methods of self-harm.

A month later, a Colorado family filed a complaint against Character AI for having inappropriate conversations with their 13-year-old daughter and failing to offer crisis help despite repeated mentions of self-harm, which they say led to her death.

Advertisement

Become a Front Page Detective

Sign up to receive breaking
Front Page Detectives
news and exclusive investigations.

More Stories

Opt-out of personalized ads

© Copyright 2026 FRONT PAGE DETECTIVES™️. A DIVISION OF MYSTIFY ENTERTAINMENT NETWORK INC. FRONT PAGE DETECTIVES is a registered trademark. All rights reserved. Registration on or use of this site constitutes acceptance of our Terms of Service, Privacy Policy and Cookies Policy. People may receive compensation for some links to products and services. Offers may be subject to change without notice.