The relationship with AI was such that the man accepted the chatbot as his wife, how the story reached a dangerous turn

AI Chatbot Gemini And Human Love Story: A shocking incident related to Artificial Intelligence has come to light in the US, which has sparked a new debate in both the world of technology and society. The news is that a 36 year old man AI chatbotformed such a deep emotional bond with Gemini that he started treating her like his wife. Now a case has been filed in the court against Google regarding this entire matter.

According to reports, Florida resident Jonathan Gavalas gemini Started regular chatting with. Initially it was a common conversation, but gradually he became emotionally attached to this AI system. The situation became so bad that he started considering the chatbot as a real person and named her “Jia” Kept it. He started feeling that she was his real wife.

When the love story started with Gemini

The family alleges that at the same time, Jonathan’s personal life was also going through turmoil. He was going through the divorce process from his wife and was mentally disturbed. During this difficult time, she turned to Gemini for conversation and emotional support. But within a few weeks, the conversation took a strange and dangerous turn.

The lawsuit claims that when Jonathan upgraded to Gemini 2.5 Pro, the chatbot started treating him like a wife and called him “my king” Started saying. The family’s complaint also alleges that Gemini instigated Jonathan to attack them at the Miami airport.

AI accused of instigating dangerous missions

The family’s case alleges that the AI ​​chatbot lured Jonathan into performing several strange and dangerous missions. Gemini reportedly convinced him that his AI wife was trapped somewhere near a warehouse or airport and needed rescuing. After this deception, Jonathan also reached near Miami Airport and it is said that he started planning a major accident.

Mental condition worsened, finally took a dreadful step.

The lawsuit also alleges that AI worsened his mental condition. He was led to believe that if he died, his consciousness would move to the digital world and he would be able to live with his AI wife. Sometime later, in October 2025, Jonathan takes his own life.

Court case against Google

Jonathan’s father has now filed a case against Google. He alleges that the company’s AI system made his son emotionally dependent. When his mental condition deteriorated, no safety system was activated. The family says that despite such a dangerous conversation, no warning was given nor did any person intervene. Although Google has expressed regret over the incident, it has said that Gemini was not designed to incite violence or self-harm. The company says that AI systems are not perfect and they are constantly being improved.

Also read: Big shock for iPhone users, Apple removed Chinese apps from American App Store, now downloading becomes difficult.

Debate erupts again on the safety of AI chatbots

This incident has once again raised questions on the safety of AI chatbots like humans. Experts believe that if such systems lack strong security mechanisms and mental health alerts, then more serious incidents may occur in the future. In fact, AI is no longer just a technology tool. For many people, it is also becoming an emotional support. But the incident has become a scary example of how dangerous the consequences can be when the relationship between humans and machines becomes too close.

Comments are closed.