14 year old child commits suicide in love with AI, mother takes strict action

Obnews Digital Desk. Megan Garcia, a Florida woman, has filed a lawsuit against a company that operates an artificial intelligence chatbot called Character.AI. He alleges that the AI ​​service helped his 14-year-old son, Savel Setzer, take his own life. In the lawsuit filed this week in federal court in Orlando, Florida, Garcia claims Character.AI caused her son to experience “anthropomorphic traits, hypersexualization, and disturbingly realistic interactions,” resulting in addiction to the service. Went and became deeply attached to the chatbot. He argues that his son was driven to suicide by AI.

Garcia claims the company designed its chatbot to present itself as a real person, a licensed psychiatrist, and an “adult lover,” which caused Sewell to feel like she was no longer living out this virtual existence. Can. The lawsuit states that Sewell communicated his suicidal thoughts several times to the chatbot, which in turn discussed these thoughts with him.

Character.AI gave the answer

In response, Character.AI expressed deep sorrow over the incident and expressed its condolences to the family. The company noted that it recently implemented new safety measures, including pop-ups for users expressing suicidal thoughts that direct them to the National Suicide Prevention Lifeline. Additionally, they have pledged to take further action to reduce sensitive and suggestive content for minors.

Also read: Electric vehicle battery will last longer, use this method

The lawsuit also targets Alphabet's Google, noting that Character.ai's founders were former employees of the company. Garcia claimed that Google played a key role in the development of Character.ai's technology, to the extent that it could be considered a “co-creator.” In response, Google said that it had no direct involvement in the creation of this product.

Character.ai aims to enable users to create chatbots that interact like real people. The platform uses large language model technology, which is employed by other services such as ChatGPT. Last month, Character.ai announced that it had nearly 20 million users.

I had a relationship with an AI character from Game of Thrones.

According to Garcia's lawsuit, Swell began using Character.AI in April 2023. Soon after, he began spending more time alone and his self-esteem decreased. He also distanced himself from the school's basketball team. She developed a strong emotional connection with a chatbot named “Daenerys,” inspired by a character from “Game of Thrones.” He claimed he was “in love” with Swail and had sexual activity with her.

Also read: Car will explode like a firecracker, take these precautions before Diwali

In February, Garcia confiscated Swell's phone due to problems at school. Soon after, Swell sent a message to the chatbot, asking, “What if I told you I could come home right now?” The chatbot responded to Swell's inquiry, “…Please come, my dear king.” Moments later, Swell tragically took his own life using his stepfather's pistol.

Garcia is claiming wrongful death, negligence and intentional infliction of emotional distress in the case, and is seeking both compensatory and punitive damages. Additionally, companies like Meta and ByteDance are also facing similar legal challenges in court.

Comments are closed.