Mother of Survivor Maya Gebala Files Suit Against OpenAI
The mother of a survivor of the Tumbler Ridge mass shooting has filed a lawsuit against OpenAI and its chatbot ChatGPT. She claims the company failed to alert police after warning signs appeared in the attacker’s conversations with the AI system.
The lawsuit was filed on March 9th at the Supreme Court of British Columbia. The lawsuit was filed by Cia Edmonds on behalf of her daughters Maya Gebala, 12 years old, who survived the shooting but was left with life-changing injuries, and her younger sister Dahlia.
The attacker, 17-year-old Jesse Van Rootselaar, had created a ChatGPT account during the summer of 2025. In a series of conversations, she had described various scenarios involving gun violence. The lawsuit has revealed that a total of 12 different monitoring staff had expressed serious concerns after analyzing her messages. The messages had indicated a possible and immediate threat to others. The staff had recommended that Canadian police be notified.
The lawsuit states that this recommendation was sent to company leadership at OpenAI. However, the claim alleges that leaders rejected the request to inform law enforcement. Instead, the company shut down the user’s account.
OpenAI Facing Lawsuit Over Fatal Tumbler Ridge School Shooting
According to the lawsuit, that action did not stop the attacker. Van Rootselaar later opened another ChatGPT account. The claim says she continued discussing violent scenarios through that second account. The filing also alleges that she received mental-health advice and informal counseling through the chatbot during those conversations.
The tragedy occurred on Feb. 10, 2026. The lawsuit states that Van Rootselaar first killed her mother and her half-brother at their home in Tumbler Ridge, British Columbia. After the killings, she walked to Tumbler Ridge Secondary School.
Once at the school, she opened fire. Five students and one teacher were killed during the attack. After the shooting, the attacker died by suicide.
Maya Gebala was one of the students seriously injured during the shooting. According to the lawsuit, she was trying to lock the library door to protect others when she was shot. She suffered gunshot wounds to the head and remains in hospital.
Doctors say Maya has a severe brain injury and paralysis on the right side of her body. Her recovery will likely take years, and her long-term condition remains uncertain.
The lawsuit says the family wants answers about how the attack developed and whether it could have been prevented. It seeks compensation for the injuries, trauma, and other losses caused by the shooting.
A report by The Wall Street Journal earlier described some of the allegations involving the attacker’s use of ChatGPT. The article said the legal claim raises questions about how AI companies respond when users discuss violent actions.
After the report, Sam Altman, the CEO of OpenAI, met with Canada’s federal AI minister, Evan Solomon. The discussion was centered around the potential changes that could be made for safety and the role of artificial intelligence systems in society.
The Tumbler Ridge Tragedy and the Push for AI Accountability
Altman was also scheduled to meet with British Columbia Premier David Eby. After the meeting, Eby said that Altman promised that he would be apologizing to the victims of the Tumbler Ridge tragedy. This apology has yet to be made, as of March 9.
The lawyers representing Edmonds and her daughters said that the purpose of the lawsuit is to get the full story of the incident. The case will possibly examine the medical records of the shooter, the individual’s history with schools, and their devices.
The planned inquest into the shooting will examine the medical records of the individual, their history with schools, and their devices. The investigators will try to get a better understanding of the circumstances that led up to the violence and whether any warning signs were ignored.
OpenAI will have 35 days to respond to the lawsuit in court. The case may bring up new legal questions for AI companies when users are discussing threats of violence.
Comments are closed.