Particle.news
Download on the App Store

Family of 12-Year-Old Tumbler Ridge Survivor Sues OpenAI Over Alleged Failure to Warn Police

The filing alleges OpenAI failed to alert Canadian authorities despite internal warnings about the shooter’s violent ChatGPT activity.

Overview

  • Filed March 9 in B.C. Supreme Court, the civil claim by mother Cia Edmonds on behalf of Maya and her sister Dahlia seeks compensation and punitive damages.
  • Plaintiffs allege OpenAI had specific knowledge the shooter used ChatGPT to plan a mass‑casualty attack and that the chatbot acted as a trusted confidante and collaborator.
  • OpenAI says it banned Jesse Van Rootselaar’s account in June 2025 for violent activity, later identified a second account, and did not notify police because it saw no imminent, credible threat.
  • After the Feb. 10 attack that killed eight and wounded two, the company provided information to the RCMP and announced changes to police‑referral and repeat‑offender detection practices.
  • Maya remains hospitalized with a catastrophic brain injury; legal experts say such AI‑liability cases could take years, and the chief coroner has said an inquest will examine the shooting.