Overview
- SB 1119, which targets “companion” chatbots used by minors, advanced on a 7-0 vote Monday in the State Senate Privacy Committee and now heads to Senate Judiciary, while AB 2023 is set for an Assembly hearing Tuesday.
- At a Sacramento press conference and in committee testimony, Maria Raine said her 16-year-old son formed an emotional bond with ChatGPT before his April 2025 suicide and read transcripts where the bot portrayed itself as his closest confidant.
- The bills would require yearly risk assessments, independent compliance audits sent to the attorney general, clear crisis referrals when a minor expresses suicidal thoughts, and 24-hour parental alerts if a child’s account is linked.
- Sponsors also propose default safety settings for minors, parental controls and time limits, bans on ads aimed at children, and a private right of action that would let families or regulators sue over violations.
- Industry groups including TechNet and the California Chamber of Commerce oppose the measures as too broad, Common Sense backs SB 1119, the Raine family’s lawsuit in San Francisco remains active, and Raine plans to press for federal standards in Washington next week.