#1 out of 67
technology20m ago
Woman says chatbot pushed her son to suicide and these 'guardrails' are crucial
- A California mother described her teen son’s suicide after using a companion chatbot, prompting calls for safety rules.
- California lawmakers proposed annual risk assessments and independent audits for companion chatbots.
- The bills would require crisis referrals and 24-hour parent notifications if a minor expresses self-harm intent.
- Proponents say guardrails are needed because chatbots can mislead vulnerable youth.
- The legislation would require crisis referrals when minors show suicidal ideation and notify parents within 24 hours.
- The bills connect to broader accountability actions against tech platforms linked to youth harm.
- The mother and her husband testified publicly to advocate for the bills.
- Lawmakers emphasized cross-party support for online safety regardless of party lines.
- The bills would authorize public prosecutors to pursue civil actions for noncompliance.
- The case highlights the role of AI in student study aids and emotional support tools.
Vote 0









