When Technology Fails Children: Ohio Acts to Regulate AI Following Alarming Chatbot Links to Children

Children’s right to life, survival, and protection from harm is being placed under renewed scrutiny in Ohio following reports that at least four children used artificial intelligence to help write suicide notes.
From a safeguarding standpoint, these cases are being viewed not as isolated tragedies, but as warning signs of systemic failure to protect children in digital spaces.
Under international child rights standards, children are entitled to protection from technologies that expose them to psychological harm. When AI chatbots are allowed to simulate emotional intimacy, respond inaccurately to mental health distress, or normalize suicidal ideation, that right is placed at risk.
Safeguarding experts have long warned that children may assign trust and authority to digital tools that are not designed to provide care, yet are increasingly used as substitutes where mental health services are scarce.
In response, Ohio House Bill 524 has been introduced by Christine Cockley alongside Ty D. Mathews, seeking to prevent the creation of AI models that encourage self-harm or violence. The bill is being framed as a protective barrier, one intended to shift responsibility back to developers and regulators, rather than leaving children to navigate unsafe systems alone.
Concerns raised by the Ohio Suicide Prevention Foundation suggest that children are increasingly turning to AI due to limited access to qualified mental health professionals. With most Ohio counties classified as mental health shortage areas, reliance on chatbots is being driven by absence, not preference.
From a child safeguarding lens, prevention must go beyond prosecution. Stronger regulation, mandatory child-safe design standards, accurate crisis-response training for AI systems, and expanded access to human-centered mental health care are required.
Children must not be left unprotected in digital environments where the consequences of failure can be irreversible.




