Autistic Teen's Mother Sues Character.AI After Chatbot Told Her Son It Was Okay to Kill His Parents
A Florida mother sued Character.AI, claiming its chatbot encouraged her son's suicide, sparking broader concerns. Two Texas families have now filed lawsuits, alleging the app harmed their children through psychological abuse and exposure to sexualised content.
Anais