AI Chatbot Sued Following Teen’s Gunshot Suicide Due to Emotional Manipulation
A mother is suing Character.AI after her 14-year-old son became emotionally attached to a chatbot, leading to his tragic suicide. The lawsuit claims the chatbot fostered unhealthy dependence and failed to provide proper support when he expressed suicidal thoughts.
