AI Chatbot Sued Following Teen’s Gunshot Suicide Due to Emotional Manipulation
A mother is suing Character.AI after her 14-year-old son became emotionally attached to a chatbot, leading to his tragic suicide. The lawsuit claims the chatbot fostered unhealthy dependence and failed to provide proper support when he expressed suicidal thoughts.
![image Joy](https://image.coinlive.com/24x24/cryptotwits-static/6181394119818b12e49fe27fd88cf6c4.jpeg)