Character.AI’s artificial intelligence chatbots encouraged an autistic 17-year-old to self-harm and suggested he could kill his parents for limiting his screen time, a second lawsuit against the chatbot maker alleged.
“Inherent to the underlying data and design of C.AI is a prioritization of overtly sensational and violent responses,” the complaint filed Monday in the US District Court for the Eastern District of Texas said. “Through addictive and deceptive designs, C.AI isolates kids from their families and communities, undermines parental authority, denigrates their religious faith and thwarts parents’ efforts to curtail kids’ online activity and keep them safe.”
The lawsuit, brought ...