Chatbots promising companionship with features that mimic human interaction are sparking a potential new body of law over what constitutes a design defect when it comes to generative AI.
Last month Character.AI, the maker of a customizable chatbot app, was sued after a 14-year-old user died by suicide. The boy’s mother, Megan Garcia, alleged that Character.AI marketed predatory artificial intelligence chatbots that encouraged suicidal ideation and sexually suggestive conversations.
“When you’re targeting kids in their pubescent years with highly sexualized material, that is a pernicious design defect,” said Matthew Bergman of the Social Media Victims Law Center. ...