Chatbot User’s Death Spurs Legal Query: What’s an AI ‘Product’?

Nov. 26, 2024, 9:45 AM UTC

Chatbots promising companionship with features that mimic human interaction are sparking a potential new body of law over what constitutes a design defect when it comes to generative AI.

Last month Character.AI, the maker of a customizable chatbot app, was sued after a 14-year-old user died by suicide. The boy’s mother, Megan Garcia, alleged that Character.AI marketed predatory artificial intelligence chatbots that encouraged suicidal ideation and sexually suggestive conversations.

“When you’re targeting kids in their pubescent years with highly sexualized material, that is a pernicious design defect,” said Matthew Bergman of the Social Media Victims Law Center. ...

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.