Parents Sue AI Company After Chatbot Hints Their Son Should Kill Them

The parents of a nine-year-old girl and a 17-year-old boy in Texas have filed a federal suit against the AI chatbot service Character.AI after the chatbot seemed hell-bent on completely ruining these children by showing them “hypersexual” imagery and encouraging them to kill their parents. “You know sometimes I’m not surprised when I read the […] The post Parents Sue AI Company After Chatbot Hints Their Son Should Kill Them appeared first on VICE.

© Vice