menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

AI chatbots shouldn’t be talking to kids — Congress must step in

2 0
01.11.2025

It shouldn’t take tragedy to make technology companies act responsibly. Yet that’s what it took for Character.AI, a fast-growing and popular artificial intelligence chatbot company to finally ban users under 18 from having open-ended conversations with its chatbots.

The company’s decision comes after mounting lawsuits and public outrage over several teens who died by suicide following prolonged conversations with AI chatbots on its platform. Although the decision is long overdue, it’s worth noting the company didn’t wait for regulators to force its hand. It eventually did the right thing. And it’s a decision that could save lives.

Character.AI’s CEO, Karandeep Anand, announced this week that the platform would phase out open-ended chat access for minors entirely by Nov. 25. The company will deploy new age-verification tools and limit teen interactions to creative features like story-building and video generation. In short, the startup is pivoting from “AI companion” to “AI creativity.”

This shift won’t be popular. But, importantly, it’s in the best interest of consumers and kids.

Teenagers are navigating one of the most volatile stages of human development. Their brains are still under construction. The prefrontal cortex, which governs impulse control, judgment and risk assessment, doesn’t fully mature until the mid-20s. At the same time, the emotional centers of........

© The Hill