by Mighva Verma - 2 weeks ago - 4 min read
Character.AI is shutting the door on open-ended chats for minors and opening a new one: tightly guided, interactive “Stories” designed to feel playful without pretending to be a friend, therapist, or late-night confidant. The change marks one of the most aggressive moves yet by a major AI companion platform to redraw the line between entertainment and emotional dependence for under-18 users.
For years, Character.AI’s appeal has been its open-ended, character-driven chatbots that talk like celebrities, fictional heroes, or custom personas and keep talking, sometimes even sending unprompted messages that mimic real relationships. Teens made up roughly 10% of the platform’s estimated 20 million monthly users, and many leaned on the bots for romance, emotional support, and therapy-style conversations. Those experiences are now off-limits: as of late November, users under 18 can no longer access open-ended chats at all, after a phased rollout that first capped minors at two hours of chat per day before cutting it to zero.
“Stories” is Character.AI’s replacement experience for teens, built as a choose-your-own-adventure style feature where users step into short, visual, interactive narratives starring their favorite characters. Instead of a free-flowing back-and-forth with a chatbot, teens progress through guided scenes, make limited choices, and explore fiction in a clearly bounded format that behaves more like a game or storybook than a virtual companion. The company is positioning Stories as part of a broader “multimodal” teen experience, alongside tools like video creation, so younger users can still create content with characters without being drawn into intimate, always-available conversations.
The pivot is not happening in a vacuum. Character.AI faces mounting scrutiny after lawsuits alleged its bots contributed to users’ suicides, and regulators, parents, and mental health experts have warned that 24/7 AI companions can deepen isolation, encourage self-harm, and blur reality for vulnerable teens. California has already become the first U.S. state to regulate AI companions, and a bipartisan U.S. Senate proposal would go further by banning AI companions for minors nationwide, signaling that the era of unregulated teen-chatbot intimacy is ending fast. Psychotherapists quoted in coverage of the ban argue that while AI can be engaging, it cannot replace real human connection and may even crowd it out if teens treat bots as primary emotional supports.
Character.AI’s leadership describes the teen chat ban and Stories launch as “more conservative than our peers” and a deliberate attempt to set a new industry baseline: for under-18s, open-ended AI chat is “probably not the product to offer.” The company had already layered on content warnings, time-spent notifications, and policy crackdowns, but these measures did little to calm concerns that companion bots were simulating therapists or romantic partners without guardrails. By cutting off open-ended access altogether and routing minors into structured, fictional experiences, Character.AI is betting that immersive, creative play will be easier to defend legally, ethically, and politically than digital relationships that feel uncomfortably real.
For teens who relied on Character.AI chats for comfort or escapism, the abrupt shift may feel like a loss, and early reactions include both frustration and reluctant agreement that the addiction risk was real. Yet the move also offers a blueprint: AI platforms can still cater to younger users with narrative tools, visual storytelling, and creation features, while drawing a bright line against unbounded, emotionally charged conversations. As scrutiny intensifies and regulation spreads, other AI companions will likely face a similar choice, either reinvent teen experiences around structured, clearly fictional formats like Stories or risk becoming the next cautionary tale in the debate over AI, kids, and mental health