User-created 'characters' on Character.AI. (Credit: Character.AI)
Character.AI is adding parental controls—five months after a mother sued the chatbot company over AIs that allegedly pushed her teenage son to take his life.
A new "Parental Insights" feature will send parents or guardians a weekly email summarizing the activity of teens under 18. It will show the average time spent on the mobile app and web-based platform, the top characters they talk to, and how much time they spend chatting.
"This does not include a user’s chat content," Character.ai says. Teens must add a parent or guardian’s email address for them to get that weekly report.
"This feature encourages parents to have an open dialogue with their children about how they use the app," Erin Teague, Character.AI's chief product officer, tells Axios.
Website of Replika, another character-driven AI chatbot service (Credit: Replika)
The feature has been in the works for several months; in December, Character.AI pledged to have parental controls in place in Q1 2025. Over the past year, it's also rolled out "a separate model for our teen users [and] improvements to our detection and intervention systems for human behavior and model responses," the company says.
These changes, however, came after Megan Garcia sued Character.AI in October. Her son, Sewell Setzer III, died from a self-inflicted gunshot wound—allegedly at the behest of one of the company’s chatbots.
"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," the company said at the time.
Recommended by Our Editors
Apple Sued for False Advertising of iPhone 16's AI Capabilities
Apple's Latest AI Strategy: Cameras in Your Ears—And Everywhere
Amid Job Cuts, DOGE Accelerates Rollout of AI Tool to Automate Government Tasks
Character.AI is one of many personal, character-driven chatbots. Others include Replika and Anima, which market themselves as "friends" who are always there for users.
Website of Anima, another character-driven AI chatbot service (Credit: Anima)
"We are uniquely centered around people, letting users personalize their experience by interacting with AI 'Characters,'" says Character.AI's website. "We are working to put our technology into the hands of billions of people to engage with, and continuing to build personalized AI that can be helpful for any moment of your day."
Researchers are studying the use of chatbots to combat depression and loneliness. However, according to recent studies from OpenAI and the Massachusetts Institute of Technology, “personal conversations” with a chatbot, which include more emotional expression, are correlated with higher levels of loneliness among users.
Get Our Best Stories!
What's New Now
Sign up for What's New Now, your daily dose of the latest tech news, the best new products, and expert advice from the editors of PCMag.
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
I'm the expert at PCMag for all things electric vehicles and AI. I've written hundreds of articles on these topics, including product reviews, daily news, CEO interviews, and deeply reported features. I also cover other topics within the tech industry, keeping a pulse on what technologies are coming down the pipe that could shape how we live and work.
Read Emily's full bio
Read the latest from Emily Forlini
- Tesla Deliveries Tank in Q1, Continuing Steady Two-Year Decline
- Young People Don't Want to Use Microsoft 365: Can Copilot Win Them Back With AI?
- Meta, UFC Sign Deal to Bring Exclusive Content to AI Glasses, Threads
- Alexa+ Takes a Page From Apple Intelligence With Staggered AI Rollout
- iOS 18.4 Is Here With Priority Notifications, Other Small Tweaks
- More from Emily Forlini