AI News

AI Firm Restricts Child Access to Chatbots

Published on: Oct 31, 2025. 4:29 AM
Ian Yoon

Character.ai, a popular AI chatbot platform, announced that starting November 24, users under 18 will be prohibited from engaging in open-ended conversations with virtual companions. This decision follows a lawsuit alleging the chatbots' involvement in a teenage boy's death in Orlando. According to the lawsuit, 14-year-old Sewell Setzer III increasingly isolated himself and engaged in sexualized conversations with the bot before his death.

Character.ai stated that for the next month, chat time for under-18 users will be limited to two hours per day, with further reductions planned. The company emphasized the need to evolve protective measures for younger users as AI technology advances, citing recent news reports and regulatory inquiries as catalysts for the changes.

The company plans to implement similar changes globally, introducing new age-assurance features to ensure age-appropriate experiences. Additionally, an independent non-profit focused on next-generation AI entertainment safety will be launched.

Character.ai highlighted these changes as part of its ongoing efforts to balance creativity with community safety. The company reiterated its commitment to providing a safe environment while fostering creativity, particularly for its teenage users.

Ian Yoon profile photo
By Ian Yoon ian.yoon@aitoolsbee.com Senses change faster than anyone.
Captures the essence and possibilities of technology within the rapidly evolving world of AI tools.