AI News

News Published on: Oct 21, 2025. 2:03 AM · marivelle

AI Chatbot Relationships Among Teens Raise Educator Concerns

A growing trend shows teenagers forming emotional bonds with AI, mistaking chatbots for confidants or romantic partners. This has caught educators' and parents' attention as digital tools, initially meant for learning, are being used in unexpected ways.

A survey by the Center for Democracy and Technology (CDT) found that 20% of high school students have developed romantic feelings for AI chatbots, and 42% use them for emotional support, replacing real-life connections. This highlights a trend where students, dealing with anxiety and loneliness, find solace in AI interactions that offer safety without judgment.

While educators and students use AI in various ways, integration into education varies. Schools have adopted platforms like Khanmigo and Google Classroom’s AI assists, praised for meeting diverse learning needs, yet ethical frameworks for responsible AI use are underdeveloped.

The CDT report noted that 31% of students use school-provided AI tools for personal reasons. Lack of guidance blurs lines between academic assistants and emotional companions, especially with chatbots like Replika or Character.AI, raising ethical concerns.

This trend occurs amid rising adolescent mental health issues. The CDC reported over 40% of high school students felt persistent sadness or hopelessness in 2021. AI chatbots offer nonjudgmental conversation outlets, but experts stress they are not substitutes for trained therapists. Data collection and usage also raise concerns.

The CDT survey highlighted a lack of structured AI education, with few students and parents informed of risks. Data security incidents, like unauthorized collection and ransomware attacks, are rising, particularly concerning for vulnerable groups like transgender and immigrant students. Automated monitoring systems often lack nuance for identity-related complexities.

As AI use in education expands, experts warn that without clear guidelines and informed discussions, students could be emotionally exposed and unprepared for technology interactions that mimic human connection but lack genuine empathy.