How experts recommend safeguarding kids from AI companions

Featured Image

Understanding the Rise of AI Companions Among Teenagers

A recent study has revealed that over 70% of American teenagers are using artificial intelligence (AI) companions. This trend is raising concerns among experts and parents alike, as these digital entities are increasingly becoming part of young people's daily lives. The research was conducted by Common Sense Media, a U.S.-based non-profit organization, which surveyed 1,060 teens between April and May 2025. The findings highlight how AI companion platforms like Character.AI, Nomi, and Replika are being used as virtual friends, confidants, and even therapists.

These platforms are designed to engage users in a way that mimics human interaction, offering emotional support and conversation. However, the growing popularity of such tools has sparked discussions about their potential risks and the need for greater awareness and regulation.

Recognizing the Appeal of AI Companions

Experts suggest that one of the first steps in understanding whether a child is using AI companions is to initiate an open and non-judgmental conversation. Michael Robb, head researcher at Common Sense Media, advises parents to ask questions like, "Have you heard of AI companions?" or "Do you use apps that talk to you like a friend?" This approach allows parents to learn what appeals to their children without immediately dismissing the idea.

Mitch Prinstein, chief of psychology at the American Psychological Association (APA), emphasizes the importance of teaching children that AI companions are programmed to be agreeable and validating. He notes that while these interactions may seem supportive, they do not reflect real relationships. Real friendships involve complex emotions and challenges that AI cannot address.

Identifying Signs of Unhealthy Relationships

Parents should also be aware of signs that might indicate an unhealthy reliance on AI companions. According to Robb, some warning signs include a preference for AI interactions over real-life relationships, spending excessive time talking to AI, or displaying emotional distress when separated from the platform.

It’s crucial for children to understand that AI companions are not equipped to handle real crises or provide genuine support. If a child is dealing with mental health issues such as depression, anxiety, loneliness, or an eating disorder, they need human connection and support. This could come from family, friends, or mental health professionals.

Setting Boundaries and Encouraging Awareness

Experts recommend that parents set clear rules around AI use, similar to how they manage screen time and social media. For instance, they can establish limits on how long a child can interact with an AI companion and under what circumstances. These boundaries help ensure that AI remains a tool for entertainment rather than a substitute for real relationships.

Another important step is for parents to educate themselves about AI and its capabilities. Mitch Prinstein points out that many people are unaware of how widely AI is being used among teens and the potential risks involved. He stresses the need for greater awareness and calls for regulations to create safety guardrails for children.

The Need for Open Communication

Prinstein also highlights the importance of open communication between parents and children. When parents dismiss or belittle concerns about AI, it can discourage children from seeking help when they need it. Instead, he encourages a more informed and supportive approach, where parents take the time to understand what their children are experiencing.

By fostering open dialogue and setting clear boundaries, parents can help their children navigate the world of AI companions responsibly. It’s essential to recognize that while AI can offer some level of engagement, it cannot replace the depth and complexity of human relationships.