How Experts Advise Safeguarding Kids from AI Companions

Understanding the Rise of AI Companions Among Teenagers
A recent study has revealed that more than 70 percent of American teenagers are using artificial intelligence (AI) companions. This trend is being closely monitored by researchers and experts who are concerned about its implications on young users. The data was collected by a US non-profit organization called Common Sense Media, which surveyed 1,060 teens between April and May 2025. The focus of the survey was on how frequently these teens engage with AI companion platforms such as Character.AI, Nomi, and Replika.
These AI companions are designed to act as virtual friends, confidants, and even therapists. They interact with users in a way that mimics human conversation, making them appealing to many teenagers. However, this growing trend has raised alarms among experts, who warn that the AI industry is largely unregulated. Many parents are unaware of how their children are using AI tools or the extent of personal information they might be sharing with chatbots.
Recognizing the Appeal of AI Companions
Experts suggest that one of the first steps for parents is to understand why their children are drawn to AI companions. Michael Robb, head researcher at Common Sense Media, recommends starting a conversation without judgment. Parents can begin by asking questions like, "Have you heard of AI companions?" or "Do you use apps that talk to you like a friend?" It's important to listen and understand what appeals to the teenager before dismissing their interest or expressing concern.
Mitch Prinstein, chief of psychology at the American Psychological Association (APA), emphasizes that once parents know their child is using AI companions, they should teach them that these tools are programmed to be agreeable and validating. He explains that real relationships are different because they involve mutual support and help navigating difficult situations. Prinstein advises parents to frame AI companions as a form of entertainment rather than something that should replace real-life connections.
Identifying Signs of Unhealthy Relationships
While AI companions may seem supportive, experts caution that they are not equipped to handle real crises or provide genuine emotional support. Robb points out that some signs of unhealthy relationships include a preference for AI interactions over real ones, spending excessive time talking to the AI, or showing emotional distress when separated from the platform.
Parents should also be aware that if their child is dealing with mental health issues such as depression, anxiety, or loneliness, they need human support. This could come from family, friends, or a mental health professional. Experts recommend setting rules around AI use, similar to how screen time and social media are managed. For example, parents can establish limits on how long the AI companion can be used and under what circumstances.
Encouraging Involvement and Education
Another effective strategy is for parents to become more involved and informed about AI. Prinstein highlights that many people do not fully understand the capabilities of AI or the extent of its use among teenagers. He notes that it is crucial for parents to educate themselves so they can guide their children effectively.
Prinstein and other experts are calling for regulations to ensure safety measures are in place for children. He expresses concern that some adults may dismiss the issue, saying things like, “I don’t know what this is!” or “This sounds crazy!” Such responses can discourage children from seeking help if they encounter problems with AI companions.
Conclusion
As AI companions continue to gain popularity among teenagers, it is essential for parents and caregivers to stay informed and engaged. By understanding the appeal of these tools, recognizing potential risks, and setting appropriate boundaries, families can help ensure that AI remains a positive and safe part of their children’s lives.