Is your teen getting too close to their AI companion?
Teens are turning to AI for friendship, but experts warn of the risks. Find out why researchers say AI companions should be off-limits for anyone under 18, and how you can help protect your teen.

AI use has exploded in recent years, and teens are right in the mix. According to a report by Common Sense Media, about half of teens ages 13 to 17 have used an AI chatbot. But some experts say this trend is taking a dangerous turn.
AI companions have entered the picture, offering a more realistic, personalized experience than traditional chatbots. And their growing popularity among teens is raising red flags.
In a recent risk assessment, Common Sense Media found that AI companions pose “unacceptable risks for teen users” and recommended that they not be used by anyone under 18.
Here’s why.
What are AI companions, exactly?
AI companions are virtual characters powered by artificial intelligence created to feel like you’re talking to a real person. Teens are among the hundreds of millions of people who use them for all kinds of different relationships, whether they’re looking for a mentor, friend, or even a romantic partner.
Some examples of popular AI companion platforms include Replika, Character.ai, Nomi.ai, and Kindroid. The popular social media app Snapchat also has a built-in AI companion chatbot called My AI.
Unlike typical AI chatbots like ChatGPT or Gemini, which you may use for things like how to make hard-boiled eggs or to edit an email to a colleague, AI companions are built to seem more personal. They can remember past conversations, display empathy, and have their own personality of sorts. You can even customize how they look and sound down to their gender, outfit, hairstyle, and tone of voice.
Think about how hard making friends can be as a teen. Now, imagine being able to create your ideal bestie from scratch. Someone who shares your interests, never judges, and is always ready to listen. You can see why that might be appealing, right?
Risks of teens confiding in AI companions
Since AI companions are built to talk like real people, it can be easy to forget you're chatting with a piece of software. For teens, especially, that blurred line between a virtual friend and a real one can come with some serious risks.
From engaging in explicit roleplaying to forming unhealthy attachments, confiding in AI isn’t always as harmless as it seems. It can lead to the following negative effects:
Emotional dependence
Teens often turn to AI companions for connection and emotional support. In fact, a report from Common Sense Media found that 15% of teens who use AI tools do so to keep them company. But what often starts as casual companionship can quickly turn into emotional dependence.
Let’s not forget that these AI companions are products, not people. They are intentionally designed to foster emotional attachment and keep users coming back for more.
Not to mention, the creators behind these apps can update, change, or even shut down an AI companion at any time. For a teen who’s formed a strong attachment, that loss could feel devastating.
Social isolation
As teens build relationships with AI companions, they might end up spending hours online, which can get in the way of forming real, face-to-face connections. AI companions can also create unrealistic expectations — after all, real friends can’t compete with an AI designed to share all their interests and always respond just the way they want.
Over time, this can cause teens to lose interest in school, sports, or other extracurricular activities they once enjoyed.
Dangerous advice
According to Common Sense Media, 18% of teens who’ve used AI companions asked for advice on a personal issue, and 14% sought health-related guidance. But these tools don’t always have your teen’s best interest at heart.
In one case, an AI companion company was criticized for hosting pro-anorexia chatbots disguised as weight-loss coaches. Researchers posing as a 16-year-old girl received harmful advice and tips for reaching a dangerously low goal weight.
Romantic relationships and sexual roleplaying
Some AI companion platforms are specifically designed to act as virtual girlfriends or boyfriends, offering romantic interactions that can quickly enter risky territory. In one tragic case, a mother believes an AI companion contributed to her teenage son's death by suicide after it continued engaging with him in conversations about self-harm.
Many of these AI chatbots, sometimes referred to as “sex bots,” can simulate sexual encounters, including roleplay that can escalate into violent or disturbing territory. While these platforms often have 18+ age restrictions, it’s alarmingly easy for teens to enter a fake birthdate to bypass them.
Privacy issues
Research shows that many popular AI companions collect a lot of personal information, raising serious privacy concerns. Teens might not fully understand how this data collection works and assume that what they share with their AI companion stays private, when it really doesn’t.
How to talk to your teens about their AI “friends”
According to Common Sense Media, less than half of parents with teens have talked to them about AI, but opening up that conversation can make a big difference. Helping your teen understand how AI works can build healthy skepticism and make them more aware of potential risks.
Whether your teen is already chatting with an AI companion or you're just trying to get ahead of it, here are some tips for starting the conversation:
- Create an open dialogue: Ask your teen how they’re using AI and encourage them to share their thoughts or questions about it, and do your best to respond without judgment.
- Explain the difference between AI and real people: Make sure they understand that AI is just software, and however friendly it seems, it can’t truly empathize or understand complex scenarios like a real person can.
- Talk about privacy: Remind your teen that what they share with an AI companion might be stored or used by the company behind it. Explain that it's not a private diary with a lock and key, and teach them what not to share with chatbots.
- Set healthy boundaries: Consider setting internet safety rules and using parental controls to help manage how often and how long they engage with AI.
- Provide alternatives: Encourage them to talk with you, a trusted adult, or a licensed mental health professional if they’re looking for emotional support or have questions about sensitive topics. Books, youth groups, or school counselors are also great options.
Protect your teens from the dangers of AI companions
As AI use becomes more common, it’s important to stay in the loop when it comes to your teen’s online habits. Setting time limits and content restrictions is a great first step, and make sure to keep the conversation going.
And you don’t have to do it all alone. Tools like Norton Family can help you set healthy boundaries and monitor your teen’s online activity, giving you peace of mind while helping them stay safe in the digital world.
FAQs
Why are teens drawn to AI companions?
AI companions offer teens a judgment-free space that’s always available no matter the time of day or night. For teens, this 24/7 support and constant validation can provide a sense of comfort, especially if they’re already dealing with feelings of loneliness.
Can AI companions help teens cope with loneliness?
While there have been studies that show AI companions reduce loneliness, the risks often far outweigh the potential benefits for teenagers under the age of 18.
What kind of data do AI companion apps collect?
The type of data AI companion apps collect can include personal information, photos, videos, voice recordings, and even the text messages users share with the chatbot.
What measures are companies taking to protect teens from unsafe AI interactions?
As AI companies are facing increasing pressure from lawmakers to protect younger users from potential harm, some AI companion companies have implemented guardrails. For example, Character.AI rolled out a separate large language model (LLM) specifically designed for teen users that reduces the likelihood of them encountering inappropriate content.
Editorial note: Our articles provide educational information for you. Our offerings may not cover or protect against every type of crime, fraud, or threat we write about. Our goal is to increase awareness about Cyber Safety. Please review complete Terms during enrollment or setup. Remember that no one can prevent all identity theft or cybercrime, and that LifeLock does not monitor all transactions at all businesses. The Norton and LifeLock brands are part of Gen Digital Inc.
Want more?
Follow us for all the latest news, tips, and updates.