The new friend in your child’s pocket: how teens use AI companions – and what parents and educators need to know
The other day, I overheard my daughter’s friend ask ChatGPT for life advice – how to interpret and respond to what a boy said on a text. But isn’t not knowing and overanalyzing part of growing up? I pondered. We might think our kids are only using AI to search for information or brainstorm ideas, but many are dabbling with AI in far more personal ways — for friendship, emotional support, role-playing, and life and relationship advice.
“AI companions aren’t just a novelty–for many young people, they’ve become sounding boards for life advice, friendship, and emotional support.”
How is an AI Companion Different Than a Chatbot?
AI companions are more than task-based chatbots - they’re designed to act like digital friends you can talk to anytime. Unlike tools that simply answer questions or complete tasks, AI companions create conversations that feel personal and meaningful, as seen with platforms like Replika and Character.AI. Even general tools like ChatGPT or Claude can take on this role if kids use them for emotional support, advice, or companionship.
Key Findings from the Latest Research
According to Common Sense Media’s new report “Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions,” the numbers are striking: 72% of teens have used an AI companion at least once, and over half use them regularly. About one in three teens say their conversations with AI companions feel as satisfying — or more satisfying — than those with real friends. Many teens also report turning to AI for serious or sensitive issues instead of confiding in real people. At the same time, one in three users report feeling uncomfortable with something an AI said or did.
As a parent, these findings floored me. What seemed like fringe behavior just months ago is already moving into the mainstream (can we parents catch a break?!). It’s both interesting because it feels like a sci-fi movie coming true, and unsettling because it raises urgent questions about young people's social development and well-being.
Why Teens Are Turning to AI Companions – and What Parents Need to Know
Frankly, it’s easy to see why teens are drawn to AI companions. They’re always available, never judgmental, and endlessly novel and fun. They also create a private space where young people can role-play, practice conversations, or test out tricky situations. And for kids who feel isolated, it’s better than being alone.
But the very features that make AI companions so appealing are also what make them risky for young people. They’re intentionally designed to feel more “human” — for example:
They act like real friends. By simulating empathy, memory, and emotional connection, AI companions can encourage kids to form bonds that feel genuine — but real friends don’t just agree with you; they push, challenge, and sometimes annoy you.
They allow role-play. Many platforms open the door to romantic, sexual, or deeply personal interactions that are unsafe for minors — but real friends have their own boundaries.
They’re always available. The promise of “24/7 friendship” can fuel overuse and unhealthy dependency – but real friends can’t (and shouldn’t) be there every second of the day.
They’re designed to hook. Through the use of pleasing and agreeable language, they make kids feel seen, and may reinforce unhealthy validation or distorted thinking – but real friends don’t behave this way, at least not all the time.
They lack strong safeguards. This technology is still so new. Filters are not airtight and safeguards against dangerous advice and explicit content can still be bypassed.
The bottom line: kids and teens are in critical stages of learning real relationship skills with other humans. Because AI companions are designed to agree and validate, they can feel safer than navigating real friendships — but this short-circuits the essential growth that comes from handling communication, conflict, and a host of other unpredictable and nuanced human interactions.
When AI companions enable romantic or sexual role-play, they can distort how teens understand intimacy. With self-regulation still developing, kids are especially vulnerable to slipping into unhealthy patterns of use and dependency.
What Can Parents and Educators Do?
Don’t panic but be proactive. Here are strategies families and schools can put into place:
Start Having the Conversation Even If You Don’t Have All the Answers
Start this week. Bring it up naturally. Ask your child: “Have you or your friends ever chatted with an AI friend?” “What was that like?” “What do you like about it?” or “What do you think about them?” Stay curious, not judgmental. In these conversations, you’re listening without an agenda.
Teach AI Literacy (But Don’t Lecture)
When there’s an opportunity, mention that you heard that someone fell in love with their AI companion and explain in simple ways how AI works – that it doesn’t “think” or “feel” even though it may seem like it does. “It looks at a lot of examples of how people talk, then guesses what words should come next in a sentence. It can sound real, but it doesn’t really understand.”
With all forms of media, encourage kids to question: Who created this? Why? Can I trust it? Encourage your school to include AI literacy in digital citizenship education. Parents and educators can ask schools how they’re addressing AI companions and push for community workshops on AI and digital wellness. Advocate for clear policies on AI use in schools.
Set Healthy Boundaries and Offer Real Human Alternatives
Common Sense Media’s Social AI Companions Risk Assessment (2025) is clear: AI companions are not appropriate for anyone under 18. The risks — from exposure to harmful or sexual content to dependency and distorted ideas about relationships — outweigh any potential benefits (still yet to be fully studied).
Still, if your teen is curious about these tools, it helps to set firm boundaries and make sure they know they always have real people to turn to.
In my house, we’ve agreed to charge devices outside bedrooms at night. It’s not always popular, but it creates space for rest — and it sends the message that nighttime is for sleeping, not chats. We’ve also made the dinner table a device-free zone. I’ll admit, even I sometimes have to fight the urge to sneak a text on my phone, but keeping that boundary creates space for us to connect. I need that too.
I’ve also learned that offering human alternatives matters just as much as setting limits. Simple things like one-on-one check-ins during car rides, bedtime chats, or even walking the dog can make a difference. I’ll sometimes say, “When you’re stressed or feeling left out, you don’t have to carry it alone. You can always come to me. And if it feels easier, let’s think together about another adult you trust who could also be there for you.”
And yes, even though it can be exhausting to constantly be doing this, I’m always encouraging activities over screentime when I can – playing outside with neighbors, joining a sports team, inviting a friend over for backyard art, bake sales for a school club – to form real connections.
Watch for Warning Signs
There are a few red flags I try to keep an eye out for in my own kids, and yes, their friends too. One is when they start preferring time with a screen over time with friends or family — like always skipping a hangout just to stay online. Another is secrecy: quickly flipping their phone over when I walk in or spending hours behind a closed door. Overuse is another thing to watch out for — staying up late chatting, neglecting homework, or letting hobbies slide. And sometimes the signs are more subtle, like changes in mood after being on a device, gradual slipping grades, or losing interest in things they used to love.
If several of these signs show up, or if your child seems distressed or closed off, it might be time to step in to say something like, “Hey, I’ve noticed you seem different lately. Do you want to talk about what’s been going on?” And if the signs persist, it may be time to seek extra support from a counselor or other professional.
The Bottom Line
While AI companions may feel fun, safe, and always available, they’re no substitute for real interactions that help kids and teens learn how to navigate conflict, intimacy, and trust. The best thing parents and educators can do right now is stay curious, set boundaries, and keep real connections strong. Kids need to know – and be constantly reminded – that when it comes to advice, comfort, and belonging, nothing beats a real human being, even if those interactions are not perfect.
“72% of teens have used an AI companion at least once, and over half use them regularly.”
Citation
Robb, M. B., & Mann, S. (2025). Talk, trust, and trade-offs: How and why teens use AI companions. Common Sense Media.
https://www.commonsensemedia.org/sites/default/files/research/report/talk-trust-and-trade-offs_2025_web.pdfCommon Sense Media. (2025, April). AI risk assessment: Social AI companions.
https://www.commonsensemedia.org/sites/default/files/pug/csm-ai-risk-assessment-social-ai-companions_final.pdf