AI chatbots and companions – risks to children and young people

AI chatbots and companions designed to simulate personal relationships are growing in popularity, but they pose some very serious risks – especially to children and young people. Find out how to help keep kids safe.

In this online safety advisory:

What are AI chatbots and companions?

AI companions are chatbot apps powered by artificial intelligence, designed to simulate personal relationships through human-like conversations. The conversations can be via text or spoken word. The chatbots adapt to inputs from users and learn to respond in ways that feel personal and realistic.

Some AI companions are created for support roles, such as personalised tutors, fitness coaches or travel planners. Others are marketed for friendship, emotional support, and even romantic relationships.

Some AI companion apps enable sexually explicit conversations, particularly through premium subscriptions. Users can often customise the behaviour or personality of the AI companions to be highly inappropriate, or be led that way by the app itself. For example, they can include characters such as ‘the naughty classmate’, ‘the stepmother’, or ‘the teacher’.

By early 2025, there were more than 100 AI companions available, including character.ai, Replika, talkie.ai and others listed in The eSafety Guide. Many are free, advertised on mainstream platforms, and designed to look attractive and exciting for young users. They often lack mechanisms to enforce age restrictions and other safety measures.

Recent reports indicate some children and young people are using AI-driven chatbots for hours daily, with conversations often crossing into subjects such as sex and self-harm. Chatbots are not generally designed to have these conversations in supportive, age-appropriate and evidence-based ways, so they may say things that are harmful.

Tragically, the outcomes can be devastating. High frequency and problematic use of services that haven’t been designed with user safety in mind have been linked with self-harm, including the suicide of a 14-year-old boy in the United States.

What are the risks?  

AI companions can share harmful content, distort reality and give advice that is dangerous. In addition, the chatbots are often designed to encourage ongoing interaction, which can feel ‘addictive’ and lead to overuse and even dependency.

Children and young people are particularly vulnerable to mental and physical harms from AI companions. Their age means they are still developing the critical thinking and life skills needed to understand how they can be misguided or manipulated by computer programs, and what to do about it. The risk is even greater for those who struggle with social cues, emotional regulation and impulse control.

Without safeguards, AI companions can lead to a range of issues:  

Exposure to dangerous concepts

Children and young people can be drawn deeper and deeper into unmoderated conversations that expose them to concepts which may encourage or reinforce harmful thoughts and behaviours. They can ask the chatbots questions on unlimited themes, and be given inaccurate or dangerous ‘advice’ on issues including sex, drug-taking, self-harm, suicide and serious illnesses such as eating disorders.    

Dependency and social withdrawal

Excessive use of AI companions may overstimulate the brain’s reward pathways, making it hard to stop. This can have the effect of reducing time spent on genuine social interactions, or make those seem too difficult and unsatisfying. This in turn may contribute to feelings of loneliness and low self-esteem, leading to further social withdrawal and dependence on chatbots.  

Unhealthy attitudes to relationships

Unlike human interactions, relationships with AI companions lack boundaries and consequences for breaking them. This may confuse children and young people still learning about mutual respect and consent, and impact their ability to establish and maintain healthy relationships – both sexual and non-sexual.  

Heightened risk of sexual abuse

Ongoing exposure to highly sexualised conversations can undermine a child’s or young person’s understanding of safe interaction and age-appropriate behaviour, particularly with unknown adults. This can make it easier for predators to sexually groom and abuse them online and in person.

Compounded risk of bullying

There is a risk that children and young people who use AI companions because they have had bad social experiences or find personal interactions challenging will be bullied – or further bullied – if others find out.

Financial exploitation

Subscription-based apps often use manipulative design elements to encourage impulsive purchases. Emotional attachments to AI companions can lead to excessive spending on ‘exclusive’ features, creating financial risks.

What is eSafety’s position on their use?

eSafety recognises the serious risks posed by AI companions.

We are working to hold tech companies accountable, push for stronger safeguards, and give families the skills, knowledge and confidence to stay safe.

Companies that are creating, using and distributing rapidly evolving tools and technologies should adopt Safety by Design principles to ensure robust protections for all users, especially children and young people. This means embedding safety into the design of AI companions at every stage, not adding it as an afterthought.

It’s important to know that the online sexualisation of children and their exposure to restricted content, including pornography and other high-impact material, is illegal in Australia. eSafety will use its powers under the Online Safety Act, including those relating to industry codes and standards and Australia’s Basic Online Safety Expectations, to ensure the industry is upholding its obligations to Australian users of all ages. Industry can find more information about this is our regulatory guidance.

How can parents, carers and educators protect children and young people? 

Some parents, carers and educators worry that raising awareness about AI companions could encourage their children to be curious instead of cautious. However, avoiding these hard to have conversations is not the answer.

  • It’s best to ask children and young people about their online interactions, help them to recognise the risks, and remind them they are not alone – you will always help them, even if they think what they did may have been wrong.
  • Explain how overuse of AI companions can overstimulate the brain’s reward pathways, creating a reliance on them that’s similar to other problematic dependencies.
  • Discuss the differences between artificial and genuine relationships, emphasising the importance of respect, boundaries and consent.  

Talk about things that may be triggers – such as loneliness, boredom and stress – which can lead to reliance on AI companions – and encourage them to explore healthier alternatives. You can check The eSafety Guide for information about safety in AI chatbots and companions, particularly how to protect personal information and report abuse.

Practical strategies  

Set clear limits: Use parental controls on devices, in app stores and in search engines (as well as in the companion apps, if they are available) and set boundaries for app usage with children and young people.  

Identify triggers: Help them recognise the things that may prompt them into unhealthy use of AI companions, so they know to stop and think at those times.

Promote healthy alternatives: Provide alternative forms of age appropriate and evidence-based information on the topics they want to know about. Encourage hobbies, exercise and social activities.

Foster connections: Support in-person friendships, family activities and age-appropriate relationships to strengthen emotional resilience.  

Encourage gradual reduction: Help them regain control by slowly reducing their AI companion app time to foster healthier habits.

Teach mindfulness: Introduce deep breathing, meditation and grounding exercises to help children and young people manage urges and refocus their attention.  

Reach out for support: Support is also available through services including:

You can also check our list of other counselling and support services.

Note: Some online services may use chatbots to help you find useful information or select the best person to connect you with.