Today I addressed the Social Media Summit in Sydney, jointly hosted by the NSW Government and the Government of South Australia.
The Social Media Summit is bringing together experts, policymakers, academics, young people, and community voices to discuss the positive and negative impacts of social media on people’s lives and how government can best support digital wellbeing. Below is an edited version of my address.
Understanding the risks of contemporary playgrounds
Having served as Australia’s eSafety Commissioner for almost eight years, I can attest that keeping kids safe online is more than a team sport, it is a whole of society responsibility.
Therefore, it’s wonderful to see government, industry, the education sector as well as national and international experts coming together to address the risks we see today and to better anticipate technology trends and the potential harms of tomorrow.
The evolving threat landscape for children is not one that we can or should ignore. We are already seeing “nudifying apps” and powerful deepfakes being weaponised against young girls. The hyper-realism and high sensory experiences in immersive environments means that a virtual assault in the metaverse can happen in real time and lodge in a child’s hippocampus with all the traumatic physiological impacts one might experience in the real world.
Whilst so much of the recent debate has focused on lifting the current age of social media usage for teens in Australia, I’ll share with you some concerning research about the preponderance of technology use of under-13s, well before they are cognitively and emotionally ready to manage the perils of social media use.
It’s clear the technology industry needs to do so much more, not the least of which to enforce their own rules, but for many parents, that horse has already bolted.
Parents are struggling to provide the support they know their children need. Helping them navigate this tricky territory is probably the single most important thing we can do.
But, in order to empower parents to safeguard our children’s future online, it is critical that we learn from the past.
1900s playground
So, I’m going to take you WAY back into the past, to the early 1900s, and I think I can confidently say that none of you were around during that era.
Apparently, this is an example of a playground 120 years ago.
It certainly was not built with safety in-mind but I’m sure at the time this was considered best practice!
Much like social media, this concrete and metal jungle gym was not made with children in-mind either, and I think very few parents would let their children near this play equipment today.
1970s playground
Not to date any of us, but perhaps these playgrounds from the 1970s seem a bit more familiar – a time when playground equipment took its cues from NASA and the “space age”.
There was plastic, primary colours, and some sand to take the sting out of those inevitable tumbles . . . and, yes, clearly less opportunity for sudden death.
But, you’ll also note safety on the playground still had room to evolve.
Modern 2020s playground
Today, children are safer than they have ever been on play equipment.
There is a shade cloth to prevent sunburn and soft material on the ground to prevent the bleeding and broken arms. Playground safety has definitely improved but that doesn’t mean that kids won’t experience that scuffs and falls.
But, as parents, we supervise and play alongside them, we kiss their boo boos when they fall, and we teach them to dust themselves off and get back on the monkey bars. We want them to have fun, to play nicely with others but we also understand that things can go wrong. We also know this process is part of childhood and we want to help them build their resilience.
These same principles apply to the digital world as well.
Digital playgrounds
Now, this more accurately represents the playgrounds that children and young people inhabit today, and they don’t differentiate between their online and offline worlds.
But there is no question that just like those mammoth playgrounds of yesteryear, the digital playgrounds that our children immerse themselves into today were not designed with children’s safety and well-being as a primary consideration. Safety by Design was not a fundamental development principle. Going forward, it must be.
We can no longer accept “risky by design” to be the operating principles of our children’s online worlds.
To highlight why that is, I’d now like to share with you some new nationally representative data from Australian 8–12-year-olds that paints a concerning picture. I’ll overlay this with some of the trends we’re seeing through our eSafety complaint schemes.
Under-age usage of online services
Children’s usage of what many adults refer to as “social media” has changed markedly over the past two decades. They are not simply posting to Facebook or Instagram anymore – although we adults may still be!
For example, our research found that over the course of 2024, almost half of 8-to-12 year olds are using ephemeral media and short form video, like Snapchat and TikTok, while 38% have used messaging apps like WhatsApp and Messenger. For that reason, I’ll refer to “social media” and “messaging platforms” collectively, as “online services”.
We are seeing a morphing of this functionality and these previously defined categories through our investigative and regulatory work, so ensuring we get the definitions right in any legislation will be crucial.
What this new eSafety research demonstrated was that 84% of 8-12-year-olds surveyed reported using at least one online service since the start of this year.
While the proportion of overall users increased with age, a significant majority –3 out of 4 children – have accessed an online service by 8 years of age.
A significant minority of children who have used an online service since the beginning of the year have their own accounts on these platforms. These proportions increase with age from 1 in 5 at age 8 to just over 2 in 3 by the age 12.
Considering Australia’s population of roughly 1.6 million 8-12-year-olds, this suggests that approximately 1.34 million children have used an online service since the beginning of 2024.
By 10 years of age, 82% of Australian kids are using online services. This jumps to 89% for 11-year-olds in upper primary school years. Finally, we see that 93% of 12-year-olds are well and truly online by the time they reach secondary school. Once again, that’s before the current official age of social media entry at 13 is reached.
Assistance setting up accounts
Many children use someone else’s account to access these online services. Almost 3 in 5 use a parent or carers’ account – so these factors will need to be considered in the context of policy formulation and with respect to effective implementation.
However, whilst some resourceful young people manage to set up accounts on their own, the vast majority with an account had help from an adult in getting social media entry by a large margin, some 80%.
Of those, 90% of kids told us this help came from a parent or carer. Now, this aligns with my own experience: I was told by my twins that they were the only two 6th graders at their school – and possibly across all of Australia – that didn’t have smartphones and social media.
So, I can speak from experience about the intensive “reverse peer pressure” that we as parents face when our children feel left out or left behind from the online planning and discourse they see their peers actively engaging in.
Account bans or shutdowns by providers
Despite the relatively young ages we are talking about here and the availability of age assurance and complementary technologies today, I found it astounding that only 13% of 8–12-year-olds who had an account told us that they had been suspended or banned from an online service this year for being underage.
This is why we have issued new information requests under the Basic Online Safety Expectations, to 8 online services about how they are enforcing their own rules about the age of their users today.
We asked questions about the number of children on their services, the methods in which they assess age, how third-party users can report underage children and what kind of signals the platforms themselves are picking up and actioning.
As with our previous transparency report processes, we will publish the information we receive to increase accountability and encourage improvements in the coming months.
How we are addressing these challenges: The eSafety model
On that note, perhaps it’s time I explained a little more about eSafety and how we work.
eSafety is Australia’s independent online safety regulator, the first of its kind in the world. Our legislated functions include playing the leading role in education and coordination around online safety.
We operate four complaint schemes covering everything from child sexual exploitation to terrorist and violent extremism, youth-based cyberbullying, adult cyber abuse and image-based abuse – the sharing of intimate images without consent.
eSafety actions are guided by our 3 Ps model: prevention, protection, and proactive and systemic change.
Prevention
The first P is Prevention. Through our research, education, and awareness raising programs, we strive to prevent online harms from happening in the first place.
As I said at the start, we believe the frontline of online defence will always be parents and carers, who ultimately make decisions in the home around children’s access to devices and online services. But our years of experience have also demonstrated that busy parents are often the hardest cohort to reach.
Like the NSW Government data released last week, our own research with adults has indicated that 95% of Australian caregivers find online safety to be one of their toughest parenting challenges. But interestingly, only 10% of parents proactively seek out online safety information unless something does go wrong with their children online.
Just as we talk to kids about stranger danger and build their resilience in other ways, it is clear we need to continue initiating important conversations with our children about what they are experiencing online.
Just as we check for hazards on the playgrounds, supervise our kids on the climbing nets or join them on the see saw, our research with young people has given us a clear indication that online gaming is one place they want to play, thrive, connect, problem-solve and relieve stress.
Young people us told us they wanted their parents to be interested in what they were doing online. And they also wanted them to co-view and co-play with them. In short, kids want us to be more involved in their online lives, most especially in those early years.
Our eSafety Guide helps parents navigate the most popular games, apps and sites young people are using today and our comprehensive parenting guides are available in multiple languages, covering a vast array of topics.
Our free webinars for parents are offered regularly and can be found alongside our other resources and reporting queues at eSafety.gov.au.
All of these resources are underpinned by research, sound pedagogical approaches on current trends, contemporary tech usage and parental needs.
Through our Toolkit for Schools, teacher professional learning programs, virtual classrooms and curriculum resources, we’ve been able to amplify our efforts through the National Online Safety Education Council, which has participation from 27 different educational bodies across the country.
Teaching responsible technology use in the schools is the second line of defence to reinforce these mission-critical skills and best practices for our children.
Partnerships
We believe further partnership with the states and territories is vital to reaching Australian parents – and I believe that we can only truly enable and empower more parents with your help.
While eSafety has the online safety content, programs and the regulatory remit, states possess the key delivery mechanisms to reach parents whether through the schools, mental health professionals, GPs or local law enforcement.
Putting boots on the ground through eSafety champions in every school and getting our content and guidance into the hands of parents and teachers is absolutely essential.
This was exactly the outcome eSafety recently achieved with the three NSW education sectors: NSW Department of Education, the Association of Independent Schools NSW and Catholic Schools NSW on our joint “Spotlight on Cyberbullying” package – and we see this as a promising model for online safety collaboration with other states and territories.
Youth engagement and codesign
Our engagement with young people, including through our eSafety Youth Council – some of whom have joined us today is gathering important momentum. Meaningful consultation and engagement with young people is both a science and an art, and we are so pleased that our Youth Council is being heard by policymakers through the Joint Select Committee process, through the age verification trial, through our Safety by Design initiative and through other critical policy efforts impacting children.
Whilst young people are tech savvy, Reset’s experiment will likely demonstrate that when it comes to opaque algorithms, the amplification of harmful content and dark patterns means children are not facing a not a fair fight – and this is not something they can weather alone.
They understand this – and want our help through prevention, protection and further measured regulatory interventions.
Protection
We know that meaningful and lasting societal change takes time and, until that happens, people suffering harm can continue reaching out to eSafety for help. This is where we provide Protection.
Under Australia’s Online Safety Act, eSafety operates several world-first schemes to protect Australians from online harm; we serve as a safety net when the platforms fail to act.
We have a greater than 90% success rate in having cyberbullying and image-based abuse content removed but we are also seeing concerning trends.
Over the past four years, we’ve seen a 313% increase in cyberbullying, including aggressive online behaviour by teens bordering on the mercenary and the merciless.
Last financial year, we received more than 7,000 reports of image-based abuse – intimate images shared without consent – including seeing deepfake and nudifying apps being used by teenagers to target their female classmates. These powerful new apps are downloaded on smartphones with few guardrails and at no cost to the perpetrator – but the cost to the victim-survivor is lingering and incalculable.
We know this is just the tip of the iceberg.
We continue to see year-on-year doubling of child sexual abuse material – also known as CSAM - including reports of synthetic material generated by AI. Almost 1 in 8 reports of child sexual exploitation our investigators are seeing has been categorised as “self-produced CSAM”, largely coerced remotely from predators through online devices in the children’s family home – most often in the privacy of their bedrooms and bathrooms.
Proactive and systemic change
And finally, our third pillar is Proactive and Systemic Change. This means hardening the threat surface for the future by anticipating technology trends before they hit us in the face.
That includes shifting the burden for safety back onto the platforms themselves through Safety by Design.
I think we need to be realistic that while a ban could forestall the potential damage done to young people for a short period, we absolutely need to ensure that the online services they are on today much, much safer.
This means challenging the flawed technology ethos of moving fast and breaking things and ultimately disrupting the attention-seeking surveillance business model that has propelled them for so long - this is well and truly a fundamental and global challenge.
That is why government has entrusted us with systemic powers such as our codes and standards that allow us to work with industry to set obligations up and down the technology stack.
This avoids a “single point of failure” in tackling the most extreme forms of online content such as child sexual abuse and terrorist material.
We are also currently well under way with Phase 2 of our industry codes process which is focused on preventing young people’s access to pornography and other high impact content, like self-harm material.
We believe these systemic approaches will serve as an important complement to any proposal around social media safety to ensure that the whole eco-system has safeguards embedded throughout.
The road ahead
Australia has shown itself to be an innovative and thoughtful first mover when it comes to online safety regulation – and the state premiers have certainly reinforced and amplified the importance of leading from the front.
The Prime Minister has recently announced plans to further legislate age restrictions this year, foreshadowing a national approach that will enable better alignment with the regulatory framework already established under the Online Safety Act.
Not only will this dovetail with the important frameworks that will be implemented through our codes and standards, but it will also coincide with the important Online Safety Act Review that Government has brought forward to strengthen and future-proof eSafety’s current regulatory toolset. These recommendations will be put forward by Delia Rickard at the end of this month.
This is the third legislative review eSafety has gone through over our short time as a regulator and we have learned first-hand that ambitious policy imperatives do not always translate into effective implementation and enforcement without considering the evidence and all pieces of the regulatory puzzle.
This is precisely why I have raised concerns about the potential for unintended consequences that could drive young people to darker recesses of the internet.
We have also spent the last nine years encouraging young people to disclose online harms to a trusted adult and to seek out various forms of help-seeking and we do not want to undermine that progress by unwittingly driving these conversations underground.
These concerns are raised, so that they can be addressed upfront.
Finally, we do need to acknowledge the protective and positive factors that young people derive from communication, exploration and connection through the internet. Our research has shown us that at-risk cohorts like LGBTIQ+ teens, First Nations teens and those with disability online feel more themselves in an online environment that they do in real life. This has served as a lifeline for them – and we certainly do not want to undermine those benefits – we want to harness them!
Going further together
With this in mind, I believe we will continue to go from strength to strength. We are stronger as a united national front in ensuring that we empower parents, build digital resilience and savvy in our children and hold global tech companies to account for the harm they may be causing to younger Australians.
Just as we saw in the evolution of safety with playground equipment it is high time that the technology industry harnessed their collective brilliance, vast financial resources and access to the most advanced technologies to create digital playgrounds for our kids that are much safer by design.
The key difference is that we need a safety revolution, rather than just an evolution.
Australia has shown its willingness to legislate for important safety outcomes. Ultimately, this is what our children deserve and a state of play we should increasingly demand.
As the old African proverb goes, “We may go faster alone but we will go further together.”