A fairer fight: protecting childhood and adolescence in a digital world

This speech was delivered virtually by eSafety Commissioner Julie Inman Grant for the Global Age Assurance Summit in Amsterdam, Netherlands. 

Protecting children from harms

I want to thank you so much for inviting me to speak to you all at this incredibly important global summit.

I have been working on age verification issues in some way, shape or form since 2008 and I truly believe we are now at that inflection point where both the technology and the ecosystem have evolved sufficiently to have a significant positive impact on the online lives of young people everywhere.

This comes at an opportune time when there is a government groundswell of interest is converging across all corners of the globe.
Where we are seeing more governments arrive at the same conclusion that it is high-time we leverage effective and privacy-preserving age assurance technologies, and complementary measures, as a means to protecting children from exposure to a range of content and conduct that they are not ready to see or experience.

The relationship between social media and young people’s mental health is one of the most important conversations of our time. It naturally generates much debate and emotion. Therefore, it is important we ground these conversations in the evidence and the principle of the best interests of young people from the start.

We all know that social media was not created with children in mind, nor with safety as a superordinate goal, which is why we have had to retrofit age assurance and other safety interventions after some of the damage has already been done.

There is no question that social media offers benefits and opportunities for young people including social connection and belonging – and these are digital rights we want to preserve.

But, we all know there is a darker side. This includes algorithmic manipulation employed by social media platforms, predatory design features such as endless scroll and exposure to graphic and increasingly violent online pornography. This also means ready access to sites that promote terrorism and violent extremism, or other seriously harmful content, including disordered eating, suicidal ideation and the advocacy of severe forms of gendered violence.

While the research is still evolving, there is also a growing concern that unregulated access to online platforms - especially those designed to encourage stickiness and prolonged engagement - may interfere with healthy cognitive and social development - especially when it displaces sleep, physical activity or offline interaction.

Who we are and what we do

Before I explore these issues further – and eSafety’s role in relation to the new scheme relating to age restricted access to social media – I think it’s important to quickly explain who we are and what we do.

And, this should start from the premise that multiple interventions will be required to keep young people safer online. There is not a singular provision that will achieve this but rather, a holistic approach, that will make the most meaningful difference in young people’s online lives.

eSafety is Australia’s online safety regulator and it has my great privilege to lead this organisation for the past eight years. Our overarching mission is simple and clear: To ensure all Australians have safer, more positive experiences online.
We accomplish this through what we call the three Ps.

These include protection through our unique and world-leading complaint schemes so that we can remediate harm quickly when the platforms fail to act, prevention through our online safety resources, education and research, and proactive change through systemic powers and promotion of Safety by Design[https://www.esafety.gov.au/industry/safety-by-design].

Underpinning these three core Ps is a fourth - partnerships - that allows us to share best practice with fellow regulators around the globe and promote international regulatory coherence.

Australia’s journey

And in that same vein, we continue to work with our partners internationally on these issues and we’re closely watching developments occurring on age assurance in other parts of the world, through the Global Online Safety Regulators Network[https://www.esafety.gov.au/about-us/consultation-cooperation/international-engagement/the-global-online-safety-regulators-network], and bilaterally.

But I think it’s fair to say that Australia has led the world in terms of its early commitment to online safety regulation and to making the bold and decisive decision we are discussing today.

That is because, for too long, improvements in online safety have been incremental so our Parliament opted late last year to try something much more monumental.

Australia’s introduction of a new law requiring certain age restricted social media platforms to prevent children under the age of 16 from having accounts is indeed a world-first.

But while it might generate international headlines, we are not referring to this as A SOCIAL MEDIA BAN. The technical name is the Social Media Minimum Age Bill, and I believe the nuance is important here.

This is not going to be the Great Australian Firewall, we are not cutting the Coral Cable nor will every child’s social media accounts magically disappear simultaneously.

What we are talking about is more accurately described as a social media delay – a delay which gives us vital time to protect young people’s health and wellbeing and equip them with the digital literacy and skills they require to engage online safely. This also includes empowering and enabling parents to better engage in their children’s online lives.

This delay helps us continue delivering our digital and algorithmic literacy programs to hone their critical reasoning skills and build their digital resilience.

This imposed respite from social media is designed to shield them from being exposed to the harmful and predatory design features that are so often opaque, keeping them tethered to their technology and foisted down content sinkholes they have no power to understand or resist. In short, this is not a fair fight.

The road to implementation

As Australia’s online safety regulator, it will be eSafety’s job to implement and enforce this new law, which will take effect by the end of this year.

While there is still a lot that needs to be considered, a key principle is the recognition that children have important digital rights - rights to participation, the right to privacy, the right to dignity and to be free from online violence and in having their best interests protected.

Our implementation of this legislation is not designed to cut kids off from their digital lifelines or inhibit their ability to connect, communicate, create and explore.

Far from it. We’re actually trying to help, support and strengthen younger cohorts in tackling the darker, unseen forces all of us are struggling with.

There are the endless notifications through the night, peer pressure dialled up to the digital max, and relentless algorithmic manipulation designed to ensnare the attention of developing adolescent minds.

As things stand, the technology deck is stacked against them and our children need a break, particularly as their brains and skills around impulse control are still developing.

Goodness knows, I find it hard enough as a grown adult to curb the endless doom scrolling on social media and news feeds, of late. How can we expect children to effectively self-regulate with powerful features designed to maximise their attention?

This law is just one part of our holistic approach, which includes research and education, alongside regulatory enforcement to rebalance the burden of responsibility from ordinary citizens, children and their parents and carers, back onto the industry which profits from them.

So, how will this work?

There is no question that this is one of the most complex and novel pieces of legislation eSafety has ever implemented and no one is pretending that it will be easy. But, there are key milestones which, upon completion, will progressively bring us closer to our goal.

There are a range of interdependencies built into the legislation that elevate the level of complexity, but I’ll give you a quick indication of where we are headed.

One of the first steps will involve Australia’s Minister for Communications making rules on which platforms are included and excluded from the legislation.

eSafety will be asked to provide independent safety advice on any rules proposed by the Minister following Australia’s upcoming election.

We will also soon have the results of the government’s age assurance trial which, as many of you will know, is being managed by our summit hosts, the Age Check Certification SchemeExternal link[https://accscheme.com/?srsltid=AfmBOopxguR5l-vAGzBGiEOsIjm3Hc2g8VxD3mOZF1TiJa7bAI6ionNa]. The trial is currently looking at the effectiveness of a range of technologies and their suitability in the Australian context, as our colleagues from the Department of Infrastructure, Transport, Regional Development, Communications and the Arts explained at the summit yesterday.

It will then be up to eSafety to develop guidance on the “reasonable steps” service providers can take to prevent under 16-year-olds from having accounts on certain social media services. 

To inform this guidance, eSafety will soon commence a broad and meaningful consultation to ensure we hear the views and expertise of domestic and international stakeholders. We will be talking to industry, academics, advocates, rights groups and most importantly, children and young people themselves.

The guidance will also build on the findings of the trial, our consultations across a broad range of sectors, as well as the considerable work and consultation eSafety has already undertaken in this space.

This includes our Age Verification Roadmap[https://www.esafety.gov.au/about-us/consultation-cooperation/age-verification] and Background Report, commenced in 2021 and released in 2023, and a paper on updates in the field of Age Assurance[https://www.esafety.gov.au/industry/tech-trends-and-challenges/age-assurance], released in 2024. It also includes our research and regulatory insights, including through our complaints schemes from the public.

As is often said in Australian parlance, “this is not our first rodeo.”

Our research and regulatory insights – under aged users on social media services

eSafety recently released a first-of-its-kind report called “Behind the Screen[https://www.esafety.gov.au/research/children-and-social-media],” which combined the results of a national survey looking at the social media use of Australian children aged 8-15, and information provided directly to eSafety by social media platforms through transparency reports about how they enforce their own age limits.

What this eSafety research demonstrated was that 84% of 8-12-year-olds surveyed reported using at least one online service since the start of 2024. While the proportion of overall users increased with age, a significant majority –3 out of 4 children – have accessed an online service by 8 years of age.

Considering Australia’s population of roughly 1.6 million 8-12-year-olds, this suggests that approximately 1.34 million children have used an aged 13+ online service since the beginning of 2024.

Despite the relatively young ages we are talking about here and the availability of age assurance and complementary technologies today, I found it astounding that only 13% of 8–12-year-olds who had an account told us that they had been suspended or banned from an online service last year for being underage.

This means most technology companies don’t really have an accurate picture of the numbers of children on their platforms, or their actual ages. Possibly, because they haven’t been required to…

Behind the screen: Industry insights and potential reasonable steps

And this becomes important when these services start touting new safety features designed to protect younger users.

As there’s nothing stopping a 13-year-old from putting in a false birth date, there’s also nothing stopping them gaining access to an unrestricted adult account devoid of these default safety features.

In addition to a lack of age assurance at sign-up, the report found a wide range of both good practices as well as gaps and inconsistencies in the accessibility of reporting mechanisms for parents and others to alert services to underage accounts, and in the sophistication of proactive measures to detect underage accounts.

eSafety will be carefully considering these findings as we begin formulating guidance on the reasonable steps that providers of age-restricted services can take – which could include, but may not be limited to:

  • What age assurance measures will be put into place at the account creation phase to keep kids from signing up to age restricted services in the first place?
  • What reasonable steps can we expect services to take to prevent and address these age assurance measures being circumvented?
  • Where underage users do create an account – how do these companies plan to provide improved and easy to complete reporting mechanisms so that parents and other users can alert the service to underage users?
  • What proactive detection measures of underage accounts will be deployed– both new and existing – and what will the potential suspension of their use look like until they meet the minimum age?

But it’s important to be clear here, there will be no penalties for those underage children who gain access to an age-restricted social media platform, or for their parents or carers.

The responsibility lies exclusively with the service providers, as it should, because to-date they have simply not done enough to effectively enforce their own age limits.

In terms of our approach to enforcement, we have established an industry supervision team to ensure that all captured services clearly understand what this regulatory guidance means for them and how they can achieve compliance. It is worth noting again that there are significant penalties that will be targeting systemic failures rather than individual transgressions.

Will this work?

The legislation was designed so that covered service providers do more and do better.

Much public commentary has centred around how age limits for social media are unworkable, impossible to implement and that resourceful kids will find ways to circumvent any restrictions put in place.

As a parent, I will concede that kids are indeed resourceful and the really determined may find ways around some of these barriers.

But what we are attempting to do here is create some friction in a system to protect children where previously there was close to none. And in doing so, we can also provide some much-needed support for parents and carers struggling with these issues.

It’s a constant challenge for parents and carers who have to juggle the urge to deny access to services they fear are harmful with the anxiety of leaving their kids socially excluded. I can speak from experience that this sometimes feels like effective “reverse peer pressure” from the kids!

From my perspective, just because some children might find ways around these new age limits is not a reason to do nothing.
We anticipate these actions and build interventions in to prevent children seeking out the darkest recesses of the web and we further empower their parents to deploy parental controls, understand VPNs and better engage in their children’s online lives.

We also think about when the online abuse we deal with every day through our complaints schemes – including our cyberbullying and image-based abuse schemes – about how these harms are likely to migrate to messaging and gaming services. This may mean we need different regulatory tools and that messaging and online gaming providers will also need to up their safety protections in this area.

And what about the idea that it’s just too hard?

Well, as the attendees of this summit know well, there are existing examples today of major social media companies already assessing the age of their users successfully and with current privacy-preserving technology, or actively using AI-driven tools to target users with marketing with deadly precision. I believe they can ultimately deliver similar precision in rooting out under-aged users on their platforms.

The fact that our age assurance technical trial is testing more than 50 technologies speaks volumes about the diversity of options, and this testing does not include the proprietary tools the world’s wealthiest and most powerful tech corporations are developing and already using.

We know it will be challenging – but I don’t believe the challenge is insurmountable - nor will we let the perfect stand in the way of the good. And, the good might be simple. Kids speaking face to face more often, reading more books, kicking the footie with their friends, playing online games with their parents. The possibility for beneficial impacts are numerous.

No silver bullet – eSafety’s holistic approach

But as important as age limits will be in helping delay children’s exposure to harmful design features and in granting us precious time to arm them with the skills they need to navigate social media in the future, they are in no way a silver bullet that will solve all our problems.

Which is why eSafety continues to take a holistic approach to protecting, supporting and empowering Australian children online.
We remain committed to working with teachers, parents, carers and children and young people through our Youth Council, to not only ensure they are well informed about risks, but also well-equipped to thrive online.

And this means building on our current digital learning arsenal at esafety.gov.au by developing further digital literacy and resilience resources, and providing access to meaningful, co-designed educational content and resources. Our “Adolescents and Algorithms” webinars have already proven to be one of our most popular teacher training offerings ever.

Our goal is to prepare children in such a way that for the services they will continue to be able to access, and for the services they will access at 16, they are equipped with the skills to ensure they have a safe, valuable and enjoyable experience.

And of course, our protective powers will still be there to provide assistance if things do go wrong online, whether through cyberbullying or deepfaked image-based abuse.

Our enforceable industry codes and Safety by Design

While prevention and protective regulatory efforts are vital, we also have important systemic regulatory powers that complete eSafety’s multi-faceted approach.

And these systemic powers will also work in concert with the new social media age limits.

The online industry has been tasked with developing enforceable codes dealing with children’s access to harmful content. These codes are intended to apply measures up and down the technology stack – including age assurance protections – to prevent and address children’s access to high impact content like pornography.

Our own research shows that while the average age when Australian children first encounter pornography is 13, more than a third of these children are actually first seeing this content younger and often by accident. The young Aussies’ encounters informed the title of our research report, “Accidental, Unsolicited and In Your Face”.

In order to protect children from exposure, the codes cover everything from social media services, app stores and search engines, to device manufacturers and internet service providers.

We have received industry-drafted codes and are assessing whether they provide appropriate community safeguards to be registered.

If I determine they don’t go far enough, I have the power to write the rules for industry and move to mandatory standards – and I believe that every sector of the technology industry bears some responsibility for safety to prevent there being a single point of failure.

We also continue to encourage the tech industry to take a Safety by Design approach by making their platforms safer from the beginning by putting safety at the forefront of the design and development process. This is an initiative eSafety kicked off in 2018. Safety by design has been growing from strength to strength and has long recognised that innovation and safety are not, in fact, mutually exclusive but are instead, inseparably linked.

Theory of change: Evidence, evaluation and impact

It’s clear that eSafety and the social media platforms will have a lot of important work to do before these social media age-limits take effect later this year.

I am pleased to announce that the first step of our planning for the evaluation of the new regulations was the launch of an expressions of interest process for a Lead Academic Partner and an Academic Advisory Group. 

This will ensure we are gathering vital evidence, evaluating the efficacy of our interventions and assessing the impact in tandem with world-renowned academics and researchers. This Is important to the integrity of this regulatory intervention but will also be valuable to other governments seeking to learn from Australia’s experience.

We are deeply committed to conducting a thorough evaluation to understand the impacts of the legislation, both intended and unintended, on children, young people, and their caregivers. Our commitment extends to gathering evidence on the implementation to inform necessary adaptations and improvements.

To that end, I am very pleased to announce today that the Social Media LabExternal link[https://sml.stanford.edu/] at Stanford University, led by Professor Jeff Hancock, will be our lead academic partner. Stanford will be joined by a distinguished group of Australian and international academics with deep expertise across the key domains of the evaluation.

We will be announcing the other Australian and global academics in due course, but a central component of this evaluation will be to harness this collaboration with academic experts to enhance the rigor, quality, and objectivity of the evaluation.

What gives me great encouragement is that, while the road to get there may prove a little bumpy, I know there is deep community support in Australia for more effective measures to protect children from harmful content, as well as from features designed to make social media addictive.

Australia is pleased to be taking a national approach to setting age restrictions for social media services, but I’m confident we won’t be the last, as we continue working with other jurisdictions with similar aims.

While we feel the eyes of world are upon us right now, we are also watching with great interest what is happening internationally in this space, particularly in the EU, the UK, the US states and across Asia Pacific.

We may not have the answers today -or tomorrow- but we will approach this challenge with the goal of making the online world a little bit safer – and a little bit better – for both young people and their parents.

In true Aussie spirit, we are having a go – not because it is easy, but because it matters. It is a bold move, for sure, but every big change starts with someone willing to take that first crack.

Wish us luck – and we look forward to reporting back next year and to working with you to achieve the best outcomes! Thanks for listening and for all that you do!