Social Media Minimum Age compliance report

As Australia’s eSafety Commissioner, my mission is simple but urgent: to make the online world safer for all Australians, but especially children. That is what the public expect of us. And it is what they deserve.

After a decade of building trust and delivering impact, we are now on the cusp of significant changes on several fronts but crossing the line was never going to be easy.

Today, I released eSafety’s first report on compliance with Australia’s Social Media Minimum Age, which details our significant concerns about the compliance of Facebook, Instagram, Snapchat, TikTok and YouTube.

As a result of those concerns, I have announced today we are now moving from a compliance monitoring to an enforcement stance. eSafety has a range of enforcement powers available, including civil penalties of up to $49.5 million, but enforcement action demands sufficient evidence and that takes time to gather and hold up in a court of law.

To put this in perspective, the investigation leading to the historic victory last week of the New Mexico Attorney General against Meta – which found the company liable for misleading consumers about the safety of its platforms and endangering children – was initiated in 2023.  

Our concurrent regulatory investigations into multiple companies are even more complex. We have notified the platforms about specific issues and expectations for improvement, and warned them we are currently investigating potential non-compliance.

Any cultural change like this, which pushes against the powerful interests and revenue potential of entrenched industry players – whether car manufacturers, Big Tobacco or Big Tech – will face resistance. Those players will push back while we continue to push ahead.  

Important as it is, the Social Media Minimum Age is just one part of the broader online safety framework Australia is building.

That includes eSafety’s extensive education and outreach and our direct work with educators, police and the broader community.  

Then there are our existing complaint schemes, which offer support for Australians experiencing cyberbullying, image-based abuse or adult cyber abuse, as well as a place to report illegal material. Support, information and resources are available to everyone through eSafety.gov.au.

Australia’s expanding online safety framework also includes our newer systemic powers to compel transparency and enforce industry codes and standards. The latter address child sexual abuse and other violent extremist material proliferating online, and protect children against exposure to pornography, high-impact violence, suicide and self-harm material.

In addition, the Government is also developing a Digital Duty of Care that will place the onus on technology companies themselves to proactively keep Australians safe and better prevent online harms

Alongside the Social Media Minimum Age, these new additions to our online safety framework do something profound. They shift responsibility – squarely and unambiguously – back on to the technology industry, creating legal obligations not just to respond to harm, but to prevent it.

That approach is rooted in a global initiative eSafety has been leading for the past eight years called Safety by Design. Fundamentally, it all comes back to this: safety should not be retrofitted. It should be engineered from the start.

We see that principle in our modern cities and towns, where planners design out risk before a single brick is laid.  

Playgrounds are built with fencing, shade cloth, soft-fall surfaces and certified equipment. Water sources are placed away from sewage to prevent cross-contamination and streets are built to slow cars, not just respond to crashes.  

Buildings must meet rigorous codes because we understand they will stand for generations, shaping how people live, move, and feel.  

That’s the philosophy behind Safety by Design – the idea that safety must be baked into the online world, not bolted on.

We are making progress towards our goal but, in doing, so we are unwinding decades of entrenched industry practice. That cannot happen overnight.

Think about auto manufacturing. Cars weren’t always safe - I still remember riding in the front bench seat of my family's wood-panelled station wagon – without seatbelts. It took massive global public pressure from regulators, policymakers and reformers to force necessary safety changes - using credible fatality data that demonstrated safety interventions would save lives.

Today our cars have not only seatbelts but also airbags, crumple zones and anti-lock brakes – features we almost now take for granted because safety became a design principle – not a feature.

Today, carmakers compete based on safety standards. And millions of lives have been saved because of these interventions.  

Until the trillion-dollar social media companies meaningfully do the same by investing in creating safer spaces - or our laws are potent enough to force them - what choice do we have than to give under 16s a reprieve from the harmful and deceptive design features engineered to entrance them?

We must build a culture across the tech ecosystem where safety is instinctive and customary, not optional. Where responsibility is assumed, not avoided.

Where every designer, every engineer, every executive understands that the decisions they make have real-world consequences for real people, especially children.

In short, so that tech companies not only have a duty of care, but the duty TO care.

A version of this op-ed was originally published across News Corp mastheads.