The May Budget marks the start of a new era for eSafety, by providing baseline funding certainty to take us into the future.
This funding allows us to plan for ongoing delivery of compassionate citizen service and education to meet the ever-growing demands of complaints and to better navigate an unpredictable operating environment.
We are also taking significant regulatory action and driving systemic change to make the online world as safe as possible for all Australians.
This important injection of predictable funding comes at a time when life online has never held so much promise or posed so such potential peril.
Our role has evolved markedly from our early days as the Children’s eSafety Commissioner, including the layering on of functions and responsibilities following significant events such as the Christchurch atrocity, and the COVID-19 pandemic, which supercharged both the frequency and gravity of online harms that all digital first responders are now contending with.
What we expected would be a peak of online malfeasance during the pandemic has become much worse than just a “COVID hangover.” We see evidence of this every day through our complaint-based schemes, and of particular concern to us is what we are seeing through our Illegal Content and Image-based abuse schemes.
In the first three months of 2023 alone, we’ve seen a 285% increase in reports under our Online Content Scheme. Most of these reports concern child sexual exploitation material, including coerced CSEM that is generated by children themselves, often in the bedrooms or bathrooms of their own homes.
Over the same period, we’ve also seen a more than doubling of reports through our Image Based Abuse scheme, which deals with the non-consensual sharing of intimate images and videos. The vast majority of these reports involved sexual extortion, a harm type perpetrated by overseas criminal syndicates and one which has almost tripled during the first three months of this year. Young Australian men between the ages of 18-24 are represented in 90% of reports to my office.
Australia’s Online Safety Act commenced in January last year. It not only strengthens our powers to remediate harms to individuals, but also grants us powerful new tools to tackle systemic failures within industry at-scale.
The Act includes a set of Basic Online Safety Expectations – a world-first transparency tool that is allowing us to finally lift the lid on what companies are -and are not doing - to tackle child sexual exploitation and abuse.
In August last year, we put tech giants like Apple, Meta, Snap and Microsoft under the microscope asking them some tough questions, with their answers revealing where they are not doing nearly enough to protect children on their services – or to respond to reports of child sexual abuse being livestreamed, hosted or shared on their platforms.
And earlier this year we issued another set of legal notices to Twitter, TikTok, Twitch, Discord and Google, extending the questions beyond CSEM to cover actions they are taking to counter sexual extortion and the amplification of harmful content through their algorithms.
We have now received responses from all of the companies and will be providing further insights through a public report in the near future.
In the coming weeks, I will be making a determination on the first phase of the draft industry codes to address illegal content, including child sexual exploitation and terrorist violent and extremist material.
If I come to the conclusion that any of the codes submitted to us by industry do not meet appropriate community safeguards, then I have the power to move to a binding industry standard.
These codes or standards will effectively set a new global benchmark in terms of what is expected of companies within the online eco-system when it comes to their obligations around the most grievous forms of illegal content.
This is another important way that Australia is setting the global pace in online safety regulation, championing systemic reforms, and working hard to markedly lift online safety standards and practices across the board.
But as you all know, this is a whole of society challenge, so we must also continue to strengthen our partnerships, and empower parents, educators, young people and the rest of the community with the necessary information and skills to navigate the online world safely.
And of course, we must also keep an eye to the future for new threats coming over the horizon. The race to colonise the metaverse combined with astonishing advances in generative AI have dominated the headlines. While these technologies promise great benefits, we can’t allow safety considerations to be forgotten in the rush by industry to conquer these new technological frontiers.
Safety By Design is an initiative we have long championed at eSafety, and I am happy to say it is now gaining greater traction within parliaments, multi-stakeholder bodies like the OECD and G7, and boardrooms around the globe. It’s vitally important that we continue our mission to encourage – and perhaps compel - industry to make user safety a key consideration rather than an afterthought and to build effective guardrails upfront.
In fact, Safety by Design will also be a key component of the recently announced formation of the “Australia-United States Joint Council on Combatting Online Child Sexual Exploitation” which was announced by Prime Minister Albanese and President Biden last week.
As our new baseline funding indicates, eSafety performs a critical ongoing function and we remain keenly focused on discharging our responsibilities under the Online Safety Act.
And we are confident that through this Budget process we are now in a more stable and sustainable position to deliver and meet the expectations of the Australian Government and the wider community.