Thank you Chair for the opportunity to update the Committee on some important recent developments involving the eSafety Commissioner.
Like any regulator, we have faced challenges as we continue to test our powers and the legislation under which those powers are provided, but we have also chalked up some globally significant wins.
Last month’s Federal Court ruling that X Corp was obliged to respond to our transparency notice seeking information about measures the company has in place to address the proliferation of child sexual exploitation and abuse material on its X platform, was incredibly important.
The Court rejected X Corp’s argument that it should not have to respond to our regulatory actions because eSafety’s notice had been sent to the company’s former entity - Twitter - in February 2023.
Had X Corp’s argument been accepted by the Court, it would have set a concerning precedent that a merger between two foreign companies, and a quick rebrand, is all that is needed to avoid regulatory obligations in Australia.
Further demonstrating the power of transparency, I was pleased to see Apple introducing a feature that allows Australian children to easily report unwanted nude images directly to the company.
This comes almost two years after eSafety’s transparency reporting first highlighted the lack of such basic safety measures on the company’s platforms.
It’s no coincidence that Apple is introducing this feature in Australia first before rolling it out worldwide and we should all feel proud of the world-leading role we are playing in this space to drive more accountability too.
We’ve now sent transparency notices to 30 major services asking important questions about how they are tackling a range of online harms, and in July we sent our first periodic notices requiring tech giants like Apple, Meta, Google and Microsoft to report to us every six months on how they are tackling child abuse material on their services.
We believe regular reporting will keep the pressure on companies to make meaningful safety improvements – and we need to see a significant safety lift from these tech giants.
You would have also seen the Administrative Appeals Tribunal make orders last month to resolve proceedings brought by X Corp in relation to an eSafety notice requiring the company to remove material depicting a declared terrorist attack by a 16-year-old boy on a religious leader in Wakeley, NSW in April.
eSafety made the decision to issue the notice in accordance with the provisions set out for us under Australia’s Online Safety Act and the potential for such viral footage to inspire copycat attacks and potentially be used as a tool for online radicalisation - especially targeting vulnerable young people.
As ASIO director-general Mike Burgess said at the South Australian Social Media Summit last month, "all of Australia's most recent cases of alleged terrorism or events that are still being investigated as potential acts of terrorism were allegedly perpetrated by young people ... including one as young as 14.” He continued, "the internet was a factor in every single one of those incidents...
It became clear to us that some of these complex issues could only be resolved through significant legislative changes. Therefore, we felt it more appropriate to await the Australian Government’s consideration through the formal review of Australia’s statutory online safety framework.
As you would have seen through Minister Rowland’s press release on November 1, “the Government brought forward the commencement of the Review by one year to 2024, to ensure the current framework is fit-for-purpose and the eSafety Commissioner has the necessary tools to keep Australians safe.”
But despite the mutual agreement of both parties with the AAT’s decision, X Corp’s Global Government Affairs arm and subsequently sections of the Australian media sought to characterise this development as somehow being an admission of error by the regulator.
This is not only misleading but factually incorrect.
eSafety will continue holding tech companies to account without fear or favour, ensuring they comply with the laws of this country while prioritising the safety and wellbeing of all Australians, particularly children.
This echoes the theme I emphasised at the last Estimates – we should not continue to allow primary school children ready online access to material containing bombings, bludgeonings and beheadings through their smartphones. School officials have confirmed that sharing of gratuitously violent content amongst peers is happening today in Australian school yards.
Indeed, the protection of children online is now the active subject of national debate, from kitchen tables to the floor of the Australian Parliament.
The Prime Minister has already announced a Commonwealth-led approach to age restrictions on digital platforms, while we have also seen a South Australian bill proposing to limit social media access for children under the age of 14.
We welcome this important national discussion and have long supported the use of age assurance as part of a suite of measures to protect children from harmful content. As such, we support an evidence-based, nationally cohesive approach that also considers children’s vulnerabilities as well as their fundamental rights.
But a key missing piece in this debate has always been reliable data on just how many kids are on social media and messaging apps to begin with and whether or not major online services are ‘age assurance-ready’.
This is why in September, eSafety asked some of the world’s most popular social media and messaging services to tell us how many children are signed up and how they enforce their own age limits. We expect to publish their answers early next year.
While much of this debate has been about lifting age limits, we're also focusing on making the online spaces young people are using today safer by design. This includes engaging deeply with young people and providing them with a voice in the national debates that will impact them.
This is also why eSafety engaged with young Australians to surface the shocking reality that 84 per cent of 8-12 year-olds reported using at least one online service since the start of the year. Members of our Youth Advisory Council have contributed to policy discussions around age verification, safety by design and just last week testified before the Joint Select Committee on Social Media and Australian Society.
Additionally in July, eSafety notified key members of the online industry that they had six months to come up with enforceable codes that will protect children from exposure to graphic pornography and other high-impact content including themes of suicide.
The draft codes are currently out for public consultation and by the end of this year industry associations will be required to submit final drafts to my office for registration.
If the codes don’t meet appropriate community safeguards, I will consider moving to mandatory standards where I will set the rules for them, but I firmly believe in the layered approach of requiring each sector of the tech industry to ensure that they all take responsibility and that there is no single point of failure.
We already have six enforceable codes addressing the worst-of-the-worst online content, including child sexual abuse material and pro-terror content. Two standards covering the proliferation and distribution of this content on messaging and cloud-based file-hosting services, are set to come into force by Christmas.
But one of our most important core functions is to provide compassionate service and support to individual Australians who find themselves in the midst of an online crisis or targeted by serious online abuse.
We act as an important safety net, bridging the divide between the tech behemoths and ordinary citizens who feel they have nowhere else to turn and today, these schemes have never been more important.
In the past financial year, we received 2,693 reports of serious online abuse directed at a child through our cyberbullying scheme – a 37% increase the previous year. Importantly, more than 4,600 visitors to our site clicked through to the Kids Helpline for extra support.
Our Adult Cyber Abuse Scheme handled 3,113 reports, and our Image-Based Abuse Scheme. dealing with the non-consensual sharing of intimate images and videos, received 7,270 reports. It’s worth noting that we successfully facilitated the removal of 98% of reported content from 947 locations across 191 platforms.
And Our Online Content Scheme, which deals with the most harmful content, like child sexual abuse material, received 13,824 complaints containing about 33,910 URLs.
It’s important to remember that these statistics represent so much more than numbers – behind each of these reports is an Australian in distress and eSafety is there to provide support and rapidly remediate these harms and provide wrap-around support.
And while our protective schemes are important, so is prevention. Our education and training programs continue to equip children and young people with practical skills and confidence to stay safe and positive online.
Almost 1.4 million people – including 1.2 million school students, 31,000 educators and 28,000 parents – participated in our training courses in the year to June 2024.
So, it really has been a watershed year for eSafety where we’ve learned a great deal by testing our powers and maturing as a regulator.
This year marks the first time in our nine-year existence where we have appeared in both the Federal Court and the Administrative Appeals Tribunal to progress our regulatory actions.
We’ve also seen completed the first phase of our enforceable codes and standards, designed to put the onus back on industry to tackle the highest harms like child sexual abuse and pro-terror material. The final standards will come into force next month.
And our phase 2 codes aimed at protecting children from exposure to pornography and other high-impact material are well underway, with early industry drafts already out for public consultation.
And of course, we’ve continued to use our transparency powers to great effect forcing the tech industry to face some uncomfortable truths about a range of harms that may be playing out of their services.
The actions Australia in taking to make the online world a safer place for everyone are truly world-leading and eSafety will continue use these powers to promote the safety of our citizens and hold industry to account.
Thank you, and I’d be happy to answer any questions that you may have.