The eSafety Commissioner exercised expanded powers during the first 12 months of the Online Safety Act, making platforms and perpetrators more accountable and protecting Australians from online harms, which continued to grow rapidly.
On the one-year anniversary of the Act’s commencement, eSafety can disclose it issued its first End User Notices compelling recipients to remove serious cyberbullying material targeting another child.
“Cyberbullying complaints have continued their post-pandemic surge since the Act came into force, increasing by over 69 per cent compared to the previous calendar year,” eSafety Commissioner Julie Inman Grant said.
End User Notices are a legal instrument available to eSafety for use at its discretion in more serious cyberbullying cases which include direct threats of harm.
Non-compliance may result in enforcement action against the recipient.
“I urge everyone to be mindful of online safety ahead of the return to school, when we typically see reports of cyberbullying spike. If you are a target or need more information or support, please contact us at eSafety.gov.au,”
Ms Inman Grant said.
“Information is also available for concerned parents and school leaders seeking to protect their students while delivering a strong deterrent message.”
In the 12 months since commencement of the Online Safety Act on January 23, 2022, eSafety investigators have probed over 1,680 cyberbullying complaints in total and made over 500 informal requests for online platforms to remove content.
“We are seeing the tenor and tone of this youth-driven cyberbullying content escalating to concerning levels,” Ms Inman Grant said.
“As well as strong action where necessary, eSafety aims to provide compassionate, wraparound support, including referrals to mental health services. More than 4,565 people clicked through from the eSafety website to the Kids Helpline website during the period.
The Act halves the time online service providers have to respond to an eSafety removal notice – down from 48 hours to 24 – and provides a range of other new and enhanced powers.
On the regulatory front, eSafety moved quickly to utilise new provisions that hold industry to account and compel greater transparency.
“Our first legal notices issued last year under the Act’s Basic Online Safety Expectations revealed some of the world’s biggest technology companies – Apple, Meta, WhatsApp, Microsoft, Skype, Snap and Omegle – have much more work to do tackling child sexual exploitation on their platforms,” Ms Inman Grant said.
“But this is just the start. We’ll be issuing more notices to other companies this year and I am currently considering draft Industry Codes, which propose to regulate the way industry handles illegal and seriously harmful content on their platforms, devices and services.
“At the same time, we’ll continue collaborative activity with industry aimed at lifting standards through our Safety by Design initiative. Not only will this help companies better comply with the Act, it will also encourage development of fundamental safety features that better protect their users.
“Understanding and anticipating technology trends and challenges, such as the recommender engines or algorithms we analysed in our latest position statement, will also be a priority for 2023.
“We have an opportunity to positively shape the technology landscape to reduce the potential for harms related to the metaverse, generative AI and quantum environments, all of which are looming in our near future. This will help eSafety be a nimble, anticipatory regulator when these technologies reach full maturity and saturation.”
The commencement of the Act marked the formal beginning of eSafety’s Adult Cyber Abuse scheme, which resulted in 6 formal Removal Notices to online service providers. In all cases, related material intended to cause serious harm to the target was either removed or geo-blocked from Australia.
eSafety continued to work with online platforms informally to remove abusive material under the scheme, investigating over 2,400 complaints and making 450 informal requests for removal.
Investigators maintained their strong success rate in addressing Imaged Based Abuse and Illegal and restricted content.
Since the commencement of the Act, eSafety has completed more than 16,000 investigations into child sexual exploitation material, 99% of which was referred to the INHOPE international network for rapid removal action.