eSafety stands ready
eSafety will work with government, industry, youth and the broader community to implement the Online Safety Amendment (Social Media Minimum Age) Act 2024, which has now received Royal Assent.
The provisions relating to age restrictions for certain social media services will come into effect no later than 12 months after the Act’s commencement.
While there will be much work to do in this short period of time, eSafety is prepared to hit the ground running, building upon our considerable work to date on these issues.
This includes eSafety’s development of the Restricted Access System Declaration under the Online Safety Act 2021 and our extensive work developing the Age Verification Roadmap – provided to Government in March 2023 – together with our subsequent Age Assurance issues paper, released in July 2024.
Throughout this process, eSafety’s approach has been informed by research, evidence, deep consultation and careful consideration of the best interests of children.
eSafety is mindful of the need to balance the imperatives of safety with privacy, children’s rights to expression online and a range of other fundamental human rights. This also includes having young people help inform the implementation process, just as eSafety convened and supported its eSafety Youth Council to provide youth input into the policy process.
Beyond the Act
eSafety has a track record of translating complex and novel online safety legislation into effective regulation as demonstrated through the range of our systemic powers, including industry codes, standards and transparency powers under the Basic Online Safety Expectations.
In the coming months, we will be establishing the fundamental building blocks needed to effectively implement and enforce the legislation. Importantly, this process will complement eSafety’s existing holistic strategy to ensure platforms and services are more effectively deploying Safety by Design, whilst lifting safety practices and processes for all Australians.
eSafety will continue to use its world-leading regulatory powers under the Online Safety Act 2021 to compel greater transparency and encourage meaningful accountability from online services, particularly when it comes to the safety and wellbeing of children and young people.
For example, under the new Basic Online Safety Expectations determination reinforcing the platforms’ responsibilities to act in the best interests of the child, eSafety recently requested vital information from the providers of eight online services used by young Australians to understand the steps they are currently taking to assess the age of users and keep younger users safe. This information will provide valuable insights into the state of play at that point in time, illustrating the methods platforms already use to enforce their own terms of service on minimum age. After further consultation, key findings from this work will be made public in early 2025.
In addition, eSafety continues to work with industry and other stakeholders on elements of the Online Safety Act 2021 which are still being implemented, including industry’s development of codes that will protect children from accessing harmful material, such as pornography and other high-impact content including themes of suicide, self-harm and serious illness, including eating disorders.
Public consultation on the industry draft codes has now been finalised and industry associations are expected to submit the final version to eSafety soon.
The eSafety Commissioner will then consider whether the codes provide appropriate community safeguards and are fit to be registered. If the codes do not provide appropriate safeguards and do not meet the proposed “layered safety approach” across all of the eight industry sectors, the Commissioner may decide to move to mandatory industry standards. Such a safety ecosystem approach across the technology industry helps eliminate the creation of a single point of failure.
Once in force, these codes and/or standards will sit alongside the first phase of industry codes and standards, which dealt with illegal content, including child sexual abuse material and pro-terror content and which will be fully operational across all 8 industry sectors from December 22nd of this year.
These world-first standards will require the online industry to tackle the worst-of-the-worst online content, including on end-to-end encrypted messaging services and through consumer-facing AI applications, like some nudifying apps targeting children. The Online Safety Amendment (Social Media Minimum Age) Act includes significantly increased civil penalties for breaches of these codes and standards. eSafety is continuing to work with industry to ensure it is prepared for these changes.
Completion of this first phase of industry codes and standards not only marks a significant stride forward in the online protection of children but may also have global impact on how some of the largest and wealthiest companies in human history tackle the most harmful material online.
Complementing these advances in online safety regulation are our prevention and education efforts, which have always been foundational pillars of eSafety’s work and remain so. This work is critical in keeping young Australians safer on the platforms they are using today, encouraging help-seeking behaviour, and preparing them for the technology challenges they will face in the future.
These efforts will include further interventions to enable and empower parents and carers as ongoing conversations and engagement in children’s online lives will continue to be a longer-term need, even with further social media restrictions in place.
The proposed introduction of a duty of care, as recommended in the statutory review of the Online Safety Act 2021 and announced by the Australian Government, would also further strengthen online safety.
These are all inter-locking measures designed to arm our citizens with the educational guidance and the tools they need to keep their families safer online whilst putting the burden for safety back on the platforms themselves to ensure the digital platforms Australians are using today – and tomorrow – are safer by design.
eSafety will continue to be transparent as we work towards the implementation of the Online Safety Amendment (Social Media Minimum Age) Act 2024.