Social media age restrictions
The Australian Government is protecting young Australians at a critical stage of their development, through world-first social media age restrictions.
The Online Safety Amendment (Social Media Minimum Age) Act 2024 introduces a mandatory minimum age of 16 for accounts on certain social media platforms, forming one part of a broader strategy to create safer digital spaces for everyone.
The change aims to strengthen existing measures for protecting young users, especially where there are particular risks associated with accessing potentially harmful social media content and features such as persistent notifications and alerts that have been found to have a negative impact on sleep, stress levels and attention.
The onus is on the applicable service providers to introduce systems and processes that ensure people under the minimum age cannot create or keep a social media account. This means there will be no penalties for age-restricted users who gain access to an age-restricted social media platform, or for their parents or carers.
The age restriction requirements will take effect by December 2025. Details about how they will operate, how and when they will be enforced, which services will be affected and other relevant information will be developed throughout 2025 and provided on this website. The information will help young people, parents, carers, educators and the online industry understand and prepare for the change.
On this page:
Key measures under the new law
Mandatory minimum age
Service providers will have to take reasonable steps to prevent those under 16 years old from having accounts on certain social media services.
New penalties
Service providers that don’t comply with the new age restrictions will face civil penalties, and the penalties for breaches of industry codes or standards have been increased to ensure they take their responsibilities under the Online Safety Act seriously.
Key milestones
November 2024: Social media minimum age Bill introduced to Parliament. A short public consultation is held – read eSafety’s submission.
December 2024: Bill passed and received Royal Assent, becoming an Act. Platforms required to take reasonable steps to prevent age restricted users having accounts, effective within 12 months on a day specified by the Minister.
Before the requirements take effect (and ongoing):
- The Minister will make legislative rules specifying services that are or aren’t covered – the eSafety Commissioner will advise through separate independent assessment.
- The Minister may make legislative rules specifying kinds of information that must not be collected (in addition to the information restrictions already set out in the Act) – the eSafety Commissioner and the Information Commissioner would make independent recommendations.
- eSafety will formulate guidelines for reasonable steps to prevent age-restricted users having accounts. This will include deep consultation.
After the requirements take effect: eSafety may obtain information from service providers about compliance, and may enforce compliance.
What does it mean for young people?
Most social media services currently have a minimum age requirement of 13 for account holders. However, current measures to enforce the minimum age are often absent or ineffective. There are also community concerns that young people who are 13 to 15 years old are too young to be exposed to the types of harms and deceptive design features that can be experienced on social media.
The new law puts the onus on social media platforms, not parents or young people, to take reasonable steps to ensure fundamental protections for under 16s are in place.
eSafety is mindful of the need to balance the safety of young people with a broader range of digital rights, including those that enable online exploration and expression and reinforce privacy protections (fuller analysis of this issue is available in our age verification background report). With this in mind, we will consult with young people – and their parents, carers and educators – about the implementation of the law, its impacts and the best way to support them through the change.
This also includes building a robust evidence base to measure the impacts when the law is reviewed after two years.
How parents and carers can help
eSafety understands the change in the law may make some young people feel upset, worried or angry. Some may become more secretive about their social media use and less likely to ask for help from a trusted adult if things start to go wrong.
It’s important that parents and carers help young people by talking openly about the age restrictions, finding out how they currently use social media and how that may be impacted by the new law, as well as encouraging them to seek help if they need further support for their health and wellbeing.
- See a full list of counselling and support services.
- Find eSafety tips for parents and carers on how to start hard-to-have conversations, as well as information about parental controls and managing time online.
- Explore eSafety advice and resources co-designed with young people – eSafety will continue to develop and update this guidance to ensure that when young Australians reach 16 they will be prepared for safer social media use.
- Helpful advice about discussing the social media age restrictions is also provided by headspace (Australia’s National Youth Mental Health Foundation) at Information for family about the social media ban.
Creating a holistic regulatory environment
The social media age restrictions are part of the Australian Government’s wider strategy to protect Australians online.
eSafety implements and enforces a number of interconnected measures designed to keep online service providers accountable, encourage greater transparency, prevent online risks and limit the impacts of online harm.
Proactive and systemic change
The Australian Government’s Basic Online Safety Expectations set expectations for how certain online services should protect users. Providers are expected to take reasonable steps to:
- ensure people can use their services safely
- consider the best interests of the child in the design and operation of services likely to be accessed by children
- prevent children’s access to age-inappropriate restricted material such as pornography.
These reasonable steps include implementing appropriate age assurance mechanisms.
eSafety has powers to request or require information from providers about how they are meeting the Expectations. In September 2024, eSafety requested information from some of the world’s most popular services to find out how many Australian children are on their platforms and what age assurance measures the services use to enforce their own age restrictions. A report on the responses will be released in early 2025.
Phase 2 industry codes are being developed by industry associations to prevent children and young people under 18 from accessing or being exposed to age-inappropriate material such as pornography, across many different types of online services. These codes must provide appropriate community safeguards for the Commissioner to register them. As set out in eSafety’s position paper, this will likely include age-assurance and a variety of complementary measures. Codes are required to be submitted to eSafety for consideration by 28 February 2025.
Social media services are also regulated under the Phase 1 Industry Codes and Standards designed to protect Australians from the most seriously harmful online content, including child sexual exploitation material and pro-terror material. Some codes include specific measures for online services which permit children under 16 to hold accounts, and these will still apply to services that are not included in the mandatory social media age restrictions.
In addition, the Safety by Design initiative encourages services to embed user safety into their design from the start. By integrating safety features into technology, Safety by Design will complement the new social media restrictions and reinforce a proactive approach to user protection.
Meanwhile, the proposed introduction of a Digital Duty of Care, as recommended in the statutory review of the Online Safety Act 2021 and announced by the Australian Government, would place the onus on all digital platforms to proactively keep Australians safe and better prevent online harms.
The social media age restrictions will intersect with each of these activities, so eSafety will take a coordinated approach to making them mutually effective.
Prevention
The social media age restrictions will not eliminate every risk that children face online, and the risks will not simply disappear when a child turns 16. So eSafety will continue to help people of all ages understand how to use online services and platforms as safely as possible and get help if they are harmed.
We raise community awareness about online safety issues, preventing risks and dealing with harms through our training and education programs. We also provide a wide range of web pages and resources, including information and advice written specifically for young people, parents and carers, and educators.
Protection
When prevention measures fail, eSafety provides a safety net for Australians harmed by serious online abuse or exposed to illegal and restricted content.
We investigate complaints and work with online service providers to stop, remove and limit the impacts of cyberbullying of children, adult cyber abuse, image-based abuse (sharing of intimate images and videos without the consent of the person shown), as well as illegal and restricted online content.
Building the foundations for change
eSafety’s approach to implementing the social media age restrictions will be based on research, evidence, deep consultation and careful consideration of the best interests of children.
It will be informed by guiding principles developed through stakeholder consultation on eSafety’s Age verification roadmap:
- Take a proportionate approach based on risk and harm.
- Respect and promote human rights.
- Propose a holistic response that recognises the roles of different stakeholders and supports those most at-risk.
- Ensure any technical measures minimise data and preserve privacy.
- Consider the broader domestic and international regulatory context.
- Consider what is feasible now and anticipate future environments.
Find out more about eSafety’s consultation on age verification and read further analysis of opportunities and risks associated with various age assessment methods in our age assurance trends and challenges issues paper.
eSafety will also build on the consultation undertaken for development of the Restricted Access System that requires certain online service providers to limit access to R18+ content. Find out more about the Restricted Access System Declaration and eSafety’s consultation.
In addition, eSafety will take into account the results of the Australian Government’s Age Assurance Technology Trial, which is reviewing age verification, estimation and inference technologies. These technologies will be considered as options to prevent access to online pornography by children and young people under the age of 18, and age-limit access to social media platforms for those under 16 years of age. The results are expected in June 2025. For more information on the technology trial, visit ageassurance.com.au.
More information
eSafety stands ready: Read eSafety’s statement, issued when the Online Safety Amendment received Royal Assent.
Australian Government fact sheet: Read information on the new legislation to enforce a minimum age for access to social media, and the age assurance technology trial.
Youth perspective: Read eSafety’s research into young people’s attitudes towards online pornography and age assurance.
Last updated: 20/12/2024