Need help dealing with violent or distressing online content? Learn more

Frequently asked questions about access to online porn and other adult content

The Age-Restricted Material Codes aim to help protect Australian children from a range of age-inappropriate content that can be harmful to their health and wellbeing. This content includes pornography and material that shows high-impact violence, or encourages disordered eating and self-harm.

These frequently asked questions (FAQs) will help you understand how the codes work, including what they mean for searching or viewing adult content and the age checks required.

On this page:

Learn more about how the industry codes and standards help to protect Australians from illegal and restricted online content.

Click or tap on the + to find more information.

About the codes

Through the Online Safety Act, Australia’s Parliament has required the online industry to provide Australians with greater protections from online harms, especially children.

In June and September 2025, eSafety registered industry-developed Age-Restricted Material Codes with the key goal of making sure that Australians who are under 18 aren’t accidentally or unintentionally exposed to potentially harmful age-inappropriate material. Although they were drafted by the industry, the codes are mandatory (not voluntary) and enforceable under the law.

While the focus is on stronger protections for children, the codes also require service providers to give all Australians information, tools and options to allow them to limit their own exposure to age-restricted content.

The age restrictions mean that adults will maintain the right to access lawful content they wish to see, while also having better control of their online experiences.

  • App distribution services – also called app stores.
  • Designated internet services – a broad category covering online services (mainly websites) that provide entertainment, education or information. It also covers generative AI services and AI model distribution platforms.
  • Equipment providers – including operating software providers. This typically relates to phones, tablets and laptops, or other devices that allow direct interaction between people, are portable, and allow you to search the internet.
  • Hosting services – covers the servers and infrastructure that make websites or online services accessible on the internet.
  • Internet service providers – including phone and home broadband services.
  • Relevant electronic services – covers email, messaging or online chat (including dating services), as well as services for playing online games together.
  • Search engines – such as Google and Bing.
  • Social media services – including both those covered under the social media age restrictions for under-16s and those which are excluded.

For search engine services, internet carriage services, and hosting services, the relevant Age-Restricted Material Codes took effect starting on 27 December 2025.

For designated internet services, relevant electronic services, social media services, app distribution platforms, and equipment providers, the relevant Age-Restricted Material Codes took effect starting on 9 March 2026.

Some requirements under the Age-Restricted Material Codes have a staged commencement. This includes some requirements around appropriate age checks, including for logged-in users of search engine services (must be in place before 27 June 2026), and for users seeking to download apps rated R18+ from app distribution services (must be in place before 9 September 2026).

The internet was not designed with children in mind, even though children and adults both rely on the internet and digital devices for study, work, connection, relaxation and self-expression. eSafety research shows children (under-18s) are regularly exposed to age-inappropriate content that can be harmful, including pornography, high-impact violence, disordered eating and deliberate self-harm.

For example, our report ‘Accidental, unsolicited and in your face’ found around 10% of children have accidentally stumbled across online pornography by the age of 10. This climbs to almost 30% by age 13. Two in five young people also said their first exposure to pornography happened when they were searching for something else, such as visiting a gaming site or checking their social media feed.

In addition to pornography, many children report encountering other potentially harmful age-inappropriate content online. For example, eSafety research indicated that 44% of children aged 10 to 17 years had seen content encouraging unhealthy eating or exercise habits, while 32% had seen sexual images or videos online (with 23% seeing this material in the 12 months prior to the 2025 survey). 

No. The Age-Restricted Material Codes were drafted by peak bodies representing the online industry. This included a public consultation on draft codes which occurred in October and November 2024.

As required under Australia’s Online Safety Act, the eSafety Commissioner consulted on the drafts and assessed them to ensure they provided appropriate community safeguards against age-inappropriate content for children.

The Commissioner found they did provide the safeguards and so registered the Codes, making them enforceable under the law.

The Age-Restricted Material Codes primarily focus on preventing children (under-18s) from accessing potentially harmful age-inappropriate content, while giving all Australian users tools to help them avoid age-restricted material if they don’t want to see it.

The Social Media Minimum Age law is focused on preventing children under 16 from having accounts on age-restricted social media platforms. You can find more information about the social media age restrictions on the eSafety website.

When the Australian Parliament legislated the Online Safety Act with bipartisan support in 2021, it didn’t give itself the power to review industry-drafted codes.

As provided for under the Act, the codes were written by industry and accepted by eSafety. This is a form of industry co-regulation, which is common in Australia (along with self-regulation). Both often occur without Parliament reviewing the text of every code – there are examples in the workplace health and safety sector, telecommunications, and traditional media industries.

What the Act does require is for eSafety to ask the online industry to draft codes that will provide Australians with greater protections from online harms. In cases where the Commissioner is not satisfied draft codes provide appropriate safeguards, eSafety can draft a standard. Any such standard is then subject to a period of Parliamentary scrutiny and the potential for disallowance. That did not occur in this case.

Accessing online content

The Age-Restricted Material Codes apply to types of material that are ‘age-restricted’ under the National Classification Scheme, set up by the Australian Government in 1995. The National Classification Scheme is not administered by eSafety. You can read more about the Scheme on the Australian Classification website.

Viewing pornography has long been legal in Australia but access to it is meant to be restricted to adults.

With the spread of pornography online, making sure children couldn’t access it became increasingly difficult. So when the Australian Parliament passed the Online Safety Act in 2021, it included requirements for the online industry to reduce the risk of children being exposed to material they’re not ready to see. This included requiring the industry development of codes for age-restricted material that create enforceable requirements for industry, which are now coming into effect.

Age restrictions for pornography and other potentially harmful high-impact material are a lot like restrictions for cigarettes and alcohol, which also come with the possibility of age checks. Age-checks relating to certain products have long been used to allow the community to protect children from things which might endanger their immediate safety or may harm their long-term health and development. 

eSafety uses ‘pornography’ to mean clearly sexual content showing or describing people who are 18 or older, which online service providers are meant to prevent Australians under 18 from accessing. The sexual elements can be real, acted or fake. They can be in images or videos, or in some cases in written content. 

Where clearly sexual content involves children (under-18s), shows extreme violence or non-consensual acts, or where it is shared online without the consent of the people shown, eSafety calls this ‘illegal content’ not ‘pornography’. This is because it’s not meant to be available online to Australians.

Age checks will be required on certain ‘high-risk’ services and platforms which provide adult content, and are significant gateways to potentially harmful age-inappropriate content, including:

  • pornography sites
  • app stores, if you want to download apps rated R18+ (including simulated gambling apps)
  • social media services that allow online pornography or self-harm material
  • artificial intelligence (AI) chatbots or generative AI services that are capable of generating sexually explicit, self-harm or high-impact violence material
  • online games rated R18+.

For those services and platforms that have a ‘low’ or ‘medium’ risk, there’s minimal change (or no change) to how you access or use them. Instead, these services are required to help stop children from being shown material they don’t want to see by offering better tools, options and information to make their experiences safer. This takes the onus off parents and carers and place responsibility where it should be – on the companies themselves.

In many cases, adults should also be offered the opportunity to opt-in to new protections against material they don’t want to see.

For services or platforms required to do age checks under the Age-Restricted Material Codes, the method must meet the definition of ‘appropriate age assurance’, as set out in the Head Terms of the Codes, and further explored in eSafety’s regulatory guidance.

Methods used by service providers, or companies that conduct age-assurance on their behalf,  can include:

  • matching of photo identification
  • facial age estimation
  • credit card checks
  • digital identity wallets or systems
  • confirmation of a user’s age by their parent
  • use of AI (artificial intelligence) technology to estimate age based on relevant data inputs.

Whatever the chosen method, it must minimise the collection of personal information and comply with privacy principles and Australian law, including the Privacy Act.

The Codes Head Terms say that age checks must be ‘technically accurate, robust, reliable and fair’. In our regulatory guidance, eSafety also says that in order for age checks to be fair, service providers should offer more than one method of age check to users, and in any event should not rely on Government-issued ID as the sole method of checking a user’s age. 

This depends on the service and, in particular, whether it’s ‘high risk’.

Some services may allow one-off verification that applies across future sessions, or across connected services. Others may require session-by-session or service-by-service confirmation. Providers must balance effectiveness with usability and privacy.

Industry bodies that drafted the Age-Restricted Material Codes said this was the best approach to implementing age assurance measures at this time.

As part of the requirements to implement appropriate age assurance, industry included requirements that service providers must take reasonable steps to prevent workarounds. As part of this, eSafety says that service providers are expected to detect whether a user is using a Virtual Private Network (VPN).

In our regulatory guidance, eSafety sets out a number of ways in which service providers can do this, including (but not limited to):

  • using device telemetry, behavioural signals, and other persistent identifiers to detect irregular activity
  • integrating VPN detection services and IP intelligence APIs to flag and restrict high-risk IP ranges
  • implementing systems to detect and investigate suspicious IP switching.

Many service providers already use things like behavioural signals and device identifiers for the purpose of delivering advertising to users. 

Your privacy

The Age-Restricted Material Codes specify that any measures introduced by service providers must comply with privacy principles and all Australian laws, including the Privacy Act.

This includes the Children’s Online Privacy Code currently in development and expected by 10 December 2026. The Office of the Australian Information Commissioner has been developing the Children’s Online Privacy Code since December 2024, in parallel to the development of the Age-Restricted Material Codes.

The Australian Government’s independent Age Assurance Technology Trial also found age assurance technology can be both effective and safeguard privacy.

No. The Age-Restricted Material Codes do not require any individual data to be shared with eSafety.

All service providers subject to the codes must continue to comply with any applicable privacy laws, including the Privacy Act and its privacy principles. 

No. Under the codes there are a number of ways platforms or services can check you're 18 or older.

Platforms and services can use different methods to each other, so long as they are consistent with the principles of ‘appropriate age assurance’. Service providers must also comply with any applicable privacy laws, including the Privacy Act and its privacy principles.

In our regulatory guidance, eSafety also says that in order for age checks to be fair, service providers should not rely on Government-issued ID as the sole method of checking a user’s age. 

Depending on the service or platform, you may be able to report any concerns about a privacy breach directly to them.

The Office of the Australian Information Commissioner is the Australian privacy regulator. You can learn more about making a privacy complaint at the Office of the Australian Information Commissioner

Yes. The only time you would need to prove you are 18 or older is if you’re using a messaging service with the sole or main purpose of sharing sexual content or sexual activity. Many major messaging services now offer tools that can automatically blur sexual content if you want that protection in place.

No. You’ll still be able to use a search engine without logging in.

However, if you’re not logged into an account and your search returns pornographic or high-impact violence images, these will be blurred by default.

If you enter a search relating to suicide or self-harm, any material encouraging this will be downranked, while reliable health information and support services will be promoted.

The codes in practice

No. The codes bring in protections to the online world which have long existed in our community offline for materials which should be limited to adults.

Other countries are also implementing age assurance measures across a range of online platforms and services to protect children from exposure to pornography and other potentially harmful age-inappropriate content.

Social media services that allow online pornography and/or self-harm material must ensure a user is 18 or older before allowing access to this material.

Services that do not allow online pornography, high-impact violence or self-harm material according to their own terms of service must detect and remove this material, as well as continuing to improve their detection systems over time.

Services must provide better options for all Australians to reduce their risk of exposure to online pornography, self-harm material and high-impact violence.

Services with AI companion chatbot features must also assess and address the risk of children generating sexually explicit material, self-harm material and high-impact violence – either through age-checks or strong safety guardrails, depending on what is appropriate.

First, report the content directly to the service where it was shown. This is often the fastest way to make sure it is dealt with.

You can also make a complaint to eSafety if you suspect a breach of the Age-Restricted Materials Codes. The information you provide will help us to identify potential non-compliance and enforce the codes.

Ongoing failure to comply with the codes may result in the service provider facing civil penalties of up to $49.5 million per breach. The obligation to comply with the codes is on industry, and there are no penalties for children who access age-restricted materials.

Please note that although eSafety welcomes information about potential codes breaches, we can’t resolve individual cases or disputes between online service providers and users in relation to the codes. 

eSafety’s Age-Restricted Material Codes target systemic safety failures by online service providers, rather than focusing on individual incidents where children may access age-restricted materials. We will take enforcement action where a service demonstrates systemic non-compliance with the codes.

In 2026, eSafety’s first regulatory priorities under the Age-Restricted Material Codes are:

  • focusing on AI chatbots that can generate sexually explicit and other harmful materials, ensuring they have appropriate safeguards for children
  • ensuring the largest and highest-reach and risk online pornography providers are complying with their obligations to prevent children from accessing their services
  • making sure large gatekeeper services like search engines and app stores are enforcing their own Terms of Service to provide safe experiences to users.

The Age-Restricted Material Codes work alongside eSafety’s other regulatory powers to protect Australians from online harm, including enforcement of the Unlawful Material Codes and Standards when breaches are detected. For example, eSafety recently issued Formal Warnings to OmeTV for systemic non-compliance with these safety obligations.

How will the codes affect me on different online services?

Click or tap on each button to find more information. 

Screen reader users: Select a button below to change content below it. You can skip to the expanded section directly by skipping to the heading.
App stores

App stores must take appropriate steps to prevent users who are under 18 from purchasing or downloading apps rated R18+ and ensure apps are appropriately rated. 

If the app store doesn’t already know someone’s age, they may be asked to confirm it through age assurance. 

AI companion chatbots

AI companion chatbots capable of generating sexually explicit, high-impact violence or self-harm material need to confirm someone is 18 or older before accessing that material. 

This may be required either when a person logs onto the service or at the point of access or generation for that material.

Messaging

There are no age checks required for widely-used general messaging services or those attached to social media platforms, for example like Facebook Messenger. 

Users may be asked to verify their age on adult messaging services that specialise in distributing pornography or self-harm material.

Online gaming

Users will have to complete some form of age assurance to access online games classified R18+ by the Australian Classification Board. For all other games, no age checks are required.

Pornography sites

Users will be asked to confirm their age when accessing age-restricted material on pornography websites and services. 

Clicking a button that says ‘I am 18 years or older’ is no longer sufficient. 

Search engines

For users who are not logged into an account, for example a Google account, search results containing pornography and high impact violence will be blurred by default. Logged in children (under-18s) will still have these same safety defaults in place, and logged-in adults can opt-in to these safety defaults, or choose to unblur material if they wish. 

For anyone entering searches related to self-harm or suicide, the first result returned will be a referral to appropriate mental health support services.

Social media

Social media services that allow pornography or self-harm material must ensure users are 18+ before giving them access to that material. This may involve age assurance when someone logs in, or at the point of access to that material.

If they are using a service that – according to its own Terms of Service – doesn't allow pornography, high-impact violence or self-harm material, the service now has to take proactive steps to detect and remove that material. If it doesn’t, it may be subject to civil penalties.

You can download or print this information using the buttons at the top or bottom of the page. 

You can also download or print our fact sheet: