Key Highlights
- Pornography websites and other online platforms must now block under-18s in Australia.
- New rules expand child safety measures beyond the December 10 ban on under-16s joining social media.
- Non-compliance can result in penalties of up to Aus$49.5 million (US$35 million) per breach.
- Age verification methods beyond a simple 'I am 18 years or older' click are now mandatory.
- The regulations cover porn sites, search engines, app stores, gaming providers, and generative AI systems.
Australia's online regulator has issued a stern warning to pornography websites, mandating that they must block individuals under the age of 18 starting from Monday, March 10, 2026. This move is part of sweeping new restrictions aimed at enhancing child protection online. Some websites had already begun implementing partial measures, including barring non-members and refusing new registrations on Friday, ahead of the official deadline.
These new regulations significantly expand Australia's existing online child safety framework. This follows the government's groundbreaking decision on December 10, which prohibited children under 16 from joining social media platforms. The current crackdown specifically targets children's access to "age-inappropriate content," which includes pornography, graphic violence, and content related to suicide and eating disorders.
The stringent rules apply to a broad range of online services. This includes pornography websites, search engines, app stores, gaming providers, and even generative artificial intelligence (AI) systems, such as chatbots. eSafety Commissioner Julie Inman Grant emphasized the seriousness of the enforcement, stating, "Make no mistake, where we see failures or foot-dragging, we will hold companies to account."
Companies found to be in breach of these regulations face substantial penalties, with potential fines reaching up to Aus$49.5 million (US$35 million) for each violation. The eSafety Commissioner's office clarified that users attempting to access age-restricted material on pornography websites and services will be required to confirm their age. Crucially, a simple click of a button stating 'I am 18 years or older' will no longer be considered sufficient verification, aligning with similar international efforts.
New Safeguards for Children Online
Commissioner Inman Grant highlighted that society has long recognized the necessity of age barriers to shield children from harm. "We don't allow children to walk into bars or bottle shops, adult stores or casinos, but when it comes to online spaces where they are spending a lot of their time, there are no such safeguards," she stated. "But that changes for Australian kids."
Under the new directives, the industry is obligated to implement consistent standards across all their services to prevent children from inadvertently encountering harmful content. AI companion chatbots capable of generating sexually explicit, violent, or self-harm material must now verify users' ages. Similarly, app stores and online gaming platforms are mandated to restrict access for under-18s to adult-only content.
For users of search engines who are not logged in, such as those using Google without an account, results containing pornography and high-impact violence will be blurred by default. In cases where searches relate to suicide or eating disorders, the initial results provided will be links to appropriate mental health support services.
The regulator has committed to actively monitoring and assessing compliance with these rules, vowing to take enforcement action against persistent non-compliance. Commissioner Inman Grant concluded, "No piece of regulation will eliminate all risks and harms all at once, but these codes create meaningful protections for children across the tech ecosystem. The government's commitment to implementing a digital duty of care will also further strengthen protections in the future."