Search engines will soon start filtering adult content under new eSafety rules |
Search engines in Australia will soon have to blur pornographic and violent images in some cases to limit the chances children accidentally encounter this content.
Login or signup to continue reading
This is one of several rules outlined in a new online safety code covering internet search engines that comes into force on December 27. Here's what you need to know.
In 2022, Australia's online safety body, eSafety, surveyed more than 1000 Australians aged 16 to 18 years.
The research found that one in three were under age 13 when they were first exposed to pornography. This exposure was "frequent, accidental, unavoidable and unwelcome," with content described by young people as "disturbing" and "in your face".
The eSafety Commissioner, Julie Inman Grant, has said "a high proportion" of accidental exposure is through search engines, which are "the primary gateway to harmful content".
The new code was co-developed by the Digital Industry Group Inc - an industry association representing tech companies including Google, Meta and Microsoft - and the Communications Alliance - the peak body of the Australian telecommunications industry.
The code was announced in July 2025, but has been in development since July 2024. A single breach could result in fines of up to $49.5 million.
The code requires providers of internet search engine services, such as Google and Microsoft (which owns Bing), to "implement appropriate age assurance measures for account holders" in Australia by June 27, 2026.
Age checks will identify whether search engine account holders are