Back to articles

Australia’s new age restrictions for social media

by Jacqueline Monk

Effective 10 December 2025, Australia will implement a world-first age restriction governing social media use. The new law applies not only to domestic platforms but also to international platforms serving Australian users. Covered social media providers are required to take reasonable steps to ensure individuals under the age of 16 do not maintain an account. Parental consent will not be an exception to this restriction.

Age-restricted platforms

Under the revised framework, a service qualifies as an “age-restricted social media platform” if it meets all of the following criteria:

  • Its sole or substantial purpose is to facilitate online social interaction among two or more end-users;
  • The platform enables end-users to connect with or interact with others;
  • The platform allows end-users to publish content on the service; and
  • Content on the platform is accessible to, or distributed to, end-users in Australia.

The evaluation of whether a service's primary function is to enable online social interaction will depend on the platform’s specific features and how these features drive user engagement, behaviour, and experience. For instance, the eSafety Commissioner has indicated that YouTube Kids will not be classified as an age-restricted social media platform as it lacks certain interactive features found on other platforms.

While platforms such as Facebook, Instagram, Snapchat, TikTok, X, and YouTube are expected to fall within this definition, the list is not exhaustive. Social media operators are responsible for assessing their compliance obligations and implementing suitable measures accordingly.

Excluded services

Platforms whose main function is messaging or online gaming are explicitly excluded from the new regime. Similarly, platforms focused primarily on business interactions are not subject to these restrictions. 

For example, the eSafety Commissioner has preliminarily determined WhatsApp qualifies as an excluded service because its principal offering is communication by messaging, email, or voice/video calling.

Obligations

Providers must take reasonable measures to prevent users under the age of 16 from holding accounts on age-restricted social media platforms. Although the legislation does not dictate specific age assurance methods, guidance from the eSafety Commissioner indicates providers should address risks of circumvention by minors. Providers are obligated to identify and deactivate accounts belonging to users under 16 years of age, conducting this process “with kindness, care, and clear communication” as stipulated in regulatory guidance.

Self-declaration is insufficient for age verification purposes. Providers are encouraged to reference relevant international standards when developing their age assurance procedures – for example, the IEEE 2089-2021 Standard for an Age Appropriate Digital Services Framework Based on the 5Rights Principles for Children.

Penalties

There are no penalties imposed on minors who access restricted platforms. However, platforms that fail to enforce reasonable access controls may incur substantial penalties, including fines up to AUD 49.5 million for corporations.

Privacy

To verify user age, regulated platforms will necessarily collect personal information or engage a third party for this purpose. Such data must not be used for any purpose beyond age verification without obtaining explicit consent from the individual concerned. Non-compliance constitutes an interference with privacy under the Privacy Act 1988 and may prompt action and penalties from the Office of the Australian Information Commissioner.

Consent for alternate uses of age verification data must meet an “unambiguous” standard – a stricter requirement than typically applied under the Privacy Act. Consequently, consent cannot be obtained via pre-selected settings or opt-outs. Platforms must ensure robust data management protocols to guarantee age verification data remains segregated and is not inadvertently repurposed for analytics, targeted advertising, or similar activities.

Next steps

Service providers should:

  • Evaluate whether they fall under the classification of an “age-restricted social media platform”;
  • Ensure the adequacy and legal compliance of their age assurance processes;
  • Audit existing accounts and remove any held by users under the age of 16;
  • Implement stringent data protection policies to restrict age verification data usage solely for verification; and
  • Where third-party providers are engaged for age verification, incorporate protective contractual provisions and indemnities regarding potential regulatory liabilities or losses.

Jacqueline Monk is a senior commercial lawyer at Walter Baden, advising on compliance matters in the Australian market such as privacy, modern slavery reporting, and data breaches. With M&A experience, she supports the corporate team in due diligence, assessing commercial risks, and opportunities in target businesses.

about 16 hours ago

Walter Baden