From 10 December 2025, age‐restricted social media platforms must take reasonable steps to prevent children under 16 from having accounts. This follows amendments to the Online Safety Act 2021 in late 2024 to introduce a social media minimum age framework (SMMA) (Part 4A of the Act).

This will create significant changes in how children, young people and their families engage with social media. The aim of the legislation is to address the growing concerns about the impact of social media on the mental health and wellbeing of children and young people.

Schools will need to think about the implications of the social media delay and how it may impact their policies, planning and support of students in their care.

Why are under-16s being ‘banned’ from social media? 

The eSafety Commissioner’s position is that it’s not a ban, it’s a delay to having accounts. The eSafety Commissioner FactSheet outlines:

Age-restricted platforms won’t be allowed to let under-16s create or keep an account. That’s because being logged into an account increases the likelihood that they’ll be exposed to pressures and risks that can be hard to deal with. These come from social media platform design features that encourage them to spend more time on screens, while also serving up content that can harm their health and wellbeing.

For example, the pressure to view disappearing content and respond to a stream of notifications and alerts has been linked to harms to health - these include reduced sleep and attention and increased stress levels. 

While most platforms currently have a minimum age of 13 for account holders, delaying account access until 16 will give young people more time to develop important skills and maturity. It’s breathing space to build digital literacy, critical reasoning, impulse control and greater resilience. 

It also means there’s extra time to teach under-16s about online risks and the impacts of harms, as well as how to stay safer online and seek help when they need it. This will give young people a better chance to prevent and deal with issues once they turn 16 and can have full social media access.

What does the legislation mean?

The proposed legislation will prohibit children under 16 years of age from creating or holding accounts with major social media platforms like Instagram, SnapChat, Facebook, TiKToK and You Tube. Social media companies that do not enforce these age limits may face significant penalties.

Importantly, there are no penalties for children who may gain access to an age-restricted social media platform, or for their parents or carers.

Legislative rules

On 29 July 2025, the Minister for Communications made the Online Safety (Age‐Restricted Social Media Platforms) Rules 2025 (the Rules), which exclude certain services from the minimum age obligation.

Under the Rules, the following services will be excluded from the minimum age obligation:

  • Messaging, email, voice calling or video calling
  • Online games
  • Services that primarily function to enable information about products or services
  • Professional networking and professional development services
  • Education and health services

These exclusions strike a balance between protecting young people from the harms associated with social media use and allowing ongoing access to services that are essential for communication, education and health. 

The Rules also enable maintained access to services that are (currently) known to pose fewer risks of online harms, particularly arising from addiction, problematic use, unhealthy social comparisons, and exposure to content that is inappropriate for children.

Impacts and considerations for schools

AISNSW has developed the following factsheet to support schools in responding to the new social media age restrictions for students under 16. It outlines key considerations for student wellbeing, communication strategies and policy adjustments to help schools navigate the changes.

Additional Resources

eSafety Commissioner