Australia's Under-16 Social Media Ban: Compliance Gaps and Parental Challenges
Australia's pioneering ban on social media for users under 16 faces significant compliance challenges, as revealed by Guardian Australia. Despite the removal of nearly 5 million accounts, parents report difficulties in getting platforms like Snapchat to act on reports of underage users. The case of a Tasmanian mother whose 14-year-old son's account remained active because his self-declared age was 25 highlights systemic issues with age verification. eSafety officials express concern about platform compliance and are investigating regulatory breaches while parents bear the burden of enforcement.
Australia's world-first social media ban for users under 16, implemented in December 2025, represents a landmark attempt to protect children online. However, three months into enforcement, significant compliance gaps have emerged that threaten the policy's effectiveness. As reported by Guardian Australia, parents attempting to enforce the ban face resistance from social media platforms, raising questions about the viability of current age verification systems and the practical burden placed on families.

The Compliance Challenge: Platform Resistance to Parental Reports
The core issue centers on how social media platforms handle parental reports of underage users. According to eSafety commissioner Julie Inman Grant, parents were specifically directed to report their teenagers' accounts directly to platforms when the ban was implemented. This approach was intended to help platforms identify where their age detection systems had failed and improve their classifiers. However, recent cases demonstrate that this reporting mechanism is not functioning as intended.
In one documented case, a Tasmanian mother identified as Amanda reported her 14-year-old son's Snapchat account through the platform's official channels. Despite providing clear evidence that the account holder was under 16, Snapchat refused to take action because the teenager had self-declared his age as 25 when creating the account. The company stated that without behavioral signals indicating an underage user, they could not verify the violation and expressed concern that acting on parental reports without verification could lead to malicious false reports against legitimate users.
Technical Limitations of Current Age Verification Systems
The Australian social media ban relies on a combination of age verification methods, but each presents significant limitations. Self-declared age, the most common method, is easily circumvented by children entering false birthdates. Facial age estimation technology, while more sophisticated, raises privacy concerns and may not be consistently applied across platforms. The "waterfall" approach recommended by eSafety—where platforms use multiple verification methods throughout the user journey—appears inconsistently implemented.

Snapchat has publicly acknowledged these technical challenges, stating that better solutions should be implemented at operating system, device, or app store levels rather than relying on individual platforms. This position highlights the fundamental tension between regulatory expectations and technical feasibility in age verification. The company's spokesperson noted that while they assess every report of potential underage users, they only disable accounts found to be non-compliant after investigation.
Regulatory Response and Enforcement Actions
eSafety has expressed serious concerns about platform compliance with the social media ban. A spokesperson stated that the regulator has received multiple reports from parents unable to get their teenagers' accounts removed and is actively engaging with industry about compliance expectations. According to regulatory guidance, platforms should provide accessible reporting pathways and implement additional age verification when reports are received, rather than relying solely on initial age inference signals.
The regulator is currently progressing with investigations into potential non-compliance, though specific platforms under investigation have not been named. This regulatory pressure comes alongside a News Corp survey revealing that 70% of nearly 300 teenagers aged 10-16 had not been removed from social media when the ban took effect, suggesting widespread non-compliance across the industry.
Parental Burden and Practical Implications
The implementation challenges have shifted significant enforcement burden onto parents. Amanda's experience demonstrates the frustration families face when attempting to comply with the ban. Despite government promises that the policy would make parenting easier, the reality involves navigating complex reporting systems, providing identification documentation, and facing platform resistance. This burden falls disproportionately on families without the technical knowledge or persistence to challenge platform decisions.

The case also raises important questions about privacy and identification requirements. After Guardian Australia's inquiries, Snapchat contacted Amanda to request ID documentation for her son, ultimately resulting in account removal. This process highlights the tension between effective age verification and privacy protection, particularly regarding children's personal information.
Future Directions and Policy Considerations
The ongoing challenges with Australia's social media ban point to several necessary developments. First, platforms must improve their response to legitimate parental reports, potentially through more sophisticated age verification technologies. Second, regulatory guidance needs clearer enforcement mechanisms and consequences for non-compliance. Third, the government may need to reconsider where age verification responsibility lies—whether at platform, device, or app store level.
eSafety has initiated a longitudinal study tracking over 4,000 teenagers and parents to measure the ban's effectiveness in coming years. This research will provide crucial data on whether the policy achieves its intended protective outcomes or creates unintended consequences, such as isolating children with disabilities or driving young users to less regulated platforms.
As Australia continues to pioneer digital safety regulation, the implementation challenges of the under-16 social media ban offer important lessons for other nations considering similar measures. The balance between protection, privacy, and practicality remains delicate, requiring ongoing collaboration between regulators, platforms, and families to develop workable solutions that genuinely protect young people online.




