Reflections on eSafety's Regulatory Guidance for the Online Safety Amendment (Social Media Minimum Age) Bill 2024
- William Cook

- Oct 24
- 4 min read
The recently released regulatory guidance from eSafety represents a significant step in implementing Australia's Social Media Minimum Age legislation. Having reviewed the guidance alongside the Age Assurance Technology Trial findings, I think the framework addresses many of the concerns raised by young people and industry stakeholders. That said, several areas need deeper consideration as we move towards implementation.

Comprehensive Engagement with Stakeholder Concerns
The guidance demonstrates genuine effort to incorporate diverse perspectives. eSafety's consultation with over 345 people representing more than 160 organisations - including direct engagement with children and young people - reflects a commitment to inclusive policy development. The principles-based approach, rather than prescriptive mandates, acknowledges that platforms and user populations aren't all the same.
The incorporation of findings from the Age Assurance Technology Trial strengthens the guidance considerably. The trial's conclusion that age assurance can be done in Australia "privately, efficiently, and effectively" provides empirical grounding for regulatory expectations. Critically, the guidance recognises that "there is no one-size-fits-all solution" - a realistic acknowledgment of implementation complexity that the trial reinforced across its assessment of 48 vendors and over 60 distinct technologies.
The Critical Need for Human-Led Appeals
Whilst the guidance mandates review mechanisms, it doesn't sufficiently emphasise human-led appeal processes. Age assurance systems, regardless of how sophisticated they are, produce errors. Age estimation methods show reduced accuracy for certain demographics, age inference relies on probabilistic conclusions, and even verification systems can fail due to document quality or technical issues.
Young people legitimately entitled to access these platforms mustn't face endless loops with automated systems. The guidance mentions "human in the loop or human oversight" for review processes but lacks specificity about response timeframes, escalation pathways, and staff training requirements. Without robust human intervention, compliant users - particularly those from marginalised backgrounds or with accessibility needs - risk indefinite exclusion.
Platform providers should be required to specify maximum response times for human review, provide clear explanations for adverse determinations, and ensure reviewers receive training in cultural competency, disability awareness, and trauma-informed communication. These safeguards would transform review from a tick-box exercise into a meaningful protection for end-users' rights.
Contextualising Age Assurance Within Child Protection Goals
The guidance risks reducing compliance to a technical exercise focused narrowly on age-gating mechanics. Whilst determining platform access is necessary, the legislation's fundamental purpose extends well beyond this. We're addressing documented harms: algorithmic manipulation, exposure to harmful content, predatory behaviour, and platform design features that negatively impact mental health and development.
The guidance would benefit from explicitly framing age assurance as one component of comprehensive safety architecture. Providers should demonstrate how age-gating integrates with content moderation, algorithmic transparency, design choices prioritising wellbeing, and proactive harm detection. Without this contextualisation, compliance becomes performative - platforms may satisfy technical requirements whilst failing to address underlying risks.
This framing matters particularly for us as young people. Understanding that these measures exist not to arbitrarily restrict our autonomy, but to mitigate specific, evidence-based harms, helps build buy-in and reduces resistance. The guidance should articulate this rationale explicitly, supporting providers in communicating purpose rather than merely process.
Addressing Vagueness in Guidance Provision
Several sections employ language that lacks operational precision. Terms like "reasonable steps," "appropriate measures," and "sufficient confidence" appear throughout without concrete benchmarks or worked examples. Whilst principles-based regulation appropriately allows flexibility, excessive vagueness creates uncertainty for providers and enforcement challenges for regulators.
The document would benefit from case studies, tiered recommendations based on platform characteristics, and worked examples demonstrating compliance scenarios. The guidance does include one case study about a fictitious platform called ChatterTrail, which is helpful, but more examples across different platform types and risk profiles would support consistent implementation. Greater specificity about risk assessment frameworks would be particularly useful.
Privacy Protections and Age Verification Data Handling
Perhaps the most underdeveloped aspect concerns privacy protections in age verification processes. Age assurance inherently involves collecting, processing, and potentially storing sensitive personal information. Young people are particularly vulnerable to privacy harms, yet the guidance provides insufficient detail on data handling, retention, security, and user rights.
The guidance states that providers must comply with Privacy Act obligations, but this isn't adequate. Providers require explicit direction on minimum data collection principles, maximum retention periods, encryption and security requirements, prohibitions on secondary use or commercialisation of verification data, and mandatory transparency about collection practices.
Different verification methods present distinct privacy risks. Document-based verification exposes identity information; biometric analysis raises surveillance concerns; third-party services create additional data-sharing vulnerabilities. Providers should be required to conduct privacy impact assessments for their chosen methods and demonstrate Privacy Act compliance alongside SMMA compliance.
The intersection of age verification and privacy becomes especially fraught given the legislation aims to protect young people - yet verification processes may expose us to privacy risks. The trial found some providers were "over-anticipating the eventual needs of regulators" about retaining personal information for future investigations. Clearer direction on appropriate retention practices would help providers avoid this pitfall whilst ensuring protective measures don't inadvertently harm those they're designed to protect.
Moving Forward
eSafety's regulatory guidance represents substantive effort to operationalise challenging legislation. The acknowledgment of diverse stakeholder perspectives, incorporation of trial findings, and principles-based approach demonstrate thoughtful policy development. However, strengthening provisions around human appeals, explicitly connecting age verification to broader protective goals, providing concrete implementation examples, and comprehensively addressing privacy considerations would significantly enhance the framework.
As a young person, I want regulation that genuinely protects us without treating us as problems requiring management. This demands viewing us as stakeholders with agency, ensuring our voices shape implementation, and recognising that our safety and rights aren't competing interests, they must advance together. The guidance makes progress towards these goals, but meaningful gaps remain. Closing these gaps through ongoing refinement and responsive adjustment will determine whether this regulatory framework achieves its protective intent whilst respecting the rights and dignity of young Australians.
Bibliography
Age Check Certification Scheme (ACCS), Age Assurance Technology Trial – Report (Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts, 2025).
eSafety Commissioner, Social Media Minimum Age Regulatory Guidance (2025).
Explanatory Memorandum, Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Cth).
Online Safety Act 2021 (Cth).
Privacy Act 1988 (Cth).
Comments