Snapchat has blocked or disabled 415,000 Australian accounts belonging to users under the age of 16, the company announced this week, as enforcement of Australia’s world-first social media ban for minors continues to reshape the digital landscape. The sweeping legislation, which took effect on December 10, mandates that platforms prevent children younger than 16 from holding accounts, with companies facing fines up to A$49.5 million (US$34 million) for failing to comply with reasonable steps to enforce the restrictions.
The disclosure comes as Australia’s eSafety Commissioner reported that major technology companies have collectively blocked approximately 4.7 million accounts since the law’s implementation, delivering what regulators have called “significant outcomes” for child online safety. Snapchat’s figures, current through the end of January, illustrate the massive scale of enforcement required under the new framework that applies to platforms including Meta, TikTok, and YouTube.
Age Verification Technology Faces Accuracy Limits
Despite the substantial account removals, Snapchat warned that significant gaps remain in the enforcement mechanism. The company noted that current age estimation technology achieves accuracy only within a two-to-three-year margin, creating practical challenges for platforms attempting to verify user ages reliably.
“In practice, this means some young people under 16 may be able to bypass protections, potentially leaving them with reduced safeguards, while others over 16 may incorrectly lose access,” the company stated in an online announcement. Snapchat emphasized that it continues to lock additional accounts daily as its detection systems identify underage users.
The accuracy limitations of existing age verification systems have emerged as a central concern for platforms navigating the compliance requirements while attempting to minimize unintended disruption for legitimate users.
Calls for App Store-Level Verification
Snapchat has joined Meta in urging Australian authorities to implement additional safeguards by requiring app stores to verify user ages before allowing downloads. The company argued that creating a centralized verification system at the app-store level would establish more consistent protection and erect higher barriers against circumventing the law.
The proposal represents a shift in how platforms view responsibility for age verification, suggesting that distributing the verification burden across the technology ecosystem could improve enforcement outcomes. By mandating age checks when users initially download applications, proponents argue that underage users would face additional obstacles before ever reaching platform-level registration systems.
Platform Questions Scope of Ban
While acknowledging Australia’s objectives and expressing support for protecting people online, Snapchat explicitly stated that it does not believe its platform should fall under the social media ban legislation. The company emphasized that Snapchat functions primarily as a messaging application used by young people to maintain connections with close friends and family rather than as a traditional social media platform.
“We do not believe that cutting teens off from these relationships makes them safer, happier, or otherwise better off,” the company said, articulating its position that an outright ban represents the wrong approach to addressing youth online safety concerns. The stance underscores ongoing tensions between regulatory frameworks designed to protect minors and platform perspectives on the social value of their services.
The debate reflects broader questions about how policymakers and technology companies can balance child protection with the legitimate communication needs of teenagers in an increasingly digital society.
Continued Enforcement Expected
With fines reaching nearly A$50 million for non-compliance, platforms face substantial financial pressure to demonstrate good-faith enforcement efforts. The 4.7 million accounts blocked across all major platforms suggests that technology companies are investing heavily in compliance mechanisms, though questions about effectiveness and unintended consequences remain active points of discussion.
Snapchat’s disclosure of specific enforcement numbers provides regulators and the public with greater transparency about implementation progress, even as the company continues advocating for alternative approaches to achieving child safety objectives.
