Australia’s new minimum-age social media law is already forcing major platforms to shut down or limit access for millions of accounts identified as belonging to children, according to early figures shared with the government.
Officials say more than 4.7 million under-16 accounts have been “deactivated, removed or restricted” since the law took effect on December 10, with compliance reports coming from 10 major social media platforms covered by the ban.
Australia’s government has described the early results as proof the policy can work, while regulators and ministers have also stressed that enforcement is still in its early stages and more work is needed to stop children from signing up again or getting around age checks.
Millions of accounts targeted in first month
The figures were provided to the Australian government by 10 social media platforms and represent the first public look at how large the crackdown has been since the law was enacted in December.
In a separate government update, the Albanese government said the “world leading” minimum-age law was working, with more than 4.7 million under-16 accounts already deactivated, removed or restricted within days of the law coming into effect on December 10.
Prime Minister Anthony Albanese said the government had acted to help keep children safe online and called it “encouraging” that social media companies were making “meaningful effort” to comply with the new rules.
Communications Minister Anika Wells described the 4.7 million figure as a “huge achievement,” while also saying the system was not expected to be perfect right away.
Which platforms are covered
Under Australia’s law, platforms covered by the ban include Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, X, YouTube and Twitch.
Messaging services such as WhatsApp and Facebook Messenger are exempt from the ban, according to officials.
Companies that fail to take “reasonable steps” to remove under-16 accounts can face fines of up to 49.5 million Australian dollars (about $33.2 million), officials said.
The government did not provide a platform-by-platform breakdown of how many accounts were removed, restricted, or deactivated.
How age checks can work
Officials said platforms can meet the law’s requirements using several methods, including requesting copies of identification documents.
They can also use a third party to apply age estimation technology to a user’s face, according to the details provided on enforcement options.
Another option described by officials is to make inferences using data already available, such as how long an account has been held.
Meta, which owns Facebook, Instagram and Threads, said that by the day after the ban came into effect it had removed nearly 550,000 accounts believed to belong to users under 16.
Regulator monitoring and public response
Australia’s eSafety Commissioner, Julie Inman Grant, said the reported figure of 4.7 million accounts being “deactivated or restricted” was an encouraging sign.
Inman Grant also said the 10 biggest companies covered by the ban had complied and reported removal figures to the regulator on time.
Government messaging has emphasized ongoing oversight, saying eSafety will continue to monitor platforms to assess compliance and ensure they are meeting their obligations.
The government also said the figures were “preliminary” and came from a first tranche of information provided to the eSafety Commissioner.
The eSafety Commissioner said companies were expected to shift from removing accounts to preventing children from creating new accounts or otherwise bypassing the prohibition.
The government said Australians were actively seeking information about the changes, noting the eSafety website recorded more than one million visits since the launch of an education campaign on the social media minimum age.
Debate over privacy, safety, and enforcement
Officials said the new law was enacted amid fears about the effects of harmful online environments on young people, and it has sparked debate in Australia over technology use, privacy, child safety and mental health.
Inman Grant said about 2.5 million Australians are aged between 8 and 15, and cited past estimates suggesting 84% of 8- to 12-year-olds had social media accounts.
She said it was not known how many accounts existed across the 10 platforms, which means the scale of enforcement is difficult to compare directly with the total number of accounts held by children.
The law has been popular with parents and child safety campaigners, while privacy advocates and some groups representing teenagers have opposed it, according to officials and reporting on the debate.
Some teenagers have said they were able to fool age-assessment technology or were helped by parents or older siblings to get around the rules.
Opposition lawmakers have suggested young people may be bypassing the ban or shifting to other apps, while Inman Grant said her office saw a spike in alternative-app downloads when the ban began but not a spike in usage.
What could come next
Inman Grant said the regulator planned to introduce “world-leading AI companion and chatbot restrictions” in March, without providing details.
Albanese said the move to keep children off major social media platforms was “working” despite earlier skepticism and described the outcome as a source of national pride.
Wells said that while it is early, each deactivated account could mean more time for a young person to build community and identity offline.
The government and regulator have both signaled the next phase will focus less on one-time account removal and more on stopping children from signing up again and testing where companies may still fall short.
