Australia’s Teen Ban: 5 Giants Now Under Investigation
Australians saw the government put a ban on under-16s using social media on Dec. 10, 2025, the first country to make comprehensive age restrictions. Originally passed as the Online Safety Amendment (Social Media Minimum Age) Bill 2024 on Nov. 29, 2024, 5 social media platforms are now being investigated over issues enforcing the law.
Australia’s Warning To Teens
When Wednesday morning arrived, the day the bill was set to take effect, teens woke up with concern. Would their social media accounts be gone from the cyber world as they knew it? The 10 platforms required to stop those under 16 from signing up and requiring proof they were of age were Facebook, Threads, Reddit, X, Instagram, Snapchat, TikTok, Twitch, YouTube and Kick.
During his formal address regarding the bill, Prime Minister Anthony Albanese declared victory, claiming the ban as Australia’s greatest reform, set to lead the globe to action. Some youths managed to stick around and skirt the government’s attempt to dissuade social media use, commenting on the Prime Minister’s TikTok post about the ban, “It didn’t work bud.”
Communications Minister Anika Wells pushed back on the idea that teens could easily evade the new rules.
“Just because they might have avoided it today doesn’t mean they will be able to avoid it in a week’s time or a month’s time. The social media platforms have to go back and routinely check under-16 accounts,” Wells said, according to The Sydney Morning Herald.
Communications Minister Starts Investigation
According to The Guardian, there are 5 social media platforms now facing investigation for what the eSafety Commissioner has flagged as major gaps, citing potential non-compliance with the law. Tuesday saw the first compliance report, showing platforms took steps leading to the closing of nearly 5 million under-16 social media accounts. While this is seen as progress, they say significant problems are still present 4 months after the law went into effect.
One of the concerns cited is platforms showing extreme safety gaps, allowing repeat attempts at age verification by self-identified minors, not restricting the creation of new accounts by them, and inadequate reporting systems for underage users. E-Safety Commissioner Julie Inman Grant states that they are moving to an enforcement position, with platforms potentially seeing civil penalties for non-compliance of up to $50 million.
Enforcement will require evidence proving that platforms did not implement proper processes to keep minors from having or creating accounts. Wells remarked that platforms have given limitless opportunities to pass facial age checks, including allowing them to scan older siblings to pass the screening. The report also found that some cases showed platforms had neglected to ask existing users to verify, explaining how children kept accounts on the platform, as they did not initiate the check.
The 5 Social Media Platforms
The platforms facing scrutiny are Facebook, Instagram, TikTok, Snapchat and YouTube. Despite loopholes in the system, the ban appeared effective. A poll of 900 parents conducted by eSafety saw approximately 31% of their children still utilizing social media accounts after the law was passed. This compares to the 49% of responses collected before the ban was enforced.
This information is vital, as many countries are beginning to look at similar laws. They are being heralded as trailblazers in an internet era where others are seeking to create what they see as safety online. Of note, Norway is seeking to raise its minimum age to 15 with an amendment to the Personal Data Act. The United Kingdom created age assurance, not banning, but regulating risky features in the Online Safety Act. France is currently advancing what they call a “Digital Majority.” A testing phase of age-verification tokens is being explored to allow users to prove their age without using ID under the Miller Bill.
The United States has not had a federal ban, but 2025 saw some states implement strict laws. Florida (HB 3) and Utah (SB 152 & HB 311) have both created a “digital bedtime,” along with required parental consent for those under 18. California (SB 976) is looking to require full age verification, trying to combat addictive feeds.
These examples offer only a snapshot of the wider international push to regulate minors’ online activity.
Australia Becomes a Test Case for Global Regulators
With millions of accounts already removed and more investigations underway, Australia’s rollout is being closely watched by governments weighing their own age‑verification laws. As more countries consider how far regulators should go in reshaping the online experiences of minors, Australia’s enforcement push may become a test case for whether sweeping age restrictions can actually work — and whether platforms can be held to account when they don’t.
