Australia’s biggest social media platforms, Facebook, Instagram, Snapchat, TikTok, and YouTube, are under the microscope after a new report from the eSafety Commissioner revealed “major gaps” in compliance with the nation’s under-16 social media ban.
The legislation, which came into effect on December 10, requires all Australians under 16 to be blocked from accessing platforms including TikTok, Snapchat, Instagram, Facebook, Threads, X, YouTube, Reddit, Twitch, and Kick. The responsibility for enforcing the law falls on the platforms themselves.
In a statement to B&T this afternoon from a Meta spokesperson, they confirmed they are “committed to complying with Australia’s social media ban and working constructively with eSafety and the government.”
“We’ve also been clear that accurately determining age online is a challenge for the whole industry, particularly at the age‑16 boundary where the Government’s own Age Assurance Technology Trial noted “natural error margins”. The most effective, privacy‑protective and consistent approach is to require robust age verification and parental approval at the app store and operating system level before a teen can download an app or create an account. That’s how you protect young people not just on major platforms, but across the other more than two million apps available, many of which may have weaker safeguards,” a spokesperson said.
“In the meantime, we’ll keep investing in enforcement to detect and remove under‑16 accounts and support parents, while advocating for a system that’s workable in practice and delivers better safety outcomes for young people.”
Senior Research Associate at the University of Sydney, Dr Rob Nicholls, told B&T “the report shows ambition alone is not enough.”
“The compliance gaps identified are not accidental. The platforms engineered workarounds into their own age assurance systems, failed to close reporting pathways and allowed repeated attempts to game facial recognition. These gaps reflect the rational commercial behaviour of platforms operating under a law that still leaves substantial discretion in their hands,” he said in a statement.
“The eSafety Commissioner has the right enforcement tools, including civil penalties of up to $49.5 million, but enforcement action against five major platforms simultaneously will test the regulator’s resources and resolve in equal measure.”
Speaking on Nine’s Today Show this morning, eSafety Commissioner Julie Inman Grant stressed she has “significant concerns” about current compliance levels.
“While social media platforms have taken some initial action, I am concerned through our compliance monitoring that some may not be doing enough to comply with Australian law,” Grant said. She added that eSafety is moving toward an “enforcement stance” that could see tech giants hit with fines of up to $49.5 million for non-compliance.
The report, set to be released on Tuesday, notes that while platforms made some early progress in the first three months of the ban, they have largely failed to maintain momentum.
Some are still pushing underaged users toward age-assurance tools even after they declare themselves under 16, and none offer a truly “accessible or effective” system for reporting suspected underaged accounts.
“Any enforcement action requires sufficient evidence, which takes time to gather,” Grant said.
“The evidence must establish the platform has not taken reasonable steps to prevent children aged under 16 from having an account. That means more than simply demonstrating some children do still have accounts. Rather, the evidence must show the platform has not implemented appropriate systems and processes.
“Durable, generational change takes time – but these platforms have the capability to comply today and we certainly expect companies operating in Australia to comply with our safety laws.
“They can choose to do so or face escalating consequences, including profound reputational erosion with governments and consumers globally.”

