YouTube has criticised Australia’s new laws banning under-16s from social media platforms, claiming the changes will remove a plethora of safety features that parents currently use to supervise young users and drawing a scathing response from Communications Minister Anika Wells in response.
Rachel Lord, public policy senior manager for Google and YouTube Australia, said parents would “lose their ability to supervise their teen or tween’s account”, including managing content settings and blocking specific channels.
Lord claimed in a blog post that without accounts, children would still be able to watch videos but without any of personalised safety features the platform currently offers.
Lord also argued that the new law would make children “less safe on YouTube”.
“This law will not fulfil its promise to make kids safer online, and will, in fact, make Australian kids less safe on YouTube,” she said.
She described the legislation as “rushed regulation that misunderstands our platform and the way young Australians use it”.
Anika Wells, Communications Minister, refuted YouTube’s criticism, calling it “outright weird” for the company to raise concerns about the future of its own safety.
“If YouTube is reminding us all that it is not safe… that’s a problem that YouTube needs to fix,” Wells told the National Press Club on Wednesday.
Under the new law, accounts belonging to any users under the age of 16 will be deactivated from 10 December. These accounts will be logged out automatically and become unable to upload videos or comments. YouTube Kids will remain unaffected.
Lord said the new legislation had “failed to allow for adequate consultation and consideration of the real complexities of online safety regulation.” Reports have suggested Google may consider challenging the inclusion of YouTube in the ban, though this has not been publicly commented on.
One week before the ban is set to begin, Wells acknowledged that some early challenges were expected. “Regulation, and cultural change, takes time. Takes patience,” she said. She argued that today children face constant exposure to online content, saying technology keeps young people connected to “a dopamine drip” through tailored content recommendations and notifications.
Platforms including Instagram, Facebook, Snapchat, TikTok, X, Twitch, Threads, Reddit, and Kick must delete all under-16 accounts and block new accounts or face fines of up to $49.5 million.
Meta’s Australian MD, Will Easton, said the law was “poorly developed” and would “force teens to less regulated parts of the internet or to apps that do not have safety features like those offered by Meta”.
“We’re not against regulation. For example, we’ve spent years developing safer offerings such as Teen Accounts, a reimagined experience for teens on Instagram, Facebook, and Messenger that limits who can contact them, the content that they see, and time spent online. And we’ve long advocated for a better approach to require app stores to verify age and obtain parents’ approval whenever teens under 16 download apps, and for social media platforms to use that age signal to provide age-appropriate experiences,” Easton posted to LinkedIn.
Instead, with the law we have, the government and regulator are going to be constantly playing whack-a-mole. As soon as these additional apps are included in the regulation, where will teens go next? Age verification and parental approval at the App Store level is the only sensible approach.

