As the Australian Government prepares to include YouTube in its under-16s social media ban, child expert and counsellor Jacintha Field has urged policymakers to tread carefully, warning that blanket restrictions risk severing trust, stifling open conversations, and isolating vulnerable young users.
B&T sat down with Field to unpack the real-world implications of the ban, the emotional needs driving children’s online behaviour, and why connection, not control, should be at the heart of Australia’s digital safety strategy.
“While I don’t usually believe in blanket bans, I agree that YouTube can be one of the worst culprits,” Field said. “There is content that is not just inappropriate but dangerous: subtle grooming, adult themes masked in animation and manipulative messaging that bypasses our filters as parents and educators.”
Yet, she argues, the issue is not the presence of risk alone, it’s the absence of connection. “When we respond to their curiosity with bans, we don’t eliminate the need; we push it underground. This can lead to feelings of shame, secrecy, risky behaviours and disconnection”.
Field, who has spent years counselling children and families through her private practice turned online platform, Happy Souls Kids, has seen firsthand how bans can backfire. “I’ve sat with children who feel guilty for wanting to go online. They’re not misbehaving; they’re trying to connect, fit in and belong. When we meet that need with restrictions instead of understanding, we unintentionally teach them that the desire itself is wrong”.
From her work with families, Field said kids are already finding workarounds. “One told me, ‘I just used my mum’s birthday,’ and another said, ‘I added a second face to my dad’s phone.’ My own son did this, and I didn’t even know it was possible.”
The result, she said, is a migration to darker corners of the internet, where no adult is present. “No filters are protecting them, and no real-world tools are being provided, just the algorithm, one that’s far more powerful than we give it credit for.”
Field believes platforms like YouTube do play a critical role in learning and development when used intentionally. “My son has taught himself how to 3D print, explore engineering concepts, and build things I never would have imagined. For self-motivated and curious kids, it’s a treasure trove, offering access to knowledge and experiences that aren’t typically found in a school curriculum”.
But the value, she cautioned, isn’t automatic. “We need to watch YouTube with our kids. Instead of monitoring them from a distance, we should sit beside them. Ask questions and talk about what they’re watching to make it a shared experience. When we don’t engage, someone else might influence them and that someone may not have their best interests at heart,” she explained.
She also questioned why more wasn’t done to trial the policy first. “Why wasn’t this tested in schools? No pilot. No longitudinal research. No mental health review. We’re rolling out a national policy based on theory, not lived experience. And the children who are already most vulnerable will pay the price”.
Field also challenged the tech industry’s priorities: “How can I walk into Bunnings to buy a shovel and be served an ad for one on Instagram five minutes later, but the same algorithm can’t be used to protect our children from bullying, exploitation, or harm?”
In place of bans, Field advocates for education, presence, and co-engagement. “This is not just about screen time; it’s about relational safety. It’s about kids knowing their truths are welcome, even when those truths are messy”.
When her own son wanted to play Roblox, a platform she considers unsafe due to unfiltered chat and inappropriate content, she chose not to ban it outright. “Instead of simply saying ‘no,’ I invited him into the conversation… Once he recognised the potential dangers and reflected on how it made him feel, he chose to delete the app himself”.
For Field, this approach, rooted in trust and emotional safety, is far more effective than blunt restriction. “Kids don’t just need limits. They need leadership. Modelling. A parent who says, ‘Let’s figure this out together,’ instead of, ‘You’re not allowed.’”
She also believes that policy must reflect the nuances of each platform. “YouTube is often search-based and long-form. TikTok and Snapchat are reactive and social. Treating them as the same reveals a lack of nuance or, worse, a lack of care.”
Ultimately, Field called for more support for families, not just more rules. “We need emotionally intelligent, child-led platforms, not ones driven by ad revenue. And we need tools for parents: support, guidance, and language for challenging conversations.”
That’s why she built Happy Souls Kids, a mental health platform designed to meet children before crisis hits. “Because while bans and waitlists dominate the headlines, there’s a generation of kids quietly needing support now”.
To policymakers, her message is clear: “Please stop treating this as a PR issue. It’s a mental health issue. We don’t need bans. We need accountability. From tech giants. From policymakers. And yes, from ourselves. Because kids don’t just need less screen time, they need more connection. And that starts with us.”