Social media companies will need to lift their game if they are to adhere to Australia’s new age restrictions, according to a report by eSafety, Australia’s online safety regulator. It found that Australian children are easily circumventing inadequate and poorly enforced minimum age rules employed by YouTube, Facebook, Instagram, TikTok, Snap, Reddit, Discord and Twitch.
Most of these platforms are only asking kids to self-declare their age at sign-up with very little, if any, scrutiny.
Eight in ten Australian children aged 8-12 used one or more social media services in 2024, with the most popular services YouTube (68 per cent of children surveyed), TikTok (31 per cent) and Snapchat (19 per cent).
More than half (54 per cent) of children under the legal age limit accessed these services via their parent’s or carer’s accounts, while 36 per cent had their own account. The majority of account owners had help setting up their accounts from parents or carers.
The report also reveals mixed results in how a number of platforms were enforcing age limits through proactive tools and reporting systems to detect children under the age of 13 already present on their services.
eSafety Commissioner Julie Inman Grant said the report shows very clearly that there is still significant work to be done by any social media platforms relying on truthful self-declaration to determine age with enforcement of the Government’s minimum age legislation, which will lift the legal age from 13 to 16, on the horizon.
“Social media services not only need to make it harder for underage users to sign up to their services in the first place, but also make sure that users who are old enough to be on the service, but are not yet adults, have strong safety measures in place by default,” Inman Grant said.
“Few have any real stringent measures in place to determine age accurately at the point of sign-up so there’s nothing stopping a 14-year-old for instance entering a false age or date of birth and setting up an unrestricted adult account that doesn’t carry those extra safety features.
“And this likely means the platforms are unaware of the true numbers of users identified as children and teens on their services. Some platforms also make it very difficult to report under-aged users who are on their platforms today.
The report also indicates that social media companies are underestimating the number of monthly active users under 18, in particular the 13-15 year old cohort.
“Even with the likely underestimation of the true numbers, we are still talking about a lot of kids. For instance, Snapchat says of its 8.3 million monthly active users in Australia almost 440,000 are aged 13-15, Instagram with around 19 million users says around 350,000 are in this age group, YouTube with well over 25 million users said 325,000 were aged 13-15, while TikTok with close to 10 million users reported around 200,000 were in this early teen cohort,” Inman Grant said.
“Our survey also found 95 per cent of teens aged 13-15 reported using at least one of the 8 social media services since January 2024, so we can expect the actual numbers to be much higher.
The most popular platforms for kids aged under 13 were identified as YouTube, TikTok, Snapchat and Instagram, with YouTube the only platform that allows access to users under this age when attached to a family account with parental supervision.
Responses received from YouTube, Facebook, Instagram, TikTok, Snap, Reddit, Discord and Twitch, cover the period between January and July 2024, and reveal setting up an account if you were a child under 13 was a relatively simple process with many services only requiring a self-declaration of age at sign up.
“We’ll be consulting with industry and other stakeholders this year about what reasonable steps platforms should be expected to take to give effect to the minimum age requirements, and this report will be one key input to that process,” Inman Grant added.
“This report shows that there will be a tremendous amount of work to be done between now and December.”
In response to the report, a TikTok spokesperson said: “As a platform, the safety of our more than 9.5 million users is our highest priority and we are pleased that eSafety has recognised the best practice work we do to keep young people safe.
“Since the start of 2023, our industry leading, proactive age detection tools, have resulted in the removal of more than one million Australian users suspected of being under the age of 13.
“This report again shines a spotlight on the Government’s decision to give an exclusive carve out to the most popular social media platform for young Australians from the under 16 ban. Australian parents and guardians have a right to know what evidence, if any, supports the Government’s decision, so they can have confidence their children are safe on any exempted platforms.”