The final report falls short of calling for an age ban for under-16s but urges the Government to continue exploring the feasibility of introducing them.
An Australian parliamentary inquiry has called for users to have greater control over social media algorithms, allowing them to alter or switch them off while using their platforms.
The Joint Select Committee on Social Media and Australian Society, which has been exploring the impact of social media, has also recommended in its final report that social media companies exercise a duty of care for the wellbeing of Australian users, and open up their hoods so that researchers can assess the harm algorithms are causing.
The committee fell short of recommending the government raise the age of users to 16, although encourages further testing of age assurance technology, but urges lawmakers co-design any changes with young people, also pour greater funding into digital media literacy to enable Australians to identify misinformation and harmful content online.
“This report puts Big Tech on notice—social media companies are not immune from the need to have a social licence to operate,” committee chair Sharon Claydon MP said in the report.
“The report supports protecting Australians through a statutory duty of care by digital platforms, education support and digital competency, greater protections of personal information, independent research, data gathering and reporting, and giving users greater control over what they see on social media.”
The final report follows a second interim report that handed down 11 recommendations, including calling on the government to establish a Digital Affairs Ministry that would be an overarching co-ordinator of the various regulatory bodies that cover [privacy and consumer protection, competition, online safety and scams.
It also recommended the government explore alternate funding mechanisms for publishers to supplement the News Media Bargaining Code, as well as a digital platform levy, and to lower the barriers of entry for smaller publishers and regional media to receive funding. The second report also called for greater transparency around recommender systems and social media algorithm changes.
Last week, the Albanese government said it would legislate a Digital Duty of Care to place the onus on digital platforms to proactively keep Australians safe and better prevent online harms.
“Online services operating in Australia can do more to proactively protect their users – which is why our Government is developing a Digital Duty of Care,” communications minster Michelle Rowland said a the time. “The duty of care will put the onus on industry to prevent online harms at a systemic level, instead of individuals having to ‘enter at their own risk’.
Algorithm ‘safety by design’
For the final report, the parliamentary inquiry heard several submissions about how algorithms and recommender systems operate and are used to curate the online experience of social media users. This includes how they can promote harmful behaviours and inappropriate content that can impair the mental health of users.
According to the eSafety Commissioner’s submission, algorithms as a technical process are neither harmful or not, but ‘they directly affect the content users are shown…and can reduce or exacerbate online harms. This includes the potential to amplify harmful and extreme content, especially where they are optimised for user engagement.
Dr Timothy Graham, an associate professor in digital media at Queensland University of Technology, has studied the impact of algorithms, including during the The Voice campaign.
He told the inquiry: “Baked into the architectural and algorithmic design of these platforms is the mandate to increase user engagement and attention. The dilemma facing us today is that platforms are designed for quantity over quality. They prioritise and recommend content that elicits strong reactions and gets user attention rather than content that is high quality or factually sound. Research shows that content expressing fear, outrage and division just gets more clicks, shares and comments.
ReachOut proposed that the platforms be designed for safety rather than engagement.
“We seek policies that compel social media platforms to step up and design their products for safety, not engagement, with greater transparency and user control. This could look like putting an end to sticky features like infinite scroll, mandating safety features and increased social media literacy programs for young people, increasing algorithm transparency and giving users more control over their algorithms so that they are in charge of the content they see.”
The committee said it is “strongly of the view that platforms, particularly Meta, are actively choosing not to provide the same levels of protections for users, transparency, and accountability mechanisms in Australia that they do in other jurisdictions. And in doing so, they are rapidly diluting their social licence to operate in Australia”.
To raise or not to raise?
The federal government and opposition have adopted a policy to ban under-16s from social media pending legislation to be introduced into parliament before the end of the year.
For the final report, the inquiry explored the harm social media causes children and the complexity of raising the age of users to 16, and whether the responsibility for verification should reside with individual apps or more broadly.
A significant number of contributors to the inquiry favoured children being prevented from accessing any social media platform.
“Children lack the developmental maturity to navigate the complexities and risks posed by social media. Platforms are designed to exploit their natural vulnerabilities, making them particularly susceptible to harmful influences,” the Australian Parents Council said.
“Just as we protect children from alcohol, gambling and driving until they reach an appropriate age, so too must we protect them from the harmful effects of social media until they are developmentally ready.”
The eSafety Commissioner Julie Inman Grant told the committee that it is difficult but not insurmountable, and that platforms are capable of ascertaining the age of their users if required.
“I will say that age verification is very, very, very difficult, but it can be done. I don’t believe it is insurmountable. I think that if we asked the major tech platforms right now, ‘How many under-13s do you have?’—and we plan to, thanks to the new BOSE (Basic Online Safety Expectations) determination—or, ‘How many under-16s do you have?’ they may not know today, because there’s probably a degree of wilful blindness because they haven’t had to look at these numbers.
She added that more thought needs to be given to the entire ecosystem on age: “If you’re not thinking about the role of the app stores, the phone manufacturers, the search engines and all of the other players in this space, it’s going to be challenging to come up with an efficacious solution.”
Meta: ‘Age verification should move beyond apps’
Meta said that it supports age verification at the app store or operating system level. “That level within the ecosystem is the gatekeeping level for the entire ecosystem, and it provides a way to ensure the least privacy-intrusive approach to collecting that information and then utilising that across the ecosystem,” Meta’s Global Head of Safety, Antigone Davis said.
“The challenges with doing age verification at an individual app level, for example, are that it creates additional privacy intrusions and it will move people from one place that asks for age to other apps that don’t ask for age.”
Not all submissions were in favour of raising the age and some warn that it creates a false sense of security when the bigger issue is that social media platforms should be safe by design irrespective of age.
“We advocate for a safety-by-design approach, which focuses on anticipating and preventing threats before they occur through the literal design of the platforms—proactivity over reactivity,” said Dr Rhys Farthing, director of research at Reset.Tech Australia.
Responsibility for these platforms should not only solely rest with the users but also with the creators. It is pertinent to hold big tech accountable for the negative impacts of their platforms by legislating these safety features into law to ensure that our protection is foundational rather than an afterthought.”
Final report recommendations
Recommendation 1
The committee recommends that the Australian Government consider options for greater enforceability of Australian laws for social media platforms, including amending regulation and legislation, to effectively bring digital platforms under Australian jurisdiction.
Recommendation 2
The committee recommends that the Australian Government introduce a single and overarching statutory duty of care onto digital platforms for the wellbeing of their Australian users, and requires digital platforms to implement diligent risk assessments and risk mitigation plans to make their systems and processes safe for all Australians.
Recommendation 3
The committee recommends that the Australian Government introduce legislative provisions to enable effective, mandatory data access for independent researchers and public interest organisations, and an auditing process by appropriate regulators.
Recommendation 4
The committee recommends that the Australian Government, as part of its regulatory framework, ensures that social media platforms introduce measures that allow users greater control over what user-generated content and paid content they see by having the ability to alter, reset, or turn off their personal algorithms and recommender systems.
Recommendation 5
The committee recommends that the Australian Government prioritise proposals from the Privacy Act review relating to greater protections for the personal information of Australians and children, including as part of its suite of ongoing privacy reforms, such as the Children’s Online Privacy Code.
Recommendation 6
The committee recommends that any features of the Australian Government’s regulatory framework that will affect young people be co-designed with young people.
Recommendation 7
The committee recommends that the Australian Government support research and data gathering regarding the impact of social media on health and wellbeing to build upon the evidence base for policy development.
Recommendation 8
The committee recommends that one of the roles of the previously recommended Digital Affairs Ministry should be to develop, coordinate and manage funding allocated for education to enhance digital competency and online safety skills.
Recommendation 9
The committee recommends that the Australian Government reports to both Houses of Parliament the results of its age assurance trial.
Recommendation 10
The committee recommends that industry be required to incorporate safety by design principles in all current and future platform technology.
Recommendation 11
The committee recommends that the Australian Government introduce legislative provisions requiring social media platforms to have a transparent complaints mechanism that incorporates a right of appeal process for complainants that is robust and fair.
Recommendation 12
The committee recommends that the Australian Government ensures adequate resourcing for the Office of the eSafety Commissioner to discharge its evolving functions.