Meta has launched a new platform, “Take It Down,” with National Center for Missing & Exploited Children (NCMEC) to prevent young people’s intimate images from being shared online.
The new platform, which NCMEC created with financial support from Meta, allows people to visit TakeItDown.NCMEC.org and submit a hash value of intimate photos or videos — not the actual photos or videos — and it will proactively search participating apps for the photos and take them down.
A hash value is a numerical code that parses the information from an image, without replicating the image itself, and can then be used to find any copies of the image.
While it is primarily designed to stop images of children being shared online, anyone can upload hashes of images including parents or trusted adults on behalf of a young person or adults worried about images taken of them when they were under 18.
“We’re proud to be working with NCMEC to launch ‘Take It Down’, a dedicated safety tool for young people, including in Australia,” said Josh Machin, head of public policy, Meta Australia.
“This world-first platform will offer support to young Australians to prevent the unwanted spread of their intimate images online, which we know can be extremely distressing for victims. We’ve taken feedback from experts, including Australia’s eSafety Commissioner, safety organisations, victims and law enforcement to develop this platform and dedicated resources for young people. Working collectively, we can help to combat this issue for young people online.”
Naturally, Meta does not allow content that exploits young people from being posted on its platform, including intimate images or sextortion activities. The company has also been releasing a raft of new features on both Instagram and Facebook that make privacy settings more straightforward.
On Instagram, for example, adults will no longer be able to see teen accounts when scrolling through the list of people who have liked a post or when looking at an account’s Followers or Following list. The app will also send a teenager a notification to review an adult follower. It is also prompting teens to review and restrict their privacy settings.
In September, however, the company was slapped with a huge €405 million (around AU$637 million) fine after the Irish Data Protection Commission found that users’ email addresses and phone numbers were publicly displayed, and the Instagram policy makes all new accounts public by default, including those of children.
In May last year, the EU ruled that Meta and other big tech companies must search for and remove child p*rnography whenever they find it or risk being fined six per cent of annual global revenue. In Meta’s case, that would amount to almost US$7 billion (around AU$10 billion).
Meta said that it now has more than 30 different tools in place to support the safety of teens and families across its apps such as supervision tools for parents and age-verification technology to ensure teens have age-appropriate experiences.