Apple has revealed it will be analysing images on select devices, in an effort to protect children using iPhones and other iOS products.
Earlier this month, Apple detailed its Expanded Protections for Children, which includes new technologies to warn children and parents about explicit content.
On Messages, Apple will now use on-device machine learning to identify sensitive content, while on-device image matching will be used for iCloud Photos to detect Child Sexual Abuse Material (CSAM).
There will also be new resources added to Siri and Search to help children and parents navigate unsafe situations.
“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe,” the company said.
“We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”
The new features are only available for accounts set up as families in iCloud, where a parent/guardian must opt in to turn the controls on for children under the age of 12.
And while Apple has made it clear that these controls will only be used on these select accounts, there has still been some level of public concern about the scope of these controls, particularly the notion of ‘scanning’ images.
The Center for Democracy & Technology’s Security & Surveillance Project co-director Greg Nojeim said the new technologies could impact security on devices.
“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world,” he said.
“Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”
Nojeim added that such technologies create a ‘backdoor’ in that they break the end-to-end encryption that many people enjoy on Messages.
Apple has previously refused to create such a ‘back door’ for law enforcement agencies looking to access encrypted messages.
However, Apple has gone on the record to say the changes do not break end-to-end encryption in Messages.
“This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom,” Apple said.
“If the feature is enabled for the child account, the device will evaluate images in Messages and present an intervention if the image is determined to be sexually explicit.”