Apple’s Photo-Scanning Technology Causes Privacy Storm

Adelaide, Australia - September 20, 2013: Entering passcode on an iPhone 4 running iOS 7. iOS 7 is the foundation of iPhone, iPad, and iPod touch. It comes with a collection of apps and useful features. The iOS 7 update features a redesigned interface and hundreds of new features.

Apple has revealed it will be analysing images on select devices, in an effort to protect children using iPhones and other iOS products.

Earlier this month, Apple detailed its Expanded Protections for Children, which includes new technologies to warn children and parents about explicit content.

On Messages, Apple will now use on-device machine learning to identify sensitive content, while on-device image matching will be used for iCloud Photos to detect Child Sexual Abuse Material (CSAM).

There will also be new resources added to Siri and Search to help children and parents navigate unsafe situations.

“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe,” the company said.

“We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”

The new features are only available for accounts set up as families in iCloud, where a parent/guardian must opt in to turn the controls on for children under the age of 12.

And while Apple has made it clear that these controls will only be used on these select accounts, there has still been some level of public concern about the scope of these controls, particularly the notion of ‘scanning’ images.

The Center for Democracy & Technology’s Security & Surveillance Project co-director Greg Nojeim said the new technologies could impact security on devices.

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world,” he said.

“Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Nojeim added that such technologies create a ‘backdoor’ in that they break the end-to-end encryption that many people enjoy on Messages.

Apple has previously refused to create such a ‘back door’ for law enforcement agencies looking to access encrypted messages.

However, Apple has gone on the record to say the changes do not break end-to-end encryption in Messages.

“This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom,” Apple said.

“If the feature is enabled for the child account, the device will evaluate images in Messages and present an intervention if the image is determined to be sexually explicit.”




Please login with linkedin to comment

Apple iPhone Privacy

Latest News

Canva Unveils Enterprise Era With Powerful New Workplace Products Debuted At Canva Create
  • Media

Canva Unveils Enterprise Era With Powerful New Workplace Products Debuted At Canva Create

At its first international Canva Create event in Los Angeles today, Canva, the world’s only all-in-one visual communication platform, accelerated its focus on the enterprise, debuting the platform’s biggest overhaul in a decade alongside a range of new workplace products and services poised to redefine the way millions of people work. Lead image: Canva Founders […]

Cashrewards: A Decade Strong!
  • Partner Content

Cashrewards: A Decade Strong!

Cashrewards' CEO muses on the company's 10 year anniversary. Still refusing to answer questions about a possible party.

Partner Content

by B&T Magazine

B&T Magazine
Network 10 Axes Gladiators
  • Media

Network 10 Axes Gladiators

Alas, 10's Gladiators has gone as quickly as it came. Should make for a very wild garage sale, however.