1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

US: Apple delays rollout of iPhone child abuse scanning tool

September 3, 2021

The feature had been intended to scan for images of child sexual abuse. But it quickly drew concern over potential misuse as a "backdoor" for hacking and surveillance.

https://p.dw.com/p/3ztiq
An Apple iPhone 12
Apple has built its brand on ensuring user privacyImage: STRF/STAR MAX/picture alliance

Apple on Friday announced an indefinite delay of plans to scan iPhones in the US for images of child sex abuse, following an outcry over potential exploitation of the tool for unlawful surveillance and hacking. 

What was Apple's photo scanning plan? 

The tool, introduced last month, would have scanned files to identify images of child sex abuse before they are uploaded to the company's iCloud storage services.

Apple had also planned to introduce a separate function, which would have scanned users' encrypted messages for sexually explicit content.

Dubbed "NeuralHash," the system was designed to catch images of child sex abuse that have either been edited or are similar to ones known to law enforcement.

Apple said it would have limited access to the flagged images to the National Center for Missing and Exploited Children.

Why Germany is losing the fight against pedophiles on the internet

Why did Apple change track?

Apple announced the postponement Friday in an update posted above its original photo scanning plans.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,'' the update said. 

Matthew Green, a top cryptography researcher at Johns Hopkins University who had criticized the plan, told the AP news agency that he supported the delay. 

"You need to build support before you launch something like this,'' Green said. "This was a big escalation from scanning almost nothing to scanning private files.''

Green had been among the experts last month who warned that the NeuralHash scanning system could be used for nefarious purposes.

For example, innocent people could be framed after having been sent seemingly innocuous images designed to trigger matches for child pornography. Green said it would be enough to fool the system and alert law enforcement.

'Too many companies vulnerable' to ransomware

A 'misunderstanding'? 

Apple has based its brand on ensuring personal privacy, and has traditionally rejected demands for access to user data from governments

"We can see that it's been widely misunderstood," said Craig Federighi, Apple's senior vice president of software engineering when the idea was unveiled last month and met almost instant criticism. 

"We wanted to be able to spot such photos in the cloud without looking at people's photos," he said.

 wmr/fb (AP, AFP)