1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Apple rejects privacy claims over child abuse scanning tool

August 14, 2021

The iPhone and iPad manufacturer said its plan to check the photos of US users for evidence of child abuse had been widely "misunderstood". CEO Tim Cook is yet to publicly comment on the privacy row.

https://p.dw.com/p/3yys0
The latest iPad Pro on show at an Apple store
The changes only affect users in the United StatesImage: Wang Gang/Costfoto/picture alliance

US tech giant Apple has rejected privacy concerns over its new child protection features that will automatically scan images for evidence of sexual abuse.

The smartphone maker finally unveiled last week a new feature for iPhones and iPads in the United States that will check photos uploaded to its cloud storage or sent via its messaging platform.

The latest iPad Pro on show at an Apple store
The changes only affect users in the United StatesImage: Wang Gang/Costfoto/picture alliance

Encryption experts and privacy campaigners argued the tool could be exploited for other purposes, potentially opening a door to mass surveillance.

But a top executive used an interview with the leading US business newspaper, The Wall Street Journal, to play down claims that Apple’s new technology amounts to an invasion of privacy.

What did Apple say?

"We can see that it's been widely misunderstood," said Craig Federighi, the company’s senior vice president of software engineering.

"We wanted to be able to spot such photos in the cloud without looking at people's photos," he said.

Craig Federighi, Apple's software chief, gives a presentation at the Apple Worldwide Developers Conference
Craig Federighi, Apple's software chief, has downplayed privacy concerns linked to the new scanning tool.Image: Marcio Jose Sanchez/AP Photo/picture alliance

The 52-year-old said the California-based corporation wanted to "offer this kind of capability... in a way that is much, much more private than anything that's been done in this area before."

Critics claimed that the move would weaken encryption, creating a potential privacy loophole that could be exploited by hackers or governments. 

Apple insists the new features will not make any of its systems and devices less secure or confidential.

The tool, known as "NeuralHash", will scan images before they are uploaded to iCloud.

If it finds a match, the image will be reviewed by a human.

If the images are confirmed to show the sexual abuse of children, the user’s account will be disabled. The National Center for Missing and Exploited Children, a US child protection non-profit, will then be notified.

What are the privacy concerns?

Many fear that this technology could be re-purposed to simply spy on users without any child abuse concerns.

"What happens when the Chinese government says, 'Here is a list of files that we want you to scan for'," said Matthew Green, a top cryptography researcher at Johns Hopkins University.

"Does Apple say no? I hope they say no, but their technology won't say no."

Tim Cook presents the iPhone 12 at a press conference in 2020
Apple's chief executive Tim Cook has not commented on the claims made by privacy campaigners.Image: Apple Inc./Brooks Kraft/AFP

A petition calling on Apple to "reconsider its technology rollout" attracted more than 7,000 signatures.

The signatories included ex-National Security Agency contractor Edward Snowden, who leaked information revealing the US government's mass surveillance program.

In a statement on its website, Apple said it would "continue to refuse" any requests from regulators that "degrade the privacy of users."

The company's chief executive, Tim Cook, is yet to comment publicly on the row.

jf/ (AFP, AP)