Apple rejects privacy concerns over child abuse scanning tool | News | DW | 14.08.2021

Visit the new DW website

Take a look at the beta version of We're not done yet! Your opinion can help us make it better.

  1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages


Apple rejects privacy concerns over child abuse scanning tool

The iPhone and iPad manufacturer said its plan to check the photos of US users for evidence of child abuse had been widely "misunderstood". CEO Tim Cook is yet to publicly comment on the privacy row.

The latest iPad Pro on show at an Apple store

The changes only affect users in the United States

US tech giant Apple has rejected privacy concerns over its new child protection features that will automatically scan images for evidence of sexual abuse.

The smartphone maker finally unveiled last week a new feature for iPhones and iPads in the United States that will check photos uploaded to its cloud storage or sent via its messaging platform.

Encryption experts and privacy campaigners argued the tool could be exploited for other purposes, potentially opening a door to mass surveillance.

But a top executive used an interview with the leading US business newspaper, The Wall Street Journal, to play down claims that Apple’s new technology amounts to an invasion of privacy.

What did Apple say?

"We can see that it's been widely misunderstood," said Craig Federighi, the company’s senior vice president of software engineering.

"We wanted to be able to spot such photos in the cloud without looking at people's photos," he said.

Craig Federighi, Apple's software chief, gives a presentation at the Apple Worldwide Developers Conference

Craig Federighi, Apple's software chief, has downplayed privacy concerns linked to the new scanning tool.

The 52-year-old said the California-based corporation wanted to "offer this kind of capability... in a way that is much, much more private than anything that's been done in this area before."

Critics claimed that the move would weaken encryption, creating a potential privacy loophole that could be exploited by hackers or governments. 

Apple insists the new features will not make any of its systems and devices less secure or confidential.

The tool, known as "NeuralHash", will scan images before they are uploaded to iCloud.

If it finds a match, the image will be reviewed by a human.

If the images are confirmed to show the sexual abuse of children, the user’s account will be disabled. The National Center for Missing and Exploited Children, a US child protection non-profit, will then be notified.

What are the privacy concerns?

Many fear that this technology could be re-purposed to simply spy on users without any child abuse concerns.

"What happens when the Chinese government says, 'Here is a list of files that we want you to scan for'," said Matthew Green, a top cryptography researcher at Johns Hopkins University.

"Does Apple say no? I hope they say no, but their technology won't say no."

Tim Cook presents the iPhone 12 at a press conference in 2020

Apple's chief executive Tim Cook has not commented on the claims made by privacy campaigners.

A petition calling on Apple to "reconsider its technology rollout" attracted more than 7,000 signatures.

The signatories included ex-National Security Agency contractor Edward Snowden, who leaked information revealing the US government's mass surveillance program.

In a statement on its website, Apple said it would "continue to refuse" any requests from regulators that "degrade the privacy of users."

The company's chief executive, Tim Cook, is yet to comment publicly on the row.

jf/ (AFP, AP)