Apple Announces Limits to Child Sex Abuse Image-Scanning System After Privacy Backlash

Apple Announces Limits to Child Sex Abuse Image-Scanning System After Privacy Backlash
A phone in a file photograph. Loic Venance/AFP via Getty Images
Tom Ozimek
Updated:

Apple on Aug. 13 provided new details of how its planned child sexual abuse material (CSAM) detection system would work, outlining a range of privacy-preserving limits following backlash that the software would introduce a backdoor that threatens user privacy protections.

The company addressed concerns triggered by the planned CSAM feature, slated for release in an update for U.S. users later this year, in a 14-page document (pdf) that outlined safeguards it says it will implement to prevent the system on Apple devices from erroneously flagging files as child pornography, or being exploited for malicious surveillance of users.

“The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised,” the company said in the document.

Apple’s reference to “possibly-colluding entities” appears to address concerns raised by some that the system could be abused—for instance, by authoritarian regimes—to falsely incriminate political opponents.

“This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports,” Apple stated.

People walk past an Apple retail store in New York on July 13, 2021. (Angela Weiss/AFP via Getty Images)
People walk past an Apple retail store in New York on July 13, 2021. Angela Weiss/AFP via Getty Images

Besides ensuring that the database of child sexual abuse imagery that Apple will check against is not controlled by a single entity or government, Apple’s other technical protections against mis-inclusion include choosing a high match threshold so that “the possibility of any given account being flagged incorrectly is lower than one in one trillion.” At the same time, Apple said the system would prevent privacy violations by never learning any information about iCloud-stored images that don’t have a positive match to the CSAM database.

Another guardrail is a secondary threshold that must be met by comparing visual derivatives of suspect images against a “second, independent perceptual hash” for an account to be flagged, a process that would then be followed by a human review before final confirmation.

“The reviewers are instructed to confirm that the visual derivatives are CSAM. In that case, the reviewers disable the offending account and report the user to the child safety organization that works with law enforcement to handle the case further,” Apple stated.

Other security and privacy requirements meant to protect against abuse of the system include preventing the CSAM database or the matching software from being altered “surreptitiously” or that it must not be possible to target specific user accounts with different encrypted CSAM databases or with different software performing the matching.

Apple insists the guardrails would work in tandem to safeguard against both human and technical error, as well as adversarial attacks against its perceptual algorithm, which could potentially cause non-CSAM images to exceed the match threshold and result in false positives.

Since news of Apple’s plan first broke on Aug. 11, thousands have signed an open letter criticizing the CSAM system, saying the “proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.”

“Experts around the world sounded the alarm on how Apple’s proposed measures could turn every iPhone into a device that is continuously scanning all photos and messages that pass through it in order to report any objectionable content to law enforcement, setting a precedent where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and unreasonable expansion of the scope of surveillance,” the letter said, which cited a number of security and privacy experts detailing their concerns.

One of the experts cited in the letter, John Hopkins University professor and cryptographer Matthew Green, had a mixed reaction to Apple’s announcement of the CSAM system.

“This sort of tool can be a boon for finding child pornography in people’s phones,” Green wrote on Twitter. “But imagine what it could do in the hands of an authoritarian government?”
Green wrote, “If you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” noting that such “systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
The expert told The Associated Press that he’s concerned Apple could be pressured by authoritarian governments to scan for other types of information.

Apple has strongly pushed back against the concern that its system could be taken advantage of by entities such as governments to tread on user privacy.

Craig Federighi, Apple’s senior vice president of software engineering, told The Wall Street Journal in an interview that privacy would be protected by “multiple levels of auditability.”

“We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world,” Federighi told the outlet.

It comes as Apple and other tech companies have faced pressure from governments and other entities to act more decisively to combat child pornography.

Tom Ozimek
Tom Ozimek
Reporter
Tom Ozimek is a senior reporter for The Epoch Times. He has a broad background in journalism, deposit insurance, marketing and communications, and adult education.
twitter
Related Topics