Apple Halts Plan to Scan Child Sex Abuse Images in iPhones

Apple Halts Plan to Scan Child Sex Abuse Images in iPhones
People walk past an Apple retail store on July 13, 2021 in New York City. Angela Weiss/AFP via Getty Images
Harry Lee
Updated:

Apple has decided to delay its plan to scan iPhones or iPads for child sex abuse images to “protect children” after facing fierce backlash.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple announced on its website Friday.

On Aug. 5, Apple declared an “ambitious” plan of “Expanded Protections for Children.” The Cupertino-based tech giant said it would introduce new child safety features in three areas, of which the feature to detect child sexual abuse material in iPhones and iPads is the most controversial.

Apple’s program will utilize breakthrough cryptography technology and artificial intelligence to find abuse material when it is stored in iCloud Photos. Illegal images will be reported to the National Center for Missing and Exploited Children, the tech giant said at the time.
A girl uses an iPad at an Apple store in central London in this file photo. (Reuters/Luke MacGregor)
A girl uses an iPad at an Apple store in central London in this file photo. Reuters/Luke MacGregor
On Aug. 14, Apple announced more details on how the child sexual abuse material detection system would work, saying “the possibility of any given account being flagged incorrectly is lower than one in one trillion.”

The database of child sexual abuse imagery that Apple would check against is not controlled by a single entity or government, Apple added.

Apple’s plan almost immediately drew intense criticism from security experts, human rights groups, civil liberties advocates, and others over concerns it would pose threat to privacy, security, freedom, and open up doors to more government surveillance.

“This sort of tool can be a boon for finding child pornography in people’s phones,” John Hopkins University professor and cryptographer Matthew Green, an outspoken critic of Apple, wrote on Twitter when the plan was introduced. “But imagine what it could do in the hands of an authoritarian government?”
Green supported the delay of the program.

It’s reported that Apple had made compromises with the Chinese communist regime to keep its business in China.

For example, Apple moved its Chinese customers’ data to a China-based data center operated by Chinese authorities, per the requirements of the regime. In 2018, Apple placed the digital keys for these data in China, The New York Times reported in May.

Independent security experts and Apple engineers said that would make it nearly impossible to stop the Chinese regime from accessing these data.

However, Apple said it had “never compromised the security of our users or their data in China or anywhere we operate.” Apple also said its encryption technology in China is its most advanced and has never used in other countries.

The Epoch Times has contacted Apple for comment.

Jack Phillips and Tom Ozimek contributed to this report.