Apple's plans to scan iPhone photos in order to find child exploitation images was quickly and clearly successful.
Apple announced Friday that it will delay the previously announced system to scan iPhone users' photos to find digital fingerprints that indicate the presence of Child Sexual Abuse Material (CSAM). This is in response to public outcry and criticism of privacy advocates.
"Previously, we announced plans to feature features to help protect children against predators who use communication instruments to recruit and exploit them. To help limit the spreading of Child Sexual Abuse Material," reads a September 3 update to the original press release. We have decided to spend additional time in the coming months collecting feedback from customers, advocates groups, researchers, etc. before releasing these critical child safety features.
The new iOS 15 feature was announced in August. It would check photos stored in the iPhone's photo library before sending them to iCloud. This database contains known CSAM images. If an automated system finds a match, the content will be sent to a human reviewer and eventually reported to child protective authorities.
Both experts and users were alarmed by the fact that scanning was done on the device. It was not only creepy that Apple could view photos that users had not yet sent to the cloud, but many also criticized it as hypocritical of a company that values privacy so much. The Electronic Frontier Foundation also criticized the "backdoor" Apple could use to access a user's device to allow law enforcement and other government agencies to do so.
The EFF stated that even a well-documented, meticulously thought-out, and narrowly-scoped "backdoor" was still a backdoor.
Experts who had previously criticized the move were generally happy with the decision to conduct more research.
Others suggested that Apple should do more to protect privacy. Digital rights group fight for the future suggested that Apple should focus on strengthening encryption.
Apple stumbled through the rollout of this product, with privacy concerns disproportionately overshadowing its intended purpose. Better luck next time, folks.