Apple announced several new features in August to prevent the spread of child sexual abuse material. Nearly instantaneous backlash from privacy advocates to Edward Snowden was due to Apple's decision to scan iCloud photos to determine CSAM and also to check for matches using your iPhone or iPad. Apple has finally resigned after weeks of protests. At least, for the moment.
The company released a statement Friday stating that it had plans to release features to protect children against predators who exploit communication tools to recruit, exploit, and spread Child Sexual Abuse Material. We have taken additional time to gather feedback from customers, advocates groups, researchers, and others before we release these critical child safety features.
Apple did not provide any additional guidance about the form or operation of these improvements. Privacy advocates and security researchers remain cautiously optimistic about the pause.
Alex Stamos, ex-chief security officer at Facebook, and cofounder of cybersecurity consultancy firm Krebs Stamos Group, believes that this is an excellent move by Apple. This problem is complex and Apple will need to consider many factors before coming up with a solution.
CSAM scanners generate cryptographic hashes from known abusive images. This is a type of digital signature. Then, they search through large amounts of data to find matches. This is a common practice in many companies, including Apple for iCloud Mail. The company plans to expand that scanning to iCloud images, but it also suggested adding the step of checking the hashes on your device if you have an iCloud Account.
There is no safe way to accomplish what they propose.
Concerns that the tool could be used in other ways were raised immediately by the introduction of the ability to compare images from your phone with a set CSAM hashes provided by the National Center for Missing and Exploited Children. Riana Pfefferkorn is a research scholar at Stanford Internet Observatory. She says Apple would have made available to all phones a CSAM scanning feature that governments could and would subvert to make Apple search other material.
Apple has refused multiple requests from the United States government to create a tool that would allow law enforcement in the past to decrypt and unlock iOS devices. The company also granted concessions to China and other countries where customer data is stored on state-owned servers. The introduction of the CSAM tool was particularly fraught at a time when encryption is being hacked more widely by legislators all over the globe.
They feel that this is politically difficult, which I believe shows how untenable their Apple will never accept government pressure, according to Matthew Green, Johns Hopkins University's cryptographer. They should scan the files unencrypted on their servers if they feel they have to scan. This is standard practice for other companies like Facebook which scans for CSAM, terroristic, and other prohibited content types. Green suggests that Apple make iCloud storage encrypted from the beginning, so it cannot view these images even if they were.
Apple's plans were also controversial on technical grounds. False positives can be generated by hashing algorithms, identifying images incorrectly as matches. These errors, known as collisions are particularly concerning when used in conjunction with CSAM. Researchers discovered collisions in the iOS NeuralHash algorithm Apple wanted to use shortly after Apple's announcement. Apple stated at that time that the NeuralHash version that was being studied was not the same as that that would be used in the scheme and that it was accurate. Paul Walsh, the founder and CEO at security firm MetaCert says that collisions may not have a significant impact in practice. Apple's system requires 30 matching hashes to sound alarms. After this, human reviewers will be able tell what is CSAM from what is false positive.