Apple confirmed today its plans to scan iPhones for images of child abuse shortly after reports surfaced. Details were provided in a news release as well as a technical summary.Apple announced that Apple's method for detecting child sexual abuse material (CSAM) was designed to protect user privacy. Instead of scanning images in cloud storage, the system uses a database of known CSAM image hasheds from NCMEC (National Center for Missing and Exploited Children), and other child safety organisations to match devices. Apple transforms the database into an unreadable set that can be securely stored on users’ devices.Apple gave more details on the CSAM detection process in a technical summary. It said that its threshold is "set to provide an extremely higher level of accuracy and ensures less then a one-in-one trillion chance per year of incorrectly flagging a particular account."Apple stated that the changes would be available "later in the year" with updates to iOS 15, iPadOS 15 and watchOS 8. Apple will also make available software that analyzes images in the Messages app to "warn parents and children when they receive or send sexually explicit photographs."Apple is accused of creating surveillance infrastructureSecurity experts and privacy advocates have criticized Apple's plan, despite its assurances.Greg Nojeim, codirector of the Center for Democracy & Technology’s Security & Surveillance Project, stated that Apple is replacing its industry-standard encrypted messaging system with an infrastructure for surveillance, censorship, and exploitation. This will make it vulnerable to abuse and scope creep not only in the US but all over the globe. Apple should reverse these changes and restore users' trust in the security of their data stored on Apple devices and services.Apple has held firm against US government pressure to add a "backdoor” in its encryption systems for years. Apple claimed that this would compromise security for all users. Security experts have praised Apple for its stance. Apple's plan to deploy software to scan devices and share the results with authorities is dangerously close to being used as a tool to spy on government, Johns Hopkins University cryptography Professor Matthew Green said on Twitter.He wrote that the client-side scanning Apple today announced could "be a key component in adding surveillance to encrypted message systems." Law enforcement around the globe has made it a priority to be able to add scanning systems such as this to E2E [end to end encrypted] messaging systems. AdvertisementMessage scanning and Siri interventionApple will scan devices for images matching the CSAM database. Apple also said that it will update the Messages app in order to "add new tools, to warn children and parents when they receive or send sexually explicit photographs.""Messages uses on device machine learning to analyze attachments and determine whether a photo contains sexually explicit material. Apple stated that the feature was designed to prevent Apple from accessing the messages.Flagging an image in Messages will cause it to be blurred. The child will then be warned and presented with resources. If they don't want to see the photo, they will be reassured that it is okay. Parents will be notified if their children view a flagged image. Similar protections will also be available for child senders of sexually explicit photos. Apple stated that the parent can send a message to the child if they choose.Apple stated that it will update Siri, Search and Help if parents or children encounter unsafe situations. Siri and Search will "intervene in queries related to CSAM" and explain to users why this topic is problematic and harmful.The Center for Democracy & Technology referred to the photo-scanning of Messages as a "backdoor," writing.Apple's scanning mechanism for images in Messages does not replace a backdoor. Client-side scanning of one "end" of a communication breaches the security of the transmission. Informing a parent (the parent) about the contents of the communication compromises its privacy. International organizations have warned against client-side scanning as it could be used to monitor private communications and allow governments or companies to do so.Apples technology to analyze imagesApple's technical summary about CSAM detection contains a few privacy promises in its introduction. It states that Apple does not know anything about images not matching the CSAM database. "Apple cannot access metadata or visual derivatives of matched CSAM photos until a threshold number of matches has been reached for an iCloud Photos accounts."Apple's NeuralHash hashing technology can be found here. It "analyzes an individual image and converts it into a unique number that is specific to that image." Apple explained that only images that are nearly identical can yield the same number. Images that differ in quality or size will still be able to produce the same NeuralHash value.The "device creates a Cryptographic Safety Voucher that encodes the match result before an iPhone or any other Apple device uploads an Apple image to iCloud. It also encrypts the NeuralHash of the image and its visual derivative. The image is also uploaded to iCloud Photos with this voucher. AdvertisementApple's "threshold secrets sharing" ensures that "the contents of the safety coupons cannot be interpreted unless the iCloud Photos accounts crosses a threshold for known CSAM content," according to the document. The cryptographic technology allows Apple to interpret safety vouchers that are associated with matching CSAM photos only when this threshold is exceeded.Apple acknowledged that there is a 1-in-1 trillion chance of false positives, but said it "manually examines all reports to NCMEC to ensure accuracy in reporting." If an account has been incorrectly flagged, users can appeal for their account to be reinstated.Blinded CSAM database stored on user devicesApple explained that user devices will keep a "blinded" database which allows them to identify when a photo matches a picture from the CSAM database.Apple first receives the NeuralHashes corresponding with known CSAM from these child-safety organisations. These NeuralHashes undergo a series transformations, which include a final blinding step powered by elliptic curve cryptography. Apple is the only one who knows the server-side blinding secret used to blinde. The blinded CSAM ishes are then placed in a table. The position in the table is purely a function the NeuralHash from the CSAM images. The blinded database is stored securely on the users' devices. Because of the properties of elliptic curve cryptography, no device can infer anything from the blinded CSAM image hashes.An iPhone or another device can analyze user photos and compute a NeuralHash. It will then look up the entry in the blinded haveh table. The blinded hash is also used by the device to find a derived encryption keys.Combining this with other steps ensures that only images matching CSAM databases will be decrypted. Apple wrote:If the user's image hash matches that of the CSAM entry, then the NeuralHash for the user image transforms to the blinded haveh if it has gone through the sequence of transformations at database setup. This property will allow the server to use the cryptographic head (derived from the NeuralHash), and, using the server-side key, compute the derived encryption keys and decrypt associated payload data. The above steps won't yield the correct derived encryption keys if the user image isn't compatible. In this case, the server will not be able to decrypt associated payload data. Thus, the server doesn't learn about images that aren't matching. Because the server-side blinding secret is required, the device does not learn the outcome of the match. The client then uploads the image and the voucher containing the encrypted payload data as well as the cryptographic header to the server.You can find the technical summary here, as mentioned earlier. Apple has also published a more detailed explanation of "private set intersection", a cryptographic technology used to determine whether a photo matches the CSAM databases without disclosing the result.