Tech companies have struggled for years between two competing impulses: protecting users' privacy and detecting the worst abuse on their platforms. Apple has unveiled a new cryptographic system to detect child abuse images stored on iCloud, without in theory introducing new privacy invasions. It has also caused a divide between cryptography and privacy experts, who see it as an innovative solution and those who see it a dangerous surrender to government surveillance.Apple today introduced a new set technological measures in iMessage and iCloud. The new opt-in option in the family iCloud account will use machine learning and detect nudity in images that are sent to iMessage. It can block images from being sent, received or displayed warnings and, in certain cases, alert parents that a child has viewed or sent them. Search and Siri will now warn you if they detect that someone is looking for or viewing child sexual abuse material, also known by CSAM.Apple's most controversial and technically advanced feature is now available on iPhones, iPads, Macs. It integrates a new system that scans images uploaded to iCloud US for child sexual abuse images. This feature uses a cryptographic process on the device and on Apple's servers to detect the images and report them on to the National Center for Missing and Exploited Children (NCMEC) and finally to US law enforcement.Apple claims that none of the new features to deal with CSAM are intrusive. Even the iCloud detection mechanism will employ clever cryptography to stop Apple's scanning mechanism accessing visible images that aren’t CSAM. Apple announced the feature in collaboration with Dan Boneh (Stanford University Cryptographer), and includes endorsements by several other cryptography experts.The Apple PSI system is a great balance between privacy and utility. It will help identify CSAM content while keeping false positives low and user privacy high, Benny Pinkas (a cryptographer at Israel's Bar-Ilan University) stated in a statement to Wired.Children's safety organizations applauded Apple’s actions immediately. They argue that they strike a balance that "brings me a step closer towards justice for survivors of traumatic moments disseminated online," Julie Cordua (CEO of Thorn), stated in a statement to WIRED.Similar cloud storage providers such as Dropbox and Microsoft already detect images uploaded to their servers. Some privacy critics claim that Apple is also moving towards a disturbing new type of surveillance by adding image analysis to user devices. This has weakened Apple's historically strong privacy stance, despite being under pressure from law enforcement.I am not advocating child abuse. However, the idea that your device is continuously scanning and monitoring you based upon some criteria for objectionable material and conditionally reporting it the authorities is a slippery slope. Nadim Kobeissi is a cryptographer who founded the cryptography software company Symbolic Software in Paris. If this continues, I will definitely switch to an Android smartphone.The US, India, China, and the UK will all be under pressure. I am terrified of what that will look like.Apple's new system doesn't simply scan user images on company devices or on its iCloud servers. It is a sophisticated and complex new form of image analysis that prevents Apple from seeing any photos unless they are part of a larger collection of CSAM images uploaded to iCloud by a user. The system creates a "hash", which is a list of all images that a user has sent to iCloud. It then converts the files into strings of characters, which are unique derived from those images. It then compares the hashes with a large collection of CSAM images provided by NCMEC, to determine if there are any matches.Apple also uses a new type of hashing called NeuralHash. This company claims it can match images regardless of cropping and colorization. It also prevents evasion by not downloading the NCMEC hashes directly to users' devices. It uses cryptographic tricks to convert the NCMEC hashes into a blind database. This database is then downloaded to the user's device or computer. It contains seemingly meaningless strings of characters that are derived from those havehes. Blinding stops any user from obtaining hashes or using them to circumvent the system's detection.