Apple says it will begin scanning iCloud Photos for child abuse images ' TechCrunch

Apple claims it will start scanning iCloud Photos to find child abuse images. However, security and privacy experts are already resisting the feature.Apple will soon release a technology that allows it to report child sexual abuse material to law enforcement. It claims this technology will protect user privacy.TechCrunch was informed by Apple that it has added a new feature, the detection of child sexual abuse material. This is in addition to filters to prevent children from being sent or received sexually explicit images through an iMessage account. A second feature will be activated when a user attempts to search Siri or Search for CSAM-related terms.Cloud services like Dropbox, Google, Microsoft, and Microsoft scan files for illegal content. Apple, however, has long resisted the scanning of users files in the cloud. Instead, it offers users the option to encrypt data before it reaches Apple's iCloud servers.Apple claimed that its new CSAM detection technology NeuralHash works instead on a user's device and can identify if a child abuse victim uploads images to iCloud. The images are not encrypted until a threshold has been reached and then a series of checks are run to confirm the content.News of Apples was leaked Wednesday by Matthew Green, a Johns Hopkins University cryptography professor, who revealed the existence of the new technology via a series of tweets. Some privacy advocates and security experts reacted to the news with caution, but there was also resistance from users who are familiar with Apple's approach to privacy and security that is different than most companies.Apple is trying calm fears through layers of encryption that encrypt privacy. This encryption requires multiple steps to be able to make it into Apple's final manual review.NeuralHash is coming to iOS 15 and macOS Monterey. It will be available in the next month. The software converts photos from an iPhone or Mac into a unique string consisting of numbers and letters, called a hash. If you make any changes to an image, it can alter the hash. This can cause problems with matching. Apple claims that NeuralHash attempts to ensure identical images, such as edited or cropped images, are returned with the same hash.Before images are uploaded to iCloud Photos they are matched against a database of known child abuse imagery hashes, which is provided by child protection agencies like the National Center for Missing & Exploited Children, and others. NeuralHash employs a cryptographic technique known as private set intersection to detect a match. It does not reveal the image or alert the user.Although the results are uploaded to Apple, they cannot be read by themselves. Apple uses a cryptographic principle called threshold sharing. This allows it to only decrypt contents if a user exceeds a threshold of child abuse imagery in their iCloud photos. Apple did not specify what the threshold was but stated that, for example, if a secret has been broken into 1,000 pieces and the threshold contains ten images of child abusive content, it can be decrypted from any of those ten images.Apple can then decrypt the images and manually verify their contents. Users can also be disabled from an account. The imagery is reported to NCMEC which is passed on to law enforcement. Apple claims this is more privacy-conscious than scanning files in cloud because NeuralHash only scans for child abuse imagery that is known. Apple stated that false positives are possible one in a trillion times, but an appeals process is available in the event of an account being flagged incorrectly.Apple has posted technical information on its website regarding NeuralHash, which were reviewed by cryptography specialists.However, despite widespread support for efforts to combat child sexual abuse and surveillance, many people would be uncomfortable giving it over to an algorithm. Security experts call for more public discussion before Apple releases the technology to the public.The big question is "Why now?" Apple claimed that its privacy-preserving CSAM detection was not available until now. Companies like Apple have been subject to intense pressure from the U.S. government, its allies, and others to weaken or backdoor encryption to protect their users' data in order to permit law enforcement to investigate serious crimes.Tech companies have refused to allow government access to their systems but have been resisting attempts to do so. Despite the fact that data in iCloud can be accessed by Apple, Reuters reported that Apple abandoned plans to encrypt users' full phone backups to iCloud because it could harm investigations.The announcement about Apple's new CSAM detection tool sparked concern that it could be misused to flood victims in child abuse imagery. This could lead to their account being flagged and closed down. However, Apple downplayed these concerns and stated that a manual review would examine the evidence for potential misuse.Apple stated that NeuralHash would initially be available in the U.S., but did not specify when or if it would be made available internationally. After the practice was accidentally banned, Facebook and other companies had to turn off their child abuse detection tools. Apple stated that the feature was technically voluntary in that users don't have to use iCloud Photos. However, it will become a requirement if they do. Apple's cloud doesn't belong to your device, but it does belong to you.