WhatsApp lead and other tech experts fire back at Apple's Child Safety plan

WhatsApp's Will Cathcart, WhatsApp's chief, said that they wont adopt Apples Child Safety measures. These are designed to prevent child abuse imagery from spreading. He posted a tweet explaining his belief that Apple software can scan your private photos and that Apple is not doing enough to address child sexual abuse material.Apple's Thursday announcement outlined a plan to take hashes from images uploaded to iCloud, and compare them to a database that includes hashes for known CSAM images. Apple claims that this allows it to protect user data and allow it to run analysis on-device. However, it can still report users to authorities if it finds they are sharing child abuse imagery. Apple's Child Safety Strategy also allows parents to be notified if their child is under 13 and views or sends photos with sexually explicit content. Apple admitted in an internal memo that they were concerned about the potential consequences of these systems.I was concerned by the information Apple released yesterday. This is a bad approach that will result in a loss of privacy for everyone around the globe.Many people have asked whether we will adopt this system to WhatsApp. No. Will Cathcart (@wcathcart), August 6, 2021Cathcart finds Apple's approach concerning. He says it would allow different governments to have different opinions about what type of images are acceptable. Cathcart claims WhatsApp's system to combat child exploitation, which partially utilizes user reports and preserves encryption like Apples, has resulted in the company reporting more than 400,000 cases to National Center for Missing and Exploited children in 2020. Apple is also working with the Center to detect CSAM.Facebook, WhatsApp's owner, has good reasons to attack Apple over privacy concerns. Apple's changes in ad tracking in iOS 14.5 sparked a row between the companies. Facebook bought newspaper ads criticizing Apple's privacy changes as being harmful to small businesses. Apple retorted, saying that the changes only require that users have the option of being tracked.Cathcart finds Apples approach to be very troubling, and he's not the only one.However, it's not just WhatsApp who has criticised Apple's Child Safety measures. Edward Snowden, Professors, the Electronic Frontier Foundation and other organizations are just a few of those who have raised concerns. Here are some reactions to Apple's new policy.Matthew Green, an associate professor from Johns Hopkins University, opposed the feature before it was made public. He also tweeted about Apple's plans and how the hashing system might be misused by governments and malign actors.These tools allow Apple to scan iPhone photos and find photos that match a certain perceptual hash. If too many are found, Apple will report them to Apple servers. Matthew Green (@matthew_d_green), August 5, 2021EFF issued a statement blasting Apples plan. It describes how governments could abuse Apples Child Safety measures and how they can decrease privacy.Apple's filtering of iMessages and iCloud does not lead to backdoors that suppress speech or make our communications less secure. This is an already-existing system that just needs external pressure to make any changes. https://t.co/f2nv062t2n EFF (@EFF) August 5, 2021Kendra Albert, an instructor in Harvards Cyberlaw Clinic has written a thread about the potential dangers of queer children and Apple's initial lackluster clarity regarding the age ranges for parental notifications.Although it is admirable that teens believe parents are safe to discuss sex with, many people don't believe this. This stuff does not only apply to children under 13. Kendra Albert (@KendraSerra) August 5, 2021EFF claims that nudity notifications for iMessage will not be sent to parents if the child is aged between 13-17, but this is not stated anywhere in Apple documentation. https://t.co/Ma1BdyqZfW Kendra Albert (@KendraSerra) August 6, 2021Edward Snowden tweeted the Financial Times article on the system and gave his interpretation of Apple's actions.Apple will modify iPhones to scan for contraband constantlyRoss Anderson, professor in security engineering, stated that it is a terrible idea because it will lead to widespread bulk surveillance of our phones. https://t.co/rS92HR3pUZ Edward Snowden (@Snowden) August 5, 2021Brianna Wu, a political scientist, called the system the worst idea ever in Apple History.This is the worst Apple idea I have ever heard, and it's not something I lightly say.It damages their privacy credibility. It will be misused by governments. It will result in gay children being killed and disowned. This is the most terrible idea. https://t.co/M2EIn2jUK2 Brianna Wu (@BriannaWu) August 5, 2021Let me clarify: Apple's scanning doesn't detect child abuse photos. It finds a list of banned images that have been added to a database. These are child abuse images originally circulated elsewhere. It is not clear how images are added over time. It doesn't know who a child is. SoS (@SwiftOnSecurity), August 5, 2021Matt Blaze, a writer, also tweeted concerns about how the technology could be misused by governments that are too far-reaching and trying to block content other than CSAM.This means that not only must the policy be extremely robust but so must its implementation. matt blaze (@mattblaze) August 6, 2021Epic CEO Tim Sweeney also criticized Apple. He said that Apple automatically uploads all data to iCloud.It is a horrible way that Apple dumps everyone's data into iCloud automatically, hides the 15+ options to turn it off in Settings under your name, and makes you have an unwelcome email account. Apple will NEVER allow third parties to send an app like this. Tim Sweeney (@TimSweeneyEpic), August 6, 2021Later, I will share my thoughts. Tim Sweeney (@TimSweeneyEpic), August 6, 2021However, not all reactions have been negative. Ashton Kutcher, who has been involved in advocacy work to end child sex traficking since 2011, calls Apples' work a significant step forward for efforts towards eliminating CSAM.