Apple's controversial new child protection features, explained

Apple is committed to privacy. Apple has advocated encrypted messaging throughout its ecosystem, pushed for data limits in mobile apps, and fought law enforcement agencies seeking user records. Apple has been fighting claims that the upcoming iOS and iPadOS releases will compromise user privacy for the past week.Apple's Thursday announcement prompted the debate. The idea behind the debate is simple. Apple wants to stop child sexual abuse and it's taking more steps to do so. Critics say Apple's strategy may weaken user control over their phones and leave them dependent on Apple's promise not to abuse its power. Apple's response has shown just how complex and confusing the conversation can be.What announcements did Apple make last week?Apple announced three new features that it will implement later in the year. They are all related to child sexual abuse prevention, but they target different apps with different feature set.Apple's Search app and Siri are affected by the first change. Apple will redirect users to resources to help them report child sexual abuse or get help for their attraction if they search for such topics. It will be available on iOS 15, watchOS 8 and iPadOS 15 later this year.However, the other updates have generated far greater backlash. One update adds a parental control option for Messages. This hides explicit images from users under 18, and sends parents an alert if a child under 12 views or sends such pictures.The last feature scans iCloud Photos photos to identify child sexual abuse material. It then reports the information to Apple moderators, who can forward it on to NCMEC. Apple claims that this feature is designed to protect users privacy and find illegal content. Critics claim that the same design amounts to a security backdoor.What's Apple doing with MessagesApple has introduced a Messages feature to protect children against inappropriate images. Parents can opt in to have their devices scan outgoing and incoming photos with an image classifier that is trained on pornography. This will look for explicit material. Apple claims that it is not limited to nudity, but that a nudity filter would be a fair description. The classifier will detect this content and it will hide the image in question, asking the user if they want to view it or not.An additional option is available for accounts that have been set up as family members in iCloud iOS 15 and iPadOS 15. If the user taps on that warning, and they are under 13, messages will allow them to notify their parents. The caption will warn children that parents will be notified, but the actual message will not be visible to them. The system does not report any information to Apple moderators, or other parties.Apple claims that the images are automatically detected on-device. Parents are notified when children consent to seeing or sending adult content. Not if they just receive it. Kendra Albert, a Harvard Cyberlaw Clinic instructor, has raised concerns about the notification, claiming that they could encourage their parents to spy on queer and transgender children.What is Apple's new iCloud photos scanning system?The iCloud Photos scanning program is designed to find child sexual abuse photos that are illegal to possess. If you are an American iOS or iPadOS user, and you sync photos with iCloud Photo, your device will check the pictures against a list known CSAM. It will notify Apples moderators if it finds enough matches and provide details. If a moderator finds CSAM in an account, it will disable the account and report the images the legal authorities.Is CSAM scanning a novel idea?Not at all. Reddit and Facebook scan user files against hash library databases. Many other companies use a Microsoft-built tool called PhotoDNA. They are also legally required by law to report CSAM the National Center for Missing and Exploited Children, a non-profit that works alongside law enforcement.Apple has been limited in its efforts to date. Apple has previously stated that it uses image matching technology in order to detect child exploitation. In a phone call, the company said it had never scanned iCloud Photos data. (It confirmed it had scanned iCloud Mail, but did not provide any further information about scanning other Apple services.What makes Apple's new scanning system different than other companies?A typical CSAM scan is performed remotely and examines files stored on a server. Apple's system checks for matches on your iPhone/iPad.It works like this. The device activates iCloud Photos on a device and uses a tool called NeuralHash. This breaks down the pictures into hashes, which are strings of numbers that identify unique characteristics of the image but cannot be reconstructed to reveal it. It then compares the hashes with a NCMEC-stored list of hashes, which compiles millions corresponding to known CSAM contents. (Again, the hashes are not accompanied by any videos or pictures, as stated above.Your phone will generate a safety coupon that is uploaded to iCloud Photos if Apple's system finds a match. Although each safety voucher is a sign that there is a match, it does not alert any moderators. It encrypts all details so that an Apple employee cannot view them and determine which photo matches. If your account generates a certain amount of vouchers, all the vouchers are decrypted and flagged by Apples human moderators. They can then examine the photos to see if the CSAM is present.Apple insists that it only looks at photos that have been synced with iCloud and not those that are stored on your device. Reporters are advised that disabling iCloud Photos will deactivate all aspects of the scanning system, which includes the local hash generator. In an interview with TechCrunch, Erik Neuenschwander, Apple privacy chief, stated that NeuralHash would not run if users don't use iCloud Photos and will not generate vouchers.Apple has used on-device computing to improve its privacy credentials in the past. iOS can do a lot more AI analysis than sending data to cloud servers. This means that third parties have fewer chances to access your data.Apple, however, has been drawing very subtle lines between remote and local for the past few days after a backlash.Why is it that some people are so upset by these changes?Before we go into the criticism, it is worth mentioning that Apple has received praise from privacy and security experts such as Dan Boneh, Dan Forsyth and Mihir Bellare. In an Apple endorsement, Forsyth stated that this system will increase the chances of people who have access to [CSAM] being found. Users who are harmless should not experience any loss of privacy.However, other experts and advocacy organizations have criticized the changes. The Messages and iCloud updates are creating surveillance systems that can be accessed directly from your smartphone or tablet, they claim. This could be a way to break secure end-to–end encryption. Even though it is currently limited, it could lead to even more disturbing privacy invasions.The complaints are detailed in an August 6th open letter. This is the description of what's happening:Although child exploitation is a grave problem and efforts to combat it have been made with great intentions, Apple's proposal introduces an backdoor that could undermine the fundamental privacy protections of all Apple users. Apple's technology monitors photos that are saved to or shared on users iPhones, iPads, and Macs. One system alerts authorities if there are a certain amount of unapproved photos in iCloud storage. A second system notifies parents if iMessages are used to send or to receive photos that a machine-learning algorithm considers nudity. Both checks are done on the user's device and could bypass any encryption that would otherwise protect their privacy.Apple has refuted the above characterizations, especially the use of the term "backdoor" and the description that allows users to monitor photos stored on their devices. It is asking users to trust Apple while it is under pressure from governments around the globe.What is end-to-end encryption?End-to-end encryption, or E2EE, makes data inaccessible to anyone but the sender and the receiver. In other words, it is impossible for the company that runs the app to see it. Although less secure systems may still be encrypted, companies may have keys to the data that they can use to scan files or give access to law enforcement. Apple's iMessages use E2EE, whereas iCloud Photos uses a similar cloud storage service, iCloud Photo.E2EE is a powerful tool, but it does not prevent people from accessing data stored on the phone. This leaves the door open to specific types of surveillance, such as client-side scanning.What is client-side scanning?The Electronic Frontier Foundation provides a detailed description of client-side scanning. It involves scanning files and messages within an app to determine if they are encrypted. Sometimes, this is done to check for any objectionable content. In some cases, the scan bypasses E2EE protections by targeting the device. Erica Portnoy, senior technologist at EFF, compared these systems with someone watching over you while you send a secure message to your phone.Is Apple doing client-side scanning?Apple strongly denies this. Apple claims that Messages is still encrypted at the end and that no information about particular messages are being made public to anyone, even parents, in a frequently asked question document. This feature in Messages guarantees that Apple will not gain access to any communications.It rejects the idea that it scans photos on your device to provide CSAM. This feature is only applicable to photos uploaded to iCloud by design, according to its FAQ. Users who have iCloud photos disabled will not be able to use the system. This feature doesn't work with your private iPhone photo collection.Apple admits that iCloud Photos does not have any E2EE vulnerabilities, so it could run these scans on the servers like many other companies. Apple claims that its system is more secure. Apple claims that only 1 in 1 trillion accounts could have CSAM, so most users won't need it. Apple claims that it will not expose any photos of anyone else using its local scanning system. This would be false if it had scanned its servers.Are Apples arguments convincing enough?Many of its critics disagree. Stratechery's Ben Thompson writes that the problem is not whether Apple sends notifications only to parents or limits its search to certain categories of content. The company is actually searching data before it leaves your smartphone.Apple is compromising your phone, which you and I both own and manage, instead of adding CSAM scanning in iCloud Photos in their cloud. You can disable iCloud Photos to stop Apples scanning. However, that is a policy decision. The ability to access a user's phone is now available and it is impossible for an iPhone user to remove it.CSAM is illegal, and it is abhorrent. However, the open letter to Apple points out that many countries have tried to compromise encryption to fight terrorism, misinformation and other objectionable content. Apple has established a precedent and will likely be asked to increase it. Apple could also roll out end-to–end encryption for iCloud, something it had reportedly considered, but never implemented. It has also laid out a possible route for bypassing E2EEs protections.Apple claims it will not allow anyone to abuse its systems. It boasts many safeguards. These include the fact that parents cannot enable alerts for older teens via Messages, that iClouds safety coupons are encrypted, and that it has a threshold for alerting moderators. Additionally, its searches are US-only, and limited to NCMECs databases.Apple's CSAM detection capability was created solely to detect known CSAM photos stored in iCloud Photo that have been identified and verified by NCMEC and other child safety organizations. We have been repeatedly refused requests to implement government-mandated changes that compromise the privacy of users. They will be refused in the future. This technology can only detect CSAM stored on iCloud. We will not accept any request from governments to expand it.Apple is able to alter these safeguards. Portnoy says that half the problem is the system's simplicity. Apple has remained firm in its resolve during some conflicts with governments. It famously defied the Federal Bureau of Investigation request for data from an iPhone that was used to carry out mass shootings. However, Apple has complied with other requests such as storing Chinese iCloud data locally. Even though it maintains that it has not compromised user security.Alex Stamos, a Stanford Internet Observatory professor, also asked how Apple has worked with the wider encryption community. He stated that Apple had refused to take part in discussions about privacy, encryption, and safety. He tweeted that they had just gotten into the balancing discussion and shoved everyone to the extremes without any public consultation or debate.What are the risks and the benefits of Apple's new features?It's complicated as usual. It depends on whether you view this change as an exception or opening door.Apple has good reasons to increase its child protection efforts. The New York Times reported in late 2019 of an epidemic of online child sexual abuse. The Times criticized American tech companies for not addressing the spread of CSAM. In a later article NCMEC singled Apple out for its low reporting rates compared with peers like Facebook. This was partly due to Apple's inability to scan iCloud files.Internal Apple documents reveal that iMessage is plagued by sexual predators. Epic v. Apple documents reveal that an Apple department head identified child predator grooming in the Epic v. Apple trial as an active threat to the platform. Grooming involves sending or asking children to send sexually explicit images to children, which is precisely what Apple's new Messages feature aims to disrupt.Apple has also called privacy a human rights. Apple's Messages and iCloud Changes have demonstrated two methods to search for or analyze content on the hardware. This is in addition to sending data to a third-party, even if it's analyzing data you consent to, such as iCloud photos.Apple has already acknowledged that there are some issues with its updates. Apple has not yet indicated any plans to alter or discontinue the updates. An internal memo on Friday acknowledged the misinterpretations but praised the improvements. Today we are announcing the result of this amazing collaboration. It provides tools that protect children but also keeps Apple's deep commitment to privacy. We are aware that there may be misunderstandings and some people are concerned about the consequences. However, we will continue to explain the features and provide details so that people fully understand what we have created.