Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Apple has shown new child safety features which will be available on its platforms as software updates are made later in the year. Apple stated that the features will only be available in the United States at launch, but will expand to other regions as time goes by.Communication SafetyThe new Communication Safety feature will be available in the Messages app for the iPhone, iPad and Mac. It is designed to warn parents and children when they receive or send explicit photos. Apple stated that the Messages app will use machine learning on-device to analyze attachments and if a photograph is found to be sexually explicit, it will automatically blur the photo and warn the child.A sensitive photo will alert a child if they attempt to view it in the Messages app. The photo could contain private parts and may be harmful. Parents will be notified depending on their child's age if their child continues to view sensitive photos or if they send an explicit photo to another person after being warned.Apple announced that the new Communication Safety feature would be available in iOS 15, iPadOS 15 or macOS Monterey updates later in the year. This will apply to accounts created as families in iCloud. Apple assured that iMessage conversations would remain secure with end-to–end encryption. This will make private communications impossible to read by Apple.Scanning Photos for Child Sexual Abuse Material (CSAM)Second, Apple will be able, beginning this year with iOS 15 and iPadOS 15 to detect known Child Sexual Abuse Material images (CSAM), stored in iCloud Photos. This will allow Apple to report such instances to the National Center for Missing and Exploited Children, a non-profit organization working in collaboration with U.S. authorities.Apple claims its method for detecting known CSAM was designed to protect user privacy. Apple claims that the system will not scan images in the cloud but instead match them against a database of known CSAM hashes from the NCMEC and other child security organizations. Apple stated that it will transform this database into an unreadable collection of hashes, which can be securely stored on the users' devices.NeuralHash is a hashing technology that analyzes images and converts them to a unique number, according to Apple.Apple's new white paper "Expanded Protections for Children", stated that the main purpose of the hash was to ensure identical images are visually similar. Images that differ from each other result in different hashes. An image that has been cropped or resized, or converted from one color to another is treated the same as its original and has the exact same hash.Apple stated that before an image is saved in iCloud Photos it is checked against an unreadable set CSAM hashes. The device creates a cryptographic safety coupon if there is a match. The voucher and the image are uploaded to iCloud Photos. Apple can then interpret the contents of the vouchers to determine CSAM matches once an undisclosed threshold has been reached. Apple reviews each report and confirms that there is a match. Then, Apple disables the user’s iCloud account and sends NCMEC a report. Apple does not disclose its threshold, but it ensures that accounts are correctly flagged.Apple claimed that its method of detecting known CSAM offers "significant privacy advantages" over other techniques.This system allows you to quickly identify known CSAM in iCloud Photos accounts, while also protecting your privacy.Users cannot also learn about the set CSAM images used to match. This prevents malicious use of the database's contents.The system is extremely accurate with a very low error rate of less that one trillion accounts per year.This system is much more privacy-preserving that cloud-based scanning because it only reports users with a collection known CSAM stored within iCloud Photos.Apple's underlying technology is complex. Apple has published a technical summary that provides more information.Apple's expanded protection of children is a game-changer. These new safety measures offer lifesaving potential for children who have been lured online by horrific images. The National Center for Missing & Exposed Children knows that this crime cannot be stopped unless we remain committed to protecting children. Technology partners like Apple are essential in enabling us to achieve this goal. Privacy and child protection can coexist, it's a fact. We are proud of Apple and look forward working with them to make the world safer for children.Siri and Search: Expanded CSAM GuidanceApple also stated that it will expand guidance in Siri and Spotlight Search across all devices. This will provide additional resources to parents and children to keep them safe online and help them with unfavorable situations. Siri will direct users to the right resources to help them file a report on child exploitation or CSAM.Apple has announced that Siri and Search will be updated in iOS 15, iPadOS 15, watchOS 8 and macOS Monterey later this year.