Apple will scan photos stored on iPhones and iCloud for child abuse imagery

Updated August 5, 2013, 3:21 PM ET: Apple has revealed more information about the Financial Times report and introduced new tools to iMessage which warn children about explicit images. These new features will be available in the next iOS 15 and iPadOS 15 updates, watchOS 8 and macOS Monterey. Apple's website has more information. You can find our original article here.According to the Financial Times, Apple will scan images stored on iPhones and iCloud in search of child abuse imagery. This new system may aid law enforcement in criminal investigations, but it could also open the door for increased government and legal demands for user data.The Financial Times reported that neuralMatch will alert a team made up of human reviewers in the event it detects illegal imagery. If the material is verified, they would contact law enforcement. neuralMatch was developed using 200,000 images obtained from the National Center for Missing & Exploited Children. It will be available in the United States first. The photos will be hashed, and then compared to a database of images that show child sexual abuse.It will be first used in the United StatesPeople briefed about the plans said that every photo uploaded to iCloud US will receive a safety voucher indicating whether the photo is suspect or not. Apple will allow all suspect photos to be encrypted and, if it appears illegal, pass them on to the appropriate authorities once a certain amount of photos have been marked as suspect.Matthew Green, a John Hopkins University professor and cryptographer, raised concerns on Wednesday night about the system via Twitter. Green stated that this tool can help people find child pornography on their phones. Imagine what this tool could do for an authoritarian government.He said that even if Apple doesn't allow the misuse of these tools [crossed fingers emoticon] there are still many things to be concerned about. These systems are based on a database that contains problematic media hashes, which you as a consumer can't view.Apple already inspects iCloud files for known child abuse imagery just like any other cloud provider. The system described here would allow central access to local storage. It would be trivial to expand the system to other crimes than child abuse, which is a concern due to Apple's large business in China.According to two security researchers briefed about Apple's earlier meeting, Apple informed US academics this week. Apple could share more information about the system this week.Apple has always praised the privacy protections in its devices and was famously able to resist the FBI's request that it create a backdoor in iOS to allow access to an iPhone that was used by one the attackers in the San Bernardino attack of 2015. The Financial Times report was not republished by the company.