Apple reportedly plans to begin scanning iPhones in the US for child abuse images

Apple is planning to update its iPhone software so it can scan for images of child sexual abuse. According to the Financial Times, Apple has been briefing security experts on its neuralMatch system. This would scan all photos stored on an iPhone by US users and upload them to its iCloud backup system.If it detects illegal imagery, the system will alert a team made up of human reviewers. Human reviewers would also alert law enforcement when images are verified. According to the report, neuralMatch was developed using data from the National Center for Missing and Exploited Children. It will only be able to detect images on iPhones within the United States.Apple would likely make a change of heart as it has always stood up for privacy and defended law enforcement. After refusing to unlock the iPhone of the terrorist who attacked San Bernardino, the FBI clashed with Apple in 2016. Tim Cook, CEO of the company, stated that the request by the government was alarming and could have serious consequences. This could lead to more government surveillance. (The FBI eventually turned to an outside security company to unlock the phone.Security researchers now have similar concerns. Although there is broad support for the fight against child abuse, the FT interviewed researchers to discuss the possibility of authoritarian regimes spying on their citizens. A system that detects one type of imagery can be extended to include other types, such as terrorism and other content considered anti-government.Apple and other companies are under increasing pressure to cooperate with law enforcement. The report notes that social media platforms like iCloud and cloud storage providers such as iCloud already have systems for detecting child sexual abuse imagery. However, extending these efforts to images on a device is a major shift for Apple.Apple declined to comment on FT's request, but could provide more information about its plans within the next week.Update 8/5 at 4PM ET: Apple has confirmed that it will begin testing a system to detect images of child abuse stored in iCloud Photos within the United States. Apple's method for detecting known CSAM was designed to protect user privacy. The company stated that the system does not scan images in the cloud but instead uses a database of known CSAM images hashes from NCMEC and other child security organizations to match devices. Apple further converts this database to an unreadable set hashes that can be securely stored on the users' devices.This update will be released at a later time along with many other child safety features such as new parental controls that can detect explicit images in children's messages.