Apple Employees Internally Raising Concerns Over CSAM Detection Plans

According to Reuters, Apple employees now join the chorus of people raising concerns about Apple's plans for scanning iPhone users' photo libraries to find CSAM or child abuse material. They are reportedly speaking internally about how technology could be used to scan photos for other content.According to Reuters, unspecified numbers of Apple employees have used internal Slack channels in order to voice concerns about CSAM detection. Employees are worried that Apple could be forced to use technology for censorship by finding content not CSAM. Apple's industry-leading privacy reputation is being questioned by some employees.Apple employees have flooded an Apple internal Slack channel, with over 800 messages about the plan that was announced a week back, workers who requested anonymity told Reuters. Workers who witnessed the lengthy thread said that many expressed concerns about the possibility of the feature being exploited by repressive countries looking for other material to censor or arrest. Employees have expressed concern about past security measures at Apple. However, workers found the length and volume of the current debate surprising. Some posters expressed concern that Apple may be undermining its reputation for privacy protection.According to the report, Apple employees working in user security roles are not believed to have been involved in the internal protest.Apple has been under fire ever since it announced its CSAM detection plans last week. They are expected to be released with iOS 15 and iPadOS 15 in the fall. The main concern is that the technology could be used to implement oppressive regimes and governments in the future.Apple strongly refutes the notion that the technology used to detect CSAM material on the device could be used for other purposes. Apple has stated in a FAQ that it will refuse to comply with any government request.Are governments able to force Apple to include images from non-CSAM countries to its hash list?Apple will not comply with such requests. Apple's CSAM detection technology is designed to identify known CSAM images in iCloud photos that have been identified and verified by NCMEC and other child safety organizations. We have been repeatedly refused requests to implement government-mandated changes that compromise the privacy of users. They will be refused in the future. This technology can only detect CSAM stored on iCloud. We will not accept any request from the government to expand it. Apple also conducts human reviews before submitting a report to NCMEC. If the system flags photos not matching known CSAM images, it will not disable the account and file a report to NCMEC.Open letter critiquing Apple and asking for the company to stop its plan to deploy CSAM detection at once has more than 7000 signatures as of the writing. WhatsApp's chief has also participated in the discussion.