Apple Says It Won't Let Governments Co-Opt CSAM Detection Tools

Apple has defended controversial new tools it plans to launch that help identify and report child sex abuse material (CSAM) on its platforms, after being subject to a lot of criticism.AdvertisementThe company revealed several updates last week. They will be available with the iOS 15 and iPadOS 15 releases. The first tool scans photos that have been uploaded to iCloud to identify signs of CSAM. The second scans iMessages that were sent to or from child accounts to prevent minors from sending or receiving explicit images. Here's a detailed overview of both features and our concerns.The company had barely time to announce its plans last Wednesday before it was met by a passionate outcry from civil liberty organizations. They characterized the proposed changes not only as well-intentioned, but also as a slippery slope towards a dangerous erosion in personal privacy.Apple responded to many of these concerns on Monday. Apple specifically denied the possibility that its scanning tools could be repurposed to search for other types of material on users' phones or computers than CSAM. Critics fear that Apple could be forced to change or add new features by a government (ours, or another's) in order to make them more effective for law enforcement.Apple, however, stated that it would not expand its scanning capabilities, in an uncommon instance of a company making a firm promise to not do something. According to Apple:Apple will not comply with any government demands. Apple's CSAM detection capability was created to identify known CSAM images in iCloud photos that have been identified and verified by NCMEC and other child safety organizations. We have been repeatedly refused requests to implement government-mandated changes that compromise the privacy of users. They will be refused in the future.Apple clarified in a Q&A session on Monday that the features were only being launched in the United States at the moment. Apple clarified Monday that the features are only being released in the United States. Although some have raised concerns about whether foreign governments could corrupt or subvert them to use them as surveillance tools, Apple stated Monday that it would carefully evaluate each country before releasing the tools abroad. This is to make sure there is no abuse.This whole thing has been confusing for many people. There are still questions about how they will work and what it means for privacy and device autonomy. Apple recently clarified a few points.Strangely, CSAM detection requires that iCloud be turned on in order for it to work. Although there has been some confusion over this, Apple only searches content that is already shared with its cloud system. Critics pointed out that this would make it extremely easy for abusers of Apple's informal dragnet to evade. All they would need to do to hide CSAM content from their phones would be to opt-out of iCloud. Apple stated Monday that it believes the system will still be effective.. Although there has been some confusion over this point, Apple only searches content that is shared with the cloud system. Critics pointed out that this would make it extremely easy for abusers of Apple's informal dragnet to evade. All they would need to do to hide CSAM content from their phones would be to opt-out of iCloud. Apple maintained Monday that the system would still be effective. Apple will not load a child porn database onto your phone. The company also clarified Monday that it would not be downloading actual CSAM to your phone. It is instead using a database that contains hashesdigital fingerprints for specific images of child abuse. These numbers are represented by numerical code. This code will be loaded into phones' operating systems. It allows images to be automatically compared to the database hashes. Apple does not care if they don't match.. The company also clarified Monday that it would not be downloading actual CSAM to your device. It is instead using a database that contains hashesdigital fingerprints for specific images of child abuse. These numbers are represented by numerical code. This code will be loaded into phones' operating systems. It allows images to be automatically compared to the hashes in that database. Apple won't care if they don't match. iCloud will scan more than new photos. It plans to scan all photos in its cloud system. Apple plans to scan all photos stored on its cloud servers, in addition to the photos that will be uploaded via iCloud. Apple confirmed this during Monday's conference call with reporters.. Apple plans to scan all photos stored on its cloud servers, in addition to the photos that will be uploaded via iCloud. Apple confirmed this during Monday's conference call with reporters. Apple claims that the iMessage update doesn't share any information with Apple nor with law enforcement. Apple claims that the iMessage update does not share personal information or alert law enforcement. It alerts parents if their child sends or receives a text message that Apple's algorithm has determined to be sexually explicit. This feature in Messages does not allow Apple to access communications. The company stated that this feature does not share information with Apple, NCMEC, or law enforcement. According to the company, this feature is only available for accounts set up in iCloud as family accounts.AdvertisementEven with all the assurances, security experts and privacy advocates are not impressed. Matthew Green, a well-known security expert, posed the following scenario Monday. It was so controversial that it sparked a Twitter dispute between Edward Snowden, ex-Facebook security chief Alex Stamos, and others in the reply section.AdvertisementLet's just say that many people have still questions. We were all in a very confusing, messy place here. It is impossible to discredit Apples mission but the sheer power of the technology it deploys has caused alarm to say the least.