Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Apple announced this week that it will be able to detect child sexual abuse material (CSAM) images in iCloud Photos later in the year. This will allow Apple to report the instances to the National Center for Missing and Exposed Children. The non-profit organization works with law enforcement agencies throughout the United States.Some security researchers and others are concerned that Apple might be forced to include images other than CSAM to its hash list by governments for nefarious purposes.Edward Snowden, a prominent whistleblower, stated that Apple is extending mass surveillance to the entire globe with this. "No matter what their intentions may be, Apple is doing this," he said. He added that they could scan for child porn today and tomorrow. Apple's plans were also criticised by the Electronic Frontier Foundation, which stated that even a well-documented, meticulously planned, and narrowly-scoped backdoor, it is still a backdoor.Apple has provided additional commentary today to address these concerns.Apple's CSAM detection system is only available in the United States. To address the possibility of some governments trying to abuse the system at launch, Apple has confirmed to MacRumors to MacRumors, that it will evaluate any global expansion plans on a country by country basis following a legal assessment. Apple didn't give a timeline for the global expansion of its system.Apple also discussed the possibility that a region of the world might corrupt a safety organisation in an effort to abuse the system. Apple noted that the first layer of protection for the system is an undisclosed threshold, before users are flagged with inappropriate imagery. Apple stated that even if the threshold was exceeded, its manual review process would act as an additional barrier to confirm that there is no known CSAM imagery. Apple stated that it would not report the flagged user NCMEC or other law enforcement agencies, and that the system would continue to work exactly as intended.Apple highlighted other supporters of the system and some parties praised the company's efforts to combat child abuse.Stephen Balkam CEO of Family Online Safety Institute said, "We support Apple's continued evolution in child online safety." "Tech companies must continue to improve their safety tools in order to address new risks and actual harms, given the difficulties parents face when protecting their children online," said Stephen Balkam, CEO of the Family Online Safety Institute.Apple admitted that there is no magic bullet solution to the problem of the system being abused. However, the company stated that it will continue to use the system only for CSAM imagery detection.