Apple delayed plans to rollout its child sexual abuse detection technology (CSAM), which it announced in a chaotic announcement last month. It cited feedback from policy groups and customers.
If you remember, the majority of this feedback has been negative. Electronic Frontier Foundation claimed this week that it has gathered more than 25,000 signatures of consumers. Nearly 100 rights and policy groups, including American Civil Liberties Union, called for Apple to halt plans to rollout the technology.
Apple stated in a statement Friday morning that it had informed TechCrunch:
Last month, we announced plans to release features to protect children against predators using communication tools to recruit and abuse them and to limit the spread Child Sexual Abuse Material. We have taken additional time to gather feedback from advocates groups, researchers, and other stakeholders before we release these critical child safety features.
Apple's NeuralHash technology, which is a type of Apple scanning that identifies known CSAM on a user's device, does not require the user to have the image or know the contents. Apple claims that NeuralHash scans for known CSAM in a user's device instead of blanket scanning as cloud providers do. This is because photos stored in iCloud can be encrypted from end to end so Apple cannot access them.
Privacy advocates and security experts expressed concern that highly-resourced actors could abuse the system, such as governments, to indict innocent victims or manipulate the system to find other materials that authoritarian nations consider objectionable.
Researchers announced the technology within a few weeks. They were able create NeuralHash hash collisions, tricking the system into believing two completely different images are the same.
iOS 15 will be released in the coming weeks.
Continue reading: