Apple has announced that it has delayed the rollout the Child Safety Features, which it had announced last month after receiving negative feedback.
These features include the scanning of users' iCloud Photos libraries to find Child Sexual Abuse Material, Communication Safety to warn parents and children when they receive or send explicit photos, and enhanced CSAM guidance in Siri Search and Search.
Apple stated that the delay was due to the feedback received from customers, advocacy and non-profit groups, researchers and others regarding the plans. Apple released the following statement regarding its decision:
Last month, we announced plans to release features to protect children against predators using communication tools to recruit and abuse them and to limit the spread Child Sexual Abuse Material. We have taken additional time to gather feedback from advocates groups, researchers, and other stakeholders before we release these critical child safety features.
The announcement of the features was criticized by many people and organizations. These included security researchers, Edward Snowden (privacy whistleblower), the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians and policy groups, university researchers, as well as Apple employees. Apple has attempted to clarify misunderstandings and reassure users since then by sharing detailed information, FAQs, new documents, interviews and other activities.
Original plans were for the suite of Child Safety Features to be released in the United States as part of an iOS 15 update, watchOS 8 and macOS Monterey. Although it is not clear when Apple will roll out these "critically important" features at this time, the company seems determined to release them.