Over a year has passed since Apple announced plans for three new child safety features, including a system to detect Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The CSAM detection feature has not been announced by Apple.

CSAM detection was supposed to be implemented in an update to iPadOS 15 by the end of 2021, but Apple decided to put the feature on hold due to feedback from customers, advocacy groups, researchers, and others.

The child safety page on Apple's website was updated in September 2021.

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Apple removed the above update and all references to its CSAM detection plans from its Child Safety page in December of 2021. We don't know if Apple has commented on the plans since then.

We want to know if the feature is still in the works. Apple did not reply immediately.

The Messages app feature will be expanded to Australia, Canada, New Zealand, and the UK in May 2022.

CSAM was designed with user privacy in mind, according to Apple. The system would use a database of known CSAM image hashes from child safety organizations, which Apple would transform into an unreadable set of hashes that is securely stored on users' devices.

The National Center for Missing and Exploited Children is a non-profit organization that works with U.S. law enforcement agencies to find missing and exploited children. There would be a threshold that would prevent an account from being wrongly flagged and a manual review of flagged accounts.

Apple's plans were criticized by a wide range of people, including security researchers, politicians, policy groups, and even some Apple employees.

Government or law enforcement agencies could use Apple's child safety features to surveil users if they wanted to. CSAM imagery can be added to another person's account in order to get their account flagged.

The discussion thread is located in the Political News forum due to the political nature of the discussion. Only forum members with at least 100 posts are allowed to post on the thread.