Apple's controversial plan to try to curb child sexual abuse imagery

Apple's announcement of changes to iOS devices to curb child abuse and find child sexual abuse material (CSAM) drew backlash.
It first announced an update to the Search app and Siri voice assistant for iOS 15, watchOS 8 and iPadOS 15. Apple redirects users to resources to report child sexual abuse or to help them attract such content when they search for topics.

Apples' two other CSAM plans have been criticized. One update will include a parental control option for Messages. This will send parents an alert if their child (12 years old or younger) views or sends explicit images. Images will also be hidden from anyone under 18.

Apples plan for scanning on-device photos to identify CSAM, before images are uploaded to iCloud is controversial. The images will be reported to Apples moderators, who can then forward the images to the National Center for Missing and Exploited Children. Apple claims the feature will protect users and allow the company to find illegal material. However, privacy advocates, Apple critics, and privacy advocates claim that the provision is a security backdoor. This seems to contradict Apple's long-promised commitment to privacy.

Follow our storystream to keep up-to-date with the latest news regarding Apples CSAM Protection Plans. We will update it whenever we learn of a new development. Our explainer is a good place to start.