Apple has a feature called communication safety in Messages that automatically blurs nudity sent to children using the company's messaging service. Users in the UK, Canada, New Zealand, and Australia will now be able to use the feature in the Messages apps on all of the Apple platforms. The timing feature is coming to the UK, but it is not certain.
The end-to-end encryption of messages is not impacted by scanning. Instructions on how to enable the feature, which is integrated with Apple's existing Family Sharing system, can be found here.
The opt-in feature scans incoming and outgoing pictures for sexually explicit material to protect children. If found, the image is blurred and guidance is provided for finding help, as well as reassurances that you are not alone and can always get help from someone you trust.
Messaging an adult about an image is optional
Similar to its initial release in the US, children will have the option of messaging an adult they trust about a flagged photo. When Apple first announced the feature, it suggested that it would happen automatically. Critics pointed out that this approach risked exposing queer kids to abuse.
If users search for topics relating to child sexual abuse, Apple will point them to safety resources.
In August of last year, Apple announced a third initiative that involved scanning photos for child sexual abuse material before they are uploaded to a user's account. Privacy advocates argued that the feature risked introducing a back door that would undermine the security of Apple's users. The company said it would delay the roll out of all three features while it addressed concerns. Apple has yet to give an update on when the CSAM detection feature will be available.