Apple reveals new efforts to fight child abuse imagery

Apple clarified key details of the ongoing project in a briefing Thursday afternoon. New versions of iOS and iPadOS in the US will be available this fall. They include new cryptography applications to limit the spread online of CSAM [child-sexual abuse material].This project is also described in an Apple Safety page. The system that scans the device before a file is backed up to iCloud is perhaps the most controversial and intrusive. According to the description, scanning doesn't occur until a file has been backed up to iCloud. Apple will only receive data about a match if cryptographic vouchers uploaded to iCloud with the image for a particular account are within a certain threshold of matching known CSAM.Apple identified several restrictions as being included to protect privacyApple has been using hash systems for years to scan emails for child abuse images. This is in line with similar systems at Gmail, and other cloud email providers. Today's announcement will allow the program to apply the same scans on user images stored in iCloud Photos. This applies even if images are not sent to anyone else or shared with anyone else.Apple provided a PDF along with the briefing to justify its image scanning efforts. It described several privacy restrictions in a PDF.Apple doesn't know anything about images that don't match the CSAM database. Apple cannot access the metadata and visual derivatives of matched CSAM images unless a threshold number of matches has been reached for an iCloud Photos account. There is very little chance that the system will incorrectly flagge an account. Apple also manually reviews all NCMEC reports to verify accuracy. The database of CSAM images is not accessible to users. The system does not allow users to identify which images have been flagged as CSAM.These new details address privacy concerns raised earlier in the week. However, they also include a number security measures that should protect against such privacy risks. The threshold system makes sure that no single error will generate alerts. This allows apple to have an error rate of just one false alert for every trillion users each year. The National Center for Missing and Exploited Children's (NCMEC) flagged material and images uploaded to iCloud Photos are also excluded from the hashing system. Apple and NCMEC review any alerts before they alert law enforcement. This provides an additional safeguard to prevent the system from being used to detect non CSAM content.Apple requested technical assessments from three independent cryptographers (PDFs 1, 2 and 3), who found the system mathematically sound. This system, according to my judgment, will increase the chances that those who traffic in these pictures (or their owners) are caught. This should help children, stated professor David Forsyth of University of Illinois in one of the assessments. It is very unlikely that any pictures not already known CSAM photos will be revealed due to the accuracy of the matching system and the threshold.Apple stated that other child safety groups would be added to the program as hash sources. The company also said it was not committed to making the partners' list public. This will increase concerns about how the system could be used by the Chinese government. The Chinese government has long wanted greater access to iPhone data.Apple also added two additional safeguards to protect children who own iPhones. To detect potentially sexually explicit content, the Messages app has already performed an on-device scan of attachments to children's accounts. Once the content has been detected, it is blurred and a warning pops up. Parents can now enable a new setting on their family iCloud accounts to trigger a message informing their child that they will receive a message from their parents if the child views (incoming) or sends (outgoing) the image.Apple is updating Siri and the Search app to respond to queries regarding child abuse imagery. The new system will allow the apps to inform users that this topic is problematic and harmful, and to provide resources from other partners to help them with their queries.