Apple shared today a document which provides a detailed overview of its child safety features, including design principles and security and privacy requirements.Apple's plan for detecting known Child Sexual Abuse Material images (CSAM) stored in iCloud Photos has raised concerns from security researchers, the Electronic Frontier Foundation and others. This raises concerns about the possibility that the system could be misused by governments to mass-surveillance.This document aims at addressing these concerns. It also reiterates details that were previously discussed in an interview with Craig Federighi, Apple's chief software engineering officer. Apple plans to establish a match threshold of 30 known CSAM image before any iCloud account is flagged to be manually reviewed by the company.Apple stated that only entries from known CSAM images in the on-device database were submitted independently by child safety organizations operating under separate sovereign jurisdictions. They are not under the control or supervision of the same government.To ensure that the system works as it is advertised, users do not need to trust Apple or any other entity. This is possible through several interlocking mechanisms. These include the intrinsic auditability and the ability to execute a single software file on-device. Additionally, the requirement that perceptual images included in the encrypted CSAM database on-device are independently provided by two or more child safety organisations from different sovereign jurisdictions. Finally, a human review process to ensure no erroneous reports.Apple also stated that it would publish a support manual on its website that contains a root hash for the encrypted CSAM database. This document will be available with every Apple operating system that supports this feature. Apple also stated that users will be able inspect the root hash stored on their device and compare it with the support document. This was not possible within a given timeframe.According to Mark Gurman of Bloomberg, Apple stated that it would have an independent auditor examine the system. Apple's retail employees might be asking questions about child safety features. The memo linked to a FAQ Apple shared earlier this week to help employees answer the questions and give more transparency to customers.