Apple has released a FAQ to address concerns that anti-child abuse measures could become surveillance tools for authoritarian governments. The company clarifies that this technology can only detect child sexual abuse material (CSAM) stored in iCloud. We will not accept any government request to expand it.Two features were added to Apple's new tools last Thursday that are designed to protect children. The first, called communication safety uses machine learning on-device to identify and blur explicit images sent by children to the Messages app. Parents can be notified if a child aged 12 or younger views or sends such an image. The second scans images uploaded to iCloud by users to identify known CSAM. Apple will notify authorities if CSAM is detected.Apple's CSAM detection capabilities are designed to detect CSAM images.These plans were met with swift opposition from campaigners and digital privacy groups. They claimed that they introduce a backdoor to Apple's software. These groups point out that once such a backdoor is in place, it could be expanded to scan content other than child sexual abuse material. It could be used by authoritarian governments to scan for political dissent material or anti-LGBT regimes to crack down sexual expression.The Electronic Frontier Foundation stated that even a well-documented, meticulously planned, and narrowly-scoped backdoor, it is still a backdoor. This mission has already been demonstrated in practice. One of the original technologies used to scan child sexual abuse images and have it hashed has been repurposed for creating a terrorist content database that companies can access and contribute to in order to ban such content.Apple claims that its safeguards are in place to prevent it from being used to detect any other sexual abuse imagery. Apple claims that the National Center for Missing and Exploited Children's (NCMEC), and other child safety organisations, provide its list of prohibited images. It also states that the list is identical across all iPhones and iPads to avoid individual targeting.It also stated that it would not accept requests from governments to include images other than CSAM to the list. We have been faced with requests to make and implement government-mandated changes that compromise the privacy rights of users in the past and have consistently refused these demands. It states that we will continue to reject them.It is worth noting, despite Apple's assurances that it will continue to operate in these countries, that the company has made concessions in the past to governments. It sells iPhones in countries that don't allow encrypted phone calls. In China, it removed thousands of apps from the App Store and moved to store data on state-run telecom servers.The FAQ fails to address concerns regarding the scan of Messages for explicit material. Although the company claims that it does not share information with Apple and law enforcement, it doesn't explain how they ensure that the tools remain focused on explicit images.To widen the Apple backdoor, all it takes to expand the machine learning parameters to search for other types of content or tweak the configuration flags that scan accounts of anyone, not just children, but everyone, said the EFF. EFF notes that machine learning technologies often incorrectly classify these content. Tumblr's efforts to combat sexual content is a prime example of this.