Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage

CSAM was removed from Apple's Child Safety webpage, suggesting that the controversial plan to detect child sexual abuse images on phones and tablets may be in jeopardy.

In August, Apple announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

The features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistle blower Edward Snowden, the EFF, Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

Apple's CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse, was the majority of criticism.

In order to allay concerns, Apple initially released detailed information, sharing FAQ, various new documents, interviews with company executives, and more.

The controversy didn't go away despite Apple's efforts. Apple decided to delay the roll out of CSAM following the torrent of criticism that it clearly hadn't anticipated, but eventually went ahead with the Communication Safety features for Messages, which went live earlier this week with the release of iOS 15.2.

Apple said it decided to delay the release of the child safety features because of feedback from customers, advocacy groups, researchers and others.

The above statement was added to Apple's Child Safety page, but it has now gone, along with all mentions of CSAM, which raises the possibility that Apple could have kicked it into the long grass and abandoned the plan altogether. If we hear back, we will update this article.