Woodward says there is only one logical solution: client-side scanning where the content is examined when it is decrypted on the user's device for them to view/read. Apple announced last year that it would introduce scanning on people's phones to check for known CSAM. The move prompted anger from civil rights groups and led to Apple pausing its plans a month after announcing them. Apple did not comment for this story.
Tech companies have been detecting CSAM on their platforms for a long time. The National Center for Missing and Exploited Children is a US-based nonprofit that requires companies to report any CSAM they find. More than 29 million reports containing 39 million images and 44 million videos were made to NCMEC last year. CSAM reports from tech companies will be received by the EU Centre.
A lot of companies are not doing the detection of child sexual abuse material.
Tech companies find CSAM online in many different ways. As tech companies get better at detecting and reporting abuse, the amount of CSAM found is increasing. CSAM is being hunted down using artificial intelligence. Abuse content can be spotted when it's uploaded to the web again using a system that assigns a fingerprints to it. More than 200 companies use Microsoft's PhotoDNA hashing system to find and remove duplicate files. When end-to-end encryption is in place, it is not possible for systems to have access to the messages and files people are sending.
Diego Naran says that obligations will exist to detect the solicitation of children and that conversations will need to be read 24/7. If companies want to comply with these obligations, they will have to offer less secure services for everyone.
The discussions around protecting children online and how this can be done with end-to-end encryption are complex, technical and combined with the horrors of the crimes against vulnerable young people. The UN's children's fund published a research in 2020 that says there is a need to protect people's privacy. For years, law enforcement agencies around the world have pushed to create ways to weaken encryption.
Tech companies and researchers are focusing on safety tools that can be used with end-to-encryption. The who, how, what, and why of messages are used to analyze people's behavior and potentially spot criminality. Business for Social Responsibility commissioned a report that found that end-to-end encryption is a positive force for human rights. 45 recommendations were made for how safety and encryption can go together. When the report was published in April, Lindsey Andersen, BSR's associate director for human rights, told WIRED: "Contrary to popular belief, there actually is a lot that can be done even without access to messages."