According to sources, Apple plans to install software on American iPhones that scans for child abuse imagery. Security researchers are concerned about the possibility of Apple spying on millions of people's devices.Two security researchers briefed about the virtual meeting said that Apple presented its proposed system, known as neuralMatchto to some US academics this week. They said that the plans could be made public sooner than expected.If it detects illegal imagery, the automated system will alert a team human reviewers. They would then contact law enforcement to verify the material. Initial rollout will be in the US.Apple declined to comment.Apples proposes to compromise its promise to protect customer privacy with ongoing demands from governments and law enforcement agencies, as well as child safety campaigners, for greater assistance in criminal investigations including child pornography.Tensions between tech companies like Apple and Facebook have only increased since Apple went to court in 2016 with the FBI over access to an iPhone belonging to terror suspects following a shooting at San Bernardino, California.While security researchers are supportive of child abuse prevention efforts, they worry that Apple could allow governments all over the globe to access personal data about their citizens. This could be far beyond what was intended.This is a terrible idea. It will lead to mass surveillance. . . Ross Anderson, professor of security engineering from the University of Cambridge, stated that our phones and laptops are both vulnerable.AdvertisementResearchers say that although the system was initially designed to detect child sex abuse, researchers believe it could be modified to scan for other targeted imagery or text. For example, terrorist beheadings and anti-government signs at demonstrations. Apple's example could increase pressure on other tech companies that use similar techniques.This will end the damgovernments will demand that it be from everyone," said Matthew Green, security professor at Johns Hopkins University and the first to tweet about the issue.Alec Muffett is a privacy activist and security researcher who worked previously at Deliveroo and Facebook. He said that Apple's move was tectonic. It is a big and regressive step in individual privacy.He said that Apple is restoring privacy to allow 1984.Social networking sites and cloud-based photo storage systems already scan images for child abuse. However, this process is more complicated when it comes to accessing data on personal devices.The Apple system is more discreet in that it screens the phone and notifies those who are searching if there is a match. This is according to Alan Woodward, a University of Surrey computer security professor. If you choose to go this route, this decentralized approach could be the best.Apple's neuralMatch algorithm will scan all photos stored on an iPhone in the United States. They can also be uploaded to its iCloud backup system. Through a process called hashing, users photos will be converted into numbers by a process called hashing. These numbers will then be compared to images from a database that contains known child sexual abuse photographs.The system was trained using 200,000 images of sex abuse collected by the US non-profit National Center for Missing and Exposed Children.People briefed about the plans said that every photo uploaded to iCloud US will receive a safety voucher indicating whether the photo is suspect or not. Apple will allow all suspect photos to be encrypted and, if it appears illegal, pass them on to the appropriate authorities once a certain amount of photos have been marked as suspect.2021 The Financial Times Ltd. All Rights Reserved.