Apple defends its new anti-child abuse tech against privacy concerns

Some experts believe that Apple will soon make iCloud encrypted public following this weeks announcement. Apple executives may feel less pressure if iCloud is encrypted, but they can still identify child abuse material and pass it along to law enforcement.This would not relieve all the pressure: Most governments who want Apple to take more steps to combat child abuse also want to see more action against content that is related to terrorism or other crimes. However, child abuse is a serious problem that big tech companies have largely failed to address.Apple's privacy approach is better than any other, according to David Forsyth (chair of computer science at University of Illinois Urbana-Champaign), who reviewed Apples system. This system, in my opinion, will increase the chances that [CSAM] owners or traffickers are found. This should help to protect children. Harmless users will experience little to no privacy loss, as visual derivatives can only be revealed if they match CSAM images. Only images that match known CSAM photos are disclosed. It is very unlikely that any pictures not matching known CSAM photos will be revealed due to the accuracy of the matching system and the threshold.What about WhatsApp?Every tech company is confronted with the terrible reality of child abuse material. Apple is the only one that has approached this problem like Apple.WhatsApp, like iMessage is an encrypted messaging platform that allows users to communicate with each other in encrypted form. It has billions of users. They are vulnerable to abuse like any other platform.I have read the information Apple released yesterday and I am concerned," WhatsApp head Will Cathcart tweeted Friday. This is a bad approach that will result in a loss of privacy for everyone around the globe. Many people have asked whether we will adopt this system to WhatsApp. No.WhatsApp has reporting capabilities that allow users to report abusive content to WhatsApp. WhatsApp reported more than 400,000 cases last year to NCMEC, despite the limitations.Cathcart stated in tweets that this surveillance system was Apple-built and operated. It could be used to scan private information for any government or individual they decide to control. Different countries will have different standards for what is acceptable. This system will be used in China. How will they decide what content is illegal in China? How will they handle requests from governments around the globe to scan other content?Apple stressed that the new scanning technology is only available in the United States at this time, during its briefing to journalists. Apple argued that the company has a history of protecting privacy and will continue to do so. This is largely due to Apple's trust.The company argued that these new systems can't be easily stolen by government action and repeatedly stressed that it was easy to opt out of iCloud backup.Despite being the most widely used messaging platform on the planet, iMessage is still criticized for not having the reporting capabilities now standard across social media. Apple has reported only a fraction of cases to NCMEC than Facebook and other companies.Apple has created something completely different instead of following the same solution. Privacy hawks will be concerned by the results. It is a welcome and radical change for others.John Clark, NCMEC president, stated that Apple's expanded protection of children is a game-changer. Privacy and child protection can be co-exist, but that is not the reality.High stakesAn optimist would argue that allowing full encryption of iCloud accounts and still detecting child abuse materials is both an antiabuse win and a privacy win. It may even be a smart political move that blunts the anti-encryption rhetoric coming from American, European, Indian and Chinese officials.Realists worry about the future of the most powerful nations in the world. Apple is almost certain to getcalls from capital cities. This is because government officials start to think about the surveillance potential of scanning technology. While political pressure is one thing; regulation and authoritarian controls are another. This threat isn't new and it isn't unique to the current system. Apple, a company that has made a steady but profitable deal with China, still has much to prove to users its ability to resist oppressive governments.All of this can be true. Apple's future tech will be defined by what happens next. This feature could be used by governments to expand surveillance. Apple is clearly not living up to its privacy promises.