Apple announced Thursday that it will report images of child exploitation in the United States to law enforcement.The new system will detect images known as Child Sexual Abuse Material (CSAM). This is done using a process called Hashing. Images are converted into unique numbers that correspond with the image.Apple began testing the system Thursday. However, most U.S. iPhone owners won't be able to use it until an iOS 15 update later in the year, Apple stated.Apple joins other cloud services that already scan files for violating their terms of service. This includes child exploitation images.Apple is also testing the system. It claims that it is more secure than other methods to eliminate illegal images of child sexual abuse. The sophisticated cryptography used on Apple's servers as well as user devices, doesn't scan actual images but only hashes.Many privacy-sensitive users are still wary of software that notifies governments about contents on devices or in the cloud. This announcement may be a red flag for many, especially considering Apple has vigorously defended encryption of devices and its operations in countries with less speech protections than the U.S.Apple was also pressured by law enforcement officers around the globe to reduce its encryption for iMessage, and other software services such as iCloud, in order to prevent child exploitation and terrorism investigations. Apple's Thursday announcement allows it to address some of these issues without having to give up its engineering principles regarding user privacy.