Apple's plans to introduce new features aimed against Child Sexual Abuse Material (CSAM), on its platforms has caused a lot of controversy.AdvertisementThis company is trying to be a pioneer in a solution to a problem which has stymied both law enforcement officers and technology companies alike over the past few years: the ongoing, large-scale crisis of CSAM proliferation across major internet platforms. Tech firms have reported that as many as 45,000,000 photos and videos were child sex abuse material, a shockingly large number.Critics fear that Apple's new features, which involve algorithmic scanning users devices and messages, will lead to a privacy violation. Worse, it could also be repurposed to search other types of material than CSAM. This shift could lead to new forms widespread surveillance, and it could be used as a workaround for encrypted communications, one of privacy's best hope.We should first look at the details of the changes to understand these concerns. The company will first roll out a new tool that scans photos from Apple devices to iCloud in order to find signs of child sex abuse material. Apple published a technical paper explaining that the new feature uses a neural match function called NeuralHash to determine if images from an iPhone user match known hashes or unique digital fingerprints of CSAM. The new feature compares images that are shared through iCloud with a large database of CSAM imagery, which has been compiled and maintained by the National Center for Missing and Exploited Children. If there are enough images, they can be flagged by human operators for review. NCMEC will then alert them (and presumably tip the FBI).People have expressed concern that photos of their children running naked through the sprinkler, or in a bathtub may be stored on their phones. Apple says that you don't have to be concerned about this. Apple stressed that it doesn't learn any images that don't match the CSAM databases. So it isn't just looking through your photos and deciding what it wants.G/O Media might get a great deal CBD Tincture oil Higher concentration to help with the harder daysIdeal for daily stress relief and anxiety relief. Boosted by Vitamins D3 and B12.Sunday Scaries: Buy for $34Apple will be introducing a new iMessage feature that warns parents and children about sending or receiving explicit images. The notification informs the child that they are about look at a porn image. It also assures them that it is okay to not look at it (until the user consents, the image will remain blurred). A notification will be sent to the parent if the child is under 13.Civil liberties activists have not welcomed the news about these updates, which will begin later in the year with the iPadOS 15 and iOS 15 releases. While the concerns may differ, they all stem from concerns that powerful new technology could pose privacy risks.AdvertisementConcerns surrounding the iMessage update revolve around encryption, its protection, and how the update will actually circumvent that protection. Encryption is a method of protecting the contents of a user's message by making it unreadable and creating cryptographic signatures that are unreadable before it is sent. This effectively nullifies the possibility of an interceptor being able to read the message. Apple's new feature will scan messages from child accounts to detect sexually explicit material. This doesn't mean Apple can read any child's text messages. It is only looking for inappropriate images.This shift could set a dangerous precedent. In a Thursday statement, the Center for Democracy and Technology said that the iMessage update was an attack on Apple's privacy. It stated that the mechanism that allows Apple to scan images in iMessages does not replace a backdoor. Client-side scanning at one end of the communication breaches the security of the transmission. Informing a third party (the parent), about the contents of the communication also compromises its privacy.AdvertisementPrivacy advocates were similarly upset by the plan to scan uploaded iCloud files. Jennifer Granick (surveillance and cybersecurity counsel for ACLUs Speech Privacy and Technology Project) told Gizmodo via email she was concerned about the possible implications of photo scans. "Regardless of its altruistic motives, Apple built an infrastructure that could allow for widespread surveillance of our conversations and information on our phones," she stated. Depending on the hashes that the company chooses or is forced to include in its matching database, the CSAM scanning capability can be used for censorship, identification, and reporting of legal content. It is susceptible to abuse by foreign autocrats, overzealous officials at home or by the company itself.Even Edward Snowden was a part of the conversation:AdvertisementThis is not Apple's goal to combat CSAM. It is the tools it uses to do so, which critics fear could lead to a slippery slope. The privacy-focused Electronic Frontier Foundation published an article Thursday that noted that Apple's scanning tools could be used to allow its algorithms to search for other types of images and text. This would effectively mean a workaround to encrypted communications. It was designed to protect private conversations. According to the EFFTo widen Apple's narrow backdoor, all it takes to expand the machine learning parameters and look for other types of content or tweak the configuration flags to scan accounts of children as well as adults. It's not a slippery slope. This is a fully-functioning system that just needs external pressure to make any changes.AdvertisementThese concerns are especially relevant when it comes down to features rollout in other nations. Some critics warn that Apple's tools might be misused and subverted by corrupt foreign government. Apple responded to these concerns by confirming to MacRumors that it will expand the features country-by-country. According to the outlet, Apple will conduct a legal assessment before considering distribution in a country.India McKinney, EFF director of federal affairs, called Gizmodo Friday to voice another concern. It is impossible to independently confirm that the tools are functioning as they should.AdvertisementOutside groups, such as ours, cannot look at the underside of the system to determine if it is accurate, if it is doing its job correctly, and if there are any false-positives. After they start pushing this system onto phones, who is to say that they won't respond to government pressure to include other terrorist content?A group of security and privacy experts, as well as privacy advocates, have written an open letter asking Apple to reconsider its new features. The letter has over 5,000 signatures as of Sunday.AdvertisementIt is not clear if any of this will impact the plans of the tech giants. Apples software vice president Sebastien Marineau–Mes admitted in an internal memo that people had misunderstood the rollout and that they were concerned about its impact on their plans. However, NMCEC wrote to Apple employees internally, referring to critics of the program as the voices of the minority, and praising Apple for its efforts.