Google Scans Gmail and Drive For Cartoons of Child Sexual Abuse

A Forbes associate editor is reporting on a recently-revealed search warrant.

The suspect in the case was asked to provide information on the person who owned the illegal cartoons because they are potentially illegal to own in the U.S. Machine learning tools can be used to look at files and analyze them for signs of abuse. The National Center for Missing and Exploited Children (NCMEC) was given information on what it found, as well as the addresses used to access the images, by the DHS Homeland Security Investigations unit. The suspect was identified through the use of the internet addresses provided by the search engine, as well as the return of emails to and from him. The suspect may be an artist. Forbes isn't publishing his name, but the man identified in the warrant had won several small Midwest art contests, and one artwork from the 1990s had been mentioned in a major West Coast newspaper.

In the past few years, the transparency reports have shown how many times it reports issues to NCMEC. There is a disturbing trend in the figures. In the first six months of 2021, it found more than 3.4 million pieces of potentially illegal content. In the last six months of 2020, there were 365,000 reports, and in the first six months of 2020, there were 180,000 reports.

_

It's possible for the tech company to find illegal content with its services like Drive and Gmail. Law enforcement can still rely on Google to warn them when abuse happens on its server, even though it has no plans to introduce those features. The Mountain View, California-based business will have to struggle with that balance for the rest of their lives, whether the majority of users want to have their accounts scanned so it can help find child abusers or if they want to have better privacy with end-to-end encryption. The same goes for all of its competitors.

_