Mark noticed something wasn't quite right with his child. The penis of his son was hurting him. Mark, a stay-at- home dad in San Francisco, grabbed his phone and took pictures to document the problem so he could keep an eye on it.

There was a Friday night in February of that year. His wife called an advice nurse at their health care provider to schedule an emergency consultation by video because it was a Saturday and there was a flu outbreak. The nurse wanted the doctor to see the photos before they were sent.

A few close-ups of their son's groin area were sent to the health care provider by Mark's wife. Mark's hand was visible in one of the pictures. Mark and his wife didn't pay much attention to the tech giants that made this possible, or what they would think of the images.

The doctor diagnosed the issue and prescribed antibiotics after seeing the photos. Mark was left with a bigger problem, one that would cost him more than a decade of contacts, emails and photos, and make him the target of a police investigation. Mark, who asked to be identified only by his first name in order to protect his reputation, had been caught in a net designed to catch people exchanging child sexual abuse material.

Technology companies have been pressured to act as sentinels because of the amount of data they capture. Child advocates say companies need to cooperate to fight the spread of sexual abuse imagery. It can involve looking into private archives, such as digital photo albums, which can cast innocent behavior in a sinister light.

Jon Callas is a technologist at the Electronic Frontier Foundation.

He said there could be many more.

The toxic nature of the accusations made Mr. Callas think that most wrongly flagged people wouldn't speak out.

Mark knew that the companies were watching him. I haven't done anything bad.

The police said yes. They did not.

Mark, who is in his 40s, set up a Gmail account in the lateaughts. He had his wife on his calendar. He backed up his photos and videos to the cloud with his phone. He had a plan with the internet company.

Two days after taking the photos of his son, Mark received a notification on his phone that his account had been disabled because of harmful content. A link to learn more led to a list of possible reasons.

Mark was confused but he remembered his son's illness. He thought that it was likely child porn.

Mark worked as a software engineer on an automated tool for removing problematic video content from the internet. He knew that such systems often have a human in the loop to make sure that computers don't make a mistake, and he assumed that his case would be cleared up as soon as possible.

ImageMark, a software engineer who is currently a stay-at-home dad, assumed he would get his account back once he explained what happened. He didn’t.
Mark, a software engineer who is currently a stay-at-home dad, assumed he would get his account back once he explained what happened. He didn’t.Credit...Aaron Wojack for The New York Times
Mark, a software engineer who is currently a stay-at-home dad, assumed he would get his account back once he explained what happened. He didn’t.

He filled out a form asking for a review of the decision by the search engine. He discovered the domino effect of the rejection by the search engine. He lost emails, contact information for friends and former colleagues, and documentation of his son's first years of life because of the shut down of his Google Fi account. He couldn't get the security codes he needed to sign in to other internet accounts because he didn't have his old phone number and email address.

The basket is more likely to break if you have more eggs in it.

Child sexual abuse material is abhorrent and we're committed to preventing the spread of it on our platforms

The account was not restored a few days after Mark appealed.

The San Francisco Police Department began to investigate Mark after he made a video that was flagged by the review team.

The same thing happened in Texas the day after Mark's troubles began. While reporting out Mark's story, I stumbled upon a post about a toddler in Houston who had an infectious disease. Cassio, who asked to be identified only by his first name, used an app on his phone to take pictures, which were backed up automatically. He sent them to his wife through the internet.

When his email account was disabled, he was in the middle of buying a house. The broker was suspicious of him until his real estate agent vouched for him.

The person said it was a head ache.

Millions of images of children being exploited or sexually abused are flagged by technology giants. Over 270,000 users were disabled and over 600,000 reports of child abuse material were filed by the search engine. Their experiences were in a bucket.

PhotoDNA, a database of known images of abuse, converted into unique digital codes, was the first tool to seriously disrupt the vast online exchange of so-called child pornography. Facebook and other tech companies used PhotoDNA after Microsoft released it in 2009.

The president of the National Center for Missing and Exploited Children said that it was a great tool.

A bigger breakthrough came in the form of an artificially intelligent tool that could identify exploitative images of children. It was important to find images of unknown victims who could be saved by the authorities. Facebook was one of the companies that got access to the technology from Google.

The photos were flagged by this technology when they were uploaded to the server. Jon Callas of the E.F.F. said that a family photo album should be a private sphere. When a user backs up photos to the company's cloud, the company scans only when anaffirmative action is taken.

The nightmare that we are all concerned about is this one. I'm going to get into trouble because they're going to scans my family album.

The photos were flagged by the artificial intelligence and would have been reviewed by a human. When such a discovery is made, it locks the user's account, searches for other exploitative material and makes a report to the CyberTipline at the National Center for missing and Exploited Children.

About 80,000 reports a day were received by the organization last year. Most of these are images that have already been reported on the internet and are still in circulation. She has a staff of 40 analysts who focus on new victims so they can be prioritized.

If the CyberTipline report includes exploitative material that hasn't been seen before, they will escalate. A child who hasn't been identified or safeguarded and isn't out of harm's way is that child.

An example of the system working as it should be, Ms. McNulty said, was the ability for her organization to report these images to the police.

CyberTipline staff members add abusive images to a database that is shared with technology companies. Mark's wife deleted the photos he took of their son from her phone because she was afraid Apple would flag her account. The iCloud was to be scanned for sexually abusive depictions of children, but the project was delayed after privacy groups objected.

Over 4,000 potential new child victims were reported by the CyberTipline. Among them were the sons of Mark and Cassio.

ImageA police investigator was unable to get in touch with Mark because his Google Fi phone number no longer worked.
A police investigator was unable to get in touch with Mark because his Google Fi phone number no longer worked.Credit...Aaron Wojack for The New York Times
A police investigator was unable to get in touch with Mark because his Google Fi phone number no longer worked.

The San Francisco Police Department sent Mark an envelope. He was told in a letter that he had been investigated, as well as copies of the search warrants that were served on him. An investigator asked for everything in Mark's account, including his internet searches, his location history, his messages, and any document, photo and video he'd stored with the company.

He took the photos of his son a week before the search took place.

Nicholas Hillard told Mark that the case was over. Mr. Hillard tried to get in touch with Mark, but his phone number and email didn't work.

The incident did not meet the elements of a crime according to Mr. Hillard. The police decided that the information they had on Mark was not child abuse.

Mark asked if Mr. Hillard could tell the internet search engine that he was innocent.

Mark said that Mr. Hillard told him to talk to the internet search engine. I don't have anything I can do.

Mark gave the police report but it was not enough to get his case heard again. After getting a notice two months ago that his account was being permanently deleted, Mark spoke with a lawyer about whether or not he could file a lawsuit.

He said that he thought it was probably not worth much.

It can be difficult to account for things that are invisible in a photo, like the behavior of the people sharing an image or the intentions of the person taking it. Billions of images being scanned are bound to lead to false positives. Most people would probably think that the trade-off was worth it, given the benefits of identifying abused children.

It would be problematic if it were just a case of moderation. It also leads to someone being reported to law enforcement.

She said it could have been worse if a parent lost custody of a child. You could imagine how this would get out of hand.

There was an investigation by the police. He was asked to come into the station by a detective from the Houston Police department.

After showing the detective his communications with the doctor, he was given the all-clear. He was unable to get his account back even though he was a paid user of the internet giant. People mock him for using a Hotmail address for email, and he makes backups of his data multiple times.

ImageMark was frustrated at Google’s refusal to reinstate his account after he explained what had happened.
Mark was frustrated at Google’s refusal to reinstate his account after he explained what had happened.Credit...Aaron Wojack for The New York Times
Mark was frustrated at Google’s refusal to reinstate his account after he explained what had happened.

Some pictures of naked children are not exploitative. According to a law professor at the University of North Carolina, defining what constitutes sexually abusive imagery can be difficult.

Medical images did not qualify according to Ms. Hessick. She denied there was abuse of the child. It is taken for non sexual reasons.

In machine learning, a computer program is trained by being fed correct and incorrect information until it can distinguish between them. To avoid flagged photos of babies in the bath or children running unclothed through sprinklers, the A.I. for recognizing abuse was trained both with images of potentially illegal material found in user accounts in the past and with images that were not indicative of abuse.

It was understandable that they were explicit photos of a child's genitalia. They were taken by a concerned parent.

When it comes to child safety, it has been necessary for parents to take photos of their children in order to get a diagnosis. She said that the company consulted doctors to make sure its human reviewers understood possible conditions that might appear in photographs.

Parents should not take pictures of their children's genitals even if a doctor tells them to.

It's not a good idea for a child to be photographed of their genitalia. Avoid uploading to the cloud if you have to.

Most physicians don't know about the risks of asking parents to take pictures.

The doctor applauded the efforts of the company to fight abuse. There's a horrible problem we have. It was tied up with parents trying to do right by their children.

The customer support representative told the man that sending pictures to his wife using the chat service was against the terms of service. The terms warn against using Hangouts in a way that exploits children. There is a zero- tolerance policy against this content.

Ms. Lilley said that reviewers had not detected a rash or redness in the photos he took and that the subsequent review of his account found a video of a young child lying in bed with an un.

Mark said it sounded like a private moment he would have been inspired to capture, but he didn't realize it would be seen or judged.

I can think of something like that. One morning, we were woken up. Mark said that it was a beautiful day with his wife and son. This all could have been avoided if we had sleepwear on.

Even though the two men were cleared of any wrongdoing, the company stood by its decisions.

According to Ms. Hessick, the cooperation the technology companies give to law enforcement to address and root out child sexual abuse is important, but she thinks it should allow for correction.

She speculated that it was easier for the company to deny the use of their services. The company would have to answer more difficult questions about what is appropriate to photograph and what is not.

Mark wants to get his information back. The contents of his account are on a thumb drive. Mark is looking for a copy. The police department is interested in helping him.

Grant reported. Susan Beachy was a researcher.