facebook stock art Illustration by Alex Castro / The Verge

Tech companies are legally required to report any child sexual abuse material found on their platforms to the National Center for Missing and Exploited Children. Content that is flagged for possible CSAM is reviewed by companies that determine if the content should be reported to the NCMEC.

According to a new report from The New York Times, Facebook has a policy that could mean it is under reporting child sexual abuse content. When they don't know someone's age in a photo or video that's suspected to be CSAM, the training document says toerr on the side of an adult.

A California Law Review article from August discussed the policy for Facebook content reviewers.

Interviewees also described a policy called “bumping up,” which each of them personally disagreed with. The policy applies when a content moderator is unable to readily determine whether the subject in a suspected CSAM photo is a minor (“B”) or an adult (“C”). In such situations, content moderators are instructed to assume the subject is an adult, thereby allowing more images to go unreported to NCMEC.

The company's reasoning for the policy was provided by The New York Times.

Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults. “The sexual abuse of children online is abhorrent,” Ms. Davis said, emphasizing that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be “life-changing” for users.

Facebook pointed to quotes in the NYT when contacted for comment. A request for comment was not immediately responded to by the company. The New York Times asked about the company.

Facebook pointed to quotes in the NYT.