TikTok logo Illustration by Alex Castro / The Verge

A Forbes report raises questions about how TikTok handles child sexual abuse material.

Employees of a third-party moderation outfit that works with TikTok and other companies claim they were asked to review a spreadsheet called DRR or Daily Required Reading on TikTok moderation standards. Hundreds of images of children who were nude or being abused were included in the spreadsheet. Hundreds of people were able to access the content from both inside and outside the office.

TikTok said its training materials do not include visual examples of CSAM, although it didn't confirm that all third-party vendors met that standard. There is no place for this kind of content on our platform and we want to minimize the exposure of our moderators. TikTok has strict access controls and does not include visual examples of CSAM in its training materials.

Moderators on all platforms deal with child abuse material — but within strict limits

Forbes says that the employees tell a different story and it is a dicey one. CSAM is posted on a lot of social media platforms. Child abuse imagery is not legal in the US. Companies are supposed to report the content to the National Center for Missing and Exploited Children, then preserve it for 90 days, but reduce the number of people who see it.

There are allegations that go far past the limit. Employees were shown graphic photos and videos as examples of what to tag on TikTok, while they played fast and loose with access to that content. One employee contacted the FBI to inquire if the practice constituted criminal CSAM.

The full Forbes report is well worth a read, detailing a situation where moderators were unable to keep up with TikTok's rapid growth, and were told to watch crimes against children for reasons they felt didn't add up It is a strange situation even by the complex standards of debates about child safety online.

The statement from TikTok has been added.