Illustration by Alex Castro / The Verge

New York City's Commission on Human Rights (CCHR) has settled discrimination allegations related to the company's adult content ban. The settlement requires Tumblr to revise its user appeals process and train its human moderators on diversity and inclusion issues, as well as review thousands of old cases and hire an expert to look for bias in its moderation algorithms.

One of the first times that regulators have reached an agreement to change a social network's moderation policies is due to the fact that the settlement did not involve a formal legal complaint. The CCHR began an investigation in December of last year after Tumblr banned explicit sexual content and nudity and enforced its rules with an automated system.

“If someone is doing business in New York City, we have the authority to investigate”

The CCHR became interested after it was reported that the ban would have an effect on the gay community. New York City has a Human Rights Law that protects against bias based on gender identity and sexual orientation.

The settlement gives Tumblr 180 days to hire an expert on sexual orientation and gender identity issues. It must hire someone with experience in this area and expertise in image classification, who will review the moderation of the site to see if they flag more LGBTQ content. As part of an overall review, Tumblr will reexamine 3,000 old cases where a user successfully appealed a takedown, looking for patterns that could indicate bias.

The CCHR believes that the deal happened because of the cooperation of Automattic, the owner ofWordPress. The original system was revised before the settlement. As a result of a larger community exodus, Tumblr has tried to reconcile with the LGBTQ users that left.

Rodriguez thinks the settlement of the Tumblr case could be the beginning of a larger regulatory movement.

Social media bias cases have rarely succeeded in court

Bias allegations against social media platforms have rarely succeeded in court, and today's settlement seems to be supported by Automattic's desire to restore trust with the LGBTQ community. The company is small and has less legal resources than giants. It is difficult to evaluate the details of the case because the CCHR didn't give details about the evidence. In the past, larger platforms have faced accusations of discrimination without regulatory action, and in particular, YouTube has beaten two lawsuits from Black and LGBTQ video creators who alleged algorithmic discrimination.

Rodriguez says that the CCHR's city-level rules don't require a specific intent to discriminate. Section 230 of the Communications Decency Act applies to federal, state, and municipal laws, and a CCHR lawsuit would have to stand up to that scrutiny.

The larger issue of race and gender bias has become an increasing priority for regulators, particularly in cases where it might affect people's housing and employment options. Even without legal complaints, some companies have reviewed their moderation methods under public scrutiny, sometimes making troubling discoveries along the way.