Meta's Oversight Board called on Meta to make the cross-check system more transparent and beef up its resources after releasing an in-depth report on it.

A special moderation queue for high-profile public figures including former president Donald Trump is provided by the Oversight Board. A failure to make clear when accounts are protected by special cross-check status, as well as cases where rule-breaking material was left up for a long period of time, was the focus of the report. Meta didn't keep track of moderation statistics that might assess the accuracy of the program's results

While Meta told the board that cross-check aims to advance Meta's human rights commitments, we found that the program is more structured to satisfy business concerns. Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.

It was protecting a small group of people who were not aware they were on the list.

More than a year ago, The Wall Street Journal disclosed details about cross-check. Meta asked the Oversight Board to evaluate the program, but the board complained that Meta had failed to give important information about it. The review of thousands of pages of internal documents, four briefings from the company, and a request for answers to 74 questions is what led to today's announcement. diagrams, statistics, and statements from Meta are included in the document.

Alan Rusbridger is a member of the Oversight Board and he says that it exposed something that is a bit more systemic inside the company. There are a lot of people at Meta who believe in the values of free speech and journalism. The program they had created wasn't doing those things. It was protecting a small group of people who were not aware they were on the list.

Cross-check is designed to prevent inappropriate takedowns of posts from a subset of users and send those decisions through a set of human reviews. Journalists reporting from conflict zones are among its members, as Rusbringer notes. Publishers, entertainers, companies, and charitable organizations are covered in this section.

The program favors under-enforcing the company's rules to avoid a "perception of censorship" or a bad experience for people who bring significant money and users to Facebook. It can take more than five days to make a decision on a piece of content, according to Meta. One piece of content remained in the queue for over seven months when the decision was delayed further.

Meta has often been criticized by the Oversight Board for removing posts that were political or artistic. It was concerned that Meta was allowing its partnerships to overshadow real harm. When a Brazilian soccer player posted nude pictures of a woman who accused him of rape, the decision was delayed because of a cross-check. The board notes that the soccer player signed a streaming deal.

Part of the problem is that ordinary users don't get the same moderation because of the scale of social media. In October of 2021, Meta said it was performing 100 million enforcement actions on content. Many of these decisions are automated or given very human review, which makes it difficult to coordinate across a purely human-powered moderation system. The board doesn't know if Meta tracks or tries to analyze the accuracy of the cross-check system. The results could show that Meta was under-enforcing its policies for high-profile users, or that a lot of ordinary users' content was being wrongly flagged.

My hope is that Meta will fight back.

There were 32 recommendations made by the board. Meta isn't bound to adopt the recommendations but must respond within 60 days. hiding posts that are marked as high severity violations while a review is underway is one of the recommendations. The board wants Meta to adopt a queue for this content that is separate from Meta's business partners in order to improve moderation. State actors and business partners can publicly mark that status if Meta sets out clear, public criteria for who is included on cross check lists.

Public marking of accounts is one of the policy decisions that wouldn't require significant extra resources. Rusbridger acknowledges that a substantial expansion of Meta's moderation force is needed to eliminate the back-up for cross-check. The company laid off around 13 percent of its workforce last month.

Rusbridger is hopeful that Meta will still prioritize moderation even as it tightens its belts. He hopes that Meta will be able to hold its nerve. They need to realize that cutting the soft areas is not a good idea in the long term.