With an image of himself on a screen in the background, Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in the Rayburn House Office Building on Capitol Hill October 23, 2019 in Washington, DC.With an image of himself on a screen in the background, Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in the Rayburn House Office Building on Capitol Hill October 23, 2019 in Washington, DC.

An Oversight Board report found that the special-track content review platform for high net worth individuals and businesses may cause harm to the public and may not protect safe and fair speech.

At a time when rival network Twitter is grappling with moderation issues of its own, the board recommendations come at a time. It shows that there is concern over how regular users are treated on Facebook.

The Oversight Board was established at the direction of the CEO. It supported the banning of Donald Trump in the wake of the insurrection.

The XCheck program was first reported by The Wall Street Journal in September of 2021.

In a 57-page report, the Board excoriated what it found to be a program that promoted an equal system. The program did not establish how effective the special-track program was, compared to standard moderation processes.

It was found in the report that potentially offensive content could stay on the site for hours if the user was part of the special program.

Meta said that it has a system that blocks some enforcement actions outside of the cross-check system.

There are exceptions for a pre-selected list of content policy violations for certain users. About a thousand technical corrections are processed by meta.

Content moderation on social media sites was easy for most users. Potentially problematic content is flagged either by automated processes or when a human reports questionable content and then a decision is made by an outsourcing contractor on the nature of the content

The cross-check program activated a more human process for a few.

The first step was a review by a specific team of Meta employees and contractors who had a degree of regional expertise. The general public didn't enjoy this opportunity.

In Afghanistan and Syria, the average review time for reported content was 17 days, in part because Meta has struggled to hire language experts around the world.

The content was then reviewed by a panel of Meta executives.

If the company faced significant legal, safety or regulatory risk, the most senior Meta executives would be involved.

There could be consequences to the company if there was a degree of urgentness. Someone made a decision to fast-track a content review process.

The content review process for the general public was changed in the wake of the Journal's initial reporting.

Content is sorted by an automatic process after initial detection and review.

Meta employees or contractors can potentially escalate to the highest level available to the general public, the Early Response Team, if it requires a deeper examination.

The cross-check program was the subject of many recommendations in the report. To fulfill Meta's human rights responsibilities, the first recommendation was to divide Meta's content review system into two separate streams.

Government relations and public policy teams should not be allowed to moderate content, as well as establishing a clear set of public criteria for inclusion on cross-check or successor lists.

A representative from Meta didn't reply to a request for comment.

The bear versus bull case for Meta