The moderation systems of Facebook are in need of repair.

The Meta's Oversight Board reversed two of Facebook's decisions to remove content from its platform. The platform's use of automated systems to take down content and the removal of news content by humans are flaws in the protocol.

The first case from the Oversight Board concerns a Facebook user who posted a cartoon depicting police brutality from the National Police of Colombia. The user's post was taken down 16 months later when the company's automated systems matched the cartoon image with the one in the Media Matching Service bank.

The image depicted did not violate Facebook's rules and should not have been added to the Media Matching Service bank.

The Oversight Board said that this user was not the only one affected. 215 people appealed the removal of a post with this image. 98% of them succeeded in their appeal to Meta. The cartoon image remained in the bank and led to automated detections and removal. When the Oversight Board decided to take up the case, Meta removed the image from the bank.

Meta wrongly removed a news post about the Taliban. The Taliban's announcement to re-open schools for women and girls was reported by an India-based newspaper in January 2022. Meta determined that the post was in violation of its policy as it construed the post as praise for the Taliban.

The Indian newspaper's access to certain Facebook features was limited due to the post being removed. The newspaper tried to appeal the decision, but it wasn't reviewed because there weren't enough reviewers who could speak Arabic.

Meta reversed its decision when the Oversight Board decided to take this case. The Oversight Board found that simply reporting on news is not a violation of Facebook's policies.

The Oversight Board used the chance to recommend changes to Facebook's moderation systems, regardless of whether it be automated or human-reviewed.

The Oversight Board was formed in order to adjudicate on Meta's moderation decisions. In January 2021, the organization released the decisions on its first case. The restoration of a post that Muslim activist groups deemed to be hate speech was called for by one of the early rulings. The Oversight Board's most notable case so far has been its decision to uphold Meta's suspension of Donald Trump. Following the riots at the Capitol building, the former President was removed from the platform.

Meta was forced to set a time for Trump's suspension by the Oversight Board. Meta said it would consider allowing Trump back on its platforms in January of 2020. It sounds far off in the future, but that is just a few months away. Don't be surprised to see Trump's name on an Oversight Board case when he returns to Facebook.