Facebook Is Everywhere; Its Moderation Is Nowhere Close

Facebook launched support in Arabic in 2009, and it was a huge success. The service was lauded for its support of the Arab Spring mass protests. Last year, Arabic was third on Facebook's most used language list. People in North Africa and the Middle East spent more time using the platform than those in other regions.
Two internal studies from last year found that Facebook is less successful in understanding and policing Arabic content. One is a detailed account on Facebooks handling Arabic content. It shows that both human and automatic reviewers have difficulty understanding the various dialects found in the Middle East and North Africa. The result: Facebook wrongly censors benign content that promotes terrorism, while exposing Arabic speakers and others to hateful speech. This is in a region plagued by political instability.

The study shows that Arabic is more than one language. It is better to think of it as a group of languages, many of which are mutually unintelligible.

The documents about Facebook's Arabic foibles are part of a series of internal material known collectively as The Facebook Papers. These papers show the company failing or neglecting to manage its platform in locations far from California headquarters. They also include information on the regions where its vast majority of users live. Many of these markets are located in economically poor areas of the world and are subject to the same political violence and ethnic tensions that social media often magnifies.

These documents were made public by the Securities and Exchange Commission and redacted by Frances Haugen's legal counsel. A consortium of news agencies, including WIRED, reviewed the redacted versions.

Although the collection is a small glimpse into Facebook's social network, it shows enough to show the enormity of the challenge posed by its success. A Harvard student site that rated the looks of women has evolved to become a global platform with nearly 3 billion users in over 100 languages. It is difficult to curate such a service perfectly, and the company's protections for users in poorer countries seem especially uneven. Users of Facebook who speak Pashto, Armenian, and Arabic are considered second-class citizens by the largest social network in the world.

Some of Facebook's failures are due to technical issues. Artificial intelligence is used by the company to manage problematic content. Because of Facebook's large scale, humans are unable to review every post. Computer scientists claim that machine learning algorithms are still unable to understand the subtleties of language. Other flaws may be due to Facebook's decisions about where and how to invest.

Facebook claims that nearly two-thirds (or more) of its users use it in another language and that the company regulates content globally in the same way. According to a spokesperson for the company, 15,000 people have reviewed content in over 70 languages. The company also published its Community Standards in 47 languages. Facebook allows users to post in over 110 languages.

Nearly two-thirds (23%) of people who use Facebook do it in another language than English.

A December 2020 memo about combating hate speech and violence in Afghanistan warned that users cannot report inappropriate content easily because Facebook has not translated its community standards to Pashto or Dari. The online forms to report hate speech were only partially translated into these languages. Many words are presented in English. The memo states that Facebook's translation of hate speech in Pashto, a language spoken widely in Pakistan, is not accurate.

A Facebook spokesperson stated that the goal of Facebook's hate speech prevention is to decrease its prevalence. Recent figures from Facebook show that this number has fallen globally on average since mid-2020, according to the company. This is the largest effort by any major consumer tech company to eliminate hate speech. While we still have a lot to do, we are determined to get it right.