Facebook's international misinformation problem is even bigger than it is in the U.S.

Recent information found in internal Facebook documents shows that misinformation by the social media company extends far beyond the United States.
Frances Haugen, whistleblower, collected the documents, which are called The Facebook Papers. They show Facebook's lack in social awareness and resources in countries such as India, Myanmar, Sri Lanka and Sri Lanka. This has led to hate speech and extremist political sentiment, which could be tied to violence and influence national elections.

The social pulse of Facebook outside the United States is not as well monitored by the company, but The New York Times reports that Facebook is aware that its platform can have an impact on politics in these countries. In India, internal researchers conducted field studies and tests on Facebook's algorithm. The News Feed was filled with hate speech, misinformation and celebrations for violence. These types of content were both from legitimate users and uncensored robots.

"Following this test user News Feed Ive seen more images about dead people in three weeks than Ive ever seen in my entire lifetime," wrote an internal Facebook researcher.

These documents also reveal that Facebook takes a biased approach to fighting misinformation. Facebook devotes 87 percent to the U.S. and the remaining 13 percent to the rest of the globe.

Ineffective measures are often taken in countries like India due to the small number of resources available. India has its own unique radical politics and nationalities. This seems like a bizarre misallocation of resources when you consider India, which is Facebook's largest market with 340 millions users across its platforms.

There are graphic posts easily uploaded by Facebook users that promote anti-Muslim or anti-Pakistan rhetoric. These posts can be found in any of 22 official languages. According to the Times AI on Facebook is trained in only five languages with human reviewers for "some other."

The company's documents show that much of the harmful content in Hindi and Bengali, two of India's most popular languages, is not flagged due to insufficient data. Despite misinformation campaigns being ramped up during national elections, Facebook's inability to provide more resources in India means that it cannot permanently combat dangerous bots or violent groups.

These issues are most severe in India. However, similar resource problems plague countries such as Myanmar. In Myanmar, Facebook's efforts not to curb harmful rhetoric were not enough and may have contributed to a coup. Although the company took measures to restrict misinformation posted by the military during the elections in Myanmar, it was unable to continue them afterwards. Facebook ended up reducing its security measures and the military executed a coup three months later.

Although Facebook recognizes its responsibility for foreign political violence, and is doing all it can to correct it, these documents reveal that it often does too little too late. If the company wants to ethically operate globally, it must show its largest market the cultural sensitivity as well as the dedication of resources in order to provide safe service to its users. Facebook must reexamine its approach to misinformation around the world. A U.S.-based firm should address misinformation in its own country.

Katie Harbath, the former director of public policies at Facebook, stated to the Times that her employer needs to find a solution that is universally applicable. Harbath stated that there is a problem with resourcing. "But the solution is not just throwing more money at it."