Illustration by Alex Castro / The Verge

The highest-profile conflict to date between Meta and its Oversight Board, an independent organization the company established to help it navigate the most difficult questions related to policy and content moderation, is today's topic.

Since the creation of the board, it has faced criticism that it primarily serves a public-relations function for the company formerly known as Facebook. The board relies on funding from Meta, it has a contractual relationship with it governing its use of user data, and its founding members were hand-picked by the company.

The fact that the board and Meta have rarely been in conflict is helping in the perception that it is mostly a PR project. The company implemented 14 of the 18 recommendations the board made to it. Even though it often rules against Facebook's content moderation, none of those reversals have generated much controversy. The more credible the board is, the more blame it can shoulder for unpopular calls.

That is what made this week's statements so noteworthy.

Meta asked the board to give an advisory opinion on how to moderate content during wartime after Russia invaded Ukraine. The conflict raised a number of difficult questions, including what circumstances users can post photos of dead bodies or videos of prisoners of war criticize the conflict.

Meta decided to temporarily allow calls for violence against Russian soldiers and others, in the most prominent content moderation question of the invasion to date.

Questions about the balance between free expression and user safety were raised. Meta changed its mind after asking the board to weigh in.

From the company's post.

Late last month, Meta withdrew a policy advisory opinion (PAO) request related to Russia’s invasion of Ukraine that had previously been referred to the Oversight Board. This decision was not made lightly — the PAO was withdrawn due to ongoing safety and security concerns.

While the PAO has been withdrawn, we stand by our efforts related to the Russian invasion of Ukraine and believe we are taking the right steps to protect speech and balance the ongoing security concerns on the ground.

The board said in a statement that it is disappointed by the move.

While the Board understands these concerns, we believe the request raises important issues and are disappointed by the company’s decision to withdraw it. The Board also notes the withdrawal of this request does not diminish Meta’s responsibility to carefully consider the ongoing content moderation issues which have arisen from this war, which the Board continues to follow. Indeed, the importance for the company to defend freedom of expression and human rights has only increased.

I spent a day talking with people familiar with the matter who could tell me what happened. Here is what I have learned.

One of the most disturbing trends of the past year has been the way that authoritarian governments have used intimidation to force platforms to do their bidding. The app that enabled anti-Putin forces to organize before an election was removed from both Apple and Google stores. Russian agents threatened their employees with jail time or worse in the aftermath.

Life for those employees and their families has become more difficult since Putin invaded. The combination of sanctions from the United States and Europe has forced many platforms to withdraw services from Russia.

Russia said that Meta had engaged in Extremist activities in the wake of their decision to allow calls for violence against the invaders. Hundreds of Meta employees are at risk of being jailed. While the company has successfully removed its employees from the country, the extremism language could mean that they will never be allowed to return to the country as long as they work at Meta. Families of employees in Russia could still be subject to persecution.

There are precedents for both outcomes under Russia's laws.

The Oversight Board has to do with it.

Meta asked for a broad opinion about its approach to moderation. The board is willing to make policy recommendations even on narrower cases submitted by users. The company's legal and security teams became concerned that the board might use the opinion to try and get rid of employees or their families in Russia, either now or in the future, after asking for the opinion.

The Oversight Board is distinct from Meta. Many people in the West refuse to recognize that distinction, and company lawyers worried that Russia wouldn't as well.

If the Oversight Board’s only role is to handle the easy questions, why bother with it at all?

All of this is compounded by the fact that tech platforms have gotten little to no support to date from either the United States or the European Union, in their struggles to keep key communication services up and running in Russia and Ukraine. I don't know what western democracies can do to reduce the fear of how Russia might treat employees and their families. Over the past year, discussions with executives at several big tech companies have made it clear that they all feel out on their feet.

The news still represents a blow to the Oversight Board's already fragile credibility, and arguably reduces its value to Facebook. An independent body was created to advise the company on policy matters. To ask that body for advice that would not be binding on the company, and then decide that the advice might be dangerous. Why bother with the Oversight Board if it only handles the easy questions?

Facebook and the board didn't want to speak to me beyond their statements. The company has stood up to Russia in some important ways, including standing by the decision to allow Ukrainians to call for Putin's death. Meta chose not to roll over for Russia.

At a crucial moment, Facebook executives fail to properly understand risk and public perception. Russia has been threatening platform employees. Before Facebook asked for an opinion from its board, there was danger for employees and their families. Talk about an oversight to realize that only weeks later.

I'm on record as saying that the Oversight Board has changed Facebook for the better. Tech companies have few options when it comes to threatening platform employees. The Russia case was a no-win situation.

It doesn't mean it won't have damage to its board. Critics were worried that if the stakes got high enough, Facebook would make all the decisions. The critics were proven correct when Putin invaded his neighbor.