Illustration by Alex Castro / The Verge

Meta decided to re-open the question of what they should do about misinformation related to COVID-19.

Meta has tried to remove false claims about the disease from social media sites. The company has been criticized for not doing a good job. President Biden walked his statement back a day later after he was asked about the role Facebook played in spreading misinformation about the disease.

Biden voiced a fear that the platform's huge user base and algorithmic recommendations often combine to help fringe conspiracy theories reach huge mainstream audiences, promoting vaccine hesitancy, resistance to wearing masks, and other public health harms.

The number of people who have died of Covid in the past day has increased by 34 percent in the last two weeks. There are fears of a surge in cases of long COVID, a condition that experts say has already been a mass disabling event. According to the US Centers for Disease Control and Prevention, an estimated 1 in 13 American adults have long chronic bronchitis symptoms.

Meta is considering relaxing some of the restrictions it has placed on COVID-related misinformation, including the removal of posts about false claims about vaccines, masks, and related subjects. The Oversight Board, which is funded by Meta, has been asked to give an opinion on how to proceed.

The president of global affairs explained in a post.

In many countries, where vaccination rates are relatively high, life is increasingly returning to normal. But this isn’t the case everywhere and the course of the pandemic will continue to vary significantly around the globe — especially in countries with low vaccination rates and less developed healthcare systems. It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in.

Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard. But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate. The policies in our Community Standards seek to protect free expression while preventing this dangerous content. But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies.

Meta has been criticized for its enforcement of health misinformation, but some of the steps it took had a positive effect on the platform. More than 25 million posts have been taken down due to the company's stricter policies which now require the removal of 80 false claims about the disease and its vaccines.

The platform has sometimes overreached. Meta reversed a ban on discussing the possibility that COVID-19 leaked from a Chinese lab. The company made the decision due to the rise in violence against Asian people, fearing that conspiracy theories about the disease could be used to justify further attacks.

Meta began to allow people to speculate about the origin of the virus. No consensus has been reached on the issue so far. The company probably should not have taken a stance on the issue in the first place, instead using its existing hate-speech policies to moderate racist posts.

I generally favor an interventionist approach when it comes to conspiracy theories on social networks: given the harm done by adherents to QAnon, Boogaloo, and other extremist movements, I see real value in platforms reducing their reach and even removing them entirely.

On some questions, though, platform intervention may do more harm than good. Banning the lab-leak hypothesis gave it the appearance of forbidden knowledge, when acknowledging the reality — that it is unlikely, but an open question — may have been just dull enough to prevent it from catching fire in those fever swamps.

The company asked the board for a second opinion on health misinformation. Meta assumes there will be future Pandemics that bring with them their own policy issues. The company wants to act more thoughtfully the next time around, so it needs expert guidance now. The Oversight Board can take a long time to give an opinion. It was Meta who wanted that process to start now.

He said that the company wanted a check on its power, so it signed a new three-year, $150 million operating deal.

I was told that this was an extension of the most strict sanction. We haven't done it on this scale before. I don't think it's a good idea to refer this to the Oversight Board.

Meta and other social platforms have a profound need for this kind of rudimentary justice system

One of the board's core duties is to weigh in on policies like this. There is a duty to hear appeals from users who believe their posts should be restored after being taken down. The board's decisions are binding when it takes those cases.

The board is supposed to give opinions on how Meta should change its policies. Sometimes the opinions are attached to decisions in individual cases and other times they are not. Meta has adopted most of the changes the board has proposed.

People are still writing the board off. Critics have accused the board of being little more than a public-relations function for the company since it began hearing cases in 2020.

Meta and other social platforms have a need for a basic justice system like this one. The board received over one million appeals in its first year. They had no recourse when Facebook made a mistake. There was no room for appeal when it came to the tough questions about speech.

Even if it still leaves a lot to be desired, a system where these cases are heard by an expert panel is superior.

What do you think will happen now?

It is possible that Meta's policy teams want to relax restrictions on speech related to COVID policy but also want the cover that a decision from the Oversight Board would give them. The board stocked with free-speech advocates, and generally when they have ruled against Meta it has been in the name of restoring posts that the board thinks were wrongly removed.

“You should set the bar really high.”

If the board gives the company the go-ahead to relax its policies, the company will be in for a lot of backlash from left-leaning politicians and journalists. Adding fact-checks, reducing the distribution of false posts in feeds, and other measures would be used to reduce the spread of misinformation if that happened. The existence of anti-vaxx content on Meta could lead to new harms.

There is a chance that the board won't take the bait. The removal of health misinformation continues to be a necessary step, at least for now. The board is new and unknown to the general public, and I wonder what appetite members have to stand up for.

Meta will be cautious with any changes. He said that the company wanted to be careful in how it deleted posts.

He said that you should use the removal sanction carefully. The bar should be high. Private-sector companies should not remove stuff unless it is related to real-world harm.