Two large luxury buses pulled up to a grand hotel in the morning of June 30, 2022, The members of the Oversight Board were milling on the driveway. The august gaggle was set up two years ago by Facebook and now is called Meta. The board members had already spent a lot of time on video calls and email. The buses took the 23 friends to Meta.

The group walked across the complex to the Bowl. The outgoing COO of Meta greeted the crowd in the midday heat. The company's president for global affairs was next to speak. He heaped praise on the board. The big screens in the Bowl lit up with a familiar face as he was taking questions from the members.

The sweaty visitors peered down at Mark. He had never met with all of the current members. A guess would have been that he was at his Hawaiian island retreat, where he had been for much of the previous year. The board has done a good job so far. Sometimes people use their voices to put other people in danger, but free expression has always been part of his company's mission. Meta shouldn't be making so many decisions alone. He finished his talk with a big hug. He said that he was committed to the board for the long term.

A few weeks later, Meta said it would give the board $150 million more than it had originally committed. Over two million appeals have been received by the board and it has ruled on 28 of them. There are more than 100 recommendations to Meta. The removal of a former US president from Facebook was one of its judgments.

Critics think the Oversight Board is an exercise incorporate ass-covering by a bunch of Meta's puppets. The board can be pushed to take a stance on the issue if the company doesn't want to make a controversial call. The former director general of Israel's Justice Ministry, Emi Palmor, says she is frequently approached in the supermarket by people looking for tech support for Meta apps. She wants to kill the person who named the board. It is a mystery.

The board has gained respect from the human rights organizations and content wonks since it started hearing moderation cases in the fall of 2020. A law professor who follows the board closely says that people thought it would be a total debacle. It has brought some accountability to the social networking site. Meta is proclaiming victory. "I'm absolutely delighted, thrilled, thrilled with the progress" The board's approach to cases is exactly what you would expect from a social media platform.

The truth is more complex and makes board members nervous. How great can the board be if they think the oversight is going great? Suzanne Nossel thinks it is too early to make a decision. She says that they are just beginning to figure out how to do this work.

The board has figured out that it has an opportunity to change the way internet companies treat their customers.

The way platforms patrol their corridors can seem arbitrary and self-serving after more than 20 years of social media. Life-changing decisions are made by imperfect algorithms and armies of under trained moderators. Millions of appeals are filed each month. They dig through help pages and often give up. The policies that balance free expression and safety were drawn up by companies that want to grow and make money. One of the board's cochairs says the platform was not designed with integrity in mind. It was designed so that it could be reached.

People don't want the government to bash out rulings on posts. People expect online speech to have some rights. The Oversight Board is a chance to stem some chaos and secure liberties. The board members find themselves bumping up against the edges of what Meta will allow them to do.

The illustration is by Deena So'Oteh.

The Oversight Board began on a bike. A professor at Harvard Law School crashed at a friend's house in the Bay Area in January of last year. He was riding his bike in the foothills when he thought about Facebook. He thought that the problem with his host's social media employer was that no matter what they decided on a piece of content, someone would be mad at them. It might benefit from a separation of powers. He suggested that Facebook create its own version of the Supreme Court, an independent body that would look at the biggest complaints about the company's decisions.

As a way to signal that he wasn't the dictator of the world's expression, he was thinking about "governance" as a way to show that he wasn't the leader of the world's expression. He accepted the idea. I went to see the rooftop gardens of Facebook in June of that year. He shared a vision of an independent body that could make binding decisions. They don't report to me but we need to figure out the mechanism for appointment They are not likely to be a part of the Facebook team. He understood that he would have to fight off the idea that the overseers were his friends.

New initiatives are made possible by relying on long time lieutenants. The governance nerds were used by Facebook. Heather Moore, who had worked in the US Attorney's Office in Newark, New Jersey, was the head of it. They thought it was a chance to help people on the platform. Harris is the head of a governance group at Meta.

The board of Facebook was set up with the careful consideration of a 19th century government railway bureaucracy. The buy-in was not universally accepted. The head of global content policy was skeptical that they would get much benefit. Her rules would be the subject of questioning by the board. The team kept moving forward, setting up a series of workshops, and soliciting suggestions from outsiders on how the board should function. Some people fill its seats.

The board was set up as an independent trust by 2020 with a grant from Facebook. The company would pay up to 40 board members six-figure salaries for 15 hours of work per week. The effort would be supported by a full time staff. The ground rules were set in a long charter. There would be disagreements over individual posts. The user might have wanted to contest the decision to remove someone's post. The board does not have the power to rule on posts. It could come later. The cases the board would take on would be assigned to five-person panels by a case selection committee. The groups would make a decision. The board had ruled on individual posts.

There was more, but there was also more. sweeping recommendations could be included in the board's case rulings. It would have to explain itself if it turned down the suggestions. Meta could ask the board to review an unpopular decision through a policy advisory opinion. Meta was able to accept or reject the board's advice.

When a robot blocks a person's speech, a human will never see it.

In May 2020, the company announced it had recruited a distinguished collection of lawyers, journalists, and human rights activists to join the board. A former prime minister ofDenmark, a Pulitzer Prize–winning former newspaper editor, and a peace prize winner were also present. The members all had the same resolve to be seen as their own person.

The Oversight Board was going to be called out as a sham. Jessica Gonzalez is the co-CEO of Free Press, a group opposed to corporate control of media, and one of a motley group of company detractors. The real board is a PR stunt that gives Facebook cover for not investing in the integrity of its systems and not doing enough to keep people safe.

The board ruled on its first cases in January of 2020. In October of last year, a Brazilian user promoted a breast cancer awareness campaign by posting an image with several examples of post- surgery breasts. The post had nipple content and was taken down. The company reviewed the post after the board accepted the case. The policy standards team restored the post after it was found that nudity was within the rules. The board was told to stop the case after the issue was over.

The members didn't want to be part of it. The real work was in interrogating company policies and their insistence was a message. They were trying to change something.

The board members exposed how this seemingly trivial mistake was a window into a deeper failure when they wrote up their decision. The company relied on technology that didn't pick up the Portuguese for "breast cancer" The board said removing the post raised human rights concerns. The board said that any restriction on freedom of expression must be for a legitimate aim. If a user appeals a decision like this, they should be granted a human content moderation. Even though Facebook decided to restore the content, the board members asserted their authority. At the same time, we said, "We want to talk about algorithm."

The company didn't follow up on the recommendation. When a robot blocks a person's speech, a human will never see it. The board imagined a world in which social media platforms would have to treat their users in a similar way to humans. Human rights are something that the members want to make happen.

The illustration is by Deena So'Oteh.

When the suspension of President Donald Trump was announced, the board had only issued a few rulings.

The violent protests that took place on January 6, 2021, were blessed by Trump on his social media accounts. He was suspended from both platforms indefinitely. The crowd cried out for freedom. The anti- Trumpers were angry that the ban wasn't permanent. The board members of Facebook were told to figure it out after a new US president was inaugurated. It was a very easy decision to request a public opinion. Imagine if we had waited to make that decision to them. People would have said, "You have created an oversight board, and you won't even let them know what to do with the former president of the most powerful democracy on the planet."

The board was at risk at the time. A clumsy move could have ruined the experiment, so both pro- and anti- Trump watchers were ready to pounce on any mistake. The board supported the decision to remove the former president from the platforms after months of discussion. The company's policies need to be more explicit. The board excoriated Facebook for not providing a time frame for the ex-president's restoration and for making decisions on the fly. The company didn't have clear standards for suspending employees. Helle Thorning-Schmidt is a former prime minister of Danes.

One of the board's obsessions is Facebook's lack of transparency. The board was good at choosing complaints with the most potential for impact. Nicolas Suzor is a board member and law professor from Australia. When the board wants to address certain issues, Suzor is on the selection committee that has to sift through thousands of appeals to find cases that fit.

The committee picked out a case called Ocalan's Isolation. The Kurdistan Workers' Party (PKK) is a group that has been designated a "dangerous entity" by Facebook. He is being held on a Turkish prison island. A few months ago, a person in the US posted a picture of Ocalan with the words "y'all ready for this conversation" and urged people to talk about the conditions of the prisoner's confinement. It was taken down by Facebook. Posts are banned in support of dangerous people. The post was not that.

The board wanted to address the issue. Julie Owono is the executive director of the digital rights organization Internet Sans Frontires. There is a leader who has been recognized as a violation of the person's human rights.

Background information on the case was dug up by researchers within the company. The issue of Ocalan's imprisonment had come up before. Users who advocated for humane treatment but weren't themselves PKK supporters were allowed to post on the company's website. It was never made public. It was forgotten inside the company as it took down posts about the conditions inside. Facebook wasn't following its own rules. Owono says that he came here because he found out about the connection.

The company's imperious attitude towards complaints was pushed to be fixed by the board. Users were never told why posts were taken down. The board considers this behavior to be an insult to human rights. The cochairs say that it was a problem before they joined the board. We realized it was a big problem. Six of their 20 rulings recommended that when the company removes a person's content, it should tell the user what rule they broke.

The battle proved to the board that the purpose of the battle was to make Meta own up to the monster it has created.

He acts as if the board's continued pounding on this topic is the best thing since targeted ads. He said a thousand percent. The board has criticized you for not explaining to users where you stand and for applying arbitrary decisions. Meta revealed this summer that it was creating a customer service group to explain its actions.

The board made its point, but it took many decisions. Meta is more open with its users about what they have done wrong.

The battle proved to the board that the purpose of the battle was to make Meta own up to the monster it has created. The text on the page where users lodge their complaints doesn't say "Get your post restored" or "Fix this bad decision." In giant letters, the call to action said, "Appeal to shape the future of Facebook andInstagram."

The board has limited leverage even though it racked up points. Harris is the lawyer who helped set up the board and remains its closest contact within Meta. The difference is obvious. Meta is free to ignore the regulators as it pleases. Meta claims to have fully implemented 19 of the board's recommendations through the end of the year. It is work Meta already does, the company said without explanation. Recommendations are not accepted.

Owono doesn't have a police force. It doesn't prevent us from holding the company accountable to its users The board committee is looking at ways to make their recommendations more difficult to ignore.

There were two themes emerging in the relationship between Meta and its oversight board. The board's decisions were having a positive effect. Even Meta's content policy head, Bickert, who was cited to me as a powerful internal detractor of the effort, now asks herself what the board would think. Some board members felt that Meta was placing obstacles in their path and that they were being forced to work within.

The board's growth is one point of contention. The idea was that the company would help choose the first group of members and then step aside. The company was given a say in the selection of 40 members. The board is far short of the total number set out in its charter due to the fact that Meta employees are deeply involved in hiring. According to the law professor who keeps an eye on the board, it is difficult to find the right kind of people.

When the board invited Renée Di Resta to interview, Meta's influence was hard to miss. It would be an opportunity to shape the direction of something that I think has real potential if I became a member of the internet observatory. DiResta has two degrees. She underwent a lot of interviews in the beginning of 2020. She was included on paper in a way that made sense. The Oversight Board doesn't have experts on algorithms. She has been critical of Meta's failure to deal with harmful misinformation on its platforms.

DiResta's application was rejected in March. She says they said they were going in different directions. The direction was the same as before. The first 20 members of the board are lawyers or journalists. According to a person familiar with the process, Meta's reservations put the kibosh on the nomination. Harris says that the company is concerned about who may or may not be more effective in certain lights as a board member. Multiple people withholding their endorsement is not uncommon, and the exceptions are the candidates who earn consensus and get hired. The board has a hard time filling its vacancies. Meta would never be entertained by the board if it were truly independent.

Board members were angry over another dispute with Meta around the time DiResta was rejected. They wanted access to a tool that would help them make decisions. The software is used to analyze the impact of social media postings. Outside researchers and media organizations use it. It seemed like a no-brainer to get access to the case. The board asked for access for months, but didn't get it. Someone at Meta didn't want the board to have it

The issue came up in a meeting with the board in March of 2022. After he promised to break the logjam, the board finally got the tool it had been waiting for. Michael McConnell, one of the board's cochairs, said that they had to fight to get it. We did it

There was another incident that roiled the waters. When Russian troops invaded Ukraine, social media sites were overwhelmed with dangerous content. Posts promoting violence, such as "death to the Russian invaders," were in clear violation of Meta's policies, but banning them may suggest the company was rooting for those invaders. Meta said in March that it would allow violent speech temporarily. The board was asked for a policy advisory opinion. The board was eager to think about the human rights dilemma. It made appointments to brief reporters on the case.

Michael McConnell says that there are many people in the company for whom they are more of an irritation.

The request was withdrawn just before the board announced its new case. An investigation might endanger some Meta employees. In private meetings with the company, the board blasted the explanation. Stephen Neal, the chair of the Oversight Board Trust, said that if safety were the reason, that would have been obvious before Meta asked for the policy advisory opinion.

Neal didn't deny that the board's foes wanted to prevent it from interfering with a hot-button issue. The board took on a case that addressed some of the issues raised by Meta. There was a Russian-language post that showed a dead fascist lying on the ground with a quote from a famous Soviet poem. It's a good idea to kill him.

The mixed feelings inside Meta were noticed by others. McConnell says there are many people in the company for whom they are more of an irritation.

The board members are accomplished people who aren't bomb throwers so they aren't the type to declare war on Meta. Alan Rusbridger is a board member and former editor of The Guardian. Nobody has ever tried to do something like that before. I think there is a pattern of dragging them screaming and kicking in order to give us the information we are looking for.

Things are worse if there is no information. Meta gave the board the wrong information in one instance.

The program Cross Check was mentioned to the board by researchers from Meta. Special treatment was given to accounts belonging to politicians. The company said it was a limited program with a few decisions. Meta was asked by some board members to compare the error rates in its Cross Check decisions with those on ordinary posts and accounts. The members wanted to make sure the program wasn't a get-out-of-jail-free card.

Meta said the task wasn't doable. When the company wants to bounce the board's suggestions, this excuse is a good one. Meta pointed the board to a previous statement that said: "We remove content from Facebook no matter who posts it."

The Wall Street Journal published leaked documents in September of 2021. Even its own employees condemned the program for allowing powerful people to circumvent the company's rules. One example is Trump's Black Lives Matter related post that said, "When the loot starts, the shooting starts." A soccer player had nude photos of a woman who accused him of rape. In May of this year, researchers at Facebook were dismayed by the fact that they were exposing users to false information. The internal paper said that we are not doing what we say we are doing.

Meta was found to be in violation. The Cross Check system was overstated to the board. According to a former employee who leaked the papers, she thought it was disrespectful that Facebook lied to the board.

Meta admitted that it should not have said that Cross Check only applied to a small number of decisions. The entire exercise would fall apart if Meta couldn't give accurate information. Suzanne Nossel, the PEN CEO, was concerned that the company's deceptions might hurt their project. She was concerned about the board's ability to carry out their work.

Meta asked the board for its opinion on the program, like it did in the Trump decision. The board formed a committee to study cross check. The meetings were online. The committee met in New York in April. Several meeting rooms at a law firm in Midtown were taken over by the board and their staff. The first time a journalist was allowed in an official Oversight Board session was when I sat in on their deliberations. I had to agree not to include members' names in quotes. It should not be the last, as the glimpse I got showed, these semi- outsiders were going to change the company that brought them together.

Can Meta give preferential treatment to certain customers? It is entwined with how people express themselves around the world. A board member cried out in frustration, asking if being on Facebook was a basic human right.

A group of fifteen people gathered around a set of tables and set up for a UN summit. Every member of the group received an iPod Touch so they could listen to the translations. It became heated after the conversation started. Some members gave up their home tongues and spoke in English to the other members.

I watched for an hour of a long session. The program was being evaluated from a human rights point of view. The members thought that Cross Check embodied inequality, which was in contrast to Meta's claim that "we remove content from Facebook no matter who posts it, when it violates our standards." The Privileged Post Club was referred to by one member.

Meta argued that it was possible to give special treatment to well-known accounts. It would be quicker for employees to assess if an improper post was excusable. The members zeroed in on the program not being transparent. The cochair said that it was up to them to explain why it should be private.

The members debated if the program should be made public. The Privileged Posters could be labeled. One member objected to the entire concept of the program after listening to all the back and forth. She said the policies should be for everyone.

Content moderation at scale was the same problem that the Cross Check program was facing. Is Meta allowed to favor certain customers? Meta is part of the way people express themselves around the world. A member cried out in anger, "Is being on Facebook a basic human right?"

Critical facts about the program were not shared by Meta. Cross Check may have given some people extra scrutiny by singling out people solely to clear questionable content. The board didn't get an answer. Members and staffers met with Meta officials after the meeting. Rusbridger told me that they tried to get the information they wanted. They were bruised and thought we behaved badly. The board got some of the information it wanted, but not all of it, according to him.

The members are hoping to move the board into a more consequential spot despite the fact that they have been frustrated. Meta accepted more of its recommendations in the last few months. It might try to take on more cases in the future. Neal thinks the trust could double or triple the number of cases it handles. If we were to do 100 cases a year, is that enough to make a difference in where platform content is going? You need to think about a larger organization if you want to think about bigger impacts. All of the open slots could be filled by the board.

It could begin to critique Meta's work. Some of the group's recommendations implicate the company's code even though they fall outside the board's influence. Palmor is a lawyer from Israel. We take into account the way content is spread even if we don't talk about the algorithm. The next step would be to get more expertise on how the computer programs work. The hiring of Renée DiResta would have made a difference.

Policy advisory opinions are the big issues that have all originated within Meta. The members would like to add to the list. If she had her way, Karman would demand action on Meta's high volume of bogus accounts. She says that they breed misinformation, hatred, and conflict, and at the same time, fake accounts are recruited to attack the real accounts. It has become a tool of oppression. Is the board planning on addressing the issue? She says they're working on it.

The board is looking at how it could use its power. The European Union's Digital Services Act, which will introduce a breathtaking suite of rules on digital platforms, including social media, is being considered by the organization. There is a provision for mandatory appeals. Some members dream of a more global force in content policy with influence over other companies if they join the effort.

They aren't exactly beating down the doors to get a piece of the oversight board The new CEO of the social media company said on his account that he was setting up an advisory committee. He hasn't accepted the offer from the Oversight Board. The board's decisions don't cover the company. Palmor thinks we are making a difference Is the board enough impact? I don't think we made more of a difference.

People on the board seem to be intoxicated by the idea of being in charge. If Meta's competitors had to follow its rules, it would be a victory.

Thomas Hughes, who handles the board's operations, says they aren't seeking to be the board for the industry. We want to understand how we might interact with companies setting up different types of council or bodies to talk about standards, and how we might interrelate with other companies. Meta, a company whose sins spring from a mania for growth, now has its own vision of getting big fast.

There is an article in the December/January issue. Don't forget to subscribe now.

We want to know what you think about this article. Send a letter to the editor towired.com