Meta, the parent company of Facebook, designated the abortion rights group Jane's Revenge as a terrorist organization the day after the Supreme Court ruled in favor of abortion rights. The decision threatens free expression around abortion rights at a critical time, according to experts.

The brief internal bulletin from Meta Platforms Inc., which ownsInstagram and Facebook, was titled "EmERGENCY Micro Policy Update" and was filed to the company's internal Dangerous Individuals and Organizations rulebook. According to the memo, Jane's Revenge is a far- left group that has claimed responsibility for an attack on an anti- abortion group's office in Madison, Wisconsin in May 2022. The group is responsible for a lot of attacks. The company says it is reserved for the world's most dangerous and violent entities, along with hate groups and drug traffickers.

Although The Intercept published a snapshot of the entire secret Dangerous Individuals and Organizations list last year, Meta does not reveal or explain additions to the public. Civil society groups have criticized the policy for its bias towards the U.S. government and for deletion of political speech. In the first three months of the year, Meta restored nearly half a million posts in the terrorism category after determining that they had been wrongly removed.

Discussion of Jane's Revenge was already subject to Tier 1 censorship due to another previously unreported internal speech restriction. The office of Wisconsin Family Action, an anti- abortion group, was vandalized after a leaked Supreme Court decision. Meta banned its 2 billion users from raising money, supporting or representing the vandals on the day after. The more recent use of theterror label suggests a more permanent policy position.

Mary Pat Dwyer, academic program director of Georgetown Law School, said that this designation is difficult to square with Meta's placement of the Oath Keepers and Three Percenters in Tier 3, which is subject to far fewer restrictions. There is a lack of transparency into when and how these decisions are made, which has a huge impact on peoples ability to discuss current events and important political issues.

The empty office in Wisconsin was damaged when a small fire and graffiti were thrown at it. The vandals were quickly designated a "Violating Violent Event", a kind of ad hoc speech restriction that Meta gives to its content moderation staff to limit discussion across its platforms in response to breaking news and various international crises.

The May 11 internal memo states that this is aViolating Violent event. Content that supports or represents the event should be removed from the platform. The Dangerous Individuals and Organizations policy framework restricts speech about violent actors such as neo-Nazis and drug traffickers. A memo stated that the office of a conservative political organization that lobbies against abortion rights was damaged by fire. Jane's Revenge took responsibility for the attack. There are zero victims of theViolating Violent event.

The Wisconsin Family Action designation is notable not only for the relatively low severity of the attack itself, but also because it marks a rare instance of Facebook limiting speech around abortion. Striking as well is the company's choice to censor abortion rights action, even destructive action, given that throughout the long history of the American abortion debate, the overwhelming majority of violence has been conducted by those trying to prevent access to the procedure. The National Abortion Federation reported that assaults directed at abortion clinic staff and patients increased last year. The Army of God Christian terrorist cell and one of its affiliates, the notorious bomber Eric Rudolph, are both on the company's dangerous individuals and organizations list. Right-wing politicians have begun demanding that the property damage be treated as domestic terrorism because of the lack of information about Jane's Revenge.

The company did not censor discussion of anti-abortion acts like the Wisconsin fire. On New Year's Eve, a clinic in Tennessee that had been shot at earlier in the year was burned down. According to multiple sources familiar with Facebook's content moderation policies who spoke on the condition of anonymity because they are not permitted to speak to the press, the New Year's Eve torching was never designated a "Violating Violent event." While anti-abortion advocates are still barred from inciting further violence against the clinics, Meta users now have more latitude to praise or even praise the actions of those on the other side.

The Wisconsin-specific update and more recent terror label, even if intended only to curb future real-world acts of violence from either side of the abortion debate, could end up stifling legitimate political speech. The company's general purpose "Community Standards" rulebook places a blanket prohibition against any explicit calls for violence, only explicitly flagged people, groups, and events are subject to Meta's far more stringent bans against "praise, support, and representation." Speech that is short of crossing the red line of violence is subject to deletion. Facebook frequently cites the ban on praise, support, and representation as a reason to remove posts that protest Israeli state violence against Palestinians.

Billions of people are limited in what they can say about the perpetrators, their motives, or their methods because Jane's Revenge is poorly understood, controversial, and subject to intense debate. Anything that could be construed as praise is at risk of being deleted. According to the public description of the "praise, support, and representation" standard, any posts that "legitimizing the cause of a designated entity by making claims that their violent, or criminal conduct is legally, morally, or otherwise justified or acceptable" are not allowed.

“There are legitimate concerns that this might shut down debate.”

The company's internal overview of the "praise" standard was obtained and published by The Intercept. While these internal rules permit academic debate and informative, educational discourse of a violent entity or event, what meets the threshold for "academic debate" or "informative discourse" is left to Facebook's thousands of workers.

The policy threatens discussion and debate of abortion rights protests at a time when such speech is very important. The director for international freedom of expression at the Electronic Frontier Foundation said that when Facebook banned certain types of harmful speech, they often caught other types of commentary in their moderation net. Counterspeech against terrorism can be removed if efforts are made to ban terrorist content. It isn't hard to imagine that an attempt to ban vandals against an anti-abortion group could also be used to ban legitimate speech against them.

One of Facebook's most controversial and problematic policies is the ad hoc censorship of "violating events" through the Dangerous Individuals and Organizations framework. The combination of the company's increasing reliance on automated content screening and the personal judgement calls of low-paid contractors creates erratic, faulty results. There are legitimate concerns about shutting down debate.

“Ukrainians get to say violent shit, Palestinians don’t. White supremacists do, pro-choice people don’t.”

There is a threat to political discussion and debate around both abortion and the reproductive rights movement due to the lack of transparency in the censorship policy and Facebook's enforcement of speech restrictions. Even those who don't condone the methods of Jane's Revenge have an interest in talking about them and maybe even entertaining them. Even if you disagree with their tactics, the underlying policy of Jane's Revenge is fighting for, even if you don't agree with it.

It's significant that a free expression around relatively minor acts of violence would not only becensored in the first place but also subjected to the same limits Facebook uses for Al Qaeda and the Third Reich. The act of vandalising a building was quickly added to the list. It is intended to be reserved for the most serious kind of incidents, such as hate crimes, gun massacres, and terrorist attacks. The decision to censor free discussion of Jane's Revenge, responsible for a failed firebombing and a series of threatening graffiti incidents, makes the fact that Facebook did not limit discussion of the TennesseePlanned Parenthood even more puzzling. In a long history of Facebook, the decision to put its finger on the scales of political discourse can be seen as ideological or arbitrary. The issue is raised by their constant picking and choosing of 'winners'. Palestinians don't say violent shit whileUkrainians do. pro-choice people and white supremacists do the same thing.

Meta will remove content that praises, supports, or represents the organization after it was confirmed that Jane's Revenge was a terror group. Jane's Revenge was flagged but not other actors committing violence to advance their stance on abortion because of the company's process for determining which people and groups are restricted. Users can appeal deletions if they think they were made in error.

It's hard to assess the merits of a decision made in secret. The details of the rules are hidden from the billions of people who use Meta. Meta needs to immediately institute the Santa Clara Principles, a content moderation policy charter that mandates clear and precise rules and policies relating to when action will be taken with respect to users' content or accounts, among many other things, according to York.

Without the entire company's rules and their justification provided to the public, Meta leaves billions posting in the dark. Meta has always claimed that it takes no sides on any issue and only deleted speech in the name of safety, a claim the public generally has to take as an article of faith. For a platform that insists that it is neutral and doesn't have its finger on the scale, it's really incumbent on Meta to be more forthcoming They need to be able to defend them.