Illustration by Alex Castro / The Verge

The company announced on Friday that it is changing its policies and community guidelines to tackle health misinformation, off- platform behavior, and hate speech. It is the first big policy update for Discord in nearly two years and is designed to target groups or individuals that participate in organized violence, spread harmful anti-vaccination material, or harass other Discord users with hate speech.

After two years of a public health crisis, Discord is making its policies more clear. In new community guidelines that go into effect on March 28th, users are not allowed to share false or misleading information that is likely to cause physical or societal harm.

It doesn't mean that all anti-vaxxers will be removed from the platform.

Clint Smith is the chief legal officer at Discord.

Not all anti-vaccination content will disappear

Smith says that if someone posts about holdingcrystals against your chest for 5 minutes and your lung capacity will improve, that's not something that will be taken action against.

The context of messages, the likelihood of direct harm, a poster's intent to convince or deceive others, and whether a group has a history of repeated violations of Discord policies are some of the factors that will be taken into account when deciding when to take action. Users of the service will be able to report this type of content by right-clicking on any message and hitting the report button.

Users who engage in harmful behavior outside of the platform will be punished. Membership in violent organizations, making credible threats of violence toward people, organizations, events, or locations, and belonging to groups that sexualize children in any way are relevant behavior.

If a user is charged with drug possession or implicated in cheating around a school exam, they aren't allowed to use the platform.

A new public server tag in the top-left of Discord servers.
Image: Discord

Research published by the Institute for Strategic Dialogue last year shows that far-right groups have exploded on platforms. These groups can organize violence or participate in harassment. These types of groups have been shut down by Discord for a long time. New groups appear regularly, and even social protest groups can organize on platforms like Facebook and cause real-world violence. Data science and machine learning are being thrown at the problem, but the company is relying on outside experts to report extremists.

Smith says that they are investing a lot in building relationships with experts and trusted reporters.

Smith says the company has consulted with academics, groups, and journalists. The platform doesn't have its own staff, so it invests in paid consultants with specific expertise.

It is different from platforms like Facebook because of its approach to moderation. It relies on individual server to moderate themselves and escalate content to company moderation when necessary. I like to refer to it as my local pub as it plays host to a wide range of groups. We wouldn't expect the landlord of a pub to police all of those different groups of conversations, but you would expect them to ban groups based on reports. It's more difficult to spread misinformation on the service, but it's ideal for groups to organize in private without any obvious oversight and gather in numbers that you just can't do offline.

Discord has been trying to remove bad actors for years

The new tag in the client will make it easier to understand the difference between private and public server. There is a line between platform management and privacy.

You could imagine that we would do more machine learning from a safety perspective, but we won't be doing that on private server.

The protected attributes of caste, gender identity, age, serious illness, and more are being expanded by Discord. The terms of service, privacy policy, and community guidelines include more regular English than legal language, which is difficult for users to comprehend. In order to make it easier for users in countries where terms like NSFW aren't used often, the terms have been dropped.

Most of the changes are the result of how the world has changed over the past few years and how Discord has evolved to target a much broader market. The service continues to attract a diverse mix of communities as a result of the money raised by the service.

Smith jokes that in 2020 they were a band of Californians serving a gaming-oriented audience.