The information war is picking up as the war in Ukraine continues.

Bad actors and those being tricked are spreading fake news, manipulated media, and all sorts of propaganda. What are the tech companies doing to stop bad information from spreading?

In order to get a complete picture of what is being done to stop misinformation, Mashable reached out to several major social media platforms.

Meta

Russia has been involved in a number of disinformation campaigns. Is Mark ZuckerbergLessons from the attempts to sway elections? What is Meta doing this time?

Major steps have been taken to stop the spread of false information on social media. Russia Today and Sputnik have been blocked by Meta in the EU. The revenue share the company has with these outlets has been cut so they can monetize their content in areas where they have not yet been banned. Meta will continue to label state-run media as it previously did.

Tweet may have been deleted

On Tuesday, Meta announced it would be taking further action against Russian state-run media on social media by demoting its content on their websites and ensuring that their content won't be recommended to users.

The lock profile tool on Facebook gives people in the country easy access to additional security and privacy measures. Only friends on the platform will be able to share or download photos if a user turns on this feature.

Meta said it had stopped two disinformation campaigns. One campaign tried to control the narrative by creating fake accounts. The pages used artificial intelligence to hide the fact that these individuals did not exist. A hacking group from Belarus was behind the other campaign. Anti-Ukrainian propaganda is spread by both networks. Dozens of accounts were removed by Meta.

Meta's messaging platform, WhatsApp, has shared best practices with its users on how to secure their accounts and take advantage of certain privacy features, such as Disappearing Mode.

Twitter

Similar to Facebook, misinformation campaigns take hold on the platform. During the Russian invasion of Ukraine, the social network seemed to have taken its role more seriously than usual.

Users may have noticed that information is being shared by official accounts to make sure they are following best practices to secure their accounts. The company says it is monitoring vulnerable users in order to stop account takeovers.

There are policies surrounding manipulated or synthetic media. Entire accounts can be suspended based on the severity of the violation. The company has already removed manipulated content from the platform, such as a clip purported to be from Ukraine that was actually footage from a video game.

Tweet may have been deleted

Russian state-run media was labeled on the platform by the social microblogging platform. As of Monday, the platform began adding warning labels to all of its links to Russian state-run media.

The label reads, "This Tweet links to a Russia state-run affiliated media website." It will reduce the reach of these messages.

Along with those changes, a number of Russian state-run media outlets have begun reporting that their own personal accounts that promote their work have been put on the warning label.

More than a dozen accounts have been suspended for violating its platform manipulation and spam policy. The use of fake accounts in order to spread content and artificially inflate engagement is what this policy entails.

Our investigation is ongoing, however, our initial findings indicate that the accounts and links originated in Russia and were attempting to disrupt the public conversation around the conflict in Ukraine.

According to NBC News, these accounts were sharing links from a new propaganda outlet called Ukraine Today.

Before Russian troops even entered the country, there was already a suspension of advertising in Ukraine and Russia. Recommendations from accounts users did not follow were paused. The company says that this action was taken to reduce the spread of abusive content.

The top priority of the company is keeping people safe, and we have longstanding efforts to improve the safety of our service.

The company made it clear that at least some of its existing policies wouldn't be paused due to the conflict. When the Ukrainian National Guard posted an Islamophobic video of a Neo-Nazi battallion, they hid the clip behind a warning label as per their hate speech policies.

YouTube

The Russian state-run media is very popular on the internet. Over the years, Russia Today has found success on the platform. More than 4.5 million people watch the main channel ofRT. It has received more than 10 billion views on its YouTube channels.

It could result in a pretty lucrative revenue stream. That was until this weekend when all Russian state-run media was demonetized.

In light of extraordinary circumstances in Ukraine, we are taking a number of actions.

The platform will be limiting the recommendations to content on these channels, according to the statement. They also revoked their monetization. Users in the Ukranian are being restricted access to other channels on YouTube.

A number of low-subscriber channels that were part of a Russian influence operation have been removed from YouTube.

Snapchat

The social messaging app has avoided becoming a hub for problematic content due to the nature of how it works. According to the company, it will remove any misinformation it comes across on its platform.

The app has been designed to make it hard for misinformation to spread. Unlike traditional social platforms, we don't feature an open, unvetted newsfeed and the content on the public parts of the app. We remove misinformation immediately if we find it.

TikTok

Online challenges and dance crazes are no longer a part of TikTok. Current events may be the best measure of how much the shortform video app has expanded beyond the teen content it was originally known for.

TikTok has been used as a platform for the latest news as well as updates from people on the ground. The young platform has become a major outlet for misinformation and propaganda.

There are videos on the platform that portray conflicts from years earlier and in completely different parts of the world. There are also scam on TikTok. Ukranians sharing their wartime experiences are being impersonated by scammers to raise money.

Critics are scratching their heads at the timing of the new feature announced by TikTok. Video uploads of up to 10 minutes long will be supported by the shortform video platform. The platform was already struggling to handle misinformation before Russia invaded Ukraine and when it was dealing with 3-minute long videos.

The company said it has taken action against users who act in bad faith and will remove content that breaks its rules.

TikTok has increased resources to respond to emerging trends and remove violative content, including harmful misinformation and promotion of violence.

The National Association of Media Literacy Education is one of the organizations that TikTok has partnerships with.

LinkedIn

While most people think of LinkedIn as a business networking tool, the social network has had its fair share of fake news and misinformation spread throughout the platform.

The Microsoft-owned platform says its safety teams are closely monitoring conversations on the platform and its global editing team is making sure news and updates are coming from trusted sources. The Professional Community Policies of LinkedIn prohibit misinformation, false content, and manipulated media.

As policies change, this post will continue to be updated.