Large audiences on platforms designed to promote content that gets engagement have been found with bad information about the Russian invasion.
The false impression that Russian soldiers were parachuting into Ukraine was created by a 2016 video on TikTok. The original statement warned that fighting near Chernobyl might disturb nuclear waste, but a mistranslation made it sound like fighting near Chernobyl had disturbed a nuclear waste site.
People face the firehose of breaking news and interact with viral posts about a terrible event, which can amplify harmful propaganda and misinformation. The guide is for people who want to avoid helping bad actors.
Before the Black Lives Matter protests in 2020 and the US election later that year, we published some of this advice. Specific considerations for news coming out of Ukraine have been added to the information.
People think that what they do online doesn't matter. It does matter. Sharing questionable information with a small group of friends and family can lead to its wider dissemination.
While an urgent news story is developing, well- meaning people may quote, share, or duet with a post on social media to challenge and condemn it. New rules, moderation tactics, and fact-checking have been introduced by Facebook. It signals to the platform that you find it interesting if you interact with misinformation. If you see a post you think is wrong, try to flag it for review by the platform where you saw it.
A method for evaluating online information that he calls SIFT is developed by a digital literacy expert.
He says that there is a human impulse to be the first person in your group to share the story. This impulse is a daily hazard for journalists, but it also applies to everyone during times of information overload.
If you want to do something to help, you should be following people from Ukraine who are telling their stories, according to a researcher.