A study has shown that TikTok misinformation about COVID is easily spread to children as soon as they sign up.
Although TikTok does not allow users younger than 13, children under 13 can still sign up.
Public criticism continues to be levelled at social media companies for their impact on young users.
Loading... Click to sign up for Insider marketing emails. You also agree to receive partner offers.
It's not secret that social media algorithms can unintentionally aid in the dissemination of COVID misinformation to millions. But the more important problem is who this content is directed at.
Popular social media app TikTok has been feeding misinformation to children as soon as they sign up. Children as young as nine were targeted with false information, even though they did not search or follow the content.
NewsGuard's media rating agency found that COVID misinformation reached eight out of nine study participants within 35 minutes of their initial access to the platform. Two-thirds of participants also saw incorrect information about the COVID vaccines. These included information relating to COVID vaccines and homeopathic remedies.
Alex Cadier, co-author of the Guardian's report and UK managing editor at NewsGuard, stated that "TikTok has failed to stop the spreadof dangerous health misinformation via their app." "Despite claims that they are taking steps to combat misinformation, the app still allows antivaccine content and health hoaxes spread relatively unimpeded," said Alex Cadier, UK managing editor for NewsGuard.
NewsGuard conducted the study between August and September. It asked children aged nine to seventeen from diverse cultural backgrounds to create TikTok accounts. Although the app is restricted to users under 13, three of the youngest users were able create accounts without any outside assistance. Statista reports that 25% of TikTok users aged between 10 and 19 were active in the US as of March 2021.
Insider was told that TikTok has a terrible record of removing misinformation videos. Videos with vaccine misinformation remain on Tik Tok for months. Katrine Wallace, University of Illinois School of Public Health Epidemiologist, said that Tik Tok's inability to remove misinformation videos. The more viral these videos become, the more people will see them. Unfortunately, some of those who see them will be children due to the nature algorithm.
TikTok's community guidelines for COVID-19 and its vaccinations prohibit "false and misleading" content. The company has teams that identify and remove misinformation and evaluate all COVID-related content case-by-case.
It also stated that the app promotes an "age-appropriate user experience", which discourages and removes accounts from underage users. The app also restricts LIVE and Direct messaging features to younger teens. In September, Douyin, the Chinese version TikTok announced that it would limit the time users can use the app to 40 minutes per week.
TikTok did not respond to a request for comment regarding the NewsGuard report.
Other than TikTok and Instagram, social media platforms such as Twitter, Instagram, Facebook, and Twitter have been under fire recently. Increased transparency by the companies has revealed more about the effects of social media on society, especially for younger generations. A Facebook whistleblower revealed how its platforms can psychologically harm teenager users this week. High-profile social media influencers continue to spread COVID misinformation and increase the number of harmful content directed towards younger viewers.