According to YouTube's Chief Product Officer Neal Mahon, 1 million videos were removed for dangerous COVID-19 misinformation.
Mahon shared this statistic in a blog posting that explained how the company deals with misinformation on its platform. He wrote that misinformation has moved from being marginalized to mainstream. It is no longer restricted to the closed-off worlds o n 9-11 truthers or Holocaust deniers.
Youtube's executive also stated that only a small portion of YouTube content is bad. Mahon stated that bad content is only a small percentage of YouTube's billions of videos. About.16-18% of all views are due to content that violates our policies. Mahon also stated that YouTube removes nearly 10 million videos every quarter. Most of these videos don't even reach 10 views.
Facebook made a similar argument recently about content on its platform. Last week, Facebook published a report that stated that memes and non-political posts were the most popular. The company was criticised for its handling of COVID-19, vaccine misinformation, and has since defended itself, arguing that vaccine misinformation doesn't represent the content most users see.
YouTube and Facebook have been particularly scrutinized for their policies regarding health misinformation in the wake of the pandemic. With well over a million users each, even a fraction of the content could have an impact on millions. Both platforms have not yet disclosed details about how misinformation is spread or how many people are getting it. Mahon said that the company's strategy does not include removing misinformation. YouTube is working to increase information from trusted sources, and reduce the spread of harmful misinformation.
Editor's Note: This article originally appeared on Engadget.