People were hungry for reliable information as COVID-19 spread. A global network of volunteers consolidated information from scientists, journalists and medical professionals and made it accessible for everyday people.
Dr. Alaa Najjar is a medical doctor who spends his breaks during his emergency room shift addressing misinformation on the Arabic version of the site, while the other is a Wikipedia volunteer. Dr. Netha Hussain, a clinical neuroscientist and doctor from Sweden, spent her downtime editing articles about COVID-19 vaccines in English and Malayalee.
Thanks to Najjar, Hussain, and more than 280,000 volunteers, Wikipedia became one of the most trusted sources for up-to-date, comprehensive knowledge about COVID-19, covering nearly 7,000 articles in 188 languages. Laws that enable its collaborative, volunteer-led model to thrive are the only things that make the reach and ability to support knowledge-sharing on a global scale possible.
The Digital Services Act (DSA) is one of the new regulations that the European Parliament is considering, and it must protect citizens' ability to collaborate in service of the public interest.
Lawmakers are right to try to stop the spread of content that causes physical or psychological harm and is illegal in many countries. We welcome some of the proposed elements, including requirements for greater transparency about how platforms moderation works.
The current draft includes requirements for how terms of service should be enforced. The measures may seem necessary to curb the power of social media, prevent the spread of illegal content and ensure the safety of online spaces. What happens to projects like Wikipedia? Digital platforms that operate differently from large commercial platforms could be stifled by some of the proposed requirements.
Big Tech platforms work differently than collaborative websites. All of the articles created by volunteers are free to read. The commercial platforms maximize profits and time on site using incentives that leverage detailed user profiles to target people with most likely to influence them. Errors of over- and under-enforcement are caused by the deployment of more algorithms to moderate content. Computer programs often confuse artwork and satire with illegal content, while failing to understand human and context necessary to enforce platforms' actual rules.
The Wikimedia Foundation and affiliates based in specific countries, like Wikimedia Deutschland, support the independence of the volunteers in making decisions about what information should exist on the internet. The open editing model of the online encyclopedia is based on the belief that people should decide what information stays on the site.
The model ensures that people who know and care about a topic enforce the rules on the page of any given article. All conversations between editors on the platform are publicly accessible. It is not a perfect system, but it has worked to make Wikipedia a global source of neutral and verified information.
If we were forced to operate more like a commercial platform with a top-down power structure, it would subvert the DSA's public interest intentions by leaving our communities out of important decisions about content.
The internet is about to change. In Europe and around the world, civic space is under attack. All of us need to think carefully about how new rules will foster, not hinder, an online environment that allows for new forms of culture, science, participation and knowledge.
Lawmakers can work with public interest communities to develop standards and principles that are more inclusive, more effective and more binding. They shouldn't impose rules that are only for the most powerful internet platforms.
We all need a better internet. Lawmakers should work with Wikimedia to design regulations that empower citizens to improve it.