Billions of people around the world go to social media platforms to connect with others, get information and make sense of the world. Every interaction that takes place on their platforms DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch DropCatch

Several of the most important platforms are controlled by a small number of people, despite the fact that social media has become one of our most important public forums for speech. The parent company of both Facebook and Instagram has 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 888-349-8884 The platform will soon be under the control of a single person after the board accepted Musk's offer to take the company private. All of these companies have a history of sharing very little data about their platforms with researchers, preventing us from understanding the impacts of social media to individuals and society. We are afraid that this lock on data sharing will continue because of the singular ownership of the three most powerful social media platforms.

It is time to require more transparency from social media companies.

In 2020, social media was used to spread false and misleading claims about the election and to mobilize groups that participated in the January 6 Capitol insurrection. During the Pandemic, we have seen misinformation spread online. Social media companies promised to remove Russian propaganda about the war in Ukraine, but they are failing to do so. Social media has become a conduit for the spread of false information. We don't know what the next crisis will be, but we know that false claims about it will circulate on these platforms.

Social media companies don't release data or publishing research when the findings might be bad. Lawmakers and regulators should require social media companies to release data to independent researchers to understand what is happening on the platforms. We need access to data on the structures of social media so we can better analyze how they shape the spread of information and affect user behavior.

platforms have assured legislators that they are taking steps to counter mis/disinformation. Is these efforts effective? We would need access to the data. We can have a substantive discussion about which interventions are most effective and consistent with our values if we have better data. We run the risk of creating new laws and regulations that do not adequately address harms or make problems worse.

Some of us consulted with lawmakers in the United States and Europe about potential legislative reforms. The conversation around transparency and accountability for social media companies has grown deeper and more substantive, moving from vague generalities to specific proposals. The debate lacks important context. Lawmakers and regulators want us to better explain why we need access to data, what research it would enable and how that research would help the public and inform regulation of social media platforms.

If social media companies began to share more of the data they gather about how their services function and how users interact with their systems, this list of questions we could answer would be created. We believe that research like this would help platforms develop better, safer systems and inform regulators who seek to hold platforms accountable for their promises to the public.

  • Research suggests that misinformation is often more engaging than other types of content. Why is this the case? What features of misinformation are most associated with heightened user engagement and virality? Researchers have proposed that novelty and emotionality are key factors, but we need more research to know if this is the case. A better understanding of why misinformation is so engaging will help platforms improve their algorithms and recommend misinformation less often.
  • Research shows that the delivery optimization techniques that social media companies use to maximize revenue and even ad delivery algorithms themselves can be discriminatory. Are some groups of users significantly more likely than others to see potentially harmful ads, such as consumer scams? Are others less likely to see useful ads, such as job postings? How can ad networks improve their delivery and optimization to be less discriminatory?
  • Social media companies attempt to combat misinformation by labeling content of questionable provenance, hoping to push users towards more accurate information. Results from survey experiments show that the effects of labels on beliefs and behavior are mixed. We need to learn more about whether labels are effective when individuals encounter them on platforms. Do labels reduce the spread of misinformation or attract attention to posts that users might otherwise ignore? Do people start to ignore labels as they become more familiar?
  • Internal studies at Twitter show that Twitter’s algorithms amplify right-leaning politicians and political news sources more than left-leaning accounts in six of seven countries studied. Do other algorithms used by other social media platforms show systemic political bias as well?
  • Because of the central role they now play in public discourse, platforms have a great deal of power over who can speak. Minority groups sometimes feel their views are silenced online as a consequence of platform moderation decisions. Do decisions about what content is allowed on a platform affect some groups disproportionately? Are platforms allowing some users to silence others through the misuse of moderation tools or through systemic harassment designed to silence certain viewpoints?

Independent researchers should be welcomed by social media companies to measure online harms. We can't depend on the goodwill of a few companies, whose policies might change at the whim of a new owner, but some companies have been helpful. We hope Musk will be as forthcoming as before. We should not legislate by anecdote in the fast-changing information environment. Lawmakers need to make sure we have access to the data we need to keep users safe.