The crusades of Musk and others have a lot in common: Nobody asked for them.
On Tuesday, Musk made an offer to buy and take the public company private. The aim of the takeover is to increase freedom of speech on the platform. Musk did his own research on the topic of freedom of speech on the social networking site.
Tweet may have been deleted
Musk said in an interview that he thinks it's unfair that people are suspended or banned from the service for violating rules.
Musk said that if in doubt, let the speech exist.
Research shows that most Americans on both sides of the aisle disagree. Eighty percent of Americans think social media companies should take action to reduce the spread of misinformation, according to a new working paper. Even though both democrats and republicans agree with that view, they are not the same person.
The data suggests that Musk's views are not representative, according to one of the study's co-authors.
Tweet may have been deleted
The survey asked Americans about their opinions about moderation on social media. They asked about whether platforms should moderate misinformation generally, but also about one specific case of moderation. It turns out that even with a relatively partisan case of misinformation, there is still bi-partisan support for cleaning up social media.
A concern that often comes up when you talk about enforcement on misinformation is that people really disagree about what misinformation is. There is less partisan disagreement on what qualifies as misinformation than you might think.
The public thinks that social media companies should be held responsible for the spread of misinformation on their platforms, and a survey found that there was rising support for the idea that the government should intervene to reduce the spread of misinformation online. The new MIT paper asked about the action social media companies should take, and found bipartisan support for moderation.
Other researchers view the study's findings as sound, even though it hasn't been published or peer reviewed yet. Jonathan Nagler, the co-director of NYU's Center for Social Media and Politics, said it was a good study, though he viewed sampling several thousand people, rather than tens of thousands, as a limitation.
Most of the public thinks it is a good idea to get misinformation off of platforms.