Community Notes are meant to be a crowd-sourced means of moderation, but Twitter is still facing big questions about site moderation going forward.

According to a set of posts from a company account, the Community Notes feature has been updated. Community Notes are subject to scoring in which site users can vote on whether or not a note is useful. More low quality notes will be identified, and users who contribute low quality notes will be restricted, according to the company.

There is now a crowd-sourced verification tool for the social networking site. In his own post, the site's VP of Product said that users would be responsible for determining a note's quality, not the site.

The platform is trying to demonstrate its moderation ability and trustworthiness after weeks of chaos caused by Musk's acquisition. Advertisers have stopped spending on the platform because of the billionaire buying the company.

Advertisers don't want their brands to be associated with certain things. Would-be ad partners have been scared off by Musk's vision of an unfettered "free speech" platform. Half of the top 100 advertisers have left since Musk took over, despite his assurances that he wouldn't make the site into a "hellscape."

Musk took to the site to post some fake news in an attempt to demonstrate the value of the Community Notes feature. The social media CEO posted a doctored headline attributed to CNN, and then replied "@CommunityNotes F TW!" when a community note appeared on his post indicating the screen cap wasn't real.

Screenshot from Twitter
Screenshot of Tweet

The moderation of the platform has been a concern since Musk took over. The site's former moderation team was unable to do their job immediately, and many of them were laid off. It is likely that moderation is central to keeping the social media site usable, attractive, and profitable. Even though the platform had a full moderation staff, it still struggled to manage abuse.

Community Notes/Birdwatch was launched to combat such difficulties, not necessarily to completely replace other aspects of the site's moderation. It is not clear how well other layers of moderation are doing. The site failed to flag and remove the videos of the mosque shooting in New Zealand.