According to a report, most of the employees on the Trust and Safety team have been banned from using their usual moderation and policy enforcement tools. Some of the workers are not able to impose penalties on accounts that violate the rules on hate speech.
moderation works through many levels of screening Automatic detection and enforcement tools are included. The two protocols are still active on the site. The final level of assessment falls to real-life employees. Hundreds of workers have the power to ban or suspend accounts. Only a small percentage of staff are able to do that.
The restriction is part of a larger effort to prevent employees from changing the code during the transition. The account was supported by Yoel Roth, the company's head of safety and integrity. This is exactly what we should be doing in the midst of a corporate transition to reduce opportunities for inside risk. The rules are still being enforced.
The question of how enforcement is possible with a fraction of the usual staff was not immediately answered by the company.
The move to cut off staff access to content enforcement tools comes at a bad time for U.S. politics. Many states are starting to vote in the mid-term elections. Over the past few years, it has been proven that dis- and misinformation can affect votes, peoples perception of election validity, and legislation. At a time of high tensions, we are worried about voter suppression and intimidation.
It won't help if false information is allowed to spread on the social networking site. In the past, the platform has contributed to the spread of election-time lies. The company claimed a couple of months ago that it would take the problem seriously. Things aren't looking good so far.
The attack on Nancy Pelosi's husband was promoted by Musk before he deleted his account. The CEO asked the conduct team to review it's misinformation and hate speech policies surrounding election outcomes, as well as targeting trans people.