Twitter logo displayed on a cracked phone screen is seen through broken glass

The Digital Services Act requires moderation of content on social networking sites. Even though it stopped enforcing rules against false information, it hasn't changed any policies.

"As we carry out this work, we want to assure you of a few things, first of all, none of our policies have changed," the post said.

The claim was made a week after the reversal of how it handles false information. The COVID-19 misleading information policy is no longer enforced byTwitter. Yesterday we wrote about it.

An interpretation of the "none of our policies have changed" claim might say it's technically accurate, instead of the other way around. The reversal of the ban on Donald Trump and other controversial users shows a different approach to rules enforcement under Musk's ownership.

"Our approach to policy enforcement will rely more heavily on de-amplification of violative content: freedom of speech, but not freedom of reach," the post said.

EU warnings

Musk was warned about complying with European Commission rules in a video meeting.

Advertisement

The EU will conduct a'stress test' at Twitter's headquarters in early 2023 to assess the company's compliance with EU rules, according to a report.

According to the Financial Times, "Breton told Musk that he must adhere to a list of rules, including abandoning an 'arbitrary' approach to reinstating banned users."

The people with knowledge of the conversation were cited by the Financial Times, while the quote came from a readout of the conversation. The EU's Digital Services Act is a new law that sets the global standard for how Big Tech must police content on the internet. If it broke the law, it could face fines of up to 6 percent of global turnover.

The Digital Services Act is very sensible, according to Musk. The EU wants Musk to give clear criteria on which users are at risk of being banned. Musk restored the account after polls.

Twitter: Moderation team “strong and well-resourced”

There is concern that a lack of staff will prevent proper content moderation. The Trust and Safety team works hard to keep the platform safe from abusive behavior, as well as from violating the rules of the platform. Automated detection plays an important role in eliminating abuse, and the team remains strong and well-resourced.

The team that polices child sexual abuse material was gutted by layoffs and resignations.

The former head of trust and safety said yesterday that mass layoffs and resignations hurt the ability to block harmful content. He does not think there is enough people left at the company who can do that work to keep pace with malicious campaigns.

Policy change is bad and damaging, according to the COVID.