Robert Mardini, the director general of the International Committee of the Red Cross, said that the organization has its own trends analysis unit that uses software to monitor online sources. It can help keep workers safe.

You can't believe what you're reading on social media. Emergency responders use social media to figure out which posts are real and which are fake. Experts say that this is where the moderation capacity of the company can be crucial. Military campaigns can include online operations that try to use the platform for weaponized lies.

Humanitarian organizations can be hurt by misinformation. False rumors about our work can endanger our staff's safety.

The special moderation policy was put in place to curb misinformation about the conflict between Russia andUkraine. According to Nathaniel Raymond, co leader of the Humanitarian Research Lab at Yale's School of Public Health, evidence is being less consistently enforced since Musk took over as CEO. He believes that we are seeing more bots. It appears that the information space has changed. The ability to preserve evidence of potential war crimes posted to the platform has been put into question by Musk's takeover. Raymond says that before they knew who to talk to, they didn't know who to ask. We don't know what will happen

Some users who paid for a verification check mark used their new status to imitate major brands, including Coca-Cola and Eli Lilly, which led to the new verification plan being put on hold. A professor at Cleveland State University who studies how local governments use social media says that emergency responders and people on the front lines of a disaster need to be able to quickly determine if an account is legit. He claims that they are making life and death decisions.

The company recently fired its communications team, so WIRED asked if the special moderation policy forUkraine still exists. According to a company post published Wednesday, no policies have changed and the platform will rely more on automation to moderate abuse. Constant upkeep from human workers is required to keep up with changes in problematic content.

Emergency managers should stay on the social media site. They are conservative and not likely to change their practices overnight. FEMA's public affairs director did not respond to questions about whether it was considering changing its approach to social media She said that social media plays a crucial role in emergency management and will continue to do so for the agency. It could be dangerous for agencies to leave the platform because people have been trained to expect emergency updates on it.

People who work in emergency management are wondering what role the internet should play in crisis response. Can any other service fill the same role as a source of distraction and entertainment but also reliable information on an ongoing disaster?

Leysia Palen is a professor at the University of Colorado Boulder who studies crisis response. She found that the platform's community has become less good at organically spreading high quality information. She says that it was better than not having anything at all.

Emergency managers are preparing for the worst. They could turn their accounts into one-way communication tools, just a way to give out directions instead of gathering information, if they became too toxic or spammy. They could leave the platform at some point in the future. Joseph Riser is a public information officer with the Emergency Management Department. We have a planB.