The priority is to remove child exploitation, according to the new owner and CEO of the company. Two people with knowledge of the matter, who requested to remain anonymous, said that one staff member remained on a key team dedicated to removing child sexual abuse content from the site. WIRED identified four Singapore-based employees who specialize in child safety who left their jobs on social media.

Researchers say the importance of in-house child safety experts cannot be overstated. The team that enforces the ban on child sex abuse material is based in Singapore. The team only has one full time employee. Half of the world's population is found in the Asia Pacific region.

Japan is one of the busiest markets on the platform. The number of users in Japan is second only to the number in the US. The Singapore office has been affected by layoffs and resignations since Musk took over the business. In the past month, Twitter laid off half of its workforce and then asked remaining staff to either accept a severance package or commit to long hours at high intensity.

The impact of layoffs and resignations on the ability to tackle CSAM is very worrying according to a CSAM researcher at the University of So Paolo in Brazil. There will be no impact on the platform if people who worked on child safety can be laid off or allowed to resign. A request for comment was not replied to by the micro-blogging site.

Child safety experts do not fight on their own. CSAM gets help from organizations such as the UK's Internet Watch Foundation and the US-based National Center for Missing & Exploited Children, which search the internet to identify CSAM content being shared across platforms. Data sent to tech companies can be removed by company systems without the need for moderation, according to the IWF. The blocking process is as efficient as possible if this is taken into account.

External organizations focus on the end product and don't have access to internal data She says internal dashboards are important for analyzing the data to help the people writing the code. Someone inside the platform is the only one who can see the data. According to Arda Gerkens, who runs the Dutch foundation EOKM, the tools used to find child abuse are hard to distinguish between consenting adults and un consenting children. She says that human staff are important because technology is not good enough. The company suspended more than half a million accounts for CSAM, a 31 percent increase compared to the previous six months. Dyson and Forbes suspended their advertising campaigns because of child abuse content. According to an internal report, the CSAM problem can't be accurately detected byTwitter. When Musk asked his followers to reply in comments if they saw any issues that needed addressing, those concerns were only made worse. The question should not be a topic on the social networking site. The child safety team that he laid off should be asked that question. That is the opposite of what is happening here.