A sign outside a big concrete building of Twitter headquarters reads @twitter with the blue twitter bird above it.

Advertisers say they don't want their ads to be associated with a platform that can't police itself for non-consensual sexual material after reports show it has struggled to deal with a rash of accounts peddling child sex abuse material on its platform.

Major brands discovered their ads were being used to promote child abuse materials, according to a report. There were 30 advertisers that appeared next to the profile pages of the accounts that were selling and soliciting child sexual abuse material.

According to a report, the names of around 30 brands came up in research on child sex abuse. Ecolab, Dyson, and Mazda are some of the advertisers that have pulled their ads from the platform.

According to a report from Business Insider, advertisers were told on Wednesday that their ads were running on these profiles after they shared their account with the micro-blogging site. According to the emails seen by Insider, the company has banned accounts that were violating the rules and is looking at how many people the ads may have reached.

We are working closely with our clients and partners to investigate the situation and take the appropriate steps to prevent this from happening in the future. All of the profiles that they found were peddling child sexual abuse content have been suspended.

There is a bi-annual transparency report. The company suspended close to 600,000 accounts and moderated 600,000 for child sexual exploitation over the course of a year.

How much of a child sex abuse problem is on social media? It is not like Pornhub, a site that has been accused of actively profiting off of child abuse. The site has had a hard time making a concerted effort against child sex abuse material due to the fact that credit card companies have stopped supporting it.

According to Ghost Data, 70% of the accounts they identified were dealing in child sexual abuse materials, but they didn't take them down for 20 days. One account asked for sexual content for those 13 and over. The accounts shared their content with Mega and Dropbox in order to complete their transactions, after they advertised their content on social media.

According to recent reports, the social media company has known about such accounts for a long time. At one point earlier this year, it was reported that Twitter was considering its own version of Onlyfans, which would have allowed creators to create paid subscriptions for adult content. The initiative ended because a dedicated team reported that the social networking site fails to police against child sexual exploitation and nudity at scale.

Employees have been warning the company about its child porn problem for over a year according to the report. The loss of advertising dollars could have been caused by the creation of this type of operation.

It's not the only site that has been cited in lawsuits for their failure to police sexual content.

According to BusinessofApps data, almost all of the revenue for the year was dependent on advertising.