Donald Findlater is the director of the Stop It Now help line and he says that using the chatbot is more direct and engaging. The help line had 158 people click on it after the chatbot appeared more than 170,000 times. The people have made an important step. They overcame a lot of obstacles to do that. It is a measure of success to stop people just beginning their journey. People are using it. We know they are using services.

Reports have shown how women and girls had videos of themselves uploaded without their consent when they were on PornHub. More than 10 million videos were removed from PornHub's website in December of 2020. The CSAM was removed from PornHub last year.

PornHub says it has "zero tolerance" for illegal material and that it has created a chatbot to educate users that they won't find it on the site. PornHub volunteered to take part in the project, not being paid to do so, and that the system will run on PornHub's UK website for a year before being evaluated by external academics.

John Perrino, who is not connected to the project, says there has been an increase in recent years in the creation of new tools that use safety by design. It's an interesting collaboration, in a line of policy and public perception, to help users and point them towards healthy resources. He says that he has never seen a tool like this before.

There is evidence that technical intervention can make a difference in redirecting people away from child sexual abuse material and reducing the number of searches for CSAM online. When people search for terms that could be linked to CSAM, the Lucy Faithfull Foundation has worked with the internet giant to introduce warnings. The number of searches for child sexual abuse material was reduced by a factor ofteen-fold.

The number of searches decreased when search engines put in place measures against child sexual abuse related terms. Over a three-year period, one set of advertisements directed people looking for CSAM to help lines in Germany saw more than 20 million impressions. The study found that the warnings had a limited impact.

The people involved with the chatbot don't think it's the only way to stop people from finding child sexual abuse material online. Child sexual abuse on the internet is not going to be stopped by a magic bullet. It is deployed in a specific area. He says it could be rolled out to other websites if the system is successful.

Findlater says there are other places that they will be looking at as well. The system rebuilt for the specific website that it is on would have to be evaluated if this happened. The search terms used by PornHub wouldn't work on a search on the internet. One set of warnings can't be transferred to another context.