Researchers Create Narc Neural Network to Help Cops Predict New Designer Drugs

Canadian researchers have created an artificial intelligence that can imagine what designer drugs will look like before they are even invented.

The University of British Columbia announced this week that they had fed a machine learning algorithm a database of information on known psychoactive substances, with the goal of predicting new designer drugs before the chemists and dealers who sell them have a chance to even make them real.

The press release associated with the project also makes clear that they are doing so because they did not know that the movie "Minority Report" was a cautionary tale.

One of the senior authors of a paper on the neural network published the journal Nature Machine Intelligence, joining the ranks of those who completely misread cautionary speculative fiction when promoting this dangerous new artificial intelligence.

The 2002 sci-fi movie, "Minority Report", where foreknowledge about criminal activities about to take place helped reduce crime in a future world, is a good example of the fact that we can predict what designer drugs are likely to emerge on the market before they actually appear. Our software gives law enforcement agencies and public health programs a head start on the drug trade, and lets them know what to look out for.

Police departments in Europe and the US are still using machine learning despite the fact that it could end up like many others that went wrong. The US Drug Enforcement Agency, European Union's Monitoring Centre of Drugs and Drug Addiction, and even the United Nations' Office of Drugs and Crime have all used it.

It is worth noting that the war on drugs has been a huge failure, as the stated goal is to eradicate the harm of drug addiction and bring to justice the bad guys who steal and kill in the service of drugs.

If the film is based on the book that popularized the term pre-crime, then we are screwed.

Artificial intelligence can now predict designer drugs before they hit the market.

Scientists built an artificial intelligence to give ethical advice, but it turned out to be a racist.

Are you interested in supporting clean energy adoption? UnderstandSolar.com can show you how much money you could save by using solar power. Futurism.com may receive a small commission if you sign up through this link.