Google Refuses to Pull Website Responsible for Scores of Suicides

There is a list of resources from the Suicide Prevention Resource Center.

There is a site on the internet where users goad each other to take their own lives, and according to the internet search engine, there is nothing it can do to remove it from its results.

We won't be naming the site because of the deep dive in The New York Times. The story raises difficult questions about ethics and censorship, and especially about the way in which Google has allowed the site to remain a prominent hit in its search results.

This site is pro-choice because it has access to information about how to die by suicide and a community that will help them do so without judging them or trying to help.

Over the last two years, at least 45 users have died by suicide, and likely many more, as a result of the site being up. Many of them learned how to get support from their fellow users when they wavered in their decision to end their lives.

The site was started by two men who live thousands of miles apart in Alabama and Uruguay, and were inspired by the closing of a forum on the internet. The Times revealed the identities of the two operators.

There is a raging debate about assisted suicide in which the dying can access treatments to end their lives.

The site highlighted by the Times should not be part of the debate. The Times refused to include physician-assisted suicides in its charts because they were clear that the decision to end lives is a gross miscalculation.

Regardless of one's personal beliefs about suicide and euthanasia, the idea of giving a group of unwell people information and support about ending their own lives is disturbing.

This cursed suicide site is the latest in a long string of problematic online material that provides people with information about all kinds of horrible things, from pro-anorexia blogs and ineffective COVID-19 treatments to forums for white nationalists and the "involuntary celibates"

Tech companies like Facebook and Google have been called on to censor online content.

They usually comply. Facebook is often quick to try and remove harmful material. In the past, Google has made an effort as well.

Content that restricts copyright is taken down by the internet giant. In the case of the company's "COVID-19 medical misinformation policy", it prohibits content promoting the horse dewormer ivermectin and the malaria treatment hydroxychloroquine as coronaviruses drugs or preventatives. It has taken down sites run by white supremacists and neo-Nazis, but only after significant public backlash.

It does feel odd that the company wants this site to remain prominent in its search results. It's difficult to comprehend why the search giant would allow the site to appear in its results when the danger it presents appears.

The issue is deeply painful and challenging, and we continue to focus on how our products can help people in vulnerable situations. If people search for information about suicide on the internet, they will see features that can provide help and support.

She said that they have specialized ranking systems designed to prioritize the highest quality results available for self-harm queries. We give people open access to information. We are guided by local law when it comes to the important and complex questions of what information people should be able to find online.

The First Amendment allows for free speech, but tech companies are not governments and can take down whatever they want.

"Don't be evil" used to be the slogan of the company. The phrase was removed from the company's code of conduct.

The New York Times has a story about how to die.

This site is not about assisted suicide chamber approved by authorities in Switzerland.

Are you interested in supporting clean energy adoption? UnderstandSolar.com can show you how much money you could save by using solar power. Futurism.com may receive a small commission if you sign up through this link.