On March 30, the young man accused of the mass shooting at a Tops grocery store in Buffalo surfed through a bunch of racist and antisemitic websites. He listened to a lecture on the decline of the American middle class on a video sharing site. He found a lurid video of a car driving through Black neighborhoods.

He stayed in furtive chat rooms on 4chan but also read articles on race in HuffPost and Medium over the course of the week that followed. He was watching local television news. He was searching for documents on extremist websites and gun instructions on YouTube.

The young man, who was indicted by a grand jury last week, has been portrayed by the authorities and some media outlets as a troubled outcast who acted alone when he killed 10 Black people in the grocery store. He and others shared racist and violent content in online communities.

Many of the disturbing ideas that fuel the atrocities are no longer confined to a few tricky-to- find dark corners of the web. Many outlets host bigoted content in the name of free speech. The inability of online services to contain violent content threatens to draw more people toward hate postings.

Images and text from the young man's extensive writings have been circulating online for years. They have penetrated some of the world's most popular sites. His path to radicalization is illustrated in these documents, which show the limits of the efforts by companies to moderate posts, images and videos that promote extremism and violence. Users can find more extreme websites only a click or two away if enough of that content is left.

Eric K. Ward is the executive director of the Western States Center and a senior fellow at the Southern Poverty Law Center. The problem is that it starts to rain down on a person when you start looking for it.

The Buffalo attack has renewed focus on the role that social media and other websites continue to play in acts of violent extremism, with criticism coming from the public as well as government officials.

The fact that this execution of innocent human beings could be broadcasted on social media platforms and not taken down within a second says to me that there is a responsibility out there. The state's attorney general, Letitia James, announced that she had begun an investigation into the role platforms played.

The rules and policies of Facebook prohibit hate speech. A spokeswoman said that the platform detected over 96 percent of hate organization related content before it was reported. It was not possible to comment on the matter. Some of the social media posts that The New York Times identified through reverse image searches were deleted, while some of the accounts that shared the images were suspended.

The man charged in the killings detailed his attack on a chat app that emerged from the video game world in 2015, and streamed it live on a website owned by Amazon. The company took down his video within two minutes, but many of the sources of misinformation he cited remain online.

His paper trail provides a chilling glimpse into how he prepared a deadly assault online, how he found inspiration in fellow racists and previous attacks that he largely mimicked. The content created a twisted and racist view of reality. The ideas were an alternative to mainstream views.

The only way to prevent a shooter like me is to prevent them from learning the truth.

The websites that motivated him were mapped in his writings. The kind of online life he lived reflected the kind of information he cobbled together in his writings.

The young man's radicalization began not long after the start of the Covid-19 epidemic, when he was restricted to his home like millions of other Americans. He joined 4chan after getting his news mostly from Reddit. He followed topics on guns and the outdoors before finding one that was devoted to politics.

Although he frequented sites like 4chan known to be on the fringes, he also spent a lot of time on mainstream sites, like YouTube, where he found graphic scenes from police cameras and videos describing gun tips and tricks. The day before the attack, the man watched more videos about mass shootings and police officers engaging in gunfights.

Links posted to the accused shooter's online diary show he spent months researching guns and equipment on websites such as YouTube and fringe websites.

An anonymous message board, unmoderated or lightly moderated video hosts, and websites containing racism or antisemitism are included in fringe websites.

The videos that appeared in the diary have been reviewed. Three videos were taken down because they were linked to websites that violated the firearms policy on YouTube.

The shooter watched a lot of videos about guns and shooting practices as he recorded his activity in his diary. He watched more videos about mass shootings and altercations with police as the attack neared.

The New York Times.

The false conviction that an international Jewish conspiracy intends to replace white voters with immigrants who will take over political power in America was at the center of the shooting.

The great replacement theory has roots in the Russian antisemitic hoax called The Protocols of the Elders of Zion.

It reappeared more recently in the works of two French novelists, who four decades apart, imagined waves of immigrants taking power in France. The term grand remplacement was popularized by Mr. Camus in a novel in 2011.

According to the documents he posted, Mr. Gendron seemed to have not read any of those; instead he attributed the idea of a great replacement to the online writings of the shooter.

New Zealand's prime minister, Jacinda Ardern, spearheaded an international pact that saw government and major tech companies commit to eliminate terrorist and extremist content online. The Trump administration refused to sign the agreement because of the principle of free speech.

Mr. Gendron's experience online shows that the writings and video clips associated with the shooting in New Zealand inspire other acts of racially motivated violence. He referred to both of them.

The Anti-Defamation League warned last year that the "great replacement" of white supremacist beliefs had moved from the fringes to the mainstream.

The narrative of the great replacement theory is what most of us don't know, according to Mr. Ward of the Southern Poverty Law Center. People are starting to understand it as if they know what they're talking about. That is frightening.

The spread of misinformation and other harmful content can be accelerated by the use of the social media platforms' algorithms, which are meant to show users posts that they will read, watch and click.

Media Matters for America, a liberal-leaning nonprofit, said last month that its researchers found at least 50 ads on Facebook over the last two years promoting aspects of the "great replacement" and related themes. Even though the company said it would bar white nationalist and white supremacist content from Facebook and Instagram, many of the ads came from candidates for political office.

More than 1.5 million engagements were generated by 907 posts on the same themes on right-wing sites, far more than posts intended to debunk them.

While Mr. Gendron was at the scene of the crime, his video reappeared on 4chan. The video has spread to other fringe platforms like Gab.

The president of Media Matters for America said that the advent of social media has brought together people who were angry with each other.

He said that they are not isolated anymore.