Facebook Papers: Facebook Watched Trump Posts Ignite Hate

(COLUMBUS - Ohio) Reports of hateful and violent Facebook posts began to flood in after President Donald Trump posted a warning via social media on May 28th that looters in Minneapolis would face execution.
Three days had passed since Derek Chauvin, a Minneapolis police officer, knelt on George Floyd's neck for more than eight minutes. The 46-year old Black man then lost consciousness and showed no signs of life. The video, which was taken by a witness, has been viewed millions upon millions online. Protests took over Minnesota's largest city, and they would soon spread to other cities in America.

An internal analysis of Trump's post on Floyd's death revealed that hate speech and violence reported on Facebook increased quickly after Trump posted it.

These THUGS are discrediting George Floyd's memory and I won't allow that to happen," Trump wrote on May 28, at 9:53 AM, via his Facebook and Twitter accounts. We will take control of any difficulty, but the shooting begins when the looting gets started!

Since then, the former president has been removed from Twitter and Facebook.

The leaked documents from Facebook give a firsthand account of how Trump's social media posts sparked more anger in a country already divided. There were also reports of hate speech across the platform. Facebook's internal automated controls meant to catch violating posts, almost 90% of which predicted that Trump's message would violate the rules of the tech company against inciting violence.

Trump's message was not heard by the tech giant.

The next day, protests that turned violent and some offline, engulfed almost every U.S. capital city.

This June 3, 2020, file photograph shows a protestor staring at a National Guard soldier, as protests continue to George Floyd's death, near Washington, D.C. Alex BrandonAP

People will not say Facebook caused it. However, Facebook was definitely the megaphone. Lainer Holt, a communication professor at Ohio State University, stated that Facebook is the reason people are looking back on the role Facebook played. They can't escape the fact that they have exacerbated the problem, I think.

Twitter, however, quickly responded to Trump's tweet by warning users and prohibiting them from sharing it.

Facebooks internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugens legal counsel. The Congress received the redacted versions from a group of news organizations including The Associated Press.

The Wall Street Journal reported previously that Trump was among many high-profile users including celebrities and politicians exempted form some or all of the company's normal enforcement policies.

The documents show that hate speech and violence reports were mostly restricted to the Minneapolis area after Floyd's death.

According to the memo published June 5, 2013, Trump's May 28 post caused a real upheaval in the country.

Internal analysis shows that there was a fivefold increase in violence reports on Facebook. Meanwhile, hate speech complaints tripled in the days after Trump's post. False news reports on Facebook doubled. The number of violent and hateful comments that were generated by Trump's message was significant. Many of these were removed by Facebook. Some of these comments called for the shooting of these thugs and f--- the white.

We can clearly see that the country was on fire by June 2. A Facebook employee spoke out about the rise in hate speech, violence reports and other issues in the memo dated June 5.

Facebook claims it is impossible to determine how many hate speech reports were driven either by Trump's post or by the Floyds death controversy.

A Facebook spokesperson stated that the spike in user reports was not due to one Donald Trump post but rather a crucial moment in the history of the racial justice movement. Facebook is often reflective of society. The only way to stop spikes in user reports at these critical moments is to not allow them on our platform. This is something that we wouldn't do.

However, the internal findings raise questions about Mark Zuckerberg's public statements last year in which he defend his decision to remove Trump from his post.

Zuckerberg stated that the company examined Trump's statements closely and determined they didn't. Zuckerberg said that he removed the post because it warned of Trump's plans to deploy troops.

While I am aware that many are upset that the President's posts have been removed, our position is that expression should be allowed unless it presents an imminent risk of harms or dangers that are clearly spelled out in clear policies. Zuckerberg posted this on his Facebook page the night of May 29, as protests broke out across the country.

Facebook's automated enforcement controls determined that the post did not violate the rules.

The June 5 analysis stated that almost 90% of our violence and incitement classificationifier determined that Trump's post violated Facebooks policy.

This contradicts conversations Zuckerberg had last year with civil rights leaders to calm concerns that Trump's post was a threat to Black protestors of Floyds death. Rashad Robinson, president of Color of Change, an advocacy group for civil rights, stated that this is contrary to the conversations. In the weeks that followed Trump's post, the group led a boycott against Facebook.

Let me be clear: I had a direct argument to Zuckerberg after the post in which he gaslighted me. Robinson stated that he pressed back against any notion that this was against their rules in an interview with AP last week.

A Facebook spokesperson stated that Facebook's internal controls don't always accurately predict when a post will violate rules, and that humans review, as was done with Trumps post, is more accurate.

Facebook employees suggested that Facebook limit the number of reshares for similar posts in order to curb ex-presidents' ability to stir hateful reactions on the platform.

Trump used his Facebook account to keep his supporters going throughout his presidency. It has more than 32million followers. Trump's false claims of widespread voter fraud were promoted in the days leading to the deadly Washington siege on Jan. 6. This prompted hundreds of Trump supporters to storm the Capitol to demand that fair elections be overturned.

Facebook suspended Trump's account until at least 2023, not long after the Capitol Riot.

Jennifer Mercieca, a Texas A&M University professor who studied the rhetoric of former presidents, says there's a reason Facebook has waited so long before taking any action.

Mercieca stated that Facebook really benefited because of Trump's outrage and ability to attract attention. They wanted Trump to continue.

Write to us at letters@time.com