Gentle internet users, be cautious about who you trust.
Facebook's Wednesday report about "Widely viewed Content" shows that the top-shared links in users' News Feeds during 2021's second quarter were related to topics such as football, hemp and charitable donations. However, you might want to avoid trusting Facebook too much.
According to the New York Times Friday report, the social media company apparently buried a similar report it had prepared for 2021's initial three months. The reason is obvious: Facebook was not thrilled with the results.
The Times reviewed the earlier report in detail and shared internal emails with it. It found that Q1's most popular link was an article whose headline linked the death of a Florida physician to the COVID-19 vaccine. The Centers for Disease Control stated that all COVID vaccines approved for emergency use in the U.S. were "safe and effective".
According to the same report, the Facebook page of a far-right-aligned website which propagated conspiracy theories and misinformation was the 19th most visited page on the platform in the first months of 2021. This is enough to convince anyone that President Joe Biden might have had a point in July.
Executives intervened to stop the report from being released. This group included Alex Schultz, chief marketing officer at Facebook. Although Schultz initially supported publication of the report, he later changed his mind and decided to keep it back. The report was not published.
Andy Stone, a Facebook spokesperson, had an entire silence to speak to the Times: "We thought about making the report public sooner, but we knew the attention it might garner, exactly like we saw this week. There were fixes to our system that we wanted to make."
The story doesn't provide any further explanations of Facebook's thinking. Mashable reached back with some questions to clarify what "fixes", and what the leaders of the company think that the decision to keep a negative report secret is about Facebook's efforts to be transparent. The company has not yet responded.
The Q2 report has been widely criticised for not accurately portraying what users are seeing on Facebook. The Washington Post reported that the Q2 report was part of a larger push by Facebook to discredit or block independent research on harmful content on its platform. Instead, it offered its own data and statistics.
One such instance was when Facebook pulled the plug on NYU’s Ad Observatory project. This effort was designed to analyze and monitor how politicians spend their ad dollars on Facebook. NYU used a browser extension to collect data about ad placement. All participants had to opt in to the program.
Facebook threatened to close the project in the weeks leading up to the 2020 election in the U.S. It eventually responded roughly halfway through 2021. Many critics noted that Facebook relied on weak and easily undermined arguments in justifying its decision.
In a story on the Q2 report, a former employee of Facebook said that it was like ExxonMobil publishing their own climate change study. It's an attempt to counter media coverage and independent research that has revealed a different story.
Facebook is indeed very proud of this report and its reflection of the company's efforts in providing the world with an honest view of what users are seeing on the platform. Guy Rosen, Vice President of Integrity, stated in a statement that was provided for the Post story, "This is yet another step in a long journey we have undertaken to be, far and away, the most transparent website on the internet."
This is an incredibly shocking sentence when viewed in context of the new Times report. It makes clear that Facebook tried to conceal an earlier, more serious report from the public and then fixed its processes in ways that aren’t currently obvious to make sure that future reports don’t end up in the exact same place.
It's not surprising or unusual for any company, private or public, to put its own interests above all else. It's hard to believe Facebook's exaggerated claims about transparency and public service when it is doing the exact opposite.