The Wall Street Journal published a series about Facebook stories based on internal company research a week ago. The Facebook Files are a collection of dizzying details about the problems that Facebook faces.
These stories reveal an opaque and separate government system for elite users called XCheck. They also show that Instagram can cause harm to a large percentage of teenage girls. These stories revealed a huge gap in the way Facebook modifies content in other countries, compared to how much it invested in the United States.
These stories have attracted public attention and Congress members have launched an investigation. As other media outlets add material, scrutiny grows.
According to MIT Technology Review, despite Facebook's substantial investment in security, Eastern European troll farm propaganda reached 140 million people a monthly by October 2019. 75 percent of those users were not aware that they had been following a page, but rather because Facebook's recommendation engine showed it to them. ProPublica conducted an investigation into Facebook Marketplace and discovered thousands of fake accounts that were involved in various scams. The New York Times reported that Facebook sought to improve its reputation by putting pro-Facebook stories in the News Feed. It is not known if this will continue.
Most Facebook scandals end in a commotion. This one is different from other Facebook scandals because it was led by Facebook's employees.
Facebook was last under such public scrutiny in 2018, when it was rocked by the Cambridge Analytica data privacy controversy. This was a bizarre scandal for many reasons. Not least because most of the details were known years ago. It became an international sensation because of the notion that political operatives tried to use Facebook's huge trove demographic data to influence Americans to vote for Donald Trump.
Facebook ads convince people to buy, but not to change their political views.
Nearly everyone today agrees with Cambridge Analytica's claim that psychographic targeting is a marketing gimmick. The idea that Facebook and other social media networks are slowly reshaping entire societies with their data collection and advertising practices, ranking algorithm and engagement metrics has largely held. Facebook is a great company because of its advertising effectiveness in getting people to purchase things. Yet, the company would like us to believe that it isn't as effective in getting people to change their political views.
The company has never been able to resolve the disconnect.
It still invested $13 billion in safety and security. The network was policing by 40,000 people. It was able to disrupt fake accounts networks. It became more comfortable inserting quality information in the News Feed about COVID-19 and climate change. Facebook was almost a non-existent after the US presidential election of 2020.
But basic questions lingered. What was the exact process of policeing the network? Are different countries being policed equitably? What does a personalized feed like this every day do for a person? Or to a country's politics?
There is always the risk of being a technological determinist. To assume that Facebook's algorithms are stronger or work in a vacuum is to make assumptions about their power. The research I have highlighted in this column shows that other forces, such as Fox News, can often be more powerful than Fox News. This can lead to a greater shift in a person's politics.
We would all benefit from being able to isolate the effects of Facebook, YouTube, TikTok or Twitter on the wider world for a variety of reasons. We spend a lot time arguing about topics that we have very little knowledge of because they keep our data private. Facebook's perceptions of us are the basis for our discussions about Facebook. Facebook and the rest of the world end up talking over each other.
Facebook, to its credit, did all it could to investigate some of our questions. Questions such as, What is Instagram doing for teenage girls?
Facebook established the seeds for the present moment by doing this. In the latest reporting, the most pressing questions are the same as Cambridge Analytica's: What is this social network doing for us? This time, we have actual data. We can look at data Facebook produced.
Facebook employees bristle when I ask them about this. They'll tell me that reporters have been doing this for years; recent stories don't show any confirmation bias. They'll counter that just because one researcher from the company claims something, it doesn't necessarily mean that it is true. They'll be asking: Why isn't anyone asking for internal research from YouTube, Twitter, TikTok, or TikTok.
This may explain the dismissive tone of the company's response to all this reporting. Blog post by Nick Clegg. It was a joke by the CEO. They are back in the mainstream media.
For me, however, the last week felt like a turning point.
The majority of Facebook researchers who have ever spoken out publicly about Facebook have said that their research was often stymied, or ignored by their superiors. We have seen evidence that the company has acted recklessly in their research.
This week felt like a turning point
Sometimes, this is accidental. Facebook seems to have been truly surprised at the discovery that Instagram appears responsible for the increase in anxiety and depression among teenage girls.
Sometimes, the company was irresponsible with full knowledge of its actions, such as when it spent massively more money in the United States to remove misleading content than it does elsewhere in the world.
Even in the United States, it may have under-invested safety and security. As Samidh Chakrabarti (who ran Facebooks civic integrity group until this year) stated, the much-ballyhooed $13 Billion investment by the company represents approximately four percent of its revenue.
Facebook continues to thrive despite all of this. The daily users of Facebook are up seven percent each year. Profits are rising. Post-pandemic advertising is so strong that even digital ad-also-rans such as Pinterest and Twitter are enjoying a record year. Facebook's hardware business is slowly becoming a success and could pave the way for the metaverse.
Yet, the question remains: What is this social network doing for us? It seems that neither the company nor the rest of the world has grasped the concept. The company's reputation continues to slide.
If you were the CEO of the company, one natural response to this situation would be to reduce research: No more negative studies, and no more negative headlines. Is Congress going to hold a hearing? Who cares? Pass a law? This year, not.
Facebook signaled its intent to make it more difficult for users to give their News Feed data to external research programs this week by making it harder to do so.
What if Facebook invested more in open research and not less?
What if it did the opposite? What if it spent significantly more on research and public pressured its peers into joining it? What if Facebook published its findings regularly and allowed data audits to take place? What if Facebook made it much easier for qualified researchers and independent users to examine the platform?
This would be unprecedented in American business history, but Facebook is a unique thing in the world. Through tweetstorms and blog posts, the company cannot rebuild trust with the wider world. It could help us understand the effects of social media on politics and human behavior.
However, this is not the case. Instead, the company is conducting research on different types of research. For example, what happens if we share good news about Facebook with people? According to one report, a Facebook user was informed of an incident where the social media network helped her find her horse. Perhaps that will help move the needle.
But that's not a joke. That test has a real idea. It shows that perception can be reshaped over time by the narratives that you promote. The News Feed could shift public opinion, depending on who is running it.
This suspicion that the News Feed could drive such changes has driven much of the company's research and fears about the companys impact, even though that possibility has been repeatedly downplayed constantly by Facebooks PR machine.
The company is now ready to test it. It will claim to be just as powerful as its apostate scientists, but it will not lie to the public.
Then, Facebook with Project Amplify will try to determine if they are right.
This column was published in conjunction with Platformer, a daily newsletter on Big Tech and democracy.