This piece is part of the effort to make the Facebook Papers available to the public. The full directory of documents can be found here.
News Feed was blamed for filling the brains of its users with false information after Donald Trump's victory. Scrutiny intensified because of the company's duty to stamp out state-affiliated troll and other malicious groups. Within days of the results, he told a tech conference crowd, it would be pretty crazy.
Sources inside Facebook told Gizmodo that political interference had been a major concern for the company for the better part of a year. The current and former employees said that high-level discussions over its approach to false news and other activities aimed at manipulating voters had been held frequently. One source with direct knowledge of the discussions recalled a potential update that employees believed would reduce the flow of fake or hoax news stories. Many decisions around the election were caught up in the fear of upsetting conservatives, a source said.
Facebook officials refused to confirm or deny the existence of the update, or acknowledge that a disproportionate amount of misinformation came from one side of the political spectrum. The News Feed changes were not based on their impact on any one political party.
The leaked statements of Facebook's own employees show that accusations of political bias have an influence over its decision making.
The company's reluctance to take action against confirmed sources of misinformation is one of the things that Gizmodo is publishing today. Employees who work in the papers attribute decisions like the News Feed update to the fear that the company would be portrayed as favoring certain publications that the users have ruled more informative. The debate about whether to improve or even correct failures in how News Feed prioritized journalism and political content is a result of accusations of liberal bias by Republican leaders. The company is accused of being guided by a liberal slant in two papers, which are weighing heavily over proposed changes to minimize the lies being promoted into people's feeds.
The decision by Facebook to kill a News Feed update was briefly described in an internal post. The underlying data for the trustworthiness of news sources was obtained by Facebook. The company decided not to reduce the flow of low quality news to prevent charges from being perceived as anti-conservative.
The discrepancy between the company's prior claims and the once-confidential testimony of its own employees was not commented on by a Facebook spokesman.
Approximately 2% of the hate speech on the platform was estimated to have been taken action against by the company in the same document. When it comes to making decisions that affect a wide range of content, the author writes, policy concerns become significantly higher.
In the political climate of 2022, the documents are relevant. Republican leaders in several states are pursuing new laws around content moderation after attempts by social media companies to minimize the spread of election-related hoaxes. Laws written in Texas and Florida were framed as attempts to protect users from being punished for holding unpopular political opinions. A paper by researchers at MIT and Yale found that Republicans were more likely to face suspension on the social networking site. The researchers found that Republicans users had posted misinformation at a higher rate than their Democratic counterparts.
The August 2020 document is related to the public relations department's influence over a policy already in effect, aimed at limiting political content more broadly across News Feed. The author reveals that internal employees had started expressing their displeasure over the way in which this policy was imposed.
The News Feed's own policy team and the PR department shot down a fix proposed by the news team. The author of the document said that the concerns about fixing the problem were not related to the accusations of bias.
In the months leading up to the 2020 election, Facebook began to reduce the amount of political content on the platform. Some of the changes were not announced. A press release from Sept. 2020 described banning political ads the week before Election Day and adding labels to posts. There was no mention of restricting political content on the platform. It wasn't disclosed until a week before the election, seemingly a consequence of Mark Zuckerberg being grilled under oath.
Concerns about users quitting the platform due to constant partisan chatter are behind the quieter efforts to restrict political chatter. A confidential report published by Gizmodo this year warned that many users had begun to associate Facebook with feelings of exhaustion, discouragement, stress, and anger. The company quoted one user as saying that Facebook had harmed many of their friends.
The Facebook Papers include tens of thousands of pages describing how Facebook's moderation systems work. Some of the documents are so specific that security experts have warned against making them public, fearing bad actors will learn secrets to avoid detection. Hundreds of journalists obtained the records after they were given to Congress by a Facebook product manager-turned-whistleblower. In October 2021, Haugen gave testimony to Congress about Facebook's harms. We shared 28 files related to the 2020 election and the Jan 6 attack on the U.S. Capitol. There were 37 files in our second. Gizmodo has collaborated with a group of experts to review, redact, and publish the documents.
There is no evidence that a conspiracy is aimed at censoring conservative voices. The company's biggest accusers on Capitol Hill seem to have achieved the opposite of what they said they would do.
How much of the news feed is good or bad for the world?
The results of the first-ever Good for the World survey, which asked a wide swath of Facebook users what content they think benefits or harms the world, were released. According to Facebook, users were mostly aligned on what was bad. She is also wearing a fancy hat. There is a clip from a boxing match. A stock picture of a bridge captioned "Would you walk across for $15 million?" is a lazy engagement bait.
The political party responded to the change.
Politicians in the EU feel pigeonholed into posting more inflammatory, clickbaity content in response to the company's ongoing reliance onMSI.
It is not enough to demote integrity signals.
The company relies onintegrity signals to address the political harms in people's feeds. According to the best estimates I have seen, hate speech is harder to quantify than nudity or graphic violence. Misinformation is often not caught until after it has gotten a lot of distribution.
The consequences of turning off video autoplay are surprising.
One researcher tried to turn off auto-play videos in News Feed. The researcher notes that the numbers suggest that the amount of video watch time on users feeds is due to autoplay, and that while turning off autoplay on News Feed globally might be good for user wellbeing in the abstract.
You should query non- feed VPVs as a real-time signal.
There are ways to rank user eyeballs on in-feed and off-feed content. When a piece of content is fully in a person's line of sight in the News Feed, Facebook measures impressions. The document states that most of the content gets its eyeballs from the News Feed.
Dogfooding for Power User Experience is a hard news story.
The News Feed recommendations experience is designed to push hard political news and trusted publishers to the top.
There are more informative links in the news feed.
An internal announcement about an upcoming change that will reduce low quality links in News Feed.
Goal report for April 30, 2019.
Some of the goals and progress from members of the News Feed team are listed in a document. Feed ranking changes are starting to create more user sessions, and global sessions continue to trend behind forecasts.
Personalization and distribution.
The integrity team is announcing various experiments on News Feed. A new system to demote content with a high prediction of anger reactions that cause anger, and high likelihood that users perceive the content as bad for the world. Climate science experts are upranked in searches.
Context, Update and Next Steps are included in the Integrity Audit.
Feed Enforcement Team Mission and Principles.
There are two documents detailing work on an integrity audit.
There is a one-go summary post for recent goaling and goal metrics.
Goal changes for metrics in the News Feed were announced in a post from early 2021.
News Feed UXR
Various studies and reports about News Feed were conducted during the first quarter of 2011.
News feed is becoming less valuable to people.
News Feed has become more or less valuable to users over time according to a study. According to the writer, the company is down on Feed satisfaction and the number of posts from people important to them has gone down.
What happens if we remove ranked feed?
Researchers switched some user's feeds from ranked to chronological. As it turns out, a chronological feed resulted in users seeing more content, as measured by VPVs, and ad revenue jumped in turn. The writer theorizes that people are frustrated because they see more ads as a result of scrolling through their feeds. Less compelling content on your News Feed means less reasons to click away from it.
Friends rank feeds.
An experiment called "The Newly Feed Game" is where 10 Facebook users are separated into two separate rooms and are tasked with ordering the top ten posts from their friends. They're pretty darn good at it!
Feed composition and integrity are important.
You can subscribe to our newsletter.
If you give us your email address, the robot will get it.
By subscribing you agree to our Terms of Use and Privacy Policy.An experiment was done to discourage people from sharing and violating content by changing their News Feeds.