Facebook had announced an experiment in February. It would show less political content to users in certain countries, including the US. Then, it would ask them questions about their experience. Aastha Gusta, a product manager, stated that our goal is to keep people able to interact with and find political content on Facebook while still respecting their individual appetites.
The company gave an update on Tuesday morning. Survey results show that users are less likely to see political content in their feeds. Facebook plans to continue the experiment in other countries and teases further expansions in coming months. This is an interesting move for a company that has been in constant trouble for its impact on politics. After all, the move was announced just one month after Donald Trump supporters stormed US Capitol. This episode was blamed by some, including elected officials. This change could have major ripple effects on media outlets and political groups that have grown accustomed to using Facebook for distribution.
However, the most important part of Facebook's announcement has nothing to do whatsoever with politics.
Any AI-driven social media platform, such as Facebook, Instagram and Twitter, TikTok or YouTube, has the basic idea that you don't have to tell it what to see. The algorithm can learn what interests you by simply watching what you share, comment on, like, or linger over. It will then show you more of the same.
This design feature provides social media companies with a handy defense against criticism. If certain content is popular on a platform, it's because users love it. If you don't like this, it could be because users are the problem.
Yet, optimizing for engagement is the core of many criticisms of social media platforms. A too-focused algorithm on engagement could lead users to engage with content of low social value but high in engagement. They might be fed a diet of ever more extreme posts, which can make them more engaged. It might also encourage the spread of false and harmful material, as the system selects first for what will generate engagement, instead of what should be seen. The many ills of engagement-first design explain why neither Mark Zuckerberg nor Jack Dorsey or Sundar Pichai admitted during a March congressional hearing, that they do not control the platforms. Zuckerberg maintained that Facebook's real goal is to create meaningful social interactions. He said that engagement is only an indicator that we are delivering that value and that people will use our services more.
Zuckerberg acknowledged in a different context that it might not be as simple. Zuckerberg wrote in a 2018 post explaining why Facebook blocks borderline posts trying to push the platform's rules without breaking them. He said that no matter where the lines are drawn for what is allowed, people will engage more with content that gets closer to those lines. However, that observation seemed to be confined to how Facebook implements its policies about banned content rather than rethinking its ranking algorithm.