Facebook whistleblower Frances Haugen testifies before the Senate – TechCrunch

Frances Haugen, the whistleblower that leaked controversial Facebook documents and The Wall Street Journal, revealed her identity on Sunday night. She testified before The Senate Committee on Commerce, Science, and Transportation on Monday.
Haugen's testimony was given after a hearing in which Safety Antigone Davis, the global head of Safety at Facebook, was asked about the company's impact on teens and children. Davis remained faithful to Facebook's instructions, which frustrated senators because she didn't answer direct questions. Haugen, who was a Facebook project manager for civic misinformation, was more open to information.

Haugen, an algorithm specialist, has worked as a project manager for companies such as Google, Pinterest, and Yelp. She was a Facebook project manager and addressed issues related to democracy.

Haugen stated that she has worked on four types of social networking sites and knows how complicated and nuanced these issues are. The choices made by Facebook are dangerous for our children, our safety and our democracy. We must insist that they make some changes.

The algorithm

Haugen stated that Facebook's current algorithm that rewards posts that produce meaningful social interactions (MSIs) was dangerous. This news feed algorithm was implemented in 2018 and prioritizes interactions (such comments and likes), from people Facebook considers you to be closest to, such as friends and family.

However, the documents leaked from Haugen reveal that data scientists were concerned about the system's adverse side effects on critical pieces of public content like news and politics.

Facebook also uses engagement-based rating, which allows an AI to display the content it believes will be most popular with individual users. This means that content that is more popular will be ranked higher, which can lead to misinformation, toxicity, and violent content. Haugen stated that she believes chronological ranking would mitigate these negative effects.

My entire career has been spent working with systems such as engagement-based ranking. Haugen stated in the hearing that when I speak to you, I am basically blaming 10 years worth of my work.

Haugen revealed to 60 Minutes that she was a member of a civic integrity group that Facebook disbanded after the 2020 election. Facebook put in place safeguards to prevent misinformation before the 2020 U.S. Presidential election. These safeguards were turned off after the election. After the attack on the U.S. Capitol in January 6, Facebook turned them back on.

Facebook altered these safety defaults during the campaign because they recognized they were dangerous. Haugen stated that they returned to the original defaults because they wanted growth after the election. That is deeply problematic, I believe.

Haugen stated that Facebook has made a misleading choice. They can use their volatile algorithms to continue their rapid growth or prioritize user safety and decline. She believes that Facebook could benefit from more safety measures like government agencies, academics, and researchers, which would help to improve its bottom line.

I want to see Facebook move away from short-termism. Haugen stated that the company is being managed by metrics, not people. It is possible that Facebook could be more profitable five to ten years from now if there was proper oversight and some of these restrictions. This is because it wasn't as toxic and as many people quit.

Establishing government oversight

Haugen was asked to think about what she would do in the shoes of CEO Mark Zuckerberg. She said that she would create policies for sharing information with oversight agencies, including Congress. She would also work with academics to ensure they have the data they need to conduct research on the platform. And that she would implement soft interventions to preserve the integrity and integrity of 2020 elections. Since other companies, such as Twitter, have used these interventions to reduce misinformation, she suggested that users be required to click on the link before sharing it.

Haugen stated that Facebook's current structure is not able to prevent misinformation spreading through vaccines. Facebook relies too heavily on AI systems, which Facebook claims will never catch more that 10% to 20%.

Haugen later told the committee she supports reforming Section 230. This section of the United States Communications Decency Act exempts social media platforms and users from liability for their posts. Haugen believes that Section 230 should be exempted from decisions regarding algorithms. This would allow companies to face legal consequences if they are found to have caused harm.

Companies have less control over user-generated content. Haugen stated that companies have full control over their algorithms. Facebook shouldn't be allowed to choose growth, virality, and reactiveness over safety.

Senator John Hickenlooper (D-CO), asked how Facebook's bottom line would change if the algorithm promoted safety. Haugen stated that the algorithm would have an impact on users' engagement with the platform. This is because they will spend more time on it, which in turn means more advertising dollars for Facebook. She believes the platform could still be profitable if it took the steps she suggested to improve user safety.

International security

According to Haugen's documents, Facebook employees reported instances where the platform was being used for international violent crime.

Employees expressed concerns about the possibility that armed groups in Ethiopia could use Facebook to coordinate attacks on ethnic minorities. Facebook's moderation policies are dependent on artificial intelligence. This means that the AI must be able function in all languages and dialects that its 2.9 million monthly active users speak. According to the WSJ Facebook's AI systems don't cover all languages. Haugen stated that even though only 9% are fluent in English, 87% spend on misinformation through Facebook.

Haugen stated that Facebook seems to invest more in people who have the highest incomes, even though there may be some danger in distributing the risk evenly based on profitability. She also stated that Facebook's inordinately low staffing for counter-espionage and information operations teams makes them a national security risk. This is something she has been communicating with Congress.

Facebook's future

The Senate committee members indicated that they are motivated to take action against Facebook. Facebook is currently in an antitrust lawsuit.

Haugen stated that he was against Facebook's dissolution. Splitting Instagram and Facebook would mean that the majority of advertising dollars will go towards Instagram. Facebook will still be a Frankenstein that threatens lives all over the globe, but there won't be enough money to pay for it.

Critics argue that the six-hour Facebook outage yesterday, unrelated to today's hearing, showed the negative effects of one company controlling so many things, particularly when platforms such as WhatsApp are so important for communication overseas.

Legislators are currently working on legislation to improve safety on social media platforms that are used by minors. Sen. Ed Markey (D.MA) announced last week that he would reintroduce legislation along with Sen. Richard Blumenthal, D-CT, called the KIDS Act (Kids Internet Design and Safety) Act. This bill seeks to provide new protections for internet users who are under 16. Today, Senator John Thune (R - SD) presented a bipartisan bill that he introduced in 2019 with three other members of the committee called the Filter Bubble Transparency Act. This legislation would improve transparency by allowing users to see content that has not been curated by secret algorithms.

Senator Blumenthal suggested that Haugen be re-audited to discuss her concerns about Facebook's threat to national security. Although Facebook's top-ranking executives spoke against Haugen at the hearing, policymakers were moved by her testimony.