Facebook's vaccine stance is part of a familiar pattern, says author and NYTimes journalist ' TechCrunch

Facebook today announced that it removed hundreds of accounts from its Instagram and Facebook platforms last month that were linked to Russian anti-vaccination disinformation campaign. According to the company, one campaign saw memes and comments posted by a banned network that claimed the AstraZeneca COVID-19 vaccination would make people chimpanzees. In May, another network published an allegedly leaked and hacked post questioning the safety of the Pfizer vaccine.AstraZeneca document, says Facebook.As a reminder to the public, the company publishes such reports to remind them that it is focused upon finding and eliminating deceptive campaigns around world. However, a New York Times investigation into Facebook's relationship to the Biden administration shows that Facebook continues to fail to tackle misinformation, particularly around vaccine misinformation.We spoke about the reported disconnect with Sheera Frenkel (a cybersecurity reporter for The New York Times) and were co-authors with Cecelia Kang (New York Times national correspondent), of An Ugly Truth. Inside Facebooks Battle for Domination. This book was published in June. This conversation was lightly edited to increase its length.TC: The big story about Facebook right now centers around it closing down accounts of NYU researchers who, according to the company, had used their tools to study advertising on the network. Many people believe that these objections are not valid. A number of Democratic senators sent a letter to the company, asking questions about the decision to ban the scholars. What does this situation mean for your understanding of Facebook's operation?SF: It was striking to me how it fit a pattern we really showed in our book of Facebook's taking what appears like a very ad hoc approach to many its problems. It was shocking that NYU took this action against NYU, as there are many other companies and commercial businesses that use data in a similar way to NYU.NYU's academics were very transparent about how they collected data. They weren't afraid to tell the truth about what they were doing. They shared their actions with journalists, who then informed Facebook. Facebook should take action against them as they were about publishing research that could have been harmful to Facebook. This is a rare instance and really reveals the problem with Facebook's data collection about its users.TC: Do your senses suggest that the Senate and Congress might demand greater accountability for recent industry indiscretions such as those of January 6? Facebook will usually apologize for a public scandal. . . Then nothing will change.SF: I spoke with one lawmaker after the book was published. He said that it was okay if they made an apology once. We saw significant changes at the company. These apologies show us that people think they can just make a few superficial changes and then apologize, but they don't get to the root cause of the problem.You brought up January 6. This is something Congress is aware of. I believe that lawmakers are going beyond what they normally do. . . They are taking a step back to ask: How did Facebook allow groups on its platform to foster for months before January 6? These groups were created by Facebook's algorithms. How did this fragmented approach to removing certain groups and not others make stop-the-steal possible? This is because they have not taken the time to fully understand Facebook's entire mechanism.TC: I am still a little skeptical about how useful these investigations will be if Facebook refuses to share more of its data.SF: When asked by the White House about the prevalence data on COVID, Facebook couldn't provide it because they didn't have it. Facebook didn't give their data scientists the mandate or resources to track the prevalence of COVID misinformation when they wanted it. This was over a year ago, at the beginning of the pandemic. Legislators can pressure Facebook to follow through on this request and give them deadlines when they need to see the data.TC: Do you believe there is a reporting problem within Facebook? Or are these open information loops by design? You mention Russian activity on Facebook leading up to the 2016 election in your book. According to you, Alex Stamos was the former chief security officer of the company. However, Mark Zuckerberg, Sheryl Sandberg, and other members of the team were unable to find the reason why Stamos' findings weren't presented earlier.SF: We really wanted to find out the truth about this as we were reporting on it for this book. Is it possible that Sheryl Sandberg and Mark Zuckerberg did not know all there was about Russia. Or were they kept out of the loop. That question can only be answered by Mark Zuckerberg and Sheryl Sandberg, according to me.Let me tell you that Alex Stamos went to the Russians about a week after the 2016 election and said, "Russian election interference." We don't know the exact amount, but we do not know the extent. We know there was something, and we will investigate it. Even though he was told this shocking news, Mark Zuckerberg and other top brass didn't request weekly or daily meetings to keep them informed about the progress of security team. As the CEO of a company, he obviously has a lot to do. You would be surprised if your security staff said to you: Hey, there was something unprecedented that happened on our platform. The potential harm to democracy was not something we expected or foresaw. You would expect that the head of the company would say, Hey, this is a very important priority for me and I will ask for regular updates. They can then say monthly, "Well, we didn't know." We were not up-to-date with everything.TC: Industry participants are still very interested in the future direction of regulation. Which areas are you most interested in?SF: There are two things I find fascinating in the next six months to a full year. COVID misinformation is the other. This is the biggest problem Facebook has, as it has been on the platform almost a decade. It has deep roots in all areas of Facebook. It's also homegrown. These Americans are spreading misinformation to Americans. It challenges all Facebook's principles on free speech. However, it also questions what it means for a platform to welcome free speech. I am curious to see how they deal with the fact that their algorithms continue to push people into anti-vaccine groups and promote people who are definitely not on the platform spreading incorrect information about COVID.Second, we are entering a year with a lot important elections in other countries. Some of these leaders are following the lead of Donald Trump. After Donald Trump was banned. I am curious to see how Facebook handles these leaders from other countries, who are trying the waters in much the same way as Trump.