Senator Dan Sullivan (Republican from Alaska) made a reference to China's recent decision to limit video games time for teens. He said that they have advised teenagers to take a break. This was in reference to Antigone Davis, Facebook's head of global security. Are you a believer that the United States government should do something similar?
It was shocking. The US would not be able to imagine a law similar to China's. Sullivan looked almost elated. Imagine a country that regulates its technology sector. Imagine a country that regulates its technology sector.
It is easy to see the point of Sullivan's statement, as we don't live in this country. Since the peak of the Cambridge Analytica scandal, Congress has been calling in Facebook executives to testify before Congress. It has passed zero legislation governing the conduct of social media platforms in those three-and-a half years. It tends to, with some exceptions, do the opposite and browbeat companies into fixing things.
Thursday's hearing was prompted in part by a series in The Wall Street Journal that was based on leaked internal research. One story was: Facebook Knows That Instagram is Toxic for Teen Girls. Company Documents Show. It was described as a mix of Watergate (what did Facebook know and when it knew it?) and the corporate exposss of yesteryear. Senator Richard Blumenthal (D.Connecticut) made the opening remarks and accused the company of concealing its research and lying about it. He claimed that Facebook has copied the Big Tobaccos strategy.
The research on teen mental health wasn't surprising. Facebook has a vast amount of data that is not accessible to outside researchers about its policy enforcement and recommendation algorithms. The Journal has seen documents that may contain this type of data. One article revealed that Facebook's XCheck system is subject to white-glove enforcement. This allows for high-profile users to get away with outrageous and flagrant violations of its policies. Another article reported that researchers found that NewsFeed algorithm changes had inadvertently promoted misinformation, toxicity and violent content and that Mark Zuckerberg was unwilling to fix the problem. Another article detailed horrifying details about Facebook's insufficient investment in platform safety beyond the US, a decision that could potentially impact 90 percent of its 3 billion users.
This is the type of internal research that gives new insights into Facebook's impact on the world. Not so with the teen mental health research. The Journal published the documents the night before the hearing. They are not based on data that Facebook only has access to. The company simply asked teens their opinions about how its product affected them. This is something that anyone can do and has been done many times. It's not particularly surprising. The Journals headline is based on troubling statistics. One-third of teenage girls with body image problems said Instagram made their issues worse. However, the top-line finding from one study was that most teens believe Instagram helps improve their mental health. However, subjective accounts of people's experiences can be unreliable. Many teens surveyed knew that Instagram was bad for them. Robbie Gonzalez, WIRED's 2018 editor, noted that even large-picture correlations between Instagram use and mental health outcomes don't prove causality.