If the term or phrase itself is promoting or encouraging self- harm, it's not visible on the photo sharing site. We show a message of support for other search terms that aren't inherently violating. The company wouldn't say how many terms were blocked.

Meta is balancing concerns about child safety with young people's freedom of expression. Molly saw two posts that would have violated the company's policies. The head of health and well-being policy at Meta said last week that it is important to give people that voice if they are struggling with suicidal thoughts. The lawyer for the Russell family asked if she agreed that the content seen by Molly and seen by the court was not safe.

There are major differences between the two platforms, according to researchers. Samuel Woolley is the program director for the propaganda research lab at the University of Texas, Austin. Running up against free speech is more of a concern for Facebook andInstagram.

This isn't the first time thatPinterest has operated like this. When in doubt, use lighter content moderation, according to the inquest testimony. The 2016 US presidential elections had an effect on Molly's death. The platform began to ban topics that didn't fit with its mission, such as vaccines or conspiracy theories.

That is in stark contrast to the other social media sites. According to Woolley, the dictum of wanting to exist as infrastructural information tools like the telephone or AT&T, rather than as social media companies, guides meta platforms. Mark Zuckerberg argued in 2020 that Facebook shouldn't be the arbiter of truth.

There were differences in how transparent the two platforms were. Varney says thatPinterest provided material about Molly's activities onPinterest in one go, including not just pins that Molly had saved but also pins that she clicked on and scrolled over. She says that the company didn't give the court that level of detail. Molly was recommended 30 accounts with names that referred to sad or depressing themes in the six months before she died. The privacy of its users was cited as the reason why the names of those accounts were blacked out.

Varney agrees that the platforms have improved since last year. She says that the level of graphic material is not the same as it was five years ago. She said that Meta did not prohibit graphic images of self- harm and suicide until 2019.