WSJ’s deep dive into eating disorder rabbit holes on TikTok explains a sudden policy change

The image is by Alex Castro.

The personal experiences of young girls who were sent down rabbit holes of extreme weight loss challenges, purging techniques, and deadly diet through TikTok are the subject of a report from the Wall Street Journal. The WSJ did an experiment to see if TikTok's video recommendation system could promote harmful content, which may explain the sudden change to the system.

The WSJ created over 100 accounts that browsed the app with little human intervention, 12 of which were bots registered to 13-year-olds that spent time on videos surrounding weight loss, alcohol, and gambling. A chart included in the report shows that as soon as one of the bots stopped watching gambling-related videos, TikTok adjusted its algorithm to find more weight loss videos, and it quickly increased the number of weight loss videos that the bot saw to account for.

This experiment doesn't reflect the experience most people have on TikTok.

By the end of the experiment, the WSJ found that of the 255,000 videos that the bots watched, 32,700 contained a description or Metadata that matched a list of hundreds of words related to weight loss. There were 11,615 videos that had text descriptions that were related to eating disorders.

A number of these videos were using different spellings to avoid being flagged by TikTok. The WSJ says it is unclear whether the creators of the videos took them down or not, after the platform was alerted to a sample of 2,960 eating disorder-related videos.

On the day the WSJ reported, TikTok announced that it was working on new ways to stop these dangerous rabbit holes from forming. It is possible that TikTok rolled out the update before the report was published because the WSJ contacted them for a statement about its upcoming story.

TikTok acknowledges in its post that it isn't always healthy to watch videos related to extreme diet and fitness. It is working on a way to know if its recommendation system is unintentionally serving up videos that may not break TikTok's policies, but could be harmful if consumed excessively. The platform is testing a tool that will allow users to stop certain videos from showing up on their For You page.

Jamie Favazza, a spokesman for TikTok, said in a statement that the experiment doesn't reflect the experience most people have on the platform. We allow educational or recovery-oriented content because we understand it can help people see there is hope, but we don't allow content that glorifies eating disorders. The National Eating Disorder Association Hotline is accessible through the TikTok app.

If this situation is familiar to you, it's because it's already been done with. After the leak of the Facebook Papers, a collection of revealing internal Facebook documents, the social network's photo sharing service, Instagram, worked quickly to patch the holes in its sinking ship.

According to the papers, Facebook did its own research on the impact of the app on teens, and found that it could cause mental health problems and make young girls' body image worse. About a month later, the social network announced that it would introduce a feature that would discourage teens from viewing potentially harmful content. The take a break feature will prompt users to close the app if they spend 10, 20, or 30 minutes on the platform.