TikTok considers monetization and creator retention in judging recommendation algorithm

The image is by Alex Castro.

TikTok's video algorithm is designed to get users to stick around and come back. A leaked copy of an internal TikTok document was reviewed by The New York Times. Retaining creators and ensuring they make money are two considerations that may not be obvious to viewers when building a video feed.

According to the Times, TikTok has four main objectives: user value, long-term user value, creator value, and platform value. One way to play out is to prioritize a variety of content over a single topic.

The document states that if a user likes a certain kind of video, but the app continues to push the same kind to him, he would quickly get bored and close the app. The app might show a recommendation to present something new.

The leaker was worried about the videos.

The document presents a simplified version of TikTok's formula for what people like and what they don't. It breaks down to likes, comments, watch time on a video, and whether a video was played. Some variables in the equation are not spelled out, but I believe that TikTok weights different interactions so that they are valued more than others.

A flow chart from a document that was recreated by the Times shows that TikTok puts a lot of emphasis on creators when judging the value of its For You feed. It shows TikTok's consideration of "creation quality," which is judged by publish rate, creator retention, and creator monetization. It is not clear how TikTok judges creator retention and monetization, but it would appear that the quality of videos in the For You feed is a real consideration. Jamie Favazza told The Verge that creators making money isn't an input to the algorithm. It is an outcome of TikTok's user satisfaction.

In the past, TikTok has not been completely transparent about this. The company gave The Verge a look inside its transparency and accountability center last year, which spoke to the company's concerns about issues like filter bubbles.

The report shouldn't make up for concerns about the app driving users toward problematic content. The document was leaked by a TikTok employee who was concerned about the app leading to self- harm. Reporters have seen the app presenting user-generated content promoting eating disorders and discussing or showing self- harm in the past. It is easy to see how quickly the network could become problematic if not properly moderated, because the app is so finely-tailored to keep users tuning in with content similar to videos they have already watched.

TikTok considers a range of engagement signals when determining what to show. She said that they continue to invest in new ways to personalize content preferences, that they automatically skip videos that aren't relevant or age appropriate, and that they remove violations of the Community Guidelines.

The leak provides a fascinating insight into how TikTok decides what to show you, and how it works. The document says that the details are very easy to understand. It shows that major social platforms could give the public a sense of why they are seeing what they are seeing by breaking down the complicated ways they craft their algorithms into clear objectives.