There are new moderation tools for the short-form video platform to make it safer for children.
The first version of its new system to restrict certain types of content from being viewed by teens is due to launch in the near future.
TikTok says that some content in the app may reflect personal experiences or real world events that are intended for older audiences. Content Levels is designed to classify such content and assign to it a maturity score which will keep it from being seen by users under the age of 17.
Initially, TikTok says Trust and Safety managers will assign the scores to increasingly popular videos or those reported by users in the app, and the system will be expanded over time to offer filters for the whole community. Similar to the way movies, TV shows, and video games are rated, the system will allow creators to classify their content.
Users will soon be able to control what appears in the For You and Following pages with a new level of control. Users will be able to designate specific words or phrases they don't want to see in their feeds.
TikTok's feature could be used to stop TikTok's algorithm from showing topics that users are not interested in or that they just don't care about. If you're going vegan, TikTok can be used to block dairy or meat recipes.
Following a congressional inquiry into how TikTok could be promoting harmful eating disorder eating content to younger users, new moderation features have been added to the app. Parents of children who died after attempting dangerous challenges on the platform sued TikTok.