TikTok works on new formats to classify and restrict its content by age

TikTok is working on new ways to classify its contents and restrict them to certain audiences according to their characteristics and themes, with the aim that minors find greater restrictions on the consumption of videos aimed at adult users.

The application seeks to filter and organize videos based on the “maturity of the content and the most comfortable subject areas” for users, according to TikTok’s Head of Broadcasting Policies, Tracy Elizabeth, in statements collected by Engadget.

The directive has indicated that, once this classification system is implemented, the contents that have been identified as containing adult themes may be restricted to users according to their age to prevent adolescents from accessing harmful information.

On the contrary, for those contents that contain “less mature” themes or open themes and not specifically intended for adults, platform users may decide whether to view or omit them.

Elizabeth pointed out that the objective is to offer the possibility to users to choose “the category with which they feel most comfortable” and, although she has only commented that it is a system “in the innovation phase”, it could resemble a classification similar to that used in movies or video games.These clarifications derive from a recent statement issued by the platform detailing the update of the policies that TikTok carries out to promote safety and well-being in its service.

Regarding the aspect commented by Elizabeth, from the platform they have indicated that when they find content that may not be appropriate for all audiences, they do everything possible to remove it from the recommendation system.

“Transparency with our community is important to us and these updates clarify or expand the types of behavior and content that we will remove from our platform or that will not appear in the ‘For you’ recommendations section,” says the director of Trust and Security of the platform and signer of the brief, Cormac Keenan.

These updates, which will roll out in the coming weeks, include strengthening its false information policy and dangerous challenges to help prevent such content from spreading further on the platform.

Although this aspect was part of the revision of its use policies, with which it developed a new resource for its Safety Center in November, Keenan has stressed that it will now be highlighted in a separate section of the category of suicides and self-harm so that the community “can become familiar with these rules.In addition, the manager encouraged the adolescent community to collaborate in their work to maintain security on the platform by evaluating ‘online’ challenges that respond to four steps: ‘Stop’ (Stop, take a break before continuing to use TikTok) , ‘Think’ (Think, know if it is a safe, harmful or real challenge), ‘Decide’ (Decide, whether to do it or not according to the risk involved) and ‘Act’ (Act, to inform the platform about harmful or deceptive challenges, as well as stopping their distribution).

To encourage this communication with the platform, next week it will launch a series of videos asking users to follow these four guidelines from the ‘Discover’ page and which will include the hashtag SaferTogether.

With the update of its section of its user protection policies, TikTok will focus its efforts on expanding its focus on eating disorders. In addition to removing content that promotes eating disorders, it will begin removing the promotion of unhealthy practices with the help of subject matter experts, researchers, and physicians.

“Our goal is to recognize more symptoms, such as excessive exercise or intermittent fasting, that are often not recognized as signs of a potential problem,” the statement said.

TikTok also clarified with this new revision of its policies the different types of hate ideologies prohibited on the platform, such as ‘deadnaming’ -referring to someone transgender with the name with which they were baptized-, malgenderization -use of words to refer to a person who does not correspond to their gender identity- or misogyny, as well as content that supports sexual conversion therapy programs.