Skip to main content

TikTok algorithm makes room for personal choice; age restrictions; less depressing content

The TikTok algorithm has been at once the key to the success of the video streaming app, and the biggest criticism leveled against it. But it is now offering users the ability to filter out topics you don’t want to see.

The company is also introducing new automated moderation tools, including one that (finally!) applies age restrictions to videos not suitable for children, and another that aims to address the “rabbit hole” problem of users being shown a succession of depressing or other potentially harmful videos …

The TikTok algorithm

TikTok differs from conventional video streaming apps like YouTube in that its algorithm has much more control over what you see. Instead of users choosing the videos they want to see, you just get to choose some initial interests, and from there the algorithm takes over.

TikTok determines your tastes by using a range of signals, including videos you watch all the way through, and those you like, share, and follow.

This has proved an extremely successful approach for the company, measured by both app downloads and usage, but has also been heavily criticized. One key criticism has been that it quickly places users into “silos,” where they only ever see a tiny subset of content.

A study conducted last year showed that this can be actively dangerous.

One bot was programmed with sadness and depression as “interests.” Less than three minutes into using TikTok, at its 15th video, [bot] kentucky_96 pauses on this [sad video about losing people from your life]. Kentucky_96 watches the 35-second video twice. Here TikTok gets its first inkling that perhaps the new user is feeling down lately […]

The user instead pauses on one about mental health, then quickly swipes past videos about missing an ex, advice about moving on, and how to hold a lover’s interest. But kentucky_96 lingers over this video containing the hashtag #depression, and these videos about suffering from anxiety.

After 224 videos into the bot’s overall journey, or about 36 minutes of total watch time, TikTok’s understanding of kentucky_96 takes shape. Videos about depression and mental health struggles outnumber those about relationships and breakups. From here on, kentucky_96’s feed is a deluge of depressive content. 93% of videos shown to the account are about sadness or depression.

TikTok also appears to be extremely poor at filtering out specifically dangerous content, like a “blackout challenge” said to be responsible for the deaths of seven children.

Keyword filters

For the first time, TikTok is offering users the chance to filter out certain types of content by blacklisting specific words and hashtags.

Viewers can [already] use our “not interested” feature to automatically skip videos from a creator or that use the same audio. To further empower viewers with ways to customize their viewing experience, we’re rolling out a tool people can use to automatically filter out videos with words or hashtags they don’t want to see from their For You or Following feeds – whether because you’ve just finished a home project and no longer want DIY tutorials or if you want to see fewer dairy or meat recipes as you move to more plant-based meals. This feature will be available to everyone in the coming weeks.

Age-restricted videos

TikTok is also finally introducing age restrictions on videos not appropriate for children. Previously the app warned younger users that a video might not be suitable, but still let them watch it. The company is now finally blocking children from watching such videos.

In the coming weeks, we’ll begin to introduce an early version to help prevent content with overtly mature themes from reaching audiences between ages 13-17. When we detect that a video contains mature or complex themes, for example, fictional scenes that may be too frightening or intense for younger audiences, a maturity score will be allocated to the video to help prevent those under 18 from viewing it across the TikTok experience.

TikTok algorithm will reduce potentially harmful content

The TikTok algorithm is also being trained to address the rabbit-hole problem of a stream of potentially harmful content.

Last year we began testing ways to avoid recommending a series of similar content on topics that may be fine as a single video but potentially problematic if viewed repeatedly, such as topics related to dieting, extreme fitness, sadness, and other well-being topics. We’ve also been testing ways to recognize if our system may inadvertently be recommending a narrower range of content to a viewer.

As a result of our tests and iteration in the US, we’ve improved the viewing experience so viewers now see fewer videos about these topics at a time.

Photo: Florian Schmetz/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing