TikTok’s much-vaunted video algorithm is designed around two things: getting users to stick around and getting users to come back. That’s according to a report in The New York Times, which reviewed a leaked copy of an internal TikTok document summarizing how the system works. The report offers a rare look into one of the most discussed algorithms in tech right now, and it reveals some considerations — like retaining creators and ensuring they make money — that may not be obvious choices when building a video feed meant to keep viewers tuned in.
To keep users watching and coming back, TikTok considers four main objectives, according to the Times: user value, long-term user value, creator value, and platform value. One way that plays out is the algorithm prioritizing a diversity of content rather than overwhelming users with one single topic they might love.
“If a user likes a certain kind of video, but the app continues to push the same kind to him, he would quickly get bored and close the app,” the document reads, according to the Times. To avoid that, the app might show a “forced recommendation” to present something new.
The document presents what is supposed to be a simplified version of TikTok’s formula for what people like and what it should play. It roughly breaks down to a combination of likes, comments, watch time on a video, and whether a video was played, according to the report. There are some variables in the equation that aren’t spelled out, but my read on it is that TikTok likely weights different interactions so that some are valued more than others.
TikTok also puts a real focus on creators when judging the value of its For You feed, according to a flow chart from the document that was recreated by the Times. It shows TikTok considering “creation quality,” which is judged by publish rate, creator retention, and creator monetization. There isn’t further detail on how TikTok judges creator retention and monetization, but it would appear to indicate that whether creators are successful is a real consideration when determining the “quality of videos” in the For You feed. That said, whether creators make money isn’t an input to the algorithm, TikTok spokesperson Jamie Favazza tells The Verge. Instead, it’s an outcome of TikTok’s optimization for user satisfaction.
For its part, TikTok has not been entirely opaque about all of this in the past. In blog posts, the company has detailed the basics of how its feed works — comments and what accounts you follow can impact recommendations — and it gave The Verge a look inside its “Transparency and Accountability Center” last year, which spoke to the company’s concerns about issues like filter bubbles.
Today’s report shouldn’t dispel concerns about filter bubbles or the app driving users toward problematic content. In fact, the Times says the document was leaked by a TikTok employee who was concerned about the app leading to self-harm. In the past, reporters have spotted the app presenting user-generated content promoting eating disorders and discussing or showing self-harm. Because the app is so finely tuned at keeping users tuned in with content similar to videos they’ve already watched, it’s easy to see how quickly the network could become problematic if not properly moderated.
Favazza says TikTok considers “a range of engagement signals” when determining what to show people. “We continue to invest in new ways to customize content preferences, automatically skip videos that aren’t relevant or age-appropriate, and remove violations of our Community Guidelines,” she said.
Mostly, though, the leak provides a fascinating insight into one of the most-discussed black boxes on the internet — how TikTok decides what to show you video after video. The details presented here, like the recommendation formula, are “highly simplified,” the document says. But it indicates that the complicated ways major social platforms craft their algorithms could be broken down into clear objectives and shared with the public to provide at least some sense of why we’re seeing whatever it is we see.