TikTok users often express awe or dismay at the seemingly uncanny accuracy of the app’s recommendation algorithm. The Wall Street Journal published a video today that dives into how TikTok customizes your feed.
WSJ investigators ran an experiment where they created bot accounts with assigned interests. The bots “watched” videos in TikTok, pausing or replaying any that had images or hashtags relevant to those interests. The WSJ team reviewed the results with Guillaume Chaslot, an algorithm expert who previously worked at YouTube.
The findings line up with TikTok’s explanations of how its recommendations work. TikTok has previously said the For You feed is personalized based on the kinds of videos you interact with, how you interact with them, details about the videos themselves, and account settings like language and location.
If you hesitate on a weird video that caught you off guard, there’s no way for the algorithm to differentiate that from content you actually like and want to see more of. That’s how some people end up with a bunch of For You recommendations that don’t seem to reflect their interests.
Though humans have more diverse tastes than bots, the experiment demonstrates how quickly a user can be exposed to far reaches of potentially harmful content. According to the WSJ, TikTok identified the interests of some of the bots in as few as 40 minutes. One of the bots fell into a rabbit hole of depressive videos, while another ended up at videos about election conspiracies. Though as Will Oremus points out on Twitter, algorithmic rabbit holes can also bring people to positive content.
The video has a lot of details and visualizations, so it’s a good way to wrap your head around the “magic” of how TikTok works. Watch the video above or at the WSJ site — though be warned it includes clips from TikToks that reference depression, suicide, and eating disorders.