TikTok removed over 340,000 videos in the US for breaking the platform’s rules on election misinformation, manipulation, or disinformation, according to a Wednesday transparency report covering content from the second half of last year.
A few months before the 2020 presidential election, TikTok announced that it would pair with fact-checking organizations as part of a broader effort to combat election and COVID-19 misinformation. At the time of the announcement, TikTok was under immense pressure by the Trump administration and lawmakers for its alleged ties to the Chinese government. Microsoft and Oracle were among some of the companies bidding for ownership over TikTok from ByteDance, a Chinese firm. A final deal between Oracle and TikTok is still on hold.
“TikTok is a diverse, global community fueled by creative expression. We work to maintain an environment where everyone feels safe to create, find community, and be entertained,” Michael Beckerman, TikTok’s vice president and head of US public policy, said in a statement Wednesday. “We are committed to being transparent about how we keep our platform safe, because it helps build trust and understanding with our community.”
Outside of the videos deleted for breaking election rules, TikTok announced Wednesday that over 441,000 videos were removed from the platform’s recommendation algorithm, or the For You Page, for spreading misinformation. Shortly before the 2020 election, TikTok rolled out an election guide powered by BallotReady, a voting information tool. That guide was available on the Discover page, and election-related videos, hashtags, and political accounts. It was visited 18,000,000 times.
TikTok also removed 1,750,000 accounts used “for automation” throughout the course of the 2020 US election. “While it’s not known if any of the accounts were used specifically to amplify election related content, it was important to remove this set of accounts to protect the platform at this critical time,” the transparency report reads.