YouTube will temporarily restrict channels that post videos containing widespread misinformation about 2020 election results — including the channel of President Donald Trump.
Instead of simply removing videos that spread misinformation, any channel that post videos with false claims about the election will now receive a strike, temporarily preventing them from uploading videos. This includes Trump’s channel. Channels that earn one strike are restricted from posting for one week.
In December, YouTube issued a new policy prohibiting any content designed to spread misinformation about the outcome of the 2020 election by promoting false theories about fraudulent votes or other unverified claims. Since then, the company has removed thousands of videos, according to a company spokesperson.
Prior to yesterday’s events, a grace period was in place allowing YouTube to remove videos without applying strikes to channels. This often happens when YouTube institutes new policies, and it’s designed as an effort to be fair to creators while they adjust to the new rules. The grace period was set to expire on January 21st, following Inauguration Day, according to YouTube.
Prior to yesterday, YouTube was removing videos without applying strikes to channels as a temporary measure
YouTube’s teams made the decision following “the extraordinary events that transpired yesterday, and given that the election results have been certified,” according to a YouTube spokesperson. The company removed one of Trump’s videos that addressed a mob attack on the Capitol and also contained widespread misinformation about the election results. Additionally, any channel that receives three strikes within a 90-day period will be terminated in accordance with YouTube policies.
This means that channels spreading misinformation about the election results will see strikes applied to their channels if they continue. For example, channels like OANN, a right-wing news organization that has perpetuated statements made by the president that he won the election, could receive a strike on top of having videos deleted if it were to continue uploading similar statements.
Over the last 24 hours, several companies have taken stricter stances against the president’s content. Twitter suspended Trump’s account for 12 hours, threatening to ban the president outright if he posted another tweet violating the company’s policies or inciting violence. Facebook initially prevented Trump from posting to Facebook and Instagram for 24 hours, before CEO Mark Zuckerberg announced even stricter measures on Thursday morning that include indefinitely suspending his account for at least the next couple of weeks.
Update January 7th, 3:07pm ET: The story was updated to include details about how long the suspension lasts.