Logan Paul's video in which he filmed the body of a suicide victim continues to live on YouTube, even though the company has attempted to purge complete re-uploads from its site. A search this morning surfaced multiple re-uploads, including a few complete videos and many stitched together reaction posts that include original clips. Most of these users have blacked out, skipped over, or censored the victim's body, which seems to have allowed their uploads to avoid being taken down.
YouTube does remove full copies of the video when it notices them. A couple re-uploads trended earlier today, one of which showed the body. That full re-upload was taken down, although the user — Skevarox — uploaded it again. This time, it included a censored screen and this text:
Although YouTube may want to erase the Paul nightmare from its site, YouTubers know the company's rules and can subvert the system. With the body censored, YouTube either has to find a new violation of its guidelines or treat re-uploads as a copyright issue — but Paul doesn't seem to care if people repost his work.
Paul could have filed a claim with Content ID to get his video removed from other channels. Every new upload to YouTube is cross-referenced with the Content ID database, and if a video matches, the original owner can decide to take it down (or monetize it).
It’s possible that Paul doesn’t have access to Content ID or that tweaked versions of the video are avoiding detection. But so far, it’s clear that YouTube isn’t able to remove the video just because it belongs to someone else: takedowns of full re-uploads that show the body cite a violation of YouTube's policies, not copyright.
YouTubers are also making extensive use of clips from the video in their own reaction posts. But taking those down would get into issues of fair use, since YouTubers could say they’re referencing the material, similar to a news clip, if copyright claims come up.
So, if Paul doesn't care what happens to his video after it's already been deleted off his channel, the decision is left to YouTube. What videos violate its policies, specifically its commitment to removing "violent or graphic content?"
YouTube issued a statement yesterday about the incident and said: "YouTube prohibits violent or gory content posted in a shocking, sensational, or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated."
YouTubers are aware of how this works. They know how to stay within the company's policies and still benefit off a newly popular search term. You see this same situation play out with every controversial YouTube upload. Re-uploads of PewDiePie's video in which he paid people to hold up a sign saying, "Death to All Jews,” are all over, for example. These videos censor the word "Jew."
The content ecosystem is well established at this point. Once a controversial video is online, enterprising YouTubers capture it, remove the part that violated actual YouTube policies, and keep the video and conversation alive. YouTube can't do anything to change it.