Skip to main content

Filed under:

The response to the deadly mass shooting in Christchurch, New Zealand

Share this story

On Friday, a man released a 73-page racist manifesto and proceeded to kill 50 people in two mosques in Christchurch, New Zealand, video of which he subsequently posted to various social media platforms.

The attack was designed to go viral, with plenty of references used to quickly gain attention online. Since then, various platforms have taken steps to limit the spread of the alleged shooter’s videos, and raised questions about the role that social media plays in spreading hate messages.

Follow along for all of the updates in the aftermath of the attack.

  • Mar 30, 2019

    Andrew Liptak

    Facebook is ‘exploring restrictions’ for live video after Christchurch attack

    facebook stock art
    Illustration by Alex Castro / Th

    Earlier today, the New Zealand Herald published a letter from Facebook COO Sheryl Sandberg, addressing how the company is addressing the deadly terror attack in Christchurch two weeks ago. In the letter, she lays out three steps that the company is taking, including that it’s “exploring restrictions” for live video.

    She described the attack as “an act of pure evil,” and that the company is “committed to reviewing what happened,” and that it’s working with the country’s authorities. In the aftermath of the attack, Facebook says that it removed 1.5 million videos of the attack around the world, with 1.2 blocked “at upload.” In her letter, Sandberg says that while Facebook moved quickly to remove the video and the perpetrator’s account, the company could do more, and laid out three steps that it will take. We’ve reached out to Facebook to clarify on the letter, and will update this post if we hear back.

    Read Article >
  • Makena Kelly

    Mar 27, 2019

    Makena Kelly

    Facebook moves to ban white nationalist and separatist content on its platform

    Illustration by James Bareham / The Verge

    In a dramatic shift in policy, Facebook announced today that it is banning white nationalism and separatism on its platform. The decision, which was first reported by Motherboard, comes just under two weeks since a white supremacist killed 50 people in Christchurch, New Zealand.

    In a blog post, Facebook said that any “praise, support, and representation of white nationalism and separatism” would be banned when the new policy goes into effect next week. “It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services,” the blog read.

    Read Article >
  • Casey Newton

    Mar 21, 2019

    Casey Newton

    Tech platforms should fight Islamophobia the way they fought ISIS

    First Burials Begin For Victims Of Christchurch Mosque Attacks
    Mourners attend the funeral of a victim of the Christchurch terrorist attack at Memorial Park Cemetery on March 20th in Christchurch, New Zealand.
    Photo by Carl Court/Getty Images

    After last week’s horrific terrorist attack in New Zealand, early commentary focused on how the shootings at two Christchurch shootings seemed to be purpose-built for spreading on social media. “A mass shooting of, and for, the internet,” Kevin Roose called it in the New York Times:

    As Roose notes, the alleged killer promoted the attack on Twitter and 8Chan, and broadcast it live on Facebook. Facebook took down the original video, but not before it could be copied and widely and shared. Over the next 24 hours, it would be uploaded to Facebook another 1.5 million times — of which, Facebook was able to remove 1.2 million copies at the time of uploading. The same thing was happening simultaneously on YouTube, but the company would not share any numbers that might describe the scale of its challenge.

    Read Article >
  • Jon Porter

    Mar 19, 2019

    Jon Porter

    Facebook says the Christchurch attack live stream was viewed by fewer than 200 people

    Illustration by Alex Castro / The Verge

    Facebook says footage of the Christchurch shooting was viewed just 200 times during its live broadcast, and 4,000 times in total before it was removed. The company also says it didn’t receive any reports from users about the shooting live stream until 12 minutes after the video had ended.

    Over the 24 hours following the initial broadcast, individuals attempted to re-upload the video 1.5 million times. The disparity highlights the specific challenges facing the social network, especially when an attack like Christchurch was made to go viral. Videos of the attack were cross-posted across various social networks, and links to the live stream and a manifesto were posted on 8chan prior to the attack.

    Read Article >
  • Makena Kelly

    Mar 18, 2019

    Makena Kelly

    New Zealand ISPs are blocking sites that do not remove Christchurch shooting video

    Ethernet / Internet (stock)

    Following the Christchurch shooting last week, internet service providers in New Zealand are blocking access to websites that do not respond or refuse to comply to requests to remove reuploads of the shooter’s original live stream.

    According to Bleeping Computer, sites like 4chan, 8chan, LiveLeak, and the file-sharing site Mega have all been pulled by ISPs like Vodafone, Spark, and Vocus. The ISPs appear to be blocking access at the DNS level to sites that do not respond to the takedown requests, but it’s unclear how effective the blocks will be. Like most web-level blocks, the restrictions are easy to circumvent through the use of a VPN or alternative DNS settings.

    Read Article >
  • Mar 18, 2019

    Colin Lecher

    YouTube took down an ‘unprecedented volume’ of videos after New Zealand shooting

    Illustration by Alex Castro / The Verge

    YouTube dealt with an “unprecedented volume” of videos after last week’s mass shooting in New Zealand, as the platform struggled to remove videos with the footage, YouTube’s chief product officer told The Washington Post.

    Friday’s killings at two mosques in Christchurch were recorded and posted to social media channels around the world as part of a plan that was seemingly designed to spread the footage online. As the footage made its way around the internet, it was reuploaded repeatedly. Yesterday, Facebook said it removed 1.5 million videos of the attack in the first 24 hours after the shooting.

    Read Article >
  • Mar 17, 2019

    Andrew Liptak

    Facebook says that it removed 1.5 million videos of the New Zealand mass shooting

    Illustration by Alex Castro / The Verge

    In the first 24 hours after the deadly mass shooting in New Zealand, Facebook says that it has removed 1.5 million videos that were uploaded of the attack, of which 1.2 million “at upload.”

    The company made the announcement in a Tweet, following up on a prior announcement that it had been alerted by authorities and removed the alleged shooter’s Facebook and Instagram accounts. Facebook spokeswoman Mia Garlick says that the company is also “removing all edited versions of the video that do not show graphic content.”

    Read Article >
  • Mar 16, 2019

    Andrew Liptak

    Valve takes down user tributes memorializing the New Zealand shooting suspect

    Illustration by Alex Castro / The Verge

    Valve removed more than 100 tributes to the alleged white supremacist responsible for the mass shooting at a mosque in Christchurch, New Zealand from Steam profiles, says Kotaku. In the days after the attack, it found that numerous users updated their profiles to include the name or image of the shooter, and in one case, a GIF of the attack.

    Kotaku says that it initially saw 66 profiles that paid tribute to the shooter in the aftermath of the attack, which claimed 50 lives. The number of tributes later surpassed 100. After contacting Valve for comment, the tributes were removed, although users were continuing to praise the shooter. Kotaku also says that “hundreds of pages continue to nod towards past mass shooters including perpetrators of massacres in Charleston, Isla Vista and Parkland and of the 2011 mass killing in Norway,” using the killers’ names or images. We have reached out to Valve for comment, and will update this post if we hear back.

    Read Article >
  • Sean Hollister

    Mar 16, 2019

    Sean Hollister

    Sky New Zealand yanks Sky Australia after Christchurch footage sparks outrage

    Aftermath Of Mosque Terror Attack Felt In Christchurch
    Photo by Fiona Goodall/Getty Images

    When news broke of the mass shooting in Christchurch, New Zealand that left at least 49 dead Thursday evening, I started watching Twitter. Two surprising themes stood out: people urging their social media followers to protect themselves from accidentally watching an extremely graphic 17-minute video of the attack — and separately, New Zealand and Australian locals condemning local channel Sky News Australia from intentionally sharing parts of that same footage with their viewers.

    Now, broadcaster Sky New Zealand has taken the drastic step of removing Sky News Australia from their broadcasts over the air. “We stand in support of our fellow New Zealanders and have made the decision to remove Sky News Australia from our platform until we are confident that the distressing footage from yesterday’s events will not be shared,” Sky NZ wrote in a since-deleted tweet, archived by BuzzFeed.

    Read Article >
  • Adi Robertson

    Mar 15, 2019

    Adi Robertson

    Questions about policing online hate are much bigger than Facebook and YouTube

    Police Raid Property Connected To Christchurch Mosque Terror Attack
    Photo by Dianne Manson/Getty Images

    In the wake of a hate-fueled mass shooting in Christchurch, New Zealand, major web platforms have scrambled to take down a 17-minute video of the attack. Sites like YouTube have applied imperfect technical solutions, trying to draw a line between newsworthy and unacceptable uses of the footage.

    But Facebook, Google, and Twitter aren’t the only places weighing how to handle violent extremism. And traditional moderation doesn’t affect the smaller sites where people are still either promoting the video or praising the shooter. In some ways, these sites pose a tougher problem — and their fate cuts much closer to fundamental questions about how to police the web. After all, for years, people have lauded the internet’s ability to connect people, share information, and route around censorship. With the Christchurch shooting, we’re seeing that phenomenon at its darkest.

    Read Article >
  • Cameron Faulkner

    Mar 15, 2019

    Cameron Faulkner

    How to turn off autoplay videos on Facebook, Twitter, Reddit, and more

    Photo by Amelia Holowaty Krales / The Verge

    You’ve probably been caught off guard by videos that play automatically on Facebook, Twitter, and other services; in fact, just across the internet in general. They begin playing as soon as you load a page or (if they’re more deviously implemented) when you start scrolling through a page to catch your attention.

    Automatic video play is a feature that, while nice to have when it’s surfacing content that’s related to your interests, can be pretty annoying. Autoplay videos can be harmful, too, exposing you to violent, offensive, or otherwise unwanted content that you shouldn’t have to see by default. Several browsers, like Google Chrome and Firefox, now have built-in measures to curb autoplay videos, but for the most part, turning them off is still a very manual process.

    Read Article >
  • Bijan Stephen

    Mar 15, 2019

    Bijan Stephen

    Reddit bans r/watchpeopledie in the wake of the New Zealand mosque massacres

    Illustration by Alex Castro / The Verge

    Reddit has banned r/watchpeopledie, an infamous subreddit that hosted videos of people dying gruesomely. The ban comes after the subreddit re-hosted videos of the recent mosque massacres in New Zealand. According to the new landing page, the subreddit was banned for violating Reddit’s content policy about glorifying or encouraging violence.

    A Reddit spokesperson provided this statement: “We are very clear in our site terms of service that posting content that incites or glorifies violence will get users and communities banned from Reddit. Subreddits that fail to adhere to those site-wide rules will be banned.”

    Read Article >
  • Mar 15, 2019

    Julia Alexander

    Why can’t YouTube automatically catch re-uploads of disturbing footage?

    Multiple Fatalities Following Christchurch Mosque Shootings
    Kai Schwoerer/Getty Images

    After a man used Facebook to live stream his attack on two New Zealand mosques last night, the video quickly spread to YouTube. Moderators fought back, trying to take the horrific footage down, but new uploads of the video kept appearing. It led many observers to wonder: since YouTube has a tool for automatically identifying copyrighted content, why can’t it automatically identify this video and wipe it out?

    Exact re-uploads of the video will be banned by YouTube, but videos that contain clips of the footage have to be sent to human moderators for review, The Verge has learned. Part of that is to ensure that news videos that use a portion of the video for their segments aren’t removed in the process.

    Read Article >
  • Elizabeth Lopatto

    Mar 15, 2019

    Elizabeth Lopatto

    The mass shooting in New Zealand was designed to spread on social media

    Illustration by Alex Castro / The Verge

    The horrific shooting at two mosques in Christchurch, New Zealand, was designed from the start to get attention, leveraging social media to make sure as many people as possible would hear about the deaths and the hate underpinning them. Officials have reported a “significant” number of people are dead from attacks at two mosques. Several people have been arrested so far. New Zealand police have told people to avoid mosques and told mosques to “shut their doors.”

    A 17-minute video that seemed to show the shooting was posted to Facebook, YouTube, Twitter, and Instagram. A post on 8chan, a message board, included links to a manifesto and a Facebook page where the poster — an alleged shooter — said a live stream of the attack would be broadcast. Facebook has removed the page and the video, but the video had already traveled.

    Read Article >