Skip to content

Mass Shooting Live Streams: From Buffalo to New Zealand

    Many of the sites tried to remove the videos while they were being uploaded but were overwhelmed. Facebook said it 1.5 million videos deleted in the 24 hours following the incident, although many managed to evade detection. On Reddit, a post featuring the video was viewed more than a million times before it was removed. Google said the speed at which the video was shared was faster than after any tragedy it had seen before, according to the New Zealand government’s report.

    In the days that followed, some people began discussing ways to bypass the platforms’ automated systems to keep the Christchurch video online. On Telegram on March 16, 2019, people who were part of a group linked to white supremacy discussed ways to manipulate the video so it wouldn’t be taken down, according to discussions watched by The Times.

    “Just change the opening,” one user wrote. “Speed ​​it up with 2x and the [expletive] can not find.”

    Within days, some clips of the shooting were posted on 4chan, an online bulletin board in the fringe. A 24-second clip of the Rumble murders was also released in July 2019, according to The Times review.

    In the months that followed, the New Zealand government identified more than 800 variants of the original video. Officials asked Facebook, Twitter, Reddit and other sites to devote more resources to their removal, according to the government report.

    New copies or links to the video were uploaded online whenever the Christchurch shooting hit the headlines or on event anniversaries. In March 2020, about a year after the shooting, nearly a dozen tweets appeared on Twitter with links to variants of the video. More videos appeared when the gunman was sentenced to life in prison in August 2020.

    Other groups intervened to pressure the tech companies to delete the video. Tech Against Terrorism, a United Nations-backed initiative developing technology to detect extremist content, sent 59 alerts about Christchurch content to tech companies and file hosting services from December 2020 to November 2021, said Adam Hadley, the group’s founder and director. That represented about 51 percent of the right-wing terrorist content the group attempted to remove online, he said.