By Daniel Brennan
As I’m very sure you’re aware, just a few weeks ago in Christchurch, New Zealand, one of the worst mass shootings in recent memory occurred as a far-right gunman charged into several mosques in the city, murdering 50 Muslims and injuring a further 50.
One of the major defining features of this horrible tragedy is the role social media and online culture played in not only radicalising the shooter, but also the role it played in sharing the footage of the shooting itself, that had been livestreamed on Facebook by the shooter.
For several hours following the incident, social media sites struggled to moderate the spreading of the shooting footage, something I found out as the footage was shared onto my own Twitter timeline moments after it had happened.
But, as it always seems to be, YouTube was by far the worst of the big social media sites at ensuring that the footage of the shooting was not shared far and wide. People even claimed that some of the channels that had uploaded the footage had ads running on the videos for the several hours it was up, before YouTube eventually posted a half-hearted tweet stating they were working on removing the footage. To their credit, the video cannot be found on Youtube now.
But the role YouTube played in facilitating the radicalisation of the shooter cannot and should not be ignored. The shooter posted his disgusting, near 70 page “manifesto” on a site called 8 Chan, a website blacklisted by Google as it has facilitated child pornography in the past, and that now is an unmoderated “wild west” where the worst of the alt right movement has infested itself.
The shooter very notably shouted “subscribe to PewDiePie” before opening fire, and while many of his defenders claimed it was simply a meme, it points to where his radicalisation process began. You don’t just simply become a member of a far-right white nationalist child pornography facilitating website you cannot find on Google, and commit a race-based mass murder without a starting point.
Obviously, you cannot blame one person for the actions of someone who is a deranged mass shooter, but there has to be at least a tiny of accountability for the online personalities that push such dangerous narratives.
PewDiePie is one of the most well known YouTube stars, as he’s amassed over 90 million subscribers, and also been involved in several controversies. Examples include paying people on the website Fiverr to dance with a sign saying “death to all Jews”, and saying the n-word on stream, that resulted in the “adpocalypse”, where most YouTube creators lost 60% plus of their income as a result of advertisers pulling out over his antics, costing many people what had become their full-time jobs.
In fact, up until just after the events in Christchurch, PewDiePie unfollowed several prominent far-right figures on Twitter – most notably people like Jordan Peterson, Count Dankula, Lauren Southern, JonTron, InfoWars “reporter” Paul Joseph Watson, white supremacist Stefan Molyneux and many, many more.
In the past, PewDiePie has featured prominent right-wing thinkers on his channel, most notably in this situation American commentator Ben Shapiro, who has been famously anti-Muslim amongst many things. In fact, Shapiro may have already been part of inspiring a previous shooting as police investigating a mosque shooting in Canada in 2017 where 17 were killed, claimed the shooter’s favourite Twitter account by far was the infamous Shapiro – someone who once claimed that there are over 600 million radicalised Muslims on the planet… a completely false, racist claim that you can find in one of many of his YouTube videos.
But YouTube has no problem not only facilitating videos like his, but promoting them as well. I often find that my recommended videos on the site feature prominent figures like Shapiro, Peterson, and so many more I previously listed – these videos are often filled with sexist, anti-trans, racist views that are allowed by YouTube under the tentative condition of “freedom of speech”… where really, the reason their videos are promoted so heavily is because of how popular they’ve become. YouTube can make money by running advertisement videos with millions upon millions of views, and they often promote these dangerous figures to vulnerable demographics like young males – just like the Christchurch shooter.
YouTube aren’t alone in allowing these figures to flourish for profit – Twitter and Facebook are two other sites that participate in this too – but this shooter, in my eyes, clearly began the path to radicalisation because of YouTube’s willful ignorance in shutting down dangerous far-right influencers before they could ever get a meaningful following. YouTube and other corporations don’t really care about incidents like what happened in Christchurch – because if they did, it probably wouldn’t have happened in the first place.