The way in which people consume news is rapidly changing. Whereas before, you’d be likely to turn to the BBC or a newspaper for your daily updates, perhaps now you’re finding quick and easy-to-access news updates as you scroll your Facebook feed during your lunch break. Whilst on the one hand, this has made news consumption quicker and easier than ever before, it does come with significant downsides. Whereas news outlets like the BBC are stringently held to rules and regulations to curtail ‘fake news’, social media outlets are largely not held accountable to the same degree. The result? Social media apps like Parler are able to facilitate the circulation of controversial and often hateful content packaged as factually accurate news. The app’s removal from AWS following the identification of 98 calls to violence have raised questions regarding the extent to which unchecked content should be allowed to exist on the internet, and who ought to be held accountable. To what extent can hateful content be termed ‘free speech’, and to what extent should it be curtailed?
What is Parler?
Parler is an app popular amongst far-right American voters and political figures. Ted Cruz has amassed a following of 4.9 million on the app, whilst Fox News host Sean Hannity has acquired approximately 7 million. The app serves a digital platform for users who find themselves blocked from mainstream social media platforms, such as Facebook and Twitter, following their sharing of problematic, hateful or violent content. The app launched in 2018, and billed itself as an alternative to the more rigorously monitored mainstream social media platforms. Comparatively, Parler took a more relaxed approach to content moderation, In doing so, Parler has facilitated the development of political conspiracies and controversial far-right calls to action.
Indeed, the app has been regarded by some as a tool used by far-right activists to stoke anger, hatred and frustration against anti-Trumpian rhetoric and occurrences. Parler was key in the organisation and arrangement of the attack on the Capitol building earlier this year, following Trump’s electoral defeat by the current President Biden. In the run up to the march on the Capitol, the platform was overrun with unchecked death threats, celebrations of violence and cries encouraging ‘patriots’ to join the attack on the government building.
Amazon’s removal of Parler from its hosting system has been a source of great controversy. The app’s CEO, John Metze, has since issued a statement attacking the hosting service, declaring that by silencing Parler’s users, Amazon has ‘attempt[ed] to completely remove free speech off the internet.’ Yet, the app’s removal has alternatively been hailed by others as an effective way to limit the insidious spread of anti-establishment rhetoric and curtail domestic far-right terrorism. Questions have since been raised over the role of conglomerate tech companies in limiting the circulation of controversial content using their products. Amazon is currently under no legal obligation to check how its services are being used. Now we must begin to wonder- should it be?
So what happens now?
Following its removal from Amazon Web Services, Parler has entered into a contract with the hosting company, Epik. American news provider, CNN, has noted that Epik is a hosting service that caters to, and is supportive of far-right sites such as Gab and 8chan. Moreover, since its reestablishment, Parler has sued Amazon removing the app from its servers. In response, Amazon has detailed some of the unchecked graphic and violent threats made by users against members of the BAME community and Trump’s political rivals.