Culture

Why Parler Got Kicked Offline After The US Capitol Riot

Want more Junkee in your life? Sign up to our newsletter, and follow us on Instagram, Twitter and Facebook so you always know where to find us.

Parler has been in the news a lot this past week.  

The ‘free speech’ social media platform was connected to some of the organising activity behind the US Capitol siege and as a result, it’s been kicked offline.  

So, what do we need to know about Parler? And what does this story say about the way that social media apps, and big tech in general, need to be regulated?  

How Did We Get Here? 

After Trump supporters violently rioted in the US Capitol building, a lot of eyes turned to the places on the internet where the attempted coup was organised 

Parler was identified as one of the main hotbeds of the right-wing extremist and conspiracy rhetoric that led to the insurrection in DC.  

It’s since been booted from both the App Store and its Amazon server – and it’s been scrambling to find a new host. 

As of Sunday Parler appears to have found a new host in Epik, a company that reportedly also supports Gab and 8chan.

What Exactly Is Parler?  

Parler was founded in 2018 as a kind of alternative social media platform with a “non-biased, free speech” approach to moderation. A bunch of researchers and journalists who scrolled through the app before it was taken down basically described it as ‘clunky Twitter’.  

It quickly became a place where right-wing pundits and political figures set up accounts. But up until this year it was a pretty niche place with only a couple of million users and not a huge amount of activity.  

Audrey Courty[Parler] were quite small relative to these other social media giants but what led to this influx in users  was the outcome of the US election. Because there were crackdowns on Facebook and Twitter … against conspiracy theories and what they alleged as misinformation, a lot of the people being blocked or affected by these crackdowns moved to Parler to escape these moderations.”  

That’s Audrey Courty, she’s a digital communication and online extremism expert.

Parler are now cooperating with the US Department of Justice and the FBI to hand over records of people who allegedly had a hand in organising the US Capitol riot on the platform.  

Should Social Media Be Regulated? 

But the wider story about Parler really brings up a lot of questions about how social media apps in general should be regulated so that extremism isn’t allowed to proliferate in the first place. 

The private companies that run these platforms are basically allowed to just make their own choices about what content is and isn’t tolerated.  

Even though extremism exists out here in the real world, social media has proved again and again that it’s a crucial tool for heightening extreme beliefs, and that it can play an integral part in the organising of violent acts.  

Who Is Responsible?  

Audrey told me that governments really need to speed up their action here.   

AC: “We should be regulating this externally, creating bodies that regulate this the same way regulate telecommunications and other forms of communication technologies. There should be a body focused on social media. And that’s progressively happening but there’s still a long way to go, especially in countries like Australia and the US, they’re really trailing behind Europe at the moment. 

You can’t just allow [social media companies] to make these calls on their own, especially because the people making these calls are a very specific demographic. We’re talking about Silicon Valleywhich really consists mostly of white males in their 20s or 30s making calls that impact the rest of the world.”  

This kind of external regulation seems as though it’s becoming more and more necessary as the public keeps learning about the massive influence these social media platforms have, and the way that these environments are kind of engineered to polarise users.  

The Takeaway 

Parler is in a weird offline limbo at the moment and it’s still unclear whether it’ll be able to get back up and running again on its new server.  

But this conversation goes way further than just targeting one app. It brings up a lot of questions about our failure to regulate influential online spaces, and why private companies are being allowed to make moderation decisions for themselves in a time when they can clearly cause so much damage.