This post by Veronica Fanzio is part of our series “Global Digital Cultures in times of COVID-19”, written by students of the research master Media Studies at the University of Amsterdam.
How does the president of the United States almost overnight get banned from Facebook, Twitter, and Youtube? The answer resides, at least in good part, in the outlandish ‘Capitol riot’ that took place on the 6th of January. Trump’s ban from social media is the most resonating action against misinformation that many social networks and various institutions are taking in an attempt to counteract the proliferation of fake news and conspiracies. It was preceded by what has been called an ‘infodemic’ during pandemic.
In early 2020, the World Health Organization (WHO) coined the term “infodemic”, referring to the overwhelming amount of disinformation and conspiracy theories that circulate about the COVID-19 pandemic. This so-called fake news especially circulates through social networks and online fora, jumping from the deep web -the parts of the Web not indexed by search engines- to mainstream platforms such as Facebook and Youtube.
In the fight against disinformation, fact-checkers are in the forefront: usually independent, they’ve been defined as a new democratic institution, with the shared goal of promoting and ensuring a truthful public discourse. Facebook notably relies on third-party fact-checkers since 2016, but nonetheless keeps receiving criticism on how it responds to flagged content. And on Twitter, flagged tweets tend to keep circulating regardless of warnings: internal research shows that 74% of users see already flagged tweets, suggesting that the reaction of the platform is too slow to prevent disinformation from circulating. Added to the efforts of fact-checking outlets, there are projects such as the Berkman Klein Center’s Disinformation Program, where experts and students join forces to educate citizens on the risks of disinformation and on the counter-actions that can be taken. The WHO itself also organizes webinars on the infodemic, but despite these institutional efforts, disinformation circulates widely, affecting people’s beliefs and behaviour. A dramatic example is provided by the Iranian case of alcohol poisoning, where over 700 people died after drinking methanol following messages read on social media that this would be a cure against the coronavirus .
Platforms’ role and reaction
Public figures have more or less blatantly endorsed misinformation and even conspiracies, using misleading content as a political strategy. Not only has this led to the increment of disinformation, but also triggered violence, of which the Capitol’s riot is the biggest case to date. The fear of an increase in the severity of assaults is one of the central reasons that pushed major platform companies to deplatform Trump, together with other perpetrators of conspiracy theories and fake news. This is not, however, the first time Big Tech has taken the lead in content moderation. A huge deplatforming action was taken already in 2019 by YouTube: labelled ‘The Great Purge’, the operation intended to remove channels and videos from the platform that reflected alt-right and conspiracist views in their content. Looking at its dynamics can help us to understand the recent deplatformization of Trump supporters and QAnon conspiracists.
Research conducted by the Oilab centre in Amsterdam explains the use that many 4chan users – the so-called anons – make of YouTube, employing it as a propagandist repository in accordance with a strong cult-like in-group vs. out-group dynamic. The connection with 4chan is relevant because its community is involved in the arising and spread of conspiracies and ideologies that have been shared during the pandemic. The aforementioned QAnon conspiracy is the most striking example. This theory was born on 4chan, and then spread out through Facebook and Twitter in 2020. It revolves around the claim that a satanist pedophile cabal plots against Trump.
Youtube was well aware of the connection between conspiracies and 4chan. Hence, itremoved channels and alt-right and nationalist content intensively shared on 4chan. . Nonetheless, anons displayed adaptive and responsive behaviour, archiving the deleted videos on other websites and moving the discussion elsewhere. This can be seen in this great visualisation.
This exodus of content is similar to the migration of users from Instagram and Facebook, when these platforms banned or censored conspiracist’s profiles in the past. The deplatformed users moved to less moderated platforms such as Parler. Like many other platforms, Parler contributes to nourishing a space on the web that hosts politically alternative content, often labelled alt-tech (alt-right technology). This type of platform aims to accommodate uncensored speech. It comes with little surprise that it is a space for extremist, white supremacist, racist content, a fact that users of these alternative web spaces readily admit. As a user of one of these unregulated alt-tech platforms, Gab, puts it: “Are there racist, anti-Semitic, misogynistic, homophobes on Gab? You betcha! And, that’s how you know you’re not being treated as a child by pasty, paternalistic nerds.”
Although the mass migration to Parler is recent, I use the past tense because again major tech corporations, namely Google and Apple decided to make the app unavailable in their stores. And Amazon kicked Parler off its cloud service.
The deplatforming of radical content and users was driven by efforts to prevent further violence. The decisive factor was the Capitol riot we started this post with: it soon became clear that supporters of the attack were moving to Parler.
The deplatformization of Trump and his most fervent supporters is a historical event that brings to the fore questions on platforms’ responsibility as socio-cultural mediators.
Big Tech companies are becoming the gatekeepers of the public speech. Only very recently have they acknowledged the liabilities and obligations that derive from this role. It is yet to be seen if these companies can meet institutional and public demands and expectations in these turbulent times..
Veronica Fanzio is a Research Master’s student in Media Studies at the University of Amsterdam, with a background in Intercultural Communication. Her main interests are disinformation and conspiracy theories, environmental media, data colonialism and activism. She aims to always look at bottom-up and alternative practices that employ media to influence their sociocultural context. (Twitter: @VFanzio)