This post was written by Anton Hansen
Introduction
Following narratives of the Silicon Valley, platforms have emerged as the harbingers of unity and waves of democratic movements. The umbrella movement or the Black Lives Matter protests, as an example, have often been quoted as showing how digital platforms play crucial roles in democratic movements (Agur & Frisch, 2019). Opposing this narrative, scholars and activist groups argue that social media companies, despite connecting the world, also reinforce colonial power structures (Couldry & Mejias, 2019).
This blog post is going to unfold the colonization of the self through digital landscapes, navigated through the lens of e.g. Couldry and Mejias, as well as decoloniality scholars like Said (1978). I will delve deep into the narratives of how data is colonizing human life from two perspectives: the ongoing colonial practices against content moderators, but also against the end users of digital technologies. Following this, I will discuss how data sovereignty and algorithmic transparency might oppose such practices.
Photo above: Facebook advertisement in India – “Make every day eventful” by e.g. organizing a hateful anti-Muslim mob (https://time.com/6112549/facebook-india-islamophobia-love-jihad/).
Digital platforms as colonizing architecture
To better understand the dominance of global tech giants in our daily digital interactions, we must contextualize it in the broader framework of digital platform colonialism. In their analysis, Couldry and Mejias (2019), show a new form of dominance that extends far beyond mere commercial transactions.
Couldry and Mejias draw disturbing parallels with historical colonization, which was marked by the seizing of land, bodies, and resources for capital gain. Data colonization, in their view, represents a new frontier where human life is extracted and quantified for value. They argue that most companies who are part of the digital web, particularly the ‘big five’ tech giants—Amazon, Apple, Facebook, Google, and Microsoft—quantify social interaction for profiting. The primary purpose of their data collection and analytics is not to enhance user experience but to create predictable consumer patterns and integrate individuals into their vast data networks. However, Couldry and Mejias’s perspective diverges from that of some of their contemporaries. They eschew the notion of this trend as a modern capitalist innovation. Instead, they contend that it is a continuation of the capitalist modus operandi, which historically turned human activity into labor for profit. In this new era, life is not just lived but is transformed into data for economic exploitation.
They are careful to distinguish between the explicit violence associated with traditional colonialism and the more subtle, yet still pervasive, nature of data colonization. While the latter may not manifest through overt violence, its impact—through surveillance, commodification, and marginalization—remains profoundly damaging. At the heart of Couldry and Mejias’s argument is the critique of ideologies that underpin data colonization. These ideologies, which misleadingly portray data as a natural resource ripe for extraction in the service of community building and democracy, need to be vigorously contested. Resistance involves more than just avoiding data extraction tools or creating alternative platforms; it requires dismantling the ideologies that legitimize such invasive practices. Looking forward, Couldry and Mejias advocate for a future that values human connection, solidarity, and authentic experiences above the ceaseless collection and analysis of data—a future where life is not merely reduced to a data point. While some might view these arguments as dystopian, Couldry and Mejias point out that traditional colonialism was of course more brutal. However, they want to show through their analysis that the underlying practices and strategies are rooted in the ideology of colonialism.
Going beyond the average digital media user and Couldry and Mejias’ analysis, brutal colonization practices remain. An example are content moderators in Kenya. There, local workers are integrated into the digital economy not as participants with agency but as laborers performing tasks such as content moderation under strenuous conditions and for minimal pay. This dynamic is a modern iteration of the colonial exploitation where the labor of local populations is harnessed for the benefit of a distant elite, echoing the oppressive structures of the past. Furthermore, it shows how wide the colonial practices span. From extracting personal data to extracting labor in a colonial manner.
Resources, labor and profits
“If data is the new oil, then developing countries and Least Developed Countries (LDCs) are the new oil fields (Dhwani Goel, 2021).”
This idea, astutely noted by LSE alumni Dhwani Goel in 2021, delineates a digital landscape still ensnared by the chains of colonial structures. These structures, while mostly invisible, perpetuate a cycle of exploitation that favors the erstwhile imperialist, colonial powers such as the United States or European countries like Germany or the UK.
The metaphorical oil fields of the digital age—former colonies, like India, Bolivia or Congo—have unwittingly morphed into massive reservoirs of digital data meticulously harvested and shipped off to Silicon Valley for processing and, ultimately, profiteering (Dhwani Goel, 2021). This dynamic doesn’t simply end with data. From the Coltan miners in Congo, content moderators in Kenya, to Lithium miners in Bolivia, a stark panorama of exploitation unfolds, where the fruit of labor is savored by companies firmly rooted in China, the US, and Europe. Therefore, ‘labor’ on digital platforms is as segregated as it used to be in times of colonialism.
An example of this neo-colonial exploitation is the story of Daniel Motaung, a content moderator who found himself traumatized after working as a content moderator for Meta, cleaning Instagram and Facebook timelines for as little as $1,50 per day. Having to look at beheadings, child sexual exploitation and other horrific contents for many hours a day, he decided to organize his co-workers in a labor union for better pay and working conditions. As a consequence, he was fired and Meta even tried to stop Motaung from speaking by using a gag order, claiming that talking to the press could bias the case.
Activism poster by https://peoplevsbig.tech/stop-facebook-from-silencing-whistleblower-daniel-motaung
Such stories are not isolated, with parallels evident in Indonesia and the Philippines, where workers, ensnared in psychologically damaging tasks, are deprived of requisite help centers and social security nets, often spiraling into unseen traumas and untreated PTSD. Of course, the software engineers of silicon valley, making more than $200,000 a year, will not have to do such dirty work. “Moderation is not an ancillary aspect of what platforms do. It is essential, constitutional, definitional. Not only can platforms not survive without moderation, but they are also not platforms without it” (Gillespie, 2018, p.21). This necessary task, the backbone of the platform, is being outsourced to a former colony, reinforcing post-colonial structures where a white upper-class is becoming really rich by exploiting the cheap labor of Africans.
Values, Domination, and Exclusion
To add another theoretical scope to this analysis, Edward Said’s (1978) paradigm-shifting work Orientalism unleashed a critical lens which can clarify such colonizing practices. The narratives, wherein the West perceives the Global South as an ‘underdeveloped’ and ‘uncivilized’ entity, can be seen within the strategies employed by the big five and its Chinese counterparts. There’s an altruistic guise, an agenda propelled under the mission to ‘connect’ and ‘uplift,’ which goes hand-in-hand with the exploitation of e.g. content moderators. Furthermore, there lurks a pursuit of unrestrained capital accumulation through emerging markets in the Global South.
This observation becomes even more visible when we position Couldry and Mejias’ life colonization within the Saidian framework. Both frameworks lament the West’s reductionist view of the East or the Global South, often seen as landscapes ripe for extraction and influence, instead of as equal partners in a shared global narrative. While Said critiques the West’s entrenched tradition of perceiving the ‘Orient’ as an exotic, mysterious, and ‘backward’ land that necessitates ‘civilizing’, Couldry and Mejias offer a corollary in the digital realm. Today’s tech giants, in a sense, are the new ‘Orientalists’. Their mission to ‘connect’ and ‘uplift’ can be likened to the colonizers’ ‘White Man’s Burden’, a notion that it’s the duty of the West to improve the conditions of their ‘inferior’ colonial subjects. However, this modern-day colonization happens two-fold. It happens in former colonies, e.g. Kenya, and even goes deeper by colonizing human life through the distraction of data as valuable resources. The objective remains to tap into untouched markets, extract valuable resources (both data and monetary), and strengthen one’s global dominance.
Following this, a connection can be drawn between the Orientalist painting tradition and the visual representation of digital platforms. Just as Orientalist paintings often presented an exoticized, distorted, and often eroticized version of the East to Western audiences, digital platforms present such a distorted image of their services. Marketing them as contributing to a more egalitarian society covers colonial practices in their production chain. Other well-known examples include coltan mining in Congo or the failure to moderate content in minority languages.
Going beyond this, platforms like Netflix also shape and present narratives of the Global South in ways that are palatable and profitable to Western consumers. This is not just about representation; it’s about commodifying cultures, stories, and identities. The allure of such content might lead to increased viewership and subscribers, but at what cost? The nuanced, multifaceted realities of these regions are at risk of being drowned in a sea of stereotypes.
In essence, much like how Said’s concept of orientalism illuminated the West’s problematic gaze upon the East, Couldry and Mejias’ theories (2019) unmask the digital age’s new form of orientalism, wherein tech giants are the artists, and their canvas is the cyberspace.
Combining these narratives, global tech companies view human beings in previous colonial territories–but also in countries like the US–as opportune landscapes for trials and, above all, for amassing significant revenues.
This cyberspace domination under the banner of “fostering global connections” is not just economic; it’s also cultural and psychological, as the values and mechanisms of the tech entities redefine how we see personal data and to what extents such personal data could be extracted. Concluding from this, addressing data colonialism demands a multifaceted strategy.
It is important to understand and resist the pervasive collection and commodification of our data and challenge the underpinning ideologies that facilitate it. It’s about redefining the narrative surrounding data from being a mere ‘resource’ to a reflection of individual and collective identities. Additionally, we must critically assess the unchecked surveillance by tech giants in alliance with state agencies, realizing that our historical experiences with exploitation and colonialism find new echoes in this digital age.
Data Sovereignty and Algorithmic Transparency as possible strategies against data colonialism
As outlined in this blog post, the main problem with digital colonialism remains two-fold: on the one hand, the capitalistic exploitation of workers remains, as the example of content moderators in Kenya has shown. On the other hand, the consumers of digital products are colonized as well. Improving labor rights in countries like Kenya and holding tech companies accountable for the minerals they process for their products, as well as the whole production process, is a major aspect that needs to be improved. For the simple idea of equality, laborers in all parts of the production chain should enjoy labor protection as profound as that which the programmers of Silicon Valley enjoy.
Going beyond this and focusing on the arguments made by Couldry and Mejias, informed consent of the individual is a corner stone of getting agency over one’s data back to consumers. Therefore, data sovereignty emerges not just as a technical necessity but as a foundational principle for a decolonial digital approach. Data sovereignty goes beyond just having control over one’s data. It’s about recognizing data as an extension of oneself, one’s digital identity, and ensuring it’s not exploited. The GDPR, while a commendable effort, is a starting point. Real data sovereignty would mean that users have clear, unequivocal rights over their data, almost akin to rights one has over personal property.
Therefore, platforms would have to radically reimagine their structures and functionalities. Users should be able to get a clear understanding of how their data is used–not hidden behind convoluted terms of service.
Adding to this, endowing users with tools that allow them granular control over their data could enable them to purposefully choose which data to reveal. It should also lay in the user’s power to decide who or what is going to happen with their data and for what purposes. This vision demands not just technical innovations but a foundational shift in how platforms perceive and respect user data.
In essence, data sovereignty, as highlighted by Couldry and Mejias, is a cornerstone in the larger project of digital decolonization. In their call for more just platforms, platforms shift from being extractive entities to becoming custodians that prioritize user agency and respect in the digital realm.
Building on the premise of data colonialism, algorithms can be identified as contributing to such colonization practices.
Gillespie’s “Custodians of the Internet,” work outlines how these algorithms, designed and deployed by dominant platforms, guide a significant portion of our digital experiences.
More than just sorting and suggesting, they are powerful forces curating content that aligns with certain preferences, pushing narratives to the forefront, or relegating others to obscurity. How these algorithms function and what their underlying mathematical formulas are remains mostly hidden from users.
Gillespie’s work provides a foundational understanding that algorithms are not just neutral lines of code but rather the backbone of colonial digital practices.
By bearing the weight of decisions made by human designers, influenced by social, political, and economic considerations, algorithms’ functionalities include strategic choices, reflecting power dynamics reminiscent of historic colonial gatekeeping. The digital territories they govern are vast, stretching from social media feeds to news recommendations, shaping not just individual perceptions but also societal values and cultural narratives.
This algorithmic hegemony needs to be confronted heads-on with transparency being the initial step. With policy makers, politicians and consumers having greater insights into the colonizing practices of big tech algorithms, they can take more informed steps to ensure fairer digital media. Adding to this, users could be given the ability to opt-out, challenge or adjust certain algorithmic determinations, allowing them a more direct role in shaping their digital worldviews.
To conclude this thought, a truly decolonized digital space operates transparently and champions user agency, ensuring that the digital realm mirrors the rich tapestry of diverse voices and perspectives, rather than a monolithic narrative curated by a few.
Author Bio
Academic references
Agur, C., & Frisch, N. (2019). Digital Disobedience and the Limits of Persuasion: Social Media Activism in Hong Kong’s 2014 Umbrella Movement. Social Media + Society, 5(1). https://doi.org/10.1177/2056305119827002
Couldry, N., & Mejias, U. (2019). The Costs of Connection: How Data is Colonizing Human Life and Appropriating it for Capitalism. Stanford University Press.
Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
Poell, T., Nieborg, D., & Duffy, B. E. (2021). Platforms and Cultural Production (1st ed.). Wiley.
Said, E. W. (1978). Orientalism. Pantheon Books.