This post was written by Dieuwertje Luitse, Research Master student in Media Studies (New Media & Digital Culture) at the University of Amsterdam
The COVID-19 pandemic has accelerated the global deployment of data-driven healthcare technologies (e.g., contact tracing apps) to mitigate the spread of the virus (Ada Lovelace Institute, 2021; Chiusi, 2020). However, as a growing number of studies have shown, such techno-solutionist approaches to control the pandemic have caused severe harm to vulnerable communities worldwide (e.g., Barman and Rathi, 2021; Leslie et al., 2021). Major problems include but are not limited to the rejection of healthcare access and treatment because of a lack of access to good quality mobile communications, as well as the structural underrepresentation in COVID-19 health data and biometrics of particular social groups. While these issues have triggered important critiques and discussions around the ethics of health technologies and data justice, less attention has been devoted to the concept of refusal in relation to healthcare technologies and COVID-19. According to MIT research scientist, Chelsea Barabas (2019), refusal can offer a useful analytical framework to interrogate and resist power and inequality in a (datafied) tech-driven society. I aim to critically examine the concept of refusal and discuss its related practices in the context of the widespread introduction of healthcare technologies in response to COVID-19. What forms of refusal can be identified? What are the values that currently guide practices of refusal in response to the pandemic? And what can we learn from refusal as an analytical practice in post-pandemic futures?
Refusal as the Informed Practice for Action
To refuse something is to say no to it — to turn down the opportunity to create and deploy technologies if they are likely to produce harm to particular societal groups. Refusal, however, does not merely refer to active negation. Professor in African American Studies, Ruha Benjamin (2016), argues instead that refusal is an informed practice that “is seeded with a vision of what can and should be, and not only a critique of what is”. In a similar vein, co-director of the Algorithmic Fairness and Opacity Working Group (AFOG), Jenna Burrell (2020), has defined refusal as “critique that is headed to action.” “To stop certain tech trajectories,” Burrell (2020) argues, “is to open up new spaces, for new possibilities.” In other words, refusal not just entails the rejection of tech-first approaches to solving different ‘problems’ in society (Crawford, 2021), it also sets the conditions of possibility for action, forcing researchers and policymakers to think through transformative ways of designing and deploying data-driven technologies or to do research with them.
The notion of refusal was the central topic of an exploratory conference organized by AFOG in October 2020. Over the course of three days, the participants approached the concept from a wide range of perspectives including the history of tech refusal, refusal within feminist and indigenous scholarship, and technical opposition of social movement and corporate firms. For example, a group of US-based feminist scholars discussed the Feminist Data Manifest-No (Cifor et al., 2019). Inspired by a note from Ruha Benjamin, the Manifest-No presents a co-written set of 32 declarations, refusals, and commitments for feminist data studies, which refuse existing data practices, including the way data is generated, collected, analyzed and used in society (Dencik et al., 2019). The collection of texts committed to the transformation of technological inquiry. During the conference panel, one of the Manifest-No’s authors, Anna Lauren Hoffmann (2020), emphasized that refusal is not just a theoretical idea, but an analytical practice that should be defined in three ways. First, in order to refuse — or “to decline what’s on offer” — one must understand what is being refused, on what terms, and why. Second, Hoffmann (2020) argues that refusal operates as a reminder of systematic oppression against a variety of societal groups and that axes of oppression do not operate independently from one another (see also D’Ignazio and Klein, 2019). Last, and in line with Benjamin’s (2016) statement mentioned above, refusal includes the redefinition of existing structures and ideas in order to “forge a mode of acting as if a different world is possible” (Hoffmann, 2020).
Tech Refusal in Times of COVID-19
The rapid adoption of data-driven technologies in response to COVID-19 has triggered different groups and organizations across the globe to participate in practices of refusal, thereby resisting pandemic-related politics and possibly providing alternatives. The introduction of multiple health monitoring technologies in India, for example, is increasingly met with critiques that demand the government to provide economic as well as social security for precarious workers, such as those who work within the Indian gig economy, which represents 6 million Indians today (Vashist and Krishnakumar, 2020). In an effort to act upon some of these issues, law scholar Siddharth de Souza (2020) explores the possibility to develop “participatory methods of contracting to involve workers in the decision-making process, drawing from research on ‘proactive law’”. The approach to involve (often precarious) gig-workers in decision-making practices, de Souza (2020) argues, would allow for the identification and determination of common goals and values between workers and platforms. De Souza also expects platforms to focus on the root causes of particular problems (e.g., surveillance and security) in order to eliminate them. To ensure the legitimacy of such approaches, it is particularly important to consider perspectives of gender and race and to distinguish between the local and national contexts within which workers’ labor is carried out (De Souza, 2020).
Meanwhile, Indigenous communities in Brazil started to develop their own community-data infrastructure. Their goal is to monitor the spread of the pandemic on Indigenous lands and inform local mitigation efforts in response to both the general lack of protection and support from the Brazilian government and the widespread misinformation. Stephanie Russo Carroll and colleagues (2021) explain that the Kuikuro Indigenous Association of Upper Xingu (Brazil) started to enhance their existing data infrastructure, collaboration, and governance activities for Indigenous territories to develop their own COVID-19 monitoring system. This has proven useful in controlling local outbreaks as it is nested within other local mitigation techniques (Carroll et al., 2021). Following scholar of technological sovereignty and data justice of Indigenous Peoples, Marisa Duarte, such activities can be understood as practices of refusal: they not only reflect resistance toward Brazil’s lack of pandemic response but also commit to locally developed data-sharing structures that address the needs of Indigenous Peoples and protect their worlds.
While these examples illustrate different pandemic-related issues in different contexts, they highlight the importance of embedding the development of technologies in the knowledge, experiences, and needs of the peoples that are subjected to them. Whether it is to reframe or redirect particular capitalist logics of the gig economy or to resist the neglect of (health) support for Indigenous Peoples, practices of tech refusal provide ways of engaging with those points as they provide alternatives to dominant “modes of acting” (Hoffman, 2020) in particular situations. In this way, professor Sarah Wright (2018) has argued, refusal includes an “insistence on the ongoing presence and flourishing of [all] beings and the worlds they nurture”.
Refusal: Values and Practices for a Post-Pandemic World
As an analytical tool, refusal has the potential to inform researchers and policymakers across different disciplines and political areas to examine and resist the power and societal inequality reproduced by data-driven technologies (Barbaras, 2020) during the pandemic and beyond. From this perspective, ‘to refuse’ requires not just a commitment to reject existing structures but also to present alternatives. Such alternative actions should address and do justice to the knowledge, experiences, and needs of all stakeholders, particularly the most vulnerable groups that are often excluded from important technological decision-making processes and subject to harmful data practices. By refocusing the terms of engagement in tech decision-making processes and design, refusal could help reshape the conditions under which technologies are developed. It can also enable reflection on whether some technological approaches to ‘problem-solving’ put forward by researchers and policymakers should be developed in the first place.
Dieuwertje Luitse is a Research Master student in Media Studies (New Media & Digital Culture) at the University of Amsterdam, with a professional background in graphic design and media arts. Her research interests mainly focus on the political economy of platforms and the (historical) development of Artificial Intelligence systems in relation to their social and political implications.
Ada Lovelace Institute. (2021). The data divide. Ada Lovelace Institute [Report]. Retrieved from https://www.adalovelaceinstitute.org/wp-content/uploads/2021/03/The-data-divide_25March_final-1.pdf.
Barabas, C. (2020, October 13). To Build a Better Future, Designers Need to Start Saying ‘No’. OneZero. Retrieved from https://onezero.medium.com/refusal-a-beginning-that-starts-with-an-end-2b055bfc14be.
Barman, A. and Rathi, A. (2021, June 2). Atmanirbhar Bharat Meets Digital India: An Evaluation of COVID-19 Relief for Migrants. Migrant Workers Solidarity Network & The Centre for Internet and Society. Retrieved from https://cis-india.org/raw/
Benjamin, R. (2016). Informed Refusal: Toward a Justice-based Bioethics. Science, Technology, & Human Values, 41(6), 967–990. https://doi.org/10.1177/0162243916656059
Burrel, J. (2020, October 14) What has been refused? A history of refusal [Conference Session]. The Refusal Conference, Algorithmic Fairness and Opacity Working Group, UC Berkeley, Berkeley, CA. Retrieved from https://www.youtube.com/watch?v
Carroll, S.R., Akee, R., Chung, P., Cormack, D., Kukutai, T., Lovett, R., Suina, M., Rowe, R.K. (2021). Indigenous Peoples’ Data During COVID-19: From External to Internal. Frontiers in Sociology. 6. doi: 10.3389/fsoc.2021.617895
Chiusi, F. (2020). Introduction. In: Automated decision-making systems in the COVID-19 pandemic: A European perspective. AlgorithmWatch and Bertelsmann Stiftung. Retrieved from https://algorithmwatch.org/en/wp-content/uploads/2020/08/ADM-systems-in-the-Covid-19-pandemic-Report-by-AW-BSt-Sept-2020.pdf.
Cifor, M., Garcia, P., Cowan, T.L., Rault, J., Sutherland, T., Chan, A., Rode, J., Hoffmann, A.L., Salehi, N., Nakamura, L. (2019). Feminist Data Manifest-No. Retrieved from: https://www.manifestno.com/.
Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of Artificial Intelligence. New Haven, MA: Yale University Press.
D’Ignazio, C. and Klein, L.F. (2019). Data feminism. Cambridge, MA: MIT Press.
De Souza, S. (2020, October 23). Proactive contracting for platform work: Making the design of terms and conditions more participatory. Feminist Approaches to Labour Collectives (FemLab.co) [Blog]. Retrieved from https://femlab.co/2020/10/23/proactive-contracting-for-platform-work-making-the-design-of-terms-and-conditions-more-participatory/.
Dencik, L., Hintz, A., Redden, J. and Treré, E. (2019) Exploring data justice: Conceptions, applications and directions. Information, Communication & Society, 22(7), 873-881, https://doi.org/10.1080/1369118X.2019.1606268
Duarte, M. (2020) Indigenous Scholarship on Refusal and The Prospects for Remaking Tech [Conference Session]. The Refusal Conference, Algorithmic Fairness and Opacity Working Group, UC Berkeley, Berkeley, CA. Retrieved from https://www.youtube.com/watch?v=SnC2KJWgFzY&t=5275s
Hoffman, A.L. (2020). Feminist Data Manifest-No [Conference Session]. The Refusal Conference, Algorithmic Fairness and Opacity Working Group, UC Berkeley, Berkeley, CA. Retrieved from https://youtu.be/SnC2KJWgFzY?t=3754
O’Neill, P.H., Ryan-Mosley, T., Johnson B. (2020, May 7). A flood of coronavirus apps are tracking us. Now it’s time to keep track of them. MIT Technology Review. Retrieved from https://www.technologyreview.com/2020/05/07/
Leslie, D., Mazumder, A., Peppin, A., Wolters, M.K. and Hagerty, A. (2021). Does “AI” stand for augmenting inequality in the era of covid-19 healthcare? BMJ, 372(304). Retrieved from https://www.bmj.com/content/372/bmj.n304.
Vashist, T. and Krishnakumar, S. (2020). COVID-19 and the New Normal in India’s Gig Economy. Data Active [Blog]. Retrieved from https://data-activism.net/2021/03/bigdatasur-covid-covid-19-and-the-new-normal-in-indias-gig-economy/
Wright, S. (2018). When dialogue means refusal. Dialogues in Human Geography, 8(2), 128–132. https://doi.org/10.1177/2043820618780570