This post was written by Ursula Daxecker and Stefania Milan as part of the GDC-funded project “Political Microtargeting on Social Media in Diverse Democracies”.
Digital platforms and mobile apps such as WhatsApp and Facebook have become important political tools in elections in the Global South. People increasingly receive political information online rather than through real-life interactions or traditional media—but personalization algorithms have the side-effect of producing homophily and polarization (the so-called “filter bubble”). Recent elections in non-Western democracies such as Brazil, India, Nigeria, Pakistan, and the Philippines have exposed potentially worrisome consequences. In diverse societies, religious, ethnic, and political cleavages are often mobilized during elections and parties increasingly rely on these cleavages in their microtargeting of voters through WhatsApp or Facebook. The ruling party in India, for example, uses in-depth demographic profiles to target voters based on caste or religious demographics, spreading messages among like-minded individuals. This microtargeting often relies on misinformation and hateful rhetoric that exacerbates existing divisions and prejudice, ultimately harming the democratic public discourse. In India, messages exploiting distrust and hatred against Muslims are especially common. The spread of misinformation through social media could amplify biases against marginalised groups, increase polarisation, or even lead to hostilities and violence.
In our GDC-funded project, we explore the consequences of political microtargeting and misinformation, in particular hostile messages, on people’s beliefs and democratic attitudes in India. We are especially interested in how the salience of political misinformation and citizens’ partisan and group attachments affects beliefs and attitudes. We also want to know how platform users perceive the role of technology and algorithms in shaping their political views, and what they expect from corporations. Studying the consequences of political microtargeting is urgent because social media-spread political misinformation plays an increasingly important role in elections across the world. However, almost all studies thus far have focused on Western contexts, particularly the US. Our study shifts the focus to India, the most populous democracy in the world and the biggest (and fastest-growing) market for WhatsApp and Facebook. Fake news spread widely on these platforms, and both have come under pressure for failing to crack down on hate speech and rumors by politicians linked to the ruling party. In recent years, concerns about democratic decline in India have intensified, and the potential role of fake news in undermining democratic attitudes requires urgent attention.
Ursula Daxecker is a political scientist at the University of Amsterdam. She specializes on elections and violence, including the ERC-funded project “Elections, Violence, and Parties”. She can be reached at u.daxecker@uva.nl or @uedaxecker
Stefania Milan is a data sociologist at the University of Amsterdam. Her research explores the interplay between digital technologies, activism and governance, also in the framework of the ERC-funded DATACTIVE project (https://data-activism.net). She can be reached at s.milan@uva.nl or @annliffey.