The Digital Border - Oxford University

The Digital Border - Oxford University

Transcript of an interview about the digital border with from Dr Anthea Vogl, the Associate Co-Director of Border Criminologies at the Oxford University Faculty of Law.

Dr Anthea Vogl, Associate Co-Director of Border Criminologies at the Oxford University Faculty of Law.

Anthea: Please introduce yourself...

Dhaksh: Hello everyone, Vanakam. My name is Dhaksh. I’m joining you today from the land of the Gadigal people of the Eora Nation. I would like to pay my respects to their Elders past and present. This always was and always will be Aboriginal land. I am a queer genderfluid Tamil person who was born in Illankai (known as Sri Lanka) and my ancestors are from the East and North of the island. I am a recovering civil engineer who has spent the last decade working as an activist and campaigner in the non-profit sector. I’m a PhD candidate at the School of Regulation and Global Governance at ANU focussed on digital surveillance in India and Australia.

Anthea: In your research, advocacy and activism, you work with the concept of the digital border or technology-enabled border policing. Can you explain the basics of the concept and what theoretical conceptions of borders/policing/surveillance or state power you see it as in conversation with?

Dhaksh: So if the borders are, to use the words of Harsha Walia, carceral regimes at the nexus of the local and the global, that operate as sites of containment and social control; then, the digital border is the digitisation and automation of these regimes.

Perhaps I’ll first talk about a bit about my understanding of digital borders through my experience as an activist, and then describe theoretically.

I first began to think about digital surveillance as a corporate campaigner focused on tech companies who were part of a growing border and surveillance industry. So this is companies such as IBM, Amazon, Google, Microsoft, Palantir and who had scaled up public-private partnerships with governments to deploy biometrics, motion detectors, drones, surveillance towers and large-scale databases (lie detectors, visa processing systems) etc at international borders. In examining some of these developments what occurred to me was when considering the Uighurs in Xinjiang China, the Palestinians living under Israeli settler colonial occupation, the Kashmiris seeking freedom from the Hindu-nationalist Indian government, not only are they all living under regimes of violent social control but these are regimes that are all digitally enabled. In fact this is increasingly true of all imprisoned and oppressed populations the world over, including here in this colony.

To get into the theory, scholars have documented how in Australia and globally, surveillance, or data collection has been used by colonial powers to classify people and control newly acquired territories (Lyon, 2001; Zuriek, 2013). Today the borders that “contain, channel, and sort” populations and persons according to racial and other social hierarchies have become virtual (Lyon, 2002; Lyon, 2005).

Many scholars refer to Pötzsch’s (2015) concept of the “iBorder” to describe the socio-technical assemblage (which includes data, algorithms, code) in methods of sorting, categorising, and filtering individuals who are forcibly displaced or immobilised.

Because big data is ubiquitous, and can include data traces collected in everyday activities including financial transactions, social media use, etc.

Key to the concept of the i-border is that the border is not bound to a physical location, not about the spectacle at the border, but instead the datafied border is an entity attached to the physical self.  The border is a method of governance.

In this way, data augments borders significantly; both the management of physical international borders and the dispersal of borders across and within societies.

It allows for verification and identification of people with greater speed, accuracy, and lower cost than ever before possible. This automation of border policing further exacerbates the criminalisation of already marginalised persons and groups.

Anthea: How have ideas of digital borders evolved and developed over time?

Dhaksh: Activists and scholars have become clearer on the “internalisation” of borders within a nation state (Latonero and Kift, 2018) and the “externalisation” of borders outside of the nation state.

In terms of externalisation, the use of digital surveillance technologies such as cameras, drones, integrated surveillance systems, and GIS-based risk analysis methods in the land and waters surrounding particular nations and seas have changed how people experience border crossing attempts.

In terms of internalisation, this refers to the ubiquity of digital surveillance in every facet of life.

What got me thinking about this is when I was interacting with a racist real estate agent who was threatened to blacklist me on a tenancy database, which would mean I would not be able to get a rental property in NSW for 3 years. I started looking into some of the tenancy database companies and saw that they shared data (including visa status) with various government agencies, and even over companies overseas (for example to employee background check companies).

In the Indian context, the clearest example of border internalisation is Aadhaar, the Indian government’s digital ID system, which is the world’s largest system. It is mandatory for accessing state benefits, welfare subsidies, and to file taxes. When registering for Aadhaar, people are required to share fingerprints and iris scans, in addition to their name, date of birth, gender, address and a facial photograph. Within a few years of being set up in 2009, it became a primary linking databases, combining bank accounts, mobile phones, income tax returns, payment apps, email IDs.

We can see how the internalisation and externalisation of borders and digital technology used to automate them work together, in the name of efficiency, optimisation and risk management, to sort individuals and groups according to their perceived levels of dangerousness, criminality, or perceived worth (e.g. customer, credit and crime profiling), at greater scale and speed than historically possible (Feely and Simon 1994; Graham 2005).

This idea of internalisation is also really helpful as it brings concepts of digital borders into conversation with digital surveillance in the social welfare system (such as the work of Eubanks 2017) and the criminal justice system (including work of scholars like Simone Browne 2015; Ruha Benjamin 2019).

Anthea: Can you talk a little about the focus of your own research and where you have applied or theorised the idea of technology-enabled borders/ border technologies within it: I am wondering if and why you find the framing useful, and/or any critique of the idea.

Dhaksh: Yes for sure. To answer your question I’ll first speak to how useful I find it, and how my thinking about it has shifted, and then speak to my research focus.

I do find the concept useful, but I would augment it in two main ways.

Firstly, given the ubiquity of digital borders, I find it helpful to pair it with concepts from surveillance studies scholars (Lyon 2005, Kitchin,  Zuriek 2013) who theorise how borders (made up of data, code and algorithms) functions to sort people spatially or geographically based on social hierarchies into into geographic social hierarchies. Some are allowed into zones of privilege, access and opportunity, whilst others are relegated to zones of precarity, debility, and death. Your displacement or containment depends on your positionality and identity within power structures such as white supremacy, caste/Brahmanical supremacy, heteronormativity, and socio-economic class.

This is not just about the maintenance of existing social orders but also about solidifying these into the future, through predictive policing.

Predictive policing allows computer systems to make predictions about the future based on algorithms (that have been trained on big data through machine learning). This can result in the consolidation of past and present social hierarchies into the future (Mertia 2020).

In this way, systems of surveillance and digital surveillance are instrumental in creating and reinforcing the socio-spatial hierarchies both in the present and into the future (Gandy 1993; Lyon 2003).  This means borders, as state-corporate socio-technical assemblages, that are not only sorting and ordering our neighbourhoods in geographically unequal ways in the present; but also shaping the ‘spatio-social production of the future’ (Jeffrey and Dyson 2021: 642; Leszczynski 2016).

If there is another concept I would pair with the digital border, it’s prefigurative politics (Jeffrey and Dyson 2021: 642; Leszczynski 2016).

Prefigurative politics about building alternative possible future worlds through ‘organising (that) reflects the society we wish to live in’; ‘that the methods we practise, institutions we create, and relationships we facilitate within our movements and communities align with our ideals’ (Walia 2013: 9). Put simply, it's about practising the future we want to create in the present.

This can include imagining and performing governance practices that reflect an alternative legal reality. For example, Davina Cooper (2023) is currently undertaking a prefigurative project around ‘decertification’, which imagines the possibility of taking sex off birth certificates. This is not just a narrow legal project but part of the work necessary to imagine a future beyond the gender binary.

I guess as an activist I went deep into seeing the ‘dark side’ of prefiguration. For instance, how the experimentation of surveillance tech on those at the margins, then expanded out to the broader populations. We saw this with Pegasus spyware developed by the Israeli cyber-arms company NSO Group. And interestingly what we now know as Aadhaar actually began in 1999 as an identity card project for citizens living in border states. It is concerning to think about how Xinjiang, where China has arguably created ‘the world’s largest open air digital prison’, provides a pre-emptive glimpse of digital technology combined with authoritarianism might have in store’ (Polyakova and Meserole 2019).

I’m feeling a sense of urgency about seeking out and nurturing alternative digital futures.

Hence I have shifted my research to focus on the digitally enabled surveillance architectures in South India, and how smart cities contribute to deepening autocratisation and reinforcing inequalities, both in the present, and through the unequal spatio-social production of the future.

We’ve seen Narendra Modi’s BJP government, criticised as a brazen Hindu nationalist project, make digital surveillance pervasive across every facet of Indian life. The Modi government ‘Smart Cities Mission’ is an ambitious plan to transform 100 existing Indian cities into smart cities. This is part of a broader thrust towards ‘e-governance’, happening under the banner of the ‘Digital India’ programme, includes the world's largest biometric ID system (known as ‘Aadhaar’), the forced adoption of digital payment systems, and a new facial recognition system that some suggest could become the world’s largest (the Automated Face Recognition System).

My study seeks to understand how this corporate and state-driven narrowing of possible urban futures can be countered through alternative imaginings of the future city.

Anthea: Your research and writing are very much grounded in from your work, your personal experience, expertise and activism. Has there been a fieldwork or advocacy experience, moment or project that has had a particular impact on you, or that has been memorable as you continue to undertake work in this field?

Dhaksh: I recently was undertaking some pre-field work in Chennai, Tamil Nadu, India. During this, I spent a lot of time with queer and trans community and civil society organisations.

During COVID in Nov 2020, the Ministry of Social Justice and Empowerment (MSJE) implemented a National Portal for Transgender Persons is intended to provide trans persons a comprehensive platform for acquiring a transgender certificate and identity card (TGID) based on their self-perceived identity. The ID card and certificate are mandatory to access any of the social welfare schemes for trans people, including skill training, scholarships, shelter homes or medical insurance.

To apply for a TGID, trans people must upload a self-attested affidavit declaring their place of residence, self-perceived identity and any one of the following ID cards – passport, bank passbook, birth certificate, PAN card, Aadhaar card, MGNREGA card or caste certificate. Procuring any identification document in preferred name and gender requires existing identification documents in given name and assigned gender, however trans people often run away from home without such documents (Brindaalakshmi 2020).

There have been numerous challenges associated with the TGID. Many trans persons do not have the financial resources to apply (the cost is Rs 300-400 to get an affidavit made). Digital processes are not the most effective way to collect data on trans people given that no more than an estimated 10-15% transgender persons use any digital device (Centre for Internet and Society 2020).

Forced digitisation of social welfare is implemented without consideration for the access gap to the internet and digital devices in India, inevitably further marginalises gender and sexual minorities (GSM). Accessing welfare becomes synonymous with being under constant state surveillance, requiring the disclosure of details such as gender identity, sexual orientation, marital status, among other personal data (Brindaalakshmi 2022).

However, since welfare is necessary for the everyday struggle of staying alive, particularly in a pandemic, activists focused on trans rights must fight for better access, inclusion and data protection within state surveillance systems. When it comes to resisting oppressive digital surveillance, activists are also balancing multiple objectives in the broader fight for trans rights, including access to education and employment, access to shelter, access to alternative family structures and protection from police brutality amongst numerous other needs (Banu 2020).

Academics and activists based in the West need to be careful about bringing assumptions about what battles are the priority for people who are impacted most by these systems on the ground, and also we need to come up with theoretical frameworks that reflect the nuance and complexity of local context.

References

Benjamin, R. (2019) Race After Technology: Abolitionist Tools for the New Jim Code. Social forces. Oxford University Press. doi: 10.1093/sf/soz162.

Brindaalakshmi, B.K. (2021) Trans lives under surveillance. Available at: https://in.boell.org/en/2021/11/30/trans-lives-under-surveillance

Brindaalakshmi, B.K. (2022) Rights of gender and sexual minorities in the age of datafication. Available at: https://in.boell.org/en/gender_data

Banu, G. (2020) Writ Petition Summary: Grace Banu: Challenges to Transgender Persons Act. Supreme Court Observer. Available at: https://www.scobserver.in/reports/swati-bidhan-baruah-union-of-india-challenges-to-transgender-persons-act-writ-petition-summary-grace-banu/

Browne, S. (2015) Dark Matters: On the Surveillance of Blackness. Duke University Press.

Cooper, D. (2023) Prefigurative Law Reform: Creating a New Research Methodology of Radical Change. Critical Legal Thinking. Available at: https://criticallegalthinking.com/2023/03/03/prefigurative-law-reform-creating-a-new-research-methodology-of-radical-change/.

Eubanks, V. (2017) Automating inequality: How high the tools profile, police and punish the poor. New York: St. Martin’s Press.

Feely, M. & Simon, J. (1994) Actuarial Justice: The Emerging New Criminal Law. in The Futures of Criminology, edited by D. Nelken. London; Sage.

Graham, S. D. N. (2005) Software-sorted geographies. Progress in Human Geography, 29(5): 562–580.

Jeffrey, C. & Dyson, J. (2021) Geographies of the Future: Prefigurative Politics. Progress in Human Geography 45(4): 641–58.

Kitchin, R. (2014) The Real-Time City? Big Data and Smart Urbanism. GeoJournal 79:1–14.

Latonero, M. & Kift, P. (2018) On digital passages and borders: Refugees and the new infrastructure for movement and control. Social Media + Society (20 March).

Lean, T (ed) (2023) The Incarceration Issue. Archer Magazine 18. Available at: https://archermagazine.com.au/2023/03/archer-magazine-18-the-incarceration-issue/

Leszczynski, A. (2016) Speculative futures: Cities, data, and governance beyond smart urbanism. Environment and Planning A 48(9): 1691–1708.

Lyon, D. (2003) Surveillance as Social Sorting. London: Routledge.

Lyon, D. (2007) Surveillance Studies: An Overview. Cambridge: Polity Press.

Polyakova, A. & Meserole, C. (2019) Policy brief: Exporting digital authoritarianism: The Russian and Chinese models. The Brookings Institute. Available at: https://www.brookings.edu/wp-content/uploads/2019/08/FP_20190827_digital_authoritarianism_polyakova_meserole.pdf

Pötzsch, H. (2015) The emergence of iBorder: Bordering bodies, networks, and machine. Environment and Planning D: Society and Space 33(1): 101–118.

Mertia, S. (eds) (2020) Lives of Data: Essays on Computational Cultures from India. Institute of Network Cultures: Theory on Demand #39.

Metcalfe, P. & Dencik, L. (2019). The politics of big borders: Data (in)justice and the governance of refugees. First Monday, 24(4). https://doi.org/10.5210/fm.v24i4.9934

Swaminathan, M. & Basu, A. (2020) Surveillance and Data Protection: Threats to Privacy and Digital Security. The Centre for Internet and Society. Available at: https://cis-india.org/internet-governance/blog/india-digital-freedoms-5-surveillance

Walia, H. (2013) Undoing Border Imperialism. Institute for Anarchist Studies.

Zureik, E. & Hindle, K. (2004) Governance, Security and Technology: the Case of Biometrics. Studies in Political Economy, 73(1): 113-137.