Technological Refusal When Our Bodies Are at Stake
As someone who writes and thinks extensively about technological refusal – contesting sociotechnical systems – in marginalised communities in the US, I should not have been surprised by an issue in my own backyard in the past year. But I was.
As it turns out, Ocado, a company that the UK considers to be one of its finest, most innovative firms (a ‘Microsoft of retail’), plans to operate a depot directly behind the primary school attended by my children. Ocado plans to fulfill thousands of online grocery orders at this depot, meaning hundreds of delivery trucks would traverse the area daily.
Local residents worry about the impacts of noise, air, and light pollution not only on the children who attend this school, but also on everyone living in the surrounding area. In late 2020, thanks to the pleas of local residents, the local council revoked the planning license of the tech company. Ocado is determined to move ahead, however. After a court affirmed the legality of license revocation, the company is moving to appeal the decision. Whatever Ocado pushes, the company faces fierce opposition from outraged residents.
A stance of adversarialism and contestation feels all the more urgent to take these days. Like countless examples that Our Data Bodies – the project I co-founded and co-lead – has surfaced, this case exemplifies the bully tactics and coercive logics of tech companies. Let us do what we want, or be damned.
Our Data Bodies has talked to marginalised people in Charlotte, Detroit, and Los Angeles about their experiences with data collection, data-driven systems, and the people and institutions that manage them. And in the case of the Ocado depot, I see a familiar fighting response: Hell no… we deserve better than you.
If I understand the responsible data community correctly, it urges that data collection, data analysis, and data-driven decisions be done/made in ways that recognise context and history, including histories of oppression or injustice. Responsible data advocates connect with data-for-good efforts, stress ethics, and sustainability and accountability, and emphasise justice and equity at each stage in the life cycle of data-driven systems.
Our Data Bodies recognises and engages in this kind of work: just this past fall, Tawana Petty and Tamika Lewis, both co-founding members, contributed to A Toolkit Centering Racial Equity throughout Data Integration. Work of this nature will remain important and necessary to challenging systemic and institutionalised oppression.
On top of this, the current political and economic climate demands attention to countering the power of sovereign technologies. Companies that make optimisation technologies – technological systems that constantly suck up data about us and adjust or optimise services on the basis thereof – have us in the palm of their hand. They get us hooked on a service and then act with impunity, making them the sovereign power over our daily habits.
Thanks to increased demand for virtual services, during the Covid-19 pandemic the power of these companies has skyrocketed. Tech CEOs and tech companies have taken advantage of a health crisis to drive demand, deploy technological infrastructures, and lock in users.
Many of these users have been people ensconcing themselves at home waiting for their online grocery deliveries or remote health diagnoses – consumers and citizens. But perhaps even more significantly, users have also included state institutions pressured to offer online services, including automated government services, on short notice. As Julia Glidden, Worldwide Public Sector CVP at Microsoft, said, “We can do virtually any interaction with the government in our pajamas… and what we can do now is do it at scale.” Indeed, in Europe and the United States – the two places I know best – education, health, policing, and other basic public services now rely on a combination of digital self-service, online service delivery, predictive analytics, and remote diagnostics.
From citizens and consumers to governments, users fully depend on tech companies to keep them afloat (those who lie out of reach of tech infrastructure are dependent too – though instead of using tech to stay afloat, they are fully left behind).
As dependence on tech infrastructure deepens, state and private actors are normalising surveillance – making it seem inevitable, unavoidable, common sense. Citizens and consumers are encouraged to adopt or accept real-time automated services, like ‘smart’ doorbells at home and facial recognition cameras on the street. The increased demand for these services drives more surveillance, with tech suppliers able to fine-tune what user data they monitor and process.
Similarly, in the workplace, employers now routinely monitor worker productivity, whether employees are based at home or on-site. Health-status monitoring is the most common amongst these practices, though performance management is also widespread.
Low-wage workers at Amazon, for example, can face a constant state of anxiety, with some being subjected to a pack-rate target of 60 to 90 boxes per hour. High(er)-wage workers experience workplace surveillance in different ways, but arguably with as much anxiety, as employers institute monitoring software designed to measure productivity or, in some cases, to detect employee actions for which the company could be legally liable.
At stake is no less than our bodies – where we can move, how we can move, who we can move with. Take iris scan technology, for example. Tech evangelists in the humanitarian field justify it as a means of deterring fraud and waste in humanitarian assistance. For the past decade, the push towards scanning the irises of refugees has grown alongside the idea that humanitarian assistance needs to be optimised. In 2020, UNHCR reported wide scale implementation of iris scanning in refugee programs in Bangladesh, Ethiopia, Zambia, and Malawi, and smaller scale or pilot programs in Costa Rica, Greece, Burundi, Iran, and Rwanda. Today, iris scanning of refugees is expanding to food assistance.
These systems’ focus on optimisation is a familiar story in a neoliberal state – and certainly one on display in the earlier research we did with Our Data Bodies. They trade empathy for efficiency, and privacy and dignity for access to basic needs. By limiting daily interactions to a set of participating providers, they narrow a path of choices and opportunities for self-development, let alone self-determination.
While iris scan technology might be a long time coming in mainstream consumer, citizen, or employee settings, the infrastructural logic that they embrace travels easily to other contexts. Given how central technology has become to the post-Covid state, it may be only a matter of time before ‘stimulus’ payments or universal basic income is mediated through a locked-in system architecture.
But the supremacy of these systems is not a done deal, and there are kinks in the armor of sovereign tech. I see hope in the continued work of Our Data Bodies, kindred groups, and inspiring movements, and in my own back yard, in the court decision against Ocado. Let’s hope the future brings more successes like these.