top of page

Are we automating inequality?

Ellen Coughlan, Social Justice theme team


Over a long winter break, I was finally able to tackle that most daunting task - the unread books that sulk in their stack next to my bedside table. Though I could write a few hundred words praising Ferdia Lennon’s debut or musing with wonder about Colson Whitehead, I wanted to write about Virginia Eubanks’ book, Automating Inequality - how high-tech tools profile, police and punish the poor. The depth of my reading backlog is somewhat exposed here because this book was published in 2018, but really it’s becoming more frighteningly relevant with each year that passes as technology accelerates. 


The book describes three examples from the United States that demonstrate how systems can further marginalise people experiencing poverty, but the book’s lessons can be applied across the world and are relevant to data science tools that are developed and implemented in healthcare. Recent research from the Health Foundation found that people who live in more deprived places are less likely to think that technology will improve the quality of healthcare, which should give us all pause for thought.


Eubanks’ approach to consider each of these cases from the perspective of the people who have been impacted by the technology imbues the book with a visceral humanity and, often, tragedy. It serves to remind the reader that the design of these tools must place individuals most at risk of marginalisation front and centre, and provides a rallying cry that tools should contribute to the dismantling of structural inequality rather than to simply record its creeping offense. There are a number of takeaways from the book for anyone concerned with equity, justice and data-driven tools. I’ll choose just a couple.

Two ceramic-like hands grip and pull on delicate threads that emerge from a "woven circuit board." The contrast between the rigid, heavy material of the hands and the soft, fragile threads creates a visual paradox, symbolising the insertion of human touch into the mechanised world. The image evokes a sense of personified anonymity, questioning whose histories and labours are being revealed or concealed when the threads of technology are pulled.
Hanna Barakat + AIxDESIGN & Archival Images of AI / Better Images of AI / Woven Circuit / CC-BY 4.0

Designing the human dimension of care

The importance of participatory design and measures to promote monitoring, accountability and human oversight is clear throughout the book. In the case studies that Eubanks describes, there is evidence of transparency to some degree, an attempt at participation and effort to understand the experience of the people for whom the tools will impact. That wasn’t enough. Thoughtful, thorough participatory design threaded through the development, implementation and monitoring of digital tools is essential to avoid systems that remove the important elements of human intervention. The Strategy Unit recently published a report on Digital Downsides, highlighting the relational trade-offs that we encounter and must grapple with when these tools are used in healthcare, including a shift to transactional care that emphasises ‘simple’ needs over complex conditions, and diminishes the human dimension of care. This decline of human interaction is a red flag glowing throughout the chapters; human interaction provides “accountability for reasonableness” and the opportunity for intervention should a patient feel that they’ve encountered a poor outcome or a decision that they don’t understand. Eubanks raises the alarm on systems that push the burden of investigating and resolving suspected errors onto the people for whom the tools impact. The EU AI Act has been criticised as neglecting to afford individuals rights to participate; the requirement for engagement with individuals impacted by AI in Fundamental Rights Impact Assessments was removed before the Act was adopted. Here at DSxHE, our Participatory Research theme has held a number of webinars on good practice, join them on Slack and watch them in action here.


Structural inequalities and quality data

In one case, Eubanks describes a system that makes predictions about risks of harm to children in different homes. The dataset that the system has been trained on almost entirely comprised families living in poverty; parents were concerned that the system would confuse “parenting while poor” with “poor parenting” and incorrectly flag risks, while the operators of the system were concerned that the system wouldn’t correctly flag risks of children living in more affluent households. These issues did not begin with the tool itself, their roots rest in the structural inequality and bias that exists in society and are reflected in the systems that collect data. The hazard of automation is an amplification of inequity if explicit actions are not taken to design tools to counter these biases and provide users with knowledge and guidance on bias. While the political outlook on tackling this issue seemed dim this week as the US revoked a 2023 executive order to address the risk of bias in AI tools, the data science community’s efforts have gathered momentum. STANDING Together, a coalition of people from 58 countries, published recommendations last month on addressing bias and transparency in health datasets. 

A collage that merges circuit board patterns with textile motifs in a grid-like background of alternating black, grey, and white. Two hand-drawn arms are on each side of the image, positioned as if gently pulling on thin, white strings that cross the image diagonally. The hands appear soft and somewhat translucent, contrasting with the rigid lines of the circuit board patterns behind them. The strings are woven through both the hands and the background, symbolising the connection between traditional weaving and modern technology. The overall colour palette features muted earth tones, including browns, beiges, and grays, creating a sense of both history and continuity between the natural and technological worlds..
Hanna Barakat + AIxDESIGN & Archival Images of AI / Better Images of AI / Textiles and Tech 2 / CC-BY 4.0
 

As a data science community we must challenge inequity through devoting resources to meaningful participatory design, understanding structural inequality, welcoming curiosity and humility and learning from one another. For those looking to make a resolution for the new year, Eubanks’ oath of non-harm for an age of big data would be a good starting point & we’re discussing it here on our Social Justice channel:


“I swear to fulfil, to the best of my ability, the following covenant:

I will respect all people for their integrity and wisdom, understanding that they are experts in their own lives, and will gladly share with them all the benefits of my knowledge. 

I will use my skills and resources to create bridges for human potential, not barriers. I will create tools that remove obstacles between resources and the people who need them.

I will not use my technical knowledge to compound the disadvantages created by historic patterns of racism, classism, able-ism, sexism, homophobia, xenophobia, transphobia, religious intolerance, and other forms of oppression. 

I will design with history in mind. To ignore a four-century long pattern of punishing the poor is to be complicit in the “unintended” but terribly predictable consequences that arise when equity and good intentions are assumed as initial conditions. 

I will integrate systems for the needs of people, not data. I will choose system integration as a mechanism to attain human needs, not to facilitate ubiquitous surveillance. 

I will not collect data for data’s sake, nor keep it because I can. 

When informed consent and design convenience come into conflict, informed consent will always prevail. I will design no data-based system that overturns an established legal right of the poor. 

I will remember that the technologies I design are not aimed at data points, probabilities, or patterns but at human beings.”

 

Virginia Eubanks (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: Picador, St Martin’s Press




Comments


Commenting has been turned off.
bottom of page