logo

41 pages 1 hour read

Virginia Eubanks

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor

Virginia EubanksNonfiction | Book | Adult | Published in 2018

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Summary and Study Guide

Overview

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (2018) by political scientist Virginia Eubanks chronicles the history of the welfare support system in the US, focusing on its widening inadequacies due to the rise of automation and artificial intelligence technologies. While proponents of these techniques believe them to be revolutionary solutions, Eubanks argues that despite promising to “shake up hidebound bureaucracies” and “increase transparency,” the new regime of data analytics “is simply an expansion and continuation of moralistic and punitive poverty management strategies” present since the 1820s (37). She uses several case studies of automation in public assistance across America to explain her argument.

Critically lauded on publication, Automating Equality won the 2019 Lillian Smith Book Award, the 2018 McGannon Center Book Prize, and was shortlisted for the Goddard Riverside Stephan Russo Book Prize for Social Justice.

Plot Summary

Eubanks uses the history of the poorhouse—a way to punish and contain the impoverished in the 19th and early 20th centuries—as a metaphor for modern automated data and technological systems. Eubanks argues that the digital entrapments of the modern welfare state are just a digital version of these rickety structures, making profit from poverty in a similar manner.

The first case study is the automated welfare eligibility system in Indiana. Eubanks connects its failures with overarching societal mechanisms that use discriminatory profiling on the impoverished.

The second case study is the coordinated entry housing system in Los Angeles, where an algorithm scores neediness, hiding results from recipients but not from the police officers who hunt them. Profiling turns into policing, eroding the illusion of impartial aid.

The final case study is the predictive risk model for child abuse from Allegheny County, Pennsylvania. Based on a flawed premise and misrepresentation of variables, this system indirectly punished the poor for accessing public services.

Notably, all three case studies include vast evidence of racial profiling, segregation, and discrimination within the hearts of supposedly impartial algorithms, indicating an exponentially bloated problem: administrative leadership’s refusal to correct course and redistribute resources as the welfare gap expands in American society. Not only do automated technologies perform inadequately, but by profiling so extensively, they additionally police and punish the poor just by nature of their social conditions.

Eubanks acknowledges the potential for technology to aid in labor movements, but stresses that the biggest influence (aside from a Universal Basic Income) is to tell better stories with our datasets.

blurred text
blurred text
blurred text
blurred text
Unlock IconUnlock all 41 pages of this Study Guide

Plus, gain access to 8,650+ more expert-written Study Guides.

Including features:

+ Mobile App
+ Printable PDF
+ Literary AI Tools