Weapons of Math Destruction

by

Cathy O’Neil

Teachers and parents! Our Teacher Edition on Weapons of Math Destruction makes teaching easy.

Weapons of Math Destruction: Chapter 5: Civilian Casualties Summary & Analysis

Summary
Analysis
In 2013, after the recession forced the city of Reading, Pennsylvania to make cuts to is police force (despite persistent crime), police chief William Heim invested in crime prediction software. The software was called PredPol (short for “predictive policing”), and it was made by a Big Data start-up based in California. The software promised to use historical crime data to show, hour by hour, where and when cries were likely to occur. By patrolling these hotspots, police could potentially cut down on crime—and a year later, burglaries in vulnerable areas were down by over 20 percent.
This passage continues to illustrate how technological advancements promise to make many parts of contemporary life more efficient. But thus far in the book, O’Neil has given several other examples of new technologies that have removed human influence from their processes and prioritized optimization over fairness and thorough vetting. It’s likely that this will be the case for PredPol as well.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Predictive policing software operates the same as a lot of baseball statistics modeling—and because it targets geography, it’s supposedly free of the racism and biases that are embedded in the recidivism models that the court system uses. And yet by allowing police to hone in on “nuisance” crimes—vagrancy, panhandling, and small-scale drug use—police are more likely to target vulnerable and impoverished areas. When officers over-police an area, the policing creates new data, which justifies more policing—and so this software can create dangerous feedback loops. 
Since predictive policing software drive officers to certain geographic hotspots, they may claim that their policing isn’t motivated by profiling. But at the same time, policing an area heavily means that it’s more likely to show up as a place where crimes are being stopped—so the software will automatically redirect police to these neighborhoods (which are often low-income and/or predominately non-white).
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Quotes
Human programs like stop and frisk (a program in which NYPD offers were given the go-ahead to stop, search, and frisk anyone who seemed suspicious anywhere at any time) have been shown to create even more friction and danger in vulnerable communities. Mathematical models now dominate law enforcement. And because of theories that link nonviolent crimes to a proliferation of violent crimes, police chiefs tend to believe that even nuisance data is useful in creating “better data” that could be used to focus more heavily on violent crimes.
Stop and frisk was a program that encouraged racism and other forms of discrimination, since officers were encouraged to stop and search people based on their own subjective judgments and biases. Predictive policing promised to solve this problem by taking human prejudice out of the equation, ideally making policing unbiased and fair.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Predictive policing software, however, doesn’t have the capacity to predict white-collar crime—even though the crimes carried out by the rich in the early 2000s arguably created some of the most widespread devastation in the U.S.’s recent history. Police forces across the country are using predictive policing data to reinforce zero-tolerance policies for violent crimes and nuisance crimes alike. They focus almost exclusively on the poor, so it’s clear that they have a choice in where they direct their attentions. PredPol is, according to O’Neil, essentially a “do-it-yourself WMD.” Its inner workings are hidden from the public, it creates dangerous feedback loops, and it’s growing in scale.
Certain crimes, like tax evasion or money laundering, aren’t the kinds of crimes that predictive policing software is looking out for. So, the software—like the human police who use it—is biased toward nuisance crimes committed in low-income or minority neighborhoods. Thus, predictive policing software is, at its core, biased against people of color and working-class people—even though it claims to be working against racism and classism.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Quotes
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
While attending a data “hackathon” in New York in the spring of 2011, O’Neil and the New York Civil Liberties Union worked to break out important data on the NYPD’s controversial and harmful stop and frisk program. Data was driving police to stop, search, and frisk more and more people—mostly Black and Latino youths, only 0.1 percent of whom were actually linked in any way to a violent crime. Stop and frisk itself, O’Neil writes, isn’t a WMD—but it uses calculations to excuse thousands of invasive stop and frisk instances in vulnerable neighborhoods. Stop and frisk, while run by humans, did create terrible feedback loops, punishing Black and Latino men disproportionately for petty crimes and misdemeanors (like drinking in public) that white people were rarely punished for.
Here, O’Neil recalls an instance when she and her colleagues worked together to try to expose the data behind stop and frisk—in a sense, they were restoring a human hand to an automated algorithm. Reviewing how and why data encourages police to stop and frisk people in certain areas is essential in order to make sure that data isn’t contributing to discrimination (like racial profiling).
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Non-white people who live in poorer neighborhoods with minimal access to good schools and job opportunities are more likely to be highly policed. So, WMDs like predictive policing and recidivism models used for sentencing guidelines are inherently racially biased and logically flawed. Even though these models claim to have lots of data about how likely a given person from a certain neighborhood is to commit more crimes upon release from prison, they don’t take into account the human factor. Thus, these WMDs only justify the systems that already exist—they don’t actually gather the data needed to question or improve them.
O’Neil suggests that data-driven programs like stop and frisk and PredPol can worsen the police’s unequal treatment of white and Black or minority people. While these WMDs advertise themselves as fair, they’re maligned by a bias against racial minorities and low-income people. 
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes
O’Neil suggests that data scientists for the justice system should actually learn what goes on inside prisons, and how those experiences affect prisoners’ behavior. Solitary confinement, rape, and malnutrition are all huge problems in the prison system. But a serious data scientist, O’Neil suggest, would look at how things like more sunlight, more sports, better food, and literacy or educational programs might impact recidivism rates. Yet private prisons make up a $5 billion industry that thrives only when its institutions are at capacity. So, rather than analyzing what goes on inside prisons, these companies purposefully work to make prisons mysterious spaces.
Rather than just looking at what factors make someone likely to commit a crime, O’Neil’s suggests that people should start looking at what actually makes life fairer, easier, and more equitable for disadvantaged people. Without transparency and a sense of humanity, any data-driven approach to crime and punishment will only contribute to greater inequality.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Stop and frisk, O’Neil suggests, will soon be a thing of the past. Facial recognition software is evolving every day, and soon, data-driven approaches to spotting potential lawbreakers will breed even more destructive WMDs. Already, police departments around the country are employing technology experts to develop WMDs that attempt to determine which people are most likely to commit crimes.
Technology is only going to get more sophisticated as time goes on—yet O’Neil implies that a serious reckoning with the social consequences of using biased data might never come to pass. So, as technology gets more advanced, it threatens to deepen social divides even more greatly.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
In 2013, a 22-year-old Chicago man who lived in a high-crime, low-income neighborhood received a knock on his door from the Chicago PD. They told him that the force had their eyes on him, since he was associated with people who’d been caught up in the criminal justice system. Rather than trying to get to know the people who lived in neighborhoods where crime was an issue, the police were deepening divisions by essentially spotlighting innocent people.
This anecdote illustrates how data-driven surveillance can encourage discrimination. Innocent people can find themselves in the crosshairs of the police simply because an aggregation of data indicates that they’re similar to people who have committed crimes in the past. WMDs are directly creating painful incidents like this one.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
It’s simpler, O’Neil asserts, to gather data and build models that assume people are all the same than to pioneer programs that help make the justice system fairer (though perhaps less efficient). Poor people and people of color all over the country, she says, are being caught in “digital dragnets”—and in the meantime, the affluent and white people who don’t trigger the algorithm get to live in blissful ignorance.
The algorithms that try to streamline criminal justice are doing serious and perhaps irreparable damage to U.S. society by dividing people along the lines of racial and class.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon