Weapons of Math Destruction

by Cathy O’Neil

Weapons of Math Destruction: Conclusion Summary & Analysis

Summary
Analysis
WMDs cause destruction and chaos throughout society: in public schools, colleges, courts, workplaces, voting booths, and more. But it’s too late to disarm these weapons one by one—they all feed into one another. Data encourages companies to send people predatory ads. It also encourages police to go into vulnerable neighborhoods, and then it influences the courts to give the people whom police arrest longer prison sentences. All this data tells other WMDs that these people are high risks—so they’re blocked from jobs and watch helplessly as their interest and insurance rates ratchet up. All the while, WMDs keep the wealthy and comfortable in silos of their own, ignorant of others’ suffering. WMDs are part of a “silent war.” 
Here, O’Neil walks readers through the interconnected nature of the processes behind applying to schools and jobs and seeking credit and insurance. In doing so, she’s illustrating the reason that WMDs are so dangerous: because the data used to create them now feeds multiple systems at once. It’s hard to disengage from WMDs because they’re virtually everywhere.  
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Corporations can right wrongs in their algorithms—for instance, even though President Bill Clinton signed the Defense of Marriage Act into law in 1996, IBM promised a week later to extend benefits to its employees’ same-sex partners. They did so not necessarily because of morality, but because other tech giants were already doing so, and they didn’t want to lose employees to competitors. So, in a bid to attract a growing talent pool of LGBT workers, IBM corrected an unfairness.
Here, O’Neil shows how a simple change in policy at a crucial moment had a ripple effect. Even though IBM made this particular move to maximize their efficiency and competitiveness in the market (but did so in the name of fairness), there’s still room for major corporations to make big changes in the name of equity and objectivity.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
In that scenario, everyone won—but companies aren’t always so incentivized to dismantle their WMDs. Many WMD victims are the most voiceless and disenfranchised: the poor, the incarcerated, the vulnerable. These easy targets are where all WMDs start operating. But it won’t be long, O’Neil predicts, before they evolve and spread, targeting the middle and upper classes as they search for new opportunities.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
The main difference between the WMDs of the present and the prejudiced human errors of the past is simple: humans can evolve, learn, and adapt. But automated systems are stuck in time—engineers have to change them as society progresses. So essentially, “Big Data processes codify the past” rather than inventing the future. Only humans have the “moral imagination” needed to create a better world. O’Neil asserts that humanity is in the throes of a new kind of industrial revolution—and it is urgent that we learn from the mistakes of the last one, which exploited workers and endangered lives in the name of profit.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
We need to regulate the mathematical models that increasingly run our lives, and we must start with the modelers themselves. Like doctors who swear to the Hippocratic Oath before obtaining their medical licenses, O’Neil suggests, data scientists need to abide by certain moral codes and strictures that prevent them from doing harm to others. Regulating WMDs would be difficult and deeply involved—but O’Neil argues that even if it comes at a cost to efficiency, we must start to “impose human values” on WMDs and “get a grip on our techno-utopia.”
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
In order to disarm WMDs, we must admit that they can’t do everything. We must measure their impact by auditing their hidden algorithms and studying their biases and shortcomings. Unfair systems, like the value-added model used to score teachers, must be dispatched entirely. Rather than letting negative feedback loops slip through the cracks, analysts must figure out how WMDs can create positive feedback loops that change lives and benefit society. While some algorithms, like Amazon and Netflix’s, should be allowed to sort the kinds of entertainment people enjoy, recidivism models and other algorithms used in the justice system must be held to unimpeachable standards—even if it means revising them and changing their inputs altogether. 
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Not all potential WMDs are nefarious. But the point is that we need analysists and auditors to maintain the systems that govern our lives and make them more transparent. Internal audits alone aren’t enough, O’Neil states, because companies that examine their own algorithms can’t be held accountable. Outside input is needed to make sure that companies like Google and Facebook stay in line. And regulations and transparency are needed in peer-to-peer lending, healthcare and health insurance, and credit score models.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
In 2013, O’Neil began working as an intern at New York City’s Housing and Human Services Departments—she wanted to build the opposite of a WMD, a model that would help stop houseless people from getting pushed back into shelters and help them finding stable housing. Her and her team’s research found that families who received Section 8 affordable housing vouchers were less likely to return to shelters. But the city government was trying to get families away from Section 8 to a new program, Advantage, that limited subsidies to a few years. Public officials were not happy to hear about O’Neil’s team’s research.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
 But Big Data, O’Neil asserts, should be disruptive when it comes to things that actually matter, like human rights. There are so many mathematical models out there today, O’Neil writes, that could be used to do good—but instead, they often wind up being abused. Yet there’s hope in the form of supply chain models that seek out sweatshops and other places where slave labor is being used to build products, and predictive models that try to pinpoint houses where children are more likely to suffer abuse.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
O’Neil hopes the WMDs that are around today will soon become relics of the past. She hopes that we can learn from our present moment—the early days of a “new revolution”—and learn to bring transparency, fairness, and accountability to the age of Big Data.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes