Weapons of Math Destruction

by

Cathy O’Neil

Teachers and parents! Our Teacher Edition on Weapons of Math Destruction makes teaching easy.

Data, Transparency, and U.S. Democracy Theme Analysis

Themes and Colors
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
LitCharts assigns a color and icon to each theme in Weapons of Math Destruction, which you can use to track the themes throughout the work.
Data, Transparency, and U.S. Democracy Theme Icon

One of the hallmark qualities of a WMD—a “weapon of math destruction,” or a destructive mathematical algorithm—is, in author Cathy O’Neil’s view, the fact that it’s “opaque.” In other words, the systems that govern it (and sometimes its overall purpose) are kept secret or shrouded in mystery. Things like FICO credit scores and baseball statistics used in game wagers are transparent: anyone can access them. But tech companies like Google and Facebook use decidedly opaque methods to gather data from their users, while companies like Sense Networks are transparent in how they gather data but completely mum as to how they’re using it. This tendency toward secrecy, O’Neil asserts, poses a threat not just to individual citizens’ privacy, but to the fabric of American society as whole. Data collection is playing an increasingly important role in U.S. civic life (it influences political polling, campaign advertising, public services like housing assistance, and voting). Therefore, O’Neil argues, its misuse threatens the transparency and legitimacy of public institutions and democratic processes like federal aid, major elections, and data-backed public policy decisions.

Big tech companies have set a precedent for using hidden methods to gather, interpret, and use data. Targeted ads are now a part of daily life. Based on how people navigate the internet, shop and bank online, and more, they’ll be shown certain kinds of advertisements that seek to gather more of their data and personal information. Then, large companies can analyze or sell off people’s data, and influence consumers to buy goods and services based on their browsing history. But companies like Facebook aren’t just using user data to sell products—they’re using it to determine what kinds of news people should see, how they should interpret it, and how it will affect them. In the run-up to the 2016 U.S. presidential election, Facebook came under fire for manipulating its users’ news feeds to gather data on how they’d interact with different kinds of news. But Facebook’s actions prioritized gathering and analyzing data about user behavior—not ensuring that its users got accurate, fairly reported news. Data scientists at Facebook were tampering with people’s feeds, and in the process, they were contributing to a dangerous rise of misinformation—especially misinformation concerning the turbulent political climate and “fake news” about each presidential candidate.

Now, political campaigns and government institutions in the U.S. are using big tech’s same methods of gathering and using data in a manipulative way. In 2011, Rayid Ghani—a data scientist for Barack Obama’s reelection campaign—used software he’d pioneered as an analyst at the consulting firm Accenture to gather data on swing voters (people who aren’t firm supporters of any one candidate or political party). Then, he used the software to find information about millions of Americans who fit the profiles of those swing voters and targeted them all with political advertisements. This process is called “microtargeting,” and it is useful to campaigns—but in most scenarios, it’s also an invasion of personal privacy. On its own, targeting potential voters with ads isn’t unprecedented or even necessarily harmful. But by keeping tabs on users’ “likes” and using those likes to rank people’s assumed personality traits, political efforts like the Obama campaign (and, later, Hillary Clinton’s 2016 presidential campaign) are essentially turning the voting public into a financial market.

City and state governments, too, are increasingly using data to determine how they function. In 2013, O’Neil worked at New York City’s Housing and Human Services Departments for a time, building a model that would help get houseless people out of shelters and into homes. But the city didn’t want to spend the money on Section 8 housing vouchers, even though O’Neil’s data proved that the vouchers helped disadvantaged people. They ignored O’Neil’s research and instead poured their resources into a new program that would limit housing subsidies significantly. In this case, the city government ignored how data could help people, focusing only on how microtargeting ads for a predatory program could keep budgets low.

When government institutions start acting like private tech companies, O’Neil asserts, American democracy comes under threat. Tactics like microtargeting and reckless, mass data-gathering “infect our civic life.” Flawed, faulty data now dictates how politicians campaign and how newscasters report on political happenings. A good example of this is Hillary Clinton’s 2016 presidential campaign: she was, according to most polls and mainstream media outlets in the weeks leading up to the election, the predicted winner. But she ended up losing to Donald Trump, which suggests that data can’t always be trusted. And when lobbyists and interest groups can target American citizens with misinformation—dangerously false or misleading facts that can sway how people vote—the integrity of democracy is threatened. By influencing people’s political opinions with flawed data, groups that should be pursuing transparency and democracy instead favor profits and fast, easy solutions.

When people don’t have all the information, they can’t make good decisions or faithfully perform their civic duties. Though algorithms, microtargeting, and misinformation are all here to stay, O’Neil suggests that “the [U.S.] government […] has a powerful regulatory role to play.” In Europe, any data that’s gathered and collected must be approved by the user—and O’Neil suggests that similar measures in the U.S. could help make sure that transparency and user autonomy are protected. Government policy change and corporate accountability are necessary, O’Neil states, to make sure that the public is protected from the ongoing spread of misinformation and increasingly sneaky tactics that invade citizens’ privacy. When the public is shut out of decision-making about how their information is gathered, interpreted, and used, O’Neil suggests, their freedom is profoundly threatened.

Related Themes from Other Texts
Compare and contrast themes from other texts to this theme…

Data, Transparency, and U.S. Democracy ThemeTracker

The ThemeTracker below shows where, and to what degree, the theme of Data, Transparency, and U.S. Democracy appears in each chapter of Weapons of Math Destruction. Click or tap on any chapter to read its Summary & Analysis.
How often theme appears:
chapter length:
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF

Data, Transparency, and U.S. Democracy Quotes in Weapons of Math Destruction

Below you will find the important quotes in Weapons of Math Destruction related to the theme of Data, Transparency, and U.S. Democracy.
Chapter 5: Civilian Casualties Quotes

These types of low-level crimes populate their models with more and more dots, and the models send the cops back to the same neighborhood.

This creates a pernicious feedback loop. The policing itself spawns new data, which justifies more policing. And our prisons fill up with hundreds of thousands of people found guilty of victimless crimes. Most of them come from impoverished neighborhoods, and most are black or Hispanic. So even if a model is color blind, the result of it is anything but. In our largely segregated cities, geography is a highly effective proxy for race.

Related Characters: Cathy O’Neil (speaker)
Related Symbols: Weapons of Math Destruction
Page Number: 87
Explanation and Analysis:

Police make choices about where they direct their attention. Today they focus almost exclusively on the poor. […] And now data scientists are stitching this status quo of the social order into models, like PredPol, that hold ever-greater sway over our lives.

The result is that while PredPol delivers a perfectly useful and even high-minded software tool, it is also a do-it-yourself WMD. In this sense, PredPol, even with the best of intentions, empowers police departments to zero in on the poor, stopping more of them, arresting a portion of those, and sending a subgroup to prison. […]

The result is that we criminalize poverty, believing all the while that our tools are not only scientific but fair.

Related Characters: Cathy O’Neil (speaker)
Related Symbols: Weapons of Math Destruction
Page Number: 91
Explanation and Analysis:

While looking at WMDs, we’re often faced with a choice between fairness and efficacy. Our legal traditions lean strongly toward fairness. The Constitution, for example, presumes innocence and is engineered to value it. […]

WMDs, by contrast, tend to favor efficiency. By their very nature, they feed on data that can be measured and counted. But fairness is squishy and hard to quantify. It is a concept.

Related Characters: Cathy O’Neil (speaker)
Related Symbols: Weapons of Math Destruction
Page Number: 94-95
Explanation and Analysis:
Chapter 10: The Targeted Citizen Quotes

[Publicly held tech corporations’] profits are tightly linked to government policies. The government regulates them, or chooses not to, approves or blocks their mergers and acquisitions, and sets their tax policies (often turning a blind eye to the billions parked in offshore tax havens). This is why tech companies, like the rest of corporate America, inundate Washington with lobbyists and quietly pour hundreds of millions of dollars in contributions into the political system. Now they’re gaining the wherewithal to fine-tune our political behavior—and with it the shape of American government—just by tweaking their algorithms.

Related Characters: Cathy O’Neil (speaker)
Related Symbols: Weapons of Math Destruction
Page Number: 181
Explanation and Analysis:

Successful microtargeting, in part, explains why in 2015 more than 43 percent of Republicans, according to a survey, still believed the lie that President Obama is a Muslim. And 20 percent of Americans believed he was born outside the United States and, consequently, an illegitimate president. (Democrats may well spread their own disinformation in microtargeting, but nothing that has surfaced matches the scale of the anti-Obama campaigns.)

Related Characters: Cathy O’Neil (speaker)
Related Symbols: Weapons of Math Destruction
Page Number: 194
Explanation and Analysis:
Conclusion Quotes

Data is not going away. […] Predictive models are, increasingly, the tools we will be relying on to run our institutions, deploy our resources, and manage our lives. But as I’ve tried to show throughout this book, these models are constructed not just from data but from the choices we make about which data to pay attention to—and which to leave out. Those choices are not just about logistics, profits, and efficiency. They are fundamentally moral.

If we back away from them and treat mathematical models as a neutral and inevitable force […] we abdicate our responsibility. And the result, as we’ve seen, is WMDs that treat us like machine parts […] and feast on inequities. We must come together to police these WMDs, to tame and disarm them.

Related Characters: Cathy O’Neil (speaker)
Related Symbols: Weapons of Math Destruction
Page Number: 218
Explanation and Analysis: