Weapons of Math Destruction

by

Cathy O’Neil

Teachers and parents! Our Teacher Edition on Weapons of Math Destruction makes teaching easy.

Weapons of Math Destruction: Afterword Summary & Analysis

Summary
Analysis
In 2016, millions of people were shocked when news outlets’ algorithms failed to accurately predict the results of the contentious U.S. presidential election. Because feedback loops for midterm and presidential election models aren’t updated so frequently, there’s a lot of room for things to change quickly: and indeed, a lot of things changed between 2012 and 2016. A rise of populist politics, media skepticism, and people’s reluctance to contribute data to polls meant that the predictive models’ results were impossibly skewed by biases and misinformation. The narrow gap between the candidates, Hillary Clinton and Donald Trump, wasn’t as narrow as it appeared to many people.
Faulty polling data, O’Neil shows here, likely had a big impact on a major U.S. election. People trusted the polls’ predictive power—and they voted (or didn’t vote) because of what the polls signaled. Polling is directly tied to civic life and democracy in the U.S.—so the algorithms and models that govern it need to be thoroughly vetted and held to high standards in order to ensure accuracy and transparency.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
While political polls are influential and somewhat mysterious, they’re not necessarily destructive—so they’re not quite WMDs. But because people gave polls so much power in the 2016 election only to see them completely miss, O’Neil is hopeful that they’ll be given less and less power in politics as time goes on. 
Polls can create the illusion of being able to predict the future. But because they’re not always accurate, their weight in our modern political system is potentially dangerous. Polling should become more transparent—and the data they gather should be vetted more thoroughly.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Polling wasn’t the only algorithmic failure in the 2016 election season. Facebook’s “Trending Topics” algorithm, which was meant to eliminate news bias, ended up performing erratically and flooding the site with “fake news” and other kinds of misinformation. O’Neil suggests that wonky algorithms like this one shouldn’t necessarily be banned or dismantled forever—but there must be algorithmic accountability that starts and ends with the developers themselves. Journalists are increasingly working to help people understand how algorithms perpetuate bias or create discrimination, hoping to make them easier to see through and understand.
When WMDs begin to spiral out of control, it’s up to humanity to call out their destructive nature. If tech companies won’t make their algorithms transparent, then everyday people and arbiters of truth, like journalists, need to encourage transparency in other ways.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
O’Neil isn’t sure whether there will ever be a simple, widely recognized definition of what makes an algorithm fair, but she’s grateful that people are finally discussing what that definition might look like. By continuing to set standards for algorithmic accountability, she suggests, both technology and contemporary ethics will improve.
Here, O’Neil’s essentially saying that the desire for technological progress doesn’t outweigh the need for fairness and transparency. Even though WMDs operate in many different sectors of contemporary life, there’s still hope that by putting a human hand back into tech, reforms can happen.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
By weighing harms instead of squabbling over fairness, O’Neil suggests, we can dismantle WMDs slowly but surely. With every algorithm created—for example, one that seeks to determine which households are most likely to be hotspots of child abuse—there are harms associated with both false positives and false negatives. Determining which is the greater harm is difficult but necessary work, and it will help ensure that algorithms are functioning the way they’re supposed to.
By making sure that we see and understand the data that’s used to create WMDs, we can stop faulty or irresponsible algorithms from ruining lives. Technology can be useful and transformative—but without examining its potential for confusion or even harm from all angles, humans are letting faulty technology take over too many aspects of modern life.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Artificial intelligence algorithms cannot distinguish between the truth and lies—so asking Google “who won the popular vote” in the 2016 election won’t always yield accurate results. Instead, the results might be contaminated by conspiracy theories. Data scientists, then, need to work to ensure that the data these algorithms use represents the world as it is.
Because the very concept of objectivity and truth is at stake, we must interrogate how data is collected and processed. This is a difficult task, to be sure, but one that O’Neil suggests is worth the effort to prevent the proliferation of WMDs.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
“Truth” can look like different things from different points of view—and even mathematical proofs can be full of mistakes. But Big Data has a duty to clarify the noise—not contribute to it even more. Big tech companies exert a huge amount of control over contemporary society because they control people’s data. And if that data remains privately owned and used, the algorithms it creates can’t be trusted.
When tech companies, the algorithms they use, and the kinds of data they collect aren’t regulated or vetted, society actually becomes less efficient. Humanity needs to step in and thoroughly examine the role technology plays in our lives. Otherwise, we will likely be victimized by error-ridden algorithms that deepen social divides without even delivering on their promises of efficiency and fairness.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Algorithms aren’t going anywhere—if anything, they’re only going to become more common as time goes on. In light of that fact, O’Neil argues, it’s time to hold algorithms accountable in the long term by making sure that they are legal, fair, factual, and capable of change. We must focus our efforts on improving how algorithms work, O’Neil warns, because “we can’t afford to do otherwise.”
WMDs are a threat to personal privacy, protections for minority groups, and indeed democracy as we know it. By failing to regulate how companies collect and use data about us, humanity is setting a dangerous precedent of living at the mercy of new technologies that seek to maximize profit and efficiency at any cost.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon