Weapons of Math Destruction

by

Cathy O’Neil

Teachers and parents! Our Teacher Edition on Weapons of Math Destruction makes teaching easy.

Weapons of Math Destruction: Chapter 2: Shell Shocked Summary & Analysis

Summary
Analysis
Quants who work at hedge funds zoom in on tiny patterns, then train algorithms to predict recurring errors and price swings, then place bets on their occurrences. The smallest of patterns can make millions for the first investor who recognizes them—and those patterns will keep raking in money until the pattern ends, or the rest of the market catches on.
Quants (quantitative analysts) who work at hedge funds essentially bet on world markets with the help of complex algorithms. Technology has streamlined the financial markets and made it easier than ever to make a profit.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
While working at Shaw, O’Neil loved the “treasure hunt” component of finding market inefficiencies. At Shaw, her smarts were translating into money—lots of money. But out of the 50 quants on O’Neil’s “futures group” team, she was the only woman. She was siloed from many of her other coworkers, so that if someone walked away to another firm, they wouldn’t bring other quants’ trade secrets with them. The work was exhausting and sometimes frightening, like when huge sums of money were needed frantically and immediately. But something deeper began to gnaw at O’Neil. The numbers she was playing with all day weren’t just abstract figures—they represented people’s livelihoods, retirement funds, and mortgages.
O’Neil had a moral investment in determining whether the work she and her team at Shaw were doing was legitimate and fair. They were often profiting off of uncertainty and chaos, and they were gambling with people’s entire financial futures. Shaw’s technology was efficient, but it wasn’t always fair, which is true of all the WMDs that O’Neil discusses in the book.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
In July of 2007, interbank interest rates spiked. Even though lots of people had been able to secure mortgages in the housing boom of the last several years, banks were now realizing that there was some “dangerous junk” in their portfolios. Shaw could see that many companies and world markets would suffer—but as a hedge fund, they didn’t plunge into risky markets. Instead, they stood on the sidelines and bet on them. Hedge funds are less like baseball fans who cheer when their team wins, and more like gamblers who bet on movements associated with the game. So, O’Neil and her team, while nervous about what was to come, felt more or less safe. Even though the market was beginning to grow unstable, Shaw was “on top of the world.”
The early days of the 2007 global financial crisis were an uncertain time, and investment firms like Shaw actually profited off of the volatility and uncertainty. By contrasting the “dangerous junk” in the banks portfolios with Shaw’s position of being “on top of the world,” O’Neil implies that hedge funds had an unfair advantage during this time. Even though the company was skilled and efficient at using data to make money, their processes weren’t rooted in equity and justice—so their technology wasn’t aimed at promoting those things either.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
But even Shaw started to get nervous as the market continued to rumble. Mortgage-backed securities—previously “boring financial instruments” that could actually offset risk through quantity—became bigger liabilities. Since the 1980s, investment bankers had been buying up and packaging mortgages into securities, a kind of bond, by the thousands. The mortgages were like the “little pieces of meat of varying quality” that make up a sausage, and the securities were the spices. No one worried about them because they were essentially marked as safe. But mortgage companies had been lending money to people for homes they couldn’t afford, collecting the fees, and then unloading the securities into the market. The deals that the banks were offering weren’t just unsustainable—they were predatory.
Here, O’Neil explains the inner workings at banks that led to the outbreak of a devastating financial crisis beginning in 2007. The banks were creating false data through hidden and unfair methods, risking human stability (and, of course, fairness) in the process.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
These subprime mortgages weren’t WMDs—they were financial instruments, not models. But when banks turned the mortgages into securities and sold them, they relied on mathematical models—flawed ones—to do so. So, the risk model attached to the mortgage-backed securities was a WMD. None of the companies’ mathematicians were updating their data and continuously balancing the risk. The numbers these companies did have had been given to them by people committing wide-scale fraud. The risk ratings were kept hidden from the public, the risk models created a feedback loop by falsely rating defective products, and the fraud was happening at an enormous scale. So, the securities had all the components of a lethal WMD. 
WMDs can spin out of control when there’s no transparency, accountability, or regulation. It’s dangerous for a model to operate without being regularly updated—especially a model like the risk models that were keeping the American (and global) economies afloat. Because the risk models seemed to be efficient in terms of making a profit and keeping things calm on the surface, they weren’t regulated or examined, which allowed them to become WMDs.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
The algorithms that had created the market and analyzed the risk in the securities turned out to be useless. Disaster hit the economy, and the human suffering it created was finally on display. In the financial sector, everyone—including the quants at O’Neil’s firm—began to wonder what would happen next. But by 2009, it was clear that the industry hadn’t really learned anything or changed—there were just a few more hoops to jump through. 
In the throes of the financial crisis, only humans could sift through the mortgages and assess the true values of the loans. Technology was supposed to be able to run itself—but in the end, a human hand was needed to mitigate the disaster that had struck. Yet as the crisis dissipated, no one seemed motivated to investigate how to prevent something similar from happening again. 
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Quotes
O’Neil had become disillusioned with the world of finance; people were wielding formulas recklessly and inappropriately. O’Neil left Shaw in 2009, planning to work on fixing WMDs from the inside out by joining a group that provided risk analysis for banks. The longer she worked as a risk analyst, though, the more she got the sense that she and her colleagues were seen as “party poopers” or threats—even given the cataclysmic crash that the country had just been through. 
Here, O’Neil shows that even after switching industries, she still wasn’t seeing anyone express the desire to take accountability for how algorithms were destabilizing the world. Furthermore, no one was taking the initiative to investigate how to stop them from growing further out of control.
Themes
Humanity vs. Technology  Theme Icon
In 2011, O’Neil switched roles yet again. She joined a web start-up as a data scientist, where she built models to anticipate the behavior of the users who visited travel websites and to try to distinguish casual window shoppers from motivated buyers. As she adjusted to her new job, she found lots of parallels between finance and Big Data—the biggest of which was that in both fields, money and self-worth were inextricably interwoven. People believed that they were successful in finance because they deserved to be, rather than because they were simply lucky. This itself was an example of a feedback loop. “Money,” O’Neil writes, “vindicates all doubts.”
This passage underscores how tech companies and financial institutions alike prioritize efficiency—which often translates to the most profits for the least amount of hassle and work—over fairness, justice, or empathy. This focus on effectiveness over everything leads to the proliferation of WMDs throughout many industries.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
O’Neil continued to grow disillusioned by how her new industry sought to replace people with data trails and turn them into more “effective” voters, workers, and consumers. She could see a new kind of dystopia growing around her, and inequality rising nonstop as data controlled and manipulated more and more people. Eventually, she quit her job to devote more time to investigating how algorithms were destroying lives. 
O’Neil resisted society’s increasing emphasis on efficiency and effectiveness over fairness and justice. Humans were being turned into data points, and too many different parts of their lives were being altered and streamlined. WMDs, in O’Neil’s estimation, had begun to take over the world.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon