LitCharts assigns a color and icon to each theme in Weapons of Math Destruction, which you can use to track the themes throughout the work.
Humanity vs. Technology
Discrimination in Algorithms
Fairness vs. Efficiency
Data, Transparency, and U.S. Democracy
Summary
Analysis
In previous decades, local bankers controlled the money in any given town. People would suit up and pay a visit to their banker if they needed a new car, a mortgage, or a loan. The bankers were people and neighbors—they knew about a person’s background and family in addition to having the numbers on their application form. In other words, the banker’s judgement was human and thus biased. For millions of people, the human angle of banking was a liability: if they were poor, Black, or female, they might have trouble convincing a banker to give them a loan. When Earl Isaac and Bill Fair developed the FICO model to evaluate the risk of an individual defaulting on a loan, things seemed to be looking up—with an algorithm doing the work, there’d be no bias in the credit process.
Securing credit used to be a much more human process—and thus a much more biased one. Technology has evolved to make the process both fairer and more efficient. Because of FICO scoring, a person’s credit history speaks for itself—and for many low-income people, non-white people, and women, that’s an important step toward a more just world. But because of the tension between efficiency and fairness that O’Neil has described in previous chapters, it’s clear that even a more objective process isn’t necessarily a perfectly equitable one.
Active
Themes
While FICO scores were relatively transparent, fair, and backed by consistently updated data, the use of scoring has changed significantly over the years. “E-scores” now aggregate everything from zip codes to internet behavior to purchase history to create arbitrary, unregulated, and unfair WMDs. Companies like Neustar and Capital One score credit-seekers lightning-fast using metrics like location and internet history to determine who’s a worthy borrower. These e-scores create destructive rather than data-backed feedback loops. It’s not clear what metrics they use to determine who will get a loan, and they’re becoming more and more popular. By prioritizing efficiency over justice and transparency, they’re becoming predatory and unfair.
FICO scores aren’t WMDs—but e-scores, which are unregulated and widespread nowadays, certainly are. Their inner workings are mysterious, so while they deliver results faster, there’s no telling whether the data they’re using to decide people’s futures is sound or not.
Active
Themes
Quotes
E-scores are taking society several steps backward from the fairness and transparency of the FICO scoring system. They’re not looking at individuals; they’re rating people in relation to a “blizzard of proxies.” So, while e-scores don’t do things like withhold credit from a Black lender based on race, they use things like zip codes—which can quite often be indicators of a person’s race or class—to assess how similar people have behaved in the past, rather than how the person seeking credit has behaved in the past. There’s no feedback that corrects these e-scoring systems when they make an error and arbitrarily place someone in the wrong category. And because the inner workings of e-scoring systems are hidden, no one can examine how they work in order to challenge or improve them.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusantium recusandae voluptates. Explicabo minus tempore. Nostrum dolor asperiores. Ut aliquam officiis. Unde enim nesciunt. Commodi necessitatibus voluptas. Accusamus eaque omnis.
Active
Themes
Over time, creditworthiness has become a stand-in for virtues like a good work ethic and dependability, while bad credit signals “sins” that have nothing to do with being able to pay bills. Human resource management software now screens potential hires based on their credit reports, creating dangerous poverty cycles and feedback loops. “Framing debt as a moral issue,” O’Neil suggests, is a huge mistake.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusantium recusandae voluptates. Explicabo minus tem
The systems that crunch numbers and run data about people’s lives aren’t perfect—no-fly lists are rife with errors that keep ethnic and religious minorities from traveling safely and easily, while wealthy white people can pay for “trusted traveler” status and bypass security altogether. Credit report errors can make borrowing difficult or impossible for people with great credit. And scoring algorithms often mix up common names, meaning that having the same name as someone with a criminal history or poor credit can be a liability.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusantium recusandae voluptates. Explicabo m
When an Arkansas woman named Catherine Taylor tried to secure federal housing assistance, she learned that her background check was full of mistakes and blended identities—she had many felonies on her report, some of which were tied to the alias of a woman named Chantel Taylor. Luckily, a housing authority employee helped Catherine find the errors and clean up the mess—but Catherine’s case is evidence of a larger problem with how these systems are built.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaeca
As algorithms become more automatic, more errors pile up in people’s consumer profiles. These errors corrupt predictive models and give WMDs even more fuel. As computers become better able to learn from spoken language and images, these errors will only continue to pile up, creating uncountable instances of racist, unjust decisions made by faulty algorithms. These automatic systems now need a human hand to sift through their mistakes. Big Data needs to slow down and allow humans to play a greater role in sorting sensitive information, but the tech world is doubling down on predictive credit models.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusantium recusandae voluptates. Explicabo minus tempore. Nostrum dolor asperiores. Ut aliquam officiis. Unde enim nesciunt. Commodi necessitatibus voluptas.
Facebook has recently patented a new type of credit rating based on social networks. For example, a young, white college graduate who spent five years volunteering in Africa might come home with no credit—but his connections on Facebook are successful and monied, and so he’s able to get a loan. But a hardworking Black or Latino housecleaner from a poor neighborhood whose social networks might reflect “friends” who are unemployed or incarcerated will have a harder time securing financial help.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusantium recusandae voluptates. Expl
Meanwhile, credit card companies like American Express have come under fire for revoking or lowering credit for customers who shopped at certain kinds of establishments. This plays into the idea that someone spending their money at Saks Fifth Avenue is more likely to pay off their card each month than someone frequenting Walmart.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusa
Companies like ZestFinance, a start-up that calculates risk and offers payday loans at a discount, buy up data about their customers in order to inform how big of a loan they get, and what their interest rate will be. People are trading privacy for discounts—and if Big Data algorithms find something as minor as a spelling error in a mountain of data about an individual, it could affect their credit score.
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste
“Peer-to-peer” lenders like Lending Club, which hoped to become a “new kind of bank” when it launched in 2007, used a combination of credit reports and data to run their operations. These companies can analyze any data they choose to and develop their own e-scores and risk correlations without explaining the methodologies behind them. O’Neil suggests that compared to the systems in place today, the prejudiced loan officers and bankers of long ago don’t look nearly as bad as they used to. At least borrowers, she writes, could “appeal to [their] humanity.”
Dolorem et quae. Exercitationem non aut. Eveniet dolor non. Incidunt dolores sunt. Ad dolor at. Quia aperiam eligendi. Ut veniam voluptatem. Aperiam consequuntur mollitia. Provident expedita delectus. Occaecati ea suscipit. Optio ut iste. Voluptas aut occaecati. Accusantium recusandae voluptates. Explicabo minus tempore. Nostrum dolor asperiores. Ut aliquam officiis. Unde enim nesci