BIBLIOTECAS del MAEC

Amazon cover image
Image from Amazon.com

Weapons of math destruction : how big data increases inequality and threatens democracy / Cathy O'Neil

By: O'Neil, CathySeries: (Society Technology)Publication details: [S. l.] : Penguin Books , 2017 Description: x, 259 p. ; 20 cmISBN: 978-0-141-98541-1Subject(s): Nuevas tecnologías de la información y de la comunicación | Tratamiento de datos | Desigualdad socialAbstract: We live in the age of the algorithm. Increasingly, the decisions that affect our lives—where we go to school, whether we get a car loan, how much we pay for health insurance—are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated. But as Cathy O’Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination: If a poor student can’t get a loan because a lending model deems him too risky (by virtue of his zip code), he’s then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a “toxic cocktail for democracy.” Welcome to the dark side of Big Data.
Holdings
Item type Current library Call number Status Date due Barcode
Monografías Monografías Biblioteca de la Escuela Diplomática
Depósito
21414 Available 2061371

We live in the age of the algorithm. Increasingly, the decisions that affect our lives—where we go to school, whether we get a car loan, how much we pay for health insurance—are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated. But as Cathy O’Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination: If a poor student can’t get a loan because a lending model deems him too risky (by virtue of his zip code), he’s then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a “toxic cocktail for democracy.” Welcome to the dark side of Big Data.

Gobierno de España
©Ministerio de Asuntos Exteriores y de Cooperación

Powered by Koha