Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

0,00 EGP

Description

Price: $0.00
(as of Jan 24,2025 13:26:02 UTC – Details)


Customers say

Customers find the book provides an enlightening and informative introduction to big data. They describe it as a worthwhile read and accessible for non-technical readers. The perspective is described as realistic and interesting. Readers find the pacing interesting, frightening, and troubling. However, some feel the author’s worldview and biases bleed through.

AI-generated from the text of customer reviews

This Post Has 8 Comments

  1. Insightful & enjoyable read
    I really enjoy books like this that share the logic (algorithms) behind the success of many popular marketing and business models. Unfortunately, US government programs are a prime target (but useful example) of how the weapons of math play out. Great read.

  2. Very clear, but over-reliant on government solutions instead of more choices for consumers (competition!)
    I was excited to read this book as soon as I heard Cathy O’Neill, the author, interviewed on EconTalk.O’Neill’s hypothesis is that algorithms and machine learning can be useful, but they can also be destructive if they are (1) opaque, (2) scalable and (3) damaging. Put differently, an algorithm that determines whether you should be hired or fired, given a loan or able to retire on your savings is a WMD if it is opaque to users, “beneficiaries” and the public, has an impact on a large group of people at once, and “makes decisions” that have large social, financial or legal impacts. WMDs can leave thousands in jail or bankrupt pensions, often without warning or remorse.As examples of non-WMDs, consider bitcoin/blockchain (the code and transactions are published), algorithms developed by a teacher (small scale), and Amazon’s “recommended” lists, which are not damaging (because customers can decide to buy or not).As examples of WMDs (many of which are explained in the book), consider Facebook’s “newsfeed” algorithm, which is opaque (based on their internal advertising model), scaled (1.9 billion disenfranchised zombies) and damaging (echo-chamber, anyone?)I took numerous notes while reading this book, which I think everyone interested in the rising power of “big data” (or big brother) or bureaucratic processes should read, but I will only highlight a few:* Models are imperfect — and dangerous if they are given too much “authority” (as I’ve said)* Good systems use feedback to improve in transparent ways (they are anti-WMDs)WMDs punish the poor because the rich can afford “custom” systems that are additionally mediated by professionals (lawyers, accountants, teachers)* Models are more dangerous the more removed their data are from the topic of interest, e.g., models of “teacher effectiveness” based on “student grades” (or worse alumni salaries)* “Models are opinions embedded in mathematics” (what I said) which means that those weak in math will suffer more. That matters when “American adults… are literally the worst [at solving digital problems] in the developed world.”* It is easy for a “neutral” variable (e.g., postal code) to reproduce a biased variable (e.g., race)* Wall Street is excellent at scaling up a bad idea, leading to huge financial losses (and taxpayer bailouts). It was not an accident that Wall Street “messed up.” They knew that profits were private but losses social.* Many for-profit colleges use online advertisements to attract (and rip off) the most vulnerable — leaving them in debt and/or taxpayers with the bill. Sad.* A good program (for education or crime prevention) also relies on qualitative factors that are hard to code into algorithms. Ignore those and you’re likely to get a biased WMD. I just saw a documentary on urbanism that asked “what do the poor want — hot water or a bathtub?” They wanted a bathtub because they had never had one and could not afford to heat water. #checkyourbias* At some points in this book, I disagreed with O’Neill’s preference for justice over efficiency. She does not want to allow employers to look at job applicants’ credit histories because “hardworking people might lose jobs.” Yes, that’s true, but I can see why employers are willing to lose a few good people to avoid a lot of bad people, especially if they have lots of remaining (good credit) applicants. Should this happen at the government level? Perhaps not, but I don’t see why a hotel chain cannot do this: the scale is too small to be a WMD.* I did, OTOH, notice that peer-to-peer lending might be biased against lender like me (I use Lending Club, which sucks) who rely on their “public credit models” as it seems that these models are badly calibrated, leaving retail suckers like me to lose money while institutional borrowers are given preferential access.* O’Neill’s worries about injustice go a little too far in her counterexamples of the “safe driver who needs to drive through a dangerous neighborhood at 2am” as not deserving to face higher insurance prices, etc. I agree that this person may deserve a break, but the solution to this “unfair pricing” is not a ban on such price discrimination but an increase in competition, which has a way of separating safe and unsafe drivers (it’s called a “separating equilibrium” in economics). Her fear of injustice makes me think that she’s perhaps missing the point. High driving insurance rates are not a blow against human rights, even if they capture an imperfect measure of risk, because driving itself is not a human right. Yes, I know it’s tough to live without a car in many parts of the US, but people suffering in those circumstances need to think bigger about maybe moving to a better place.* Worried about bias in advertisements? Just ban all of them.* O’Neill occasionally makes some false claims, e.g., that US employers offered health insurance as a perk to attract scarce workers during WWII. That was mainly because of a government-ordered wage freeze that incentivised firms to offer “more money” via perks. In any case, it would be good to look at how other countries run their health systems (I love the Dutch system) before blaming all US failures on WMDs.* I’m sympathetic to the lies and distortions that Facebook and other social media spread (with the help of WMDs), but I’ve gotta give Trump credit for blowing up all the careful attempts to corral, control and manipulate what people see or think (but maybe he had a better way to manipulate). Trump has shown that people are willing to ignore facts to the point where it might take a real WMD blowing up in their neighborhood to take them off auto pilot.* When it comes to political manipulations, I worry less about WMDs than the total lack of competition due to gerrymandering. In the 2016 election, 97 percent of representatives were re-elected to the House.* Yes, I agree that humans are better at finding and using nuances, but those will be overshadowed as long as there’s a profit (or election) to win. * * * Can we push back on those problems? Yes, if we realize how our phones are tracking us, how GPA is not your career, or how “the old boys network” actually produced a useful mix of perspectives.* Businesses will be especially quick to temper their enthusiasm when they notice that WMDs are not nearly so clever. What worries me more are politicians or bureaucrats who believe a salesman pitching a WMD that will save them time but harm citizens. That’s how we got dumb do not fly lists, and other assorted government failures.* Although I do not put as much faith in “government regulation” as a solution to this problem as I put into competition, I agree with O’Neill that consumers should own their data and companies only get access to it on an opt-in model, but that model will be broken for as long as the EULA requires that you give up lots of data in exchange for access to the “free” platform. Yes, Facebook is handy, but do you want Facebook listening to your phone all the time?Bottom Line: I give this book FOUR STARS for its well written, enlightening expose of MWDs. I would have preferred less emphasis on bureaucratic solutions and more on market, competition, and property rights solutions.

  3. Must read, especially for students of engineering and computer science
    This is a thoughtful and very approachable introduction and review to the societal and personal consequences of data mining, data science, and machine learning practices which seem at times extraordinarily successful. While others have breached the barriers of this subject, Professor O’Neil is the first to deal with it in the call-to-action manner it deserves. This is a book you should definitely read this year, especially if you are a parent. It should be required reading for anyone who practices in the field before beginning work.I have a few quibbles about the book’s observations based on its very occasional leaps of logic and some quick interpretations of history.For example, while I wholeheartedly deplore the pervasive use of e-scores and a financing system which confounds absence of information with higher risk (that is, fails to posit and apply proper Bayesian priors), the sentence “But framing debt as a moral issue is a mistake”, while correct, ignores the widespread practice of debtors courts and prisons in the history of the United States. This is really not something new, only a new form. Perhaps it is more pervasive.For a few of the cases used to illustrate WMDs, there are other social changes which exacerbate matters, rather than abused algorithms being a cause. For instance, the idea of individual home ownership was not such a Big Deal in the past, especially for people without substantial means. These less fortunate individuals resigned themselves to renting their entire lives. Having a society and a group of banks pushing home ownership onto people who can barely afford it sets them up for financial hardship, loss of home, and credit.What will be interesting to see is where the movement to fix these serious problems will go. Protests are good and necessary but, eventually, engagement with the developers of actual or potential WMDs is required. An Amazon review is not a place to write more of this, nor give some of my ideas. Accordingly, I have written a full review at my blog (see the image) for the purpose.My primary recommendation is a plea for rigorous testing of anything which could become a WMD. It’s apparent these systems touch the lives of many people. Just as in the case of transportation systems, it seems to me that we as a society have very right to demand these systems be similarly tested, beyond the narrow goals of the companies who are building them. This will result in fewer being built, but, as Dr O’Neil has described, building fewer bad systems can only be a good thing.

  4. O livro como um todo é extremamente interessante. A divisão dos capítulos foi muito oportuna, tendo em vista as inúmeras áreas em que a automatização de sistemas pode ser extremamente perturbadora. O que é desigual se torna ainda mais, seleções são enviesadas e aspectos morais (e, talvez, legais) são totalmente corrompidos.

  5. Es un ensayo en el que se revisa el papel de los algoritmos y la estadística en la predicción del mundo moderno, con el objeto de atraer nuestra atención al tema y con mente humana enfrentar los sesgos y malos usos que estas herramientas pueden tener si abdicamos de la responsabilidad de revisarlos y actualizarlos

  6. It’s a hard sell to tell someone a book about math and algorithms is interesting, but this one really is. It’s a complex issue but a easy read. The author breaks down ways algorithms are impacting our social media, ability to find work, get loans etc. It is not a conspiracy theory but a serious book that shows the real-life problems that arrive when programmers who aren’t subject matter experts make program, and we all think they are working fine, until we realize an important issue was not contemplated, or we implement them and blindly follow their lead without understanding they have no flexibility for real life.

Leave a Reply

Your email address will not be published. Required fields are marked *