Technically Wrong
Buying groceries, tracking our health, finding a date: whatever we want to do, odds are that we can now do it online. But few of us ask why all these digital products are designed the way they are. It’s time we change that. Many of the services we rely on are full of oversights, biases, and downright ethical nightmares: Chatbots that harass women. Signup forms that fail anyone who’s not straight. Social media sites that send peppy messages about dead relatives. Algorithms that put more black people behind bars.Sara Wachter-Boettcher takes an unflinching look at the values, processes, and assumptions that lead to these and other problems. Technically Wrong demystifies the tech industry, leaving those of us on the other side of the screen better prepared to make informed choices about the services we use—and demand more from the companies behind them.

Technically Wrong Details

TitleTechnically Wrong
Author
LanguageEnglish
ReleaseOct 10th, 2017
PublisherW. W. Norton Company
ISBN-139780393634631
Rating
GenreNonfiction, Science, Technology, Feminism, Business, Design

Technically Wrong Review

  • John Norman
    January 1, 1970
    Well . . . This is another one of those funny books that is sort of a “5” and sort of a “3.” The book broadly claims that the tech industry builds interfaces and products that are (not necessarily intentionally) biased. The book says that the main driver is the homogeneity of tech company investors and employees.There is no doubt in my mind that this is true, and on that basis, I’d recommend this to anyone in or outside of tech. We product builders and designers are doing a crap job of acknowled Well . . . This is another one of those funny books that is sort of a “5” and sort of a “3.” The book broadly claims that the tech industry builds interfaces and products that are (not necessarily intentionally) biased. The book says that the main driver is the homogeneity of tech company investors and employees.There is no doubt in my mind that this is true, and on that basis, I’d recommend this to anyone in or outside of tech. We product builders and designers are doing a crap job of acknowledging the incredibly broad types of people and styles of interaction out there. Because of tech’s homogeneity, there’s so much stuff that just isn’t thought about critically (e.g., image analysis software not being able to analyze non-white faces). But as I’ll get into it in a moment, I would very strongly recommend this to historians of technology as a little guide to problems that deserve significantly more research. The author’s a web consultant, but I think we need to bring out the scholars. There’s good stuff here about geography, the 2010s -- more reports of personal experiences would make the story even more valuable. (I keep thinking about to another book I reviewed: Tuco’s The Conversational Firm, which shows how far we can get with ethnographical strategies.)There are some arguments here that are very dear to my heart. For example, on p. 137 and chapter 3, the author notes how engineers and product designers will focus on the main experimental flow, and minimize the importance of “edge cases.” For example, say 80% of the users are young, and only 20% are old (perhaps needing bigger fonts). Well, the company is going to focus on where the money is: So font-changing features may be downplayed. The author rightly stresses harm and consequences: Even though the 20% might not be where the money is, the negative consequences of not helping them out with a useful UI can cause a lot of damage. One area I have been concerned about is privacy and security in healthcare. Say a login code is sent to an email: But that email might go to a shared account. For the most part, this is probably not troubling: The user “opted in,” supplying that email. But should we work harder to ensure that only the individual can access that account? What if it’s a shared account and medical details about domestic violence make it to that address. Again, say the patient has signed a consent to allow that message to go via email to a particular address. Should that minority example make us very concerned to protect the “minority” user pattern? I think so. The book does a good job walking the reader through this.But I have some concerns:* Geography: Time and time again, the examples lean towards west coast companies: Uber, Facebook, Twitter, etc. There are some exceptions. But I’d like to know: If the California tech culture is so bad, are there other places that are better?* Timespan: Is this a particularly bad moment? Wachter-Boeettcher provides the appalling facts around the decline of women computer science majors (37% in 1984, 18% in 2014). “I can’t pretend to know the precise reason for this shift” (p. 182). Me neither. But this book is so anchored in the present, it begs the question of how we would assess, say, the tech culture of the 80s. I bet it was better. But was it? Just as an example, back in the day, Ann Wollrath was the lead author on the original RMI article. Big stuff. What was the culture? It would mean a lot if Wollrath told us that it was the same back then. Then we might understand the core problem as a more broader ill.* Intentions: There are some good anecdotes here about how female voices are used for Siri, Alexa, and Google Maps (etc.) (pp. 36-38). Right. But what conclusion should we draw? “Women are expected to be more helpful than men . . . The more we rely on digital tools in everyday life, the more we bolster the message that women are society’s ‘helpers’” (p. 38). I get this. But then the author says: “Did the designers intend this? Probably not.” I protest! Go out and interview the designers! What were their reasons? Apple, in particular, thinks hard about this stuff. What were the factors going into a female Siri, and how did they outweigh providing other Siris (male; accented; whatever)? I want to know. The book makes an insinuation, but I think there’s a real research task to be performed. Bring out the ethnographers.In large part, the book is driven by articles in the tech media. The next step is to get out there and start quoting people on their individual experiences, in order to test some claims (e.g., is the problem peculiarly tech in California in the 2010s? Or is it men in tech (more geography would help)? Or even a side-effect of the investment structure and capitalism (seems implicit in chapter 9) -- and, in particular, figure out where people are doing it right, and why. [The one positive example given in the book is Slack, but I’m not going to give much quarter there: Slack was produced by advertising and exploring the corporate customer’s desire to control discourse in the company, not by inclusiveness.]
    more
  • linhtalinhtinh
    January 1, 1970
    A good and short read. Plenty of examples, but mostly the famous ones on the internet - the author's alignment with the truly marginalized is limited, mostly with female/gays/transgender/nonwhites but still the educated, unlike O'Neil in Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy who places her heart towards the poor, the abused whose stories may not be heard at all, buried deep, powerless. The problems aren't less worthy to discuss, though. The sexist A good and short read. Plenty of examples, but mostly the famous ones on the internet - the author's alignment with the truly marginalized is limited, mostly with female/gays/transgender/nonwhites but still the educated, unlike O'Neil in Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy who places her heart towards the poor, the abused whose stories may not be heard at all, buried deep, powerless. The problems aren't less worthy to discuss, though. The sexist and racist culture is so embedded, the privileges so taken for granted, the arrogance and the belief that tech people are coolest and smartest and above everyone else so fierce. That needs to change.
    more
  • Philipp
    January 1, 1970
    Recommended reading on the current (very current) state of the tech industry. Overlaps a little bit with and cites Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, but focuses more on programmer an designer choices, assumptions and hidden biases instead of algorithms. First I'd thought of recommending it only to programmers - there's a bunch of stuff on personas and other design techniques that are not of interest to 'regular' humans - but then it branches Recommended reading on the current (very current) state of the tech industry. Overlaps a little bit with and cites Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, but focuses more on programmer an designer choices, assumptions and hidden biases instead of algorithms. First I'd thought of recommending it only to programmers - there's a bunch of stuff on personas and other design techniques that are not of interest to 'regular' humans - but then it branches out and goes into the role of the tech industry in daily life, fake news, concerted online harassment, and all the other acrid smoke from the garbage fire that is the modern WWW.
    more
  • Kathy Reid
    January 1, 1970
    A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".As Wachter-Boettcher is at pains to highlight A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders - and reinforcement that technology is the domain of rich, white, young men.The examples that frame the narrative are disarming in their simplicity. The high school graduand whose Latino/Caucasian hyphenated surname doesn't fit into the form field. The person of mixed racial heritage who can't understand which one box to check on a form. The person who's gender non-conforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'. Beware, these are not edge cases! The most powerful take-away for me personally from this text is that in design practice, edge cases are not the minority. They exist to make us recognise of the diversity of user base that we design for. Think "stress cases" not "edge cases". If your design doesn't cater for stress cases, it's not a good design.While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs. There are a finite number of ways to write a syntactically correct PHP function. Give me 100 form designers, and I will will give you 100 different forms that provide 100 user experiences. And at least some of those 100 users will be left without "delight" - a nebulous buzzword for rating the success (or otherwise) of digital experiences.Wachter-Boettcher takes precise aim at another seemingly innocuous technical detail - application defaults - exposing their (at best) benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data. It is designing not for delight, but for deception. "Default settings can be helpful or deceptive, thoughtful or frustrating. But they're never neutral."Here the clarion call for action is not aimed at technology developers themselves, but at users, urging us to be more careful, more critical, and more vocal about how applications interact with us.Artificial intelligence and big data do not escape scrutiny. Wachter-Boettcher illustrates how algorithms can be inequitable - targeting or ignoring whole cohorts of people, depending on the (unquestioned) assumptions built into machine learning models. Big data is retrospective, but not necessarily predictive. Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future. Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque. Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be.The parting thought of this book is that good intentions aren't enough. The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity. That will be delightful!
    more
  • Jill
    January 1, 1970
    I won this book in a giveaway. I work in the tech sector and was interested in this book because I am leading a digital transformation effort at my job and wanted to make sure i didn't fall into any of these traps. The book was not what I was thinking it was but boy were my eyes opened. I have worked in tech for 35 years. I'm a woman and have experienced the discrimination the book describes early in my career developing software for a utility. While I was raising my kids, I taught computers in I won this book in a giveaway. I work in the tech sector and was interested in this book because I am leading a digital transformation effort at my job and wanted to make sure i didn't fall into any of these traps. The book was not what I was thinking it was but boy were my eyes opened. I have worked in tech for 35 years. I'm a woman and have experienced the discrimination the book describes early in my career developing software for a utility. While I was raising my kids, I taught computers in college part-time then returned to the workforce when they were driving. I thought my days of discrimination were behind me but just last year it happened again. I was being groomed for a position to take over for my boss, the IT Director, when he retired. When he announced his retirement date, I was expecting the promotion but I didn't get it. Even though my boss was progressive, the good ol' boy network of the company, choose otherwise and now I report to someone who not only has never managed IT but has never worked in it. So I am training my boss. Toxic!I didn't realize that software meant for the general public had such a narrow view of "normal". This book opened my eyes tremendously. I am ashamed of my industry. This should be required reading for anyone studying in the tech field in college. I have forwarded this title to the college at which I taught.
    more
  • Rachel
    January 1, 1970
    I want to qualify my rating of this book: If you haven’t previously thought about sexism, racism, or other forms of discrimination in the tech industry, this is a five-star recommendation. However, as someone who regularly reads about this topic and pays attention to tech news, I encountered very little new information in this book. It was also a bit disappointing to see so much focus on recent big news stories (e.g. the Google Photos categorization fail, Uber sexism and spying, Facebook year in I want to qualify my rating of this book: If you haven’t previously thought about sexism, racism, or other forms of discrimination in the tech industry, this is a five-star recommendation. However, as someone who regularly reads about this topic and pays attention to tech news, I encountered very little new information in this book. It was also a bit disappointing to see so much focus on recent big news stories (e.g. the Google Photos categorization fail, Uber sexism and spying, Facebook year in review) rather than a wider range of companies and more in-depth looks at what went wrong, how it happened, and how companies are or could be doing things differently. So I wasn’t blown away by the book, but it holds valuable information for some folks and I just might be the wrong audience.
    more
  • Amy
    January 1, 1970
    This was a very thoughtful exploration of how bias is built into the tech products we use every day, and how that bias subsequently shapes and reinforces behaviors offline. Wachter-Boettcher explores not just how technology is built, but also how the organizations that build it perpetuate particular cultural norms that just don't work for many of the people they supposedly serve. As someone who works in technology as a behavior change designer, I'll return to this book for reflection in the futu This was a very thoughtful exploration of how bias is built into the tech products we use every day, and how that bias subsequently shapes and reinforces behaviors offline. Wachter-Boettcher explores not just how technology is built, but also how the organizations that build it perpetuate particular cultural norms that just don't work for many of the people they supposedly serve. As someone who works in technology as a behavior change designer, I'll return to this book for reflection in the future. This is also a book I can see myself giving to others who either want or need to think about the many ways tech consumes, reiterates, and reinforces harmful biases.
    more
  • Parker
    January 1, 1970
    This is a good solid introduction to a really important issue. Given the nature of the subject matter, a lot of the most striking anecdotes in here were covered by the tech press and so were widely circulated within the community of people observing this kind of thing closely. But even as somebody who pays a lot of attention to the problems described in this book, a few stories were new to me. Certainly, if this is not an area you are already pouring hours of each day into, there will be a lot o This is a good solid introduction to a really important issue. Given the nature of the subject matter, a lot of the most striking anecdotes in here were covered by the tech press and so were widely circulated within the community of people observing this kind of thing closely. But even as somebody who pays a lot of attention to the problems described in this book, a few stories were new to me. Certainly, if this is not an area you are already pouring hours of each day into, there will be a lot of new and compelling stories for you.In any case, this book is entertaining, readable, and persuasive.
    more
  • Emily Finke
    January 1, 1970
    This book doesn't really cover anything new, if you've been following conversations about bias in technology in recent years. However, that really isn't a mark against it, since it's trying to be an introduction to the topic rather than an expansive deep dive. It's a really great primer on the topic, and I'll be recommending it to people who aren't necessarily conversant on inequality in technology, but are curious about where to start. I can't think of any other book that would suit that purpos This book doesn't really cover anything new, if you've been following conversations about bias in technology in recent years. However, that really isn't a mark against it, since it's trying to be an introduction to the topic rather than an expansive deep dive. It's a really great primer on the topic, and I'll be recommending it to people who aren't necessarily conversant on inequality in technology, but are curious about where to start. I can't think of any other book that would suit that purpose quite as well.
    more
  • Rachel Moyes
    January 1, 1970
    Some parts of it dragged, but overall, it was terrifying. I thought it made especially interesting points about the necessity of training algorithms with unbiased training data so as not to perpetuate past injustices, the myth of the "tech industry" monoculture, and the way free speech on the internet can quickly turn into hate speech.When you get a bunch of the same types of people making choices, it's easy for them to devalue or overlook the experience of others. I think that is the joy and di Some parts of it dragged, but overall, it was terrifying. I thought it made especially interesting points about the necessity of training algorithms with unbiased training data so as not to perpetuate past injustices, the myth of the "tech industry" monoculture, and the way free speech on the internet can quickly turn into hate speech.When you get a bunch of the same types of people making choices, it's easy for them to devalue or overlook the experience of others. I think that is the joy and difficulty of the twenty-first century. We are moving into a society that acknowledges all life experiences, that doesn't say, "Well, you're not in the average, so you can be ignored." That means things are more complicated, and it's teaching all of us to be less selfish, to have more empathy, and to be forced to consider and value those different from us.
    more
  • Elizabeth Grace
    January 1, 1970
    Wachter-Boettcher's book is a relatively thorough introduction to the many sins of the majority white, majority male silicon valley. Some of her anecdotes were so cringe-worthy, I felt a little guilty for reading them, like I was driving too slowly past a car accident, gawking. The chapter on Northpoint and their recidivism prediction software, COMPASS, was incredibly alarming. The racism of algorithms is a timely topic, and covered in greater depth elsewhere; what this book does is integrate it Wachter-Boettcher's book is a relatively thorough introduction to the many sins of the majority white, majority male silicon valley. Some of her anecdotes were so cringe-worthy, I felt a little guilty for reading them, like I was driving too slowly past a car accident, gawking. The chapter on Northpoint and their recidivism prediction software, COMPASS, was incredibly alarming. The racism of algorithms is a timely topic, and covered in greater depth elsewhere; what this book does is integrate it with the other abuses of the industry in design and advertising. I agree with another reviewer that a critique of capitalism could have been more explicit. It would help explain how a company as morally bankrupt as Uber came to be in the first place, and why it persists.
    more
  • Kylie
    January 1, 1970
    This is one of those books that I hope gets made into mandatory reading in STEM courses.It does a little good job of highlighting a lot of the recent problems with the current state of "tech" and the dangerous place it's in right now. It was kind of weird to read something talking about a bunch of internet drama that I remember watching unfold in real time. Also nice to learn more of the factors leading up to the incidents. Overall there wasn't much in this book that I hadn't heard about before This is one of those books that I hope gets made into mandatory reading in STEM courses.It does a little good job of highlighting a lot of the recent problems with the current state of "tech" and the dangerous place it's in right now. It was kind of weird to read something talking about a bunch of internet drama that I remember watching unfold in real time. Also nice to learn more of the factors leading up to the incidents. Overall there wasn't much in this book that I hadn't heard about before but it's well presented and analyzed here. So I would highly recommend it for anyone who doesn't know how many problems are caused by the inherent biases in technology development.
    more
  • Katie
    January 1, 1970
    Must read for anyone who creates tech products - any product, really. Wachter-Boettcher tells story after story of how tech is only as inclusive, useful, and fair as the ideas behind it. "Because, no matter how much tech companies talk about algorithms like they’re nothing but advanced math, they always reflect the values of their creators: the programmers and product teams working in tech. And as we’ve seen time and again, the values that tech culture holds aren’t neutral."I would love to see m Must read for anyone who creates tech products - any product, really. Wachter-Boettcher tells story after story of how tech is only as inclusive, useful, and fair as the ideas behind it. "Because, no matter how much tech companies talk about algorithms like they’re nothing but advanced math, they always reflect the values of their creators: the programmers and product teams working in tech. And as we’ve seen time and again, the values that tech culture holds aren’t neutral."I would love to see more concrete and specific solutions, but the stories are sticky enough to help me keep inclusivity, privacy, and other needs at the front of my mind when designing and using products. My personal biggest takeaways - beware demographic data in personas, look for stress cases instead of edge cases, don't trust Facebook, and read the fine print.
    more
  • Ashley
    January 1, 1970
    Some of the claims this book makes are overly broad, but taken as a whole, this book explains how technology is built on practices of discrimination. It’s not just the homogeneity of Silicon Valley—it’s that cognitive biases and discriminatory ideologies are built into the very programming of the tech we use to run our lives. And that’s a huge problem because technology shapes our cultural landscape. The author doesn’t say much that a savvy observer hasn’t noticed on her own, but she brings it a Some of the claims this book makes are overly broad, but taken as a whole, this book explains how technology is built on practices of discrimination. It’s not just the homogeneity of Silicon Valley—it’s that cognitive biases and discriminatory ideologies are built into the very programming of the tech we use to run our lives. And that’s a huge problem because technology shapes our cultural landscape. The author doesn’t say much that a savvy observer hasn’t noticed on her own, but she brings it all together in a cohesive, imminently readable package. I think anyone interested in the unconscious ways technology influences our lives would love this book.
    more
  • Katie Kovalcin
    January 1, 1970
    This book is a must read for anyone who uses technology in their daily lives. Sara's writing is so approachable and demystifies tech with examples of how biases in applications affect all of us. It was refreshing to read such an honest critique of the tech-focused world we live in. I couldn't put it down, I read it in one sitting!
    more
  • Holly Dowell
    January 1, 1970
    A really important look into the biases built into the tech that permeates our lives. Insightful anecdotes and important points about the need for ethics and diversity in the tech industry. As someone in that space already, most of it wasn’t new, but I’m glad the book exists! I hope the right people read it and change can come about in the companies that (literally) run our lives.
    more
  • Bastian Greshake Tzovaras
    January 1, 1970
    Technology is now the energy field that surrounds us and penetrates us; it binds our planet together. But the tech industry is failing all of us in myriads of ways. This book gives a great summary of the problems of technology and how they came to be. Recommended: for everyone who reads this on a screen and not in print.
    more
  • Nicole
    January 1, 1970
    Insightful and highly readable. None of the examples used will come as a surprise to anyone who’s been following tech stories for the last few years, but the author draws a neat through line that bring the problems into sharper focus.
  • Katie
    January 1, 1970
    This isn't the first book I've read about big data and the algorithms behind the technology that works in every facet of our lives can work against us. What was new for me about this book was the discussion on the culture of tech companies that is often sexist and racist. Things like this make me want to just unplug from everything and not look back.
    more
  • Laura
    January 1, 1970
    I feel like pretty much everyone should read this book. Will definitely think about some aspects of technology differently having done so.
  • Rahul Phatak
    January 1, 1970
    Pretty good look with examples/data at how/why the current SV companies bias their apps towards 'young Caucasian males' and what companies should do to fix this.
  • Robbin
    January 1, 1970
    This is one of the most important books a responsible designer could read. Technology has a lot of responsibility to take, and there are ways to make that happen, but we need more people in higher positions to read content like this book.
  • Karen
    January 1, 1970
    This book is a must-read for anyone interested in how tech culture affects all of our lives.
  • Sophia Ramos
    January 1, 1970
    CD, this is a fascinating read that I borrowed from my friend Grace. It looks into the ways our technology has been created with less than the majority audience in mind. It's not written in academic jargon, which I appreciate—Wachter-Boettcher clearly wanted her text to be understood by the people its affecting, rather than writing some peer-reviewed equivalent of a burn book to her colleagues. Definitely give this a shot if you want to understand more about how our technology works against us, CD, this is a fascinating read that I borrowed from my friend Grace. It looks into the ways our technology has been created with less than the majority audience in mind. It's not written in academic jargon, which I appreciate—Wachter-Boettcher clearly wanted her text to be understood by the people its affecting, rather than writing some peer-reviewed equivalent of a burn book to her colleagues. Definitely give this a shot if you want to understand more about how our technology works against us, and what actions we can take to fix it.
    more
  • Lance Eaton
    January 1, 1970
    Wachter-Boettcher's book on understanding the exclusionary power and privilege of technology is must-read for anyone who works in technology or with technology (which yes, means the vast majority of us). She moves through a variety of technologies, platforms, and systems to show how while useful, technology also privileges certain groups of people and excludes other and that if technology is going to truly meaningful and transformative, it needs to be inclusive. She does this by look at differen Wachter-Boettcher's book on understanding the exclusionary power and privilege of technology is must-read for anyone who works in technology or with technology (which yes, means the vast majority of us). She moves through a variety of technologies, platforms, and systems to show how while useful, technology also privileges certain groups of people and excludes other and that if technology is going to truly meaningful and transformative, it needs to be inclusive. She does this by look at different technology and raise questions around questions of edge-cases (people who do not fit the mold of how tech designer assume will fit into their technology or who were not prepared for such people), intentional design made to rush users rather than engage them, and how companies have histories of abusing or not protecting the information they gather on users. It's not a call to be anti-tech but a call to be tech-conscious, tech-inclusive and tech-responsible, which is always appreciated. Her best work is done when she illustrates how simple steps in processes and technologies illustrate innate and problematic assumptions on the side of the designers such as when name-inputs restrict the letter count (what if you have a particularly long name) or when Google search results illustrate problematic results (that typically represent racial assumptions baked into or derived from other people's use). These help the common reader understand where these problems arrive for those that may not have encountered them or help them understand that they have indeed encountered such issues but did not realize such things were conscious design choices. In total, it's worth checking out as many of us can benefit from thinking about the inclusivity of technology.
    more
  • kat
    January 1, 1970
    Concise and motivating (if depressing). Should be required reading for everyone in tech.
  • Richard
    January 1, 1970
    Even though I have some criticisms of this book, I’m compelled to give it 5 stars. The core message is too important.I agree with about 80% of what the author says and with about 60% with her reasoning. My takeaway of her central message is this: A lot of tech is badly designed. There are many reasons for this, among them the narrow perspective, experience, worldview, what have you, of the people who design the products. Too many of the software engineers, programmers, architects, designers, etc Even though I have some criticisms of this book, I’m compelled to give it 5 stars. The core message is too important.I agree with about 80% of what the author says and with about 60% with her reasoning. My takeaway of her central message is this: A lot of tech is badly designed. There are many reasons for this, among them the narrow perspective, experience, worldview, what have you, of the people who design the products. Too many of the software engineers, programmers, architects, designers, etc., don’t know what they don’t know, and that’s a bad place to be. The author focuses on issues of gender, lifestyle, and race. I would add age and related physical abilities. For years, I’ve beefed about badly placed or sized controls, too small labels, inscrutable jargon, and many other design shortcomings of the world of products we use. Twenty- and thirty-somethings seem to primarily design products for themselves, forgetting that, not knowing that, or not caring that there are millions of consumers out there who have fat fingers, poor eyesight, poor flexibility, are short of stature, and don’t operate with the same assumptions and knowledge background that people in the tech industry have.If you are a person who participates in product creation, this book is for you. You can learn a lot.Now for some criticisms and other observations.First, the author is a lib. I’m a conservative. I could have done without the occasional political asides. They didn’t add any value to the book and didn’t have anything to do with the author’s message. That said, except for the final short chapter, such asides were infrequent and mild. I bring this up because the author preaches sensitivity to different situations in people’s lives, but then makes the book a LITTLE off-putting for those who may not share her political views. For too many libs, inclusion seems to apply in one direction only. Minor point, however.The author relates an experience with a medical questionnaire form that asked a particular question about sexual history. I thought the author’s take on the possible outcomes of a response to that question was very insightful, not to mention very personal (and brave to even put that in the book). However, the one thing she didn’t cover was that people are free in many cases to just not answer the question. The choices on the form were Yes and No. On a paper form, just leaving the answer blank is also an option. Don’t buy into the mindset that your choice is limited to one of the options presented. Remember the saying, “Not to decide is to decide”? Same thing applies here. Not to answer is to answer. Let the questioner figure out what that means.The author complains that defaults strengthen stereotypes or biases. I agree with her reasoning on that, but here she kind of ignores the real world. For example, she argues that the default feminine voice of Alexa, Siri, and similar apps reinforces the stereotypical (and she would say biased) image of an assistant is a female role, and that role is subordinate. She’s probably right. But, in the real world, what company is going to debut a new product of the same type that doesn’t meet customer expectations, in this case, a Siri-type product that doesn’t initially present with a female voice? It might be at least a little risky in terms of market acceptance. Or it might not. I don’t know. But then, it isn’t my money on the line putting out such a product.Wachter-Boettcher then goes on to say that people tend not to change the default settings even when that option is available to them (which is true). But this is where she and I part ways. She kind of puts groups of people in the victim bucket (the victim of the microaggression of encountering a sexist default setting in this case), where I put them in the bucket of they really don’t care one way or the other. Or at least they don’t care enough to expend the time and effort of changing the default. I do agree that defaults should be easy to change, and better set-up wizards are always welcome. We just have a somewhat different take on how important some of this stuff is.Her perspective on “edge” cases is good. I like how she redefines them as “stress” cases. However, in every product I do think there is a cut-off point where the designer doesn’t have to account for unusual situations. She probably agrees with this, too, but I think our thresholds might be different, with hers being lower, and again maybe skewed toward her political leanings, as I imagine mine probably are, too.I liked the chapter that dealt with COMPAS, the Correctional Offender Management software. Early on, the author was complaining that COMPAS was about 60% accurate for both white and black arrestees, but that since there were proportionally more black subjects, black people were unduly affected. I thought, wait a minute, it seems to me that COMPAS is equally accurate (or inaccurate) for both populations. No injustice there. Later in the same chapter, she pointed out that COMPAS predicts rearrests, not re-offenses, I finally got it. COMPAS predictions are self-reinforcing because the algorithm is in part self-referential. Good point. (I should say I liked the chapter in a way. I liked the analysis. I didn't like that COMPAS was having the effect that it did, or does.)The final point I’ll make in my review is this: Products are designed to do a certain thing or certain sets of things (let’s call that X). People find them useful and tend to use them in novel or unpredictable ways (let’s call that Y). If then the product doesn’t do Y as well as people would like, whose problem is that? I think it’s primarily the users’ problem: they’re using the product in a way it wasn’t intended or designed to support. Now, it’s certainly in the company’s best interest to meet the market need and incorporate Y into the supported feature set. But sometimes that doesn’t or can’t happen for any of a number of reasons. I think the author thinks it’s the company’s problem and they have an obligation to “fix” the product so that it supports Y.An illustrative case of this could be eHarmony (which isn't in the book), where the developers designed a product for heterosexual users and gave no thought to homosexual users. In this case the courts forced them to accommodate the gay customers even though the company professed no expertise in same-sex matching. Was it the fault of eHarmony to put out a product that didn’t accommodate both straight and gay users? Or was it the fault of the gay users to insist that a product that didn’t take their needs into account to use it for such, especially considering there were other sites that did cater to their needs? This case has societal values wrapped up into it, but I think it’s illustrative nonetheless.There is a lot more in the book than I have covered here. Again, I recommend the book for its insight and its ability to provoke thought.
    more
  • Douglas Lord
    January 1, 1970
    This scathing critique of the tech industry and its techniques is both informative and hair-raising. Wachter-Boettcher winningly posits that from top (industry giants like Facebook) to bottom (smaller, niche companies), services rely on finely crafted promises of ease, interconnectedness, and service to humanity. In reality, these are for-profit businesses. As these companies become more and more ubiquitous they act as quasi-public utilities—sans the governmental oversight and controls; Google’s This scathing critique of the tech industry and its techniques is both informative and hair-raising. Wachter-Boettcher winningly posits that from top (industry giants like Facebook) to bottom (smaller, niche companies), services rely on finely crafted promises of ease, interconnectedness, and service to humanity. In reality, these are for-profit businesses. As these companies become more and more ubiquitous they act as quasi-public utilities—sans the governmental oversight and controls; Google’s summer 2017 anti-diversity uproar, Facebook’s September 2017 revelations about Russian ads, and the Equifax breach-and-coverup revealed in that same month (all of which occurred after this book was written) lend much credence to W-B’s well-written criticism. Indeed, W-B convincingly shows Silicon Valley bigwigs as a hegemony that “…routinely excludes anyone who’s not young, white, and male.” The inevitability of embedded tech, of it becoming “…more fundamental to the way we understand and interact with our communities and governments,” writes W-B, must be balanced with an absence of “…biased, alienating, or harmful” aspects in its creation. For every “there’s an app for that,” whether you are tracking your health, dating, or banking online, there’s a design flaw that humiliates, belittles, and undermines real human beings (e.g., a smart scale that scolds a toddler for gaining weight, a binary choice for sexual orientation, Native American names being deemed unacceptable on social media). VERDICT Provocative, passionate, impossible to ignore.Find reviews of books for men at Books for Dudes, Books for Dudes, the online reader's advisory column for men from Library Journal. Copyright Library Journal.
    more
  • Marcus Kazmierczak
    January 1, 1970
    A really good book covering the biases in technology, algorithms, and problems with Silicon Valley. The first half seemed more practical and applicable around forms biases and design you might not think about. The second half of the book was good, but around news stories of tech firms behaving badly; most of which were already familiar and a bit less actionable.
    more
  • Tim Kadlec
    January 1, 1970
    Originally published at https://timkadlec.com/read/2018/techn...------Last year I read "Fifty Inventions That Shaped The Modern Economy" by Tim Hartford. My favorite question that he raised repeatedly was about who benefits from what we build, and more importantly, who loses.So whenever a new technology emerges, we should ask: Who will win and who will lose out as a result?Sara's book is all about this exact idea. She looks at the technology landscape asking this question, and the answer she get Originally published at https://timkadlec.com/read/2018/techn...------Last year I read "Fifty Inventions That Shaped The Modern Economy" by Tim Hartford. My favorite question that he raised repeatedly was about who benefits from what we build, and more importantly, who loses.So whenever a new technology emerges, we should ask: Who will win and who will lose out as a result?Sara's book is all about this exact idea. She looks at the technology landscape asking this question, and the answer she gets isn't a good one. We're building technology for people like us, and most of the time in this community, that means building for young, white males. And we're doing this without thinking about the consequences.But when we start looking at them together, a clear pattern emerges: an industry that is willing to invest plenty of resources in chasing "delight" and "disruption", but that hasn't stopped to think about who's being served by its products, and who's being left behind, alienated, or insulted.This book is an uncomfortable read, and it should be. It's painful to hold up the mirror and see just how badly we're falling short. But it's so important that we do. Technology drives so much of our day to day lives, and its reach is only expanding. It's not a hobby, it's not a niche thing—it's something that impacts everyone around the world every single day.I love that Sara very early points the finger at us, the people building technology, and then she never lets it waver. She doesn't let us hide behind the code or the math in the algorithms we build. Her focus is on the human aspect, as it should be. We're the ones who need to work to ensure that we're considering different viewpoints and testing our work through these different lenses.The book also builds very nicely from chapter to chapter. She progresses from seemingly basic considerations—like form fields—in the early chapters to complex algorithms in the later ones. Throughout, there are numerous examples of situations where people were left out by the decisions that we made on their behalf, whether or not we realized it. She also does a good job of zeroing on some core beliefs in our field that contribute to the mess we're in: how the idea of a separate "technology industry" lets us avoid the checks and balances for established fields, how our obsession with engagement drives us to make the wrong decisions for the people using our products, and how the focus on collecting and selling people's data counters inclusivity.Sara isn't anti-technology. She just recognizes the importance technology has become, and the power of the decisions we make. Every form field, every default setting, every push notification, affects people. Every detail can add to the culture we want—can make people a little safer, a little calmer, a little more hopeful.My own love of technology is because of this reach she describes. It's so incredible that what we build can be used by people all around the world, in various different walks of life. I want it to work for everyone. Taking the time to read Sara's book is a good way for anyone to get started in making that a reality.
    more
Write a review