Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech review

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic TechTechnically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech by Sara Wachter-Boettcher

My rating: 3 of 5 stars


I have mixed feelings about this book. I feel like if you’re designing consumer products in the tech industry you should definitely read this book. Or conversely if you use social networks, etc. but haven’t thought about how much influence design decisions can have, you’d probably get a lot out of this book. But I’m kind of in the squishy middle, where I’ve heard a lot about this sort of stuff, but it doesn’t apply to me directly.

Which is not to say I didn’t get anything out of this book. Wachter-Boettcher’s thesis is that the tech industry has convinced itself that it’s a meritocracy of the best and brightest, which means that tech companies:
– don’t make products that are biased, they’re just based on algorithms (as if algorithms can’t be biased!)
– aren’t sexist or racist, they just hire the best people for the job

Sadly, both of these points are entirely wrong, but they are indeed commonly held in the industry based on what I read.

Odds and ends from the book:
– Wachter-Boettcher makes the point that a lot of design teams can just default to catering to the “average” user, and other people are “edge cases”. But people aren’t really “average”; she cites the study done in the 1950s on Air Force fighter pilots where they calculated their average dimensions in their shoulders, chest, waist, etc. Not a single pilot was in the middle 30 percent of all ten measurements.
– The book is full of examples of companies just not thinking about these “edge cases”. One of the interesting ones is about names. The author has five names (including a hyphenated last name), so she’s had some experience in dealing with systems that can’t handle her name. She describes this as being like a microaggression, which makes sense to me. (I feel the same way whenever I see forms for the kids that ask for a mother and father…) This spins off into a discussion of Facebook’s policy that you have to use your real name. But there are a lot of reasons people don’t want to use their real names – political refugees, victims of stalking, drag queens, etc. Facebook did eventually bend on this policy, but it was still much easier for people to report you for using a fake name than for you to respond to it. Wachter-Boettcher also talks about how Facebook’s messages about this also became more user-friendly, from “Your Name Wasn’t Approved” to “Help Us Confirm Your Name”.
– There’s a discussion about racism on Nextdoor, the social network for your physical neighbors. This was an infamous problem; people would often report suspicious activity whenever they saw a non-white person, sigh. The book says that Nextdoor banned racial profiling, but also spent a long time redesigning the form that people used to report suspicious activity to emphasize clothing, hair, etc instead of race. They also added rules that you can’t just specify someone’s race, and Nextdoor claims that all of these changes together reduced racial profiling by 75%. This came at the cost that a lot more people abandon the new form without submitting it, and most of the time this reduced “engagement” would be considered a terrible thing.
– Wachter-Boettcher talks about how in 2012 Google let you see what it thinks your interests, age, and gender is. (you can still see it here if you haven’t turned off ad personalization) And people realized that Google thought women that were interested in tech stuff (including the author!) were men. This isn’t a big surprise because Google’s algorithm was trained on what it saw in the past – that more men than women are interested in tech stuff. But now this has a cascading effect where the algorithm thinks it’s even more likely that people interested in tech stuff are men!
– There’s a discussion of ProPublica’s investigation into the COMPAS algorithm that is used to predict recidivism in people convicted of crimes, and how ProPublica found that the algorithm is biased against black people.


View all my reviews

Leave a comment