Sara Wachter-Boettcher studies, writes and talks about technology design.
She’s the author of “Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech.”
“To The Best Of Our Knowledge” producer Shannon Henry Kleiber talked with her about why digital products are designed the way they are and how those choices are made.
Stay informed on the latest news
Sign up for WPR’s email newsletter.
Wachter-Boettcher warns failing to account for the unintended ways technology shapes our lives can cause us pain that might be avoided if we think about how we design digital platforms and apps differently.
This transcript has been edited for clarity and length.
Design Choices Impact Our Every Day Lives (Without Us Even Noticing)
Sara Wachter-Boettcher: I think that you can really see bias at play in so many different places in the digital tools that you use every day. You can definitely see it in search engines. If you do a Google image search for a CEO, you are going to get pretty much images of white men; and if you do a Google search for anything related to, for example, black girls, you will get some exceptionally racist results.
You go to fill out a form and it has dropdown menus asking what your gender is, or what your title is, and none of those options actually fit you. If facial recognition systems aren’t built with an inclusive enough set of images, then it actually is going to work less well for you than it would for somebody who has lighter skin.
What has happened is that you have a tech industry that underinvested in understanding these unintended consequences of its design decisions for so long that the problems have now gotten really big. And we’re at a pretty messy state in the industry.
When Trying To Create Delight Results In Pain
SWB: (Until Apple updated the virtual assistant in 2016,) if you were asking Siri for help during different kinds of crises — anything that was related to things like sexual assault or domestic violence — Siri really would respond with no idea of what to do. Siri didn’t understand the entire nature of the query.
One of the problems there is that we need to be thinking both the intended use cases and also unintended use cases. Intended use cases mean the stress or crisis moments, but also things like, are people using this product to hurt someone or harm someone? How could this be used in a way that is violent or unethical?
When Siri didn’t know what to do, one of the most common responses would be to crack a joke. But that’s not very funny if you’re asking for help because you’ve been sexually assaulted.
I think what we find in the tech industry is that comes from a design culture that’s very focused on things like delight. What about the times when delight is not going to happen, when delight is not an appropriate response?
Dear Algorithm: Popular Doesn’t Always Mean Positive
SWB: Eric Meyer is somebody I’ve known for a long time. When his daughter was diagnosed with aggressive brain cancer, he documented every aspect of her illness in his blog. Thousands and thousands of people followed his story. Ultimately she ended up dying on her sixth birthday. It was this tragic, awful, terrible year for him.
He gets to the end of that year and he logs on to Facebook. It’s Christmas Eve, and he’s expecting the well-wishes of friends and family and seeing what people are up to. Instead, he’s confronted by this picture of his daughter and right around it says, “Eric here’s what your year looks like” and it’s an advertisement for a feature called “The Year in Review.” It’ll show the most popular content you posted all year. The idea is, let’s package it up in an album, show it to you and then try to get you to share it with your friends. People engage with it, leave comments on it. Every time you engage with it, that’s more advertising revenue for them.
Nobody had stopped and thought, “what if the most popular content that they posted this year wasn’t the most positive?”
This is the worst day of his life that they’ve decided to show balloons and streamers and be like, “Hey don’t you want to see this again?”
Emotions, Feelings Behind Technology
SWB: The tech industry has spent a long time hiring for technical skills alone. But we’re not just designing technology, we are fundamentally shaping people’s interactions with each other, their relationship with things like their employment, and their money. These are really big and intimate things that people have a huge emotional connection to. And so we need to invest in understanding those things — that is just as valuable as somebody who can program.