This study path asks learners to investigate the design processes behind frequently-used technologies and analyze those processes for where a different approach can lead to better, less biased outcomes.
By Zoe Chao, Penn State University, User Experience Librarian
Nowadays technology is evolving at an unprecedented pace and our daily lives are ever more intertwined with products of technology. More often than not we use a product or a service without much thoughts on the rationale behind its design, its available functions, or its distribution. Certainly, how a product is made needs to align with its target audience. However, when the benefits of technology do not apply to everyone equally, we should ask why one group of people is favored but not others and what decision-making process contributes such outcome and how we can improve the situation.
- The learner will:
- Be more aware of and able to articulate potential bias in products and services
- Learn about the concepts of inclusive design process
Reflection and Discussion Prompt
Joy Buolamwini, a MIT grad student, had to borrow her roommates’ face to use an A.I.-powered facial detection software for her project because the software was unable to detect her dark-skinned face. Her roommate was white. 1
What is the issue here?
Normally, a design process includes identifying problems, developing ideas, creating solutions, and evaluating, and it cycles. At any stage, had someone like Ms. Buolamwini participated in the process, the problem of failing to identify dark-skinned face would be discovered. Having people with diverse backgrounds participate in a design process help us see “the imbalance of knowledge and power where it exists” 2 and work toward a product that is more empathetic and inclusive.
Though most of us are not involved in the design process of a high-tech product, we may have conveniently made assumptions based on our own experiences when designing systems to provide our services and products.
- What other assumptions might be baked into AI-powered facial recognition software, and how might they affect people?
- Have you experienced similar issues with technology that made it either worse or not work at all? If so, what assumptions were being made about you that were incorrect?
Reflection and Discussion Prompt
Imagine this real-life scenario. It was late and 20 degrees outside. You came to the bus stop early just to make sure you wouldn’t miss the hourly bus to get home. Finally, you saw the bus approaching but it unexpectedly made an early turn and drove away before reaching your stop. Upset and cold, you called customer service to report the situation. You got a courteous response saying that the bus had to make a detour due to road construction. But there was no sign of road construction and no notice about the detour as far as you could see. To your protest, he said, “You would’ve seen the notice if you downloaded our mobile app or followed us on Twitter or Facebook.”
Based on the scenario, reflect the following questions:
- What do you think of the bus alert system? What assumptions did the bus company have when they designed the alert system?
- Think about a service/product provided in your institution, can it be used by everyone without problems?
- Can you list the assumptions (or stereotyping) you/your institution may have made that might result in such bias in a service/product?
Read the following case study by Scott Young: Participation, Design, Empathy, Justice: The User Experience with Underrepresented Populations (UXUP) Project.
For each assumption you listed in the two discussion/reflection questions above, can you think of an activity (or action) in the design process that would help mitigate bias?
Create a written design audit. First, find a a new example of algorithmic bias in the news. Research and learn more about the design process behind that particular technology. Next, acting as a design manager, pinpoint steps in those design processes and where you, as a manager, could either ask your team questions or make specific design proposals that would help address bias.
The inherent bias in our technology3, a playlist of TED talks which includes the following:
- “How I’m fighting bias in algorithms” by Joy Buolamwini
- “The moral bias behind your search results” by Andreas Ekstrom
- “Beware online ‘filter bubbles’” by Eli Pariser
- “Machine intelligence makes human morals more important” by Zeynep Tufekci
Examples of Bias in Technology
Inclusive Design Process
Participation, Design, Empathy, Justice: The User Experience with Underrepresented Populations (UXUP) Project, a case study focusing on participatory design with Native youth, by Scott Young.