This study path asks learners to investigate the design processes behind frequently-used technologies and analyze those processes for where a different approach can lead to better, less biased outcomes.
By Zoe Chao, Penn State University, User Experience Librarian
Abstract
Nowadays technology is evolving at an unprecedented pace and our daily lives are ever more intertwined with products of technology. More often than not we use a product or a service without much thoughts on the rationale behind its design, its available functions, or its distribution. Certainly, how a product is made needs to align with its target audience. However, when the benefits of technology do not apply to everyone equally, we should ask why one group of people is favored but not others and what decision-making process contributes such outcome and how we can improve the situation.
Learning Objectives
- The learner will:
- Be more aware of and able to articulate potential bias in products and services
- Learn about the concepts of inclusive design process
Activity
Reflection and Discussion Prompt
Joy Buolamwini, a MIT grad student, had to borrow her roommates’ face to use an A.I.-powered facial detection software for her project because the software was unable to detect her dark-skinned face. Her roommate was white. 1
What is the issue here?
Normally, a design process includes identifying problems, developing ideas, creating solutions, and evaluating, and it cycles. At any stage, had someone like Ms. Buolamwini participated in the process, the problem of failing to identify dark-skinned face would be discovered. Having people with diverse backgrounds participate in a design process help us see “the imbalance of knowledge and power where it exists” 2 and work toward a product that is more empathetic and inclusive.
Though most of us are not involved in the design process of a high-tech product, we may have conveniently made assumptions based on our own experiences when designing systems to provide our services and products.
Discussion/Reflection Questions
- What other assumptions might be baked into AI-powered facial recognition software, and how might they affect people?
- Have you experienced similar issues with technology that made it either worse or not work at all? If so, what assumptions were being made about you that were incorrect?
Activity
Reflection and Discussion Prompt
Imagine this real-life scenario. It was late and 20 degrees outside. You came to the bus stop early just to make sure you wouldn’t miss the hourly bus to get home. Finally, you saw the bus approaching but it unexpectedly made an early turn and drove away before reaching your stop. Upset and cold, you called customer service to report the situation. You got a courteous response saying that the bus had to make a detour due to road construction. But there was no sign of road construction and no notice about the detour as far as you could see. To your protest, he said, “You would’ve seen the notice if you downloaded our mobile app or followed us on Twitter or Facebook.”
Discussion/Reflection Questions
Based on the scenario, reflect the following questions:
- What do you think of the bus alert system? What assumptions did the bus company have when they designed the alert system?
- Think about a service/product provided in your institution, can it be used by everyone without problems?
- Can you list the assumptions (or stereotyping) you/your institution may have made that might result in such bias in a service/product?
Assignment
Read the following case study by Scott Young: Participation, Design, Empathy, Justice: The User Experience with Underrepresented Populations (UXUP) Project.
For each assumption you listed in the two discussion/reflection questions above, can you think of an activity (or action) in the design process that would help mitigate bias?
Assessment
A written report based on a new example of algorithmic bias in the news, research and learn more about the design process behind that technology, and identify points in that design process where you could address bias.
Resources
Videos
The inherent bias in our technology3, a playlist of TED talks which includes the following:
- “How I’m fighting bias in algorithms” by Joy Buolamwini
- “The moral bias behind your search results” by Andreas Ekstrom
- “Beware online ‘filter bubbles’” by Eli Pariser
- “Machine intelligence makes human morals more important” by Zeynep Tufekci
Overview
Acemoglu, D. (2007). Equilibrium bias of technology. Econometrica, 75(5), 1371-1409.
Hankerson, D., Marshall, A. R., Booker, J., El Mimouni, H., Walker, I., & Rode, J. A. (2016, May). Does technology have race?. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 473-486). ACM.
Hudson, L. (2017, July 20). Technology is Biased Too. How Do We Fix It?. FiveThirtyEight https://fivethirtyeight.com/features/technology-is-biased-too-how-do-we-fix-it/
Kilbourne, W., & Weeks, S. (1997). A socio-economic perspective on gender bias in technology. The Journal of Socio-Economics, 26(3), 243-260.
Examples of Bias in Technology
Lohr, S. (2018, February 9). Facial Recognition Is Accurate, if You’re a White Guy. The New York Times. Retrieved from https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.
Smith, D. (2013, January 25). ‘Racism’ of early colour photography explored in art exhibition. The Guardian. Retrieved from https://www.theguardian.com/artanddesign/2013/jan/25/racism-colour-photography-exhibition
Snow, J. (2018, Februry 26). Bias already exists in search engine results, and it’s only going to get worse. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/
Inclusive Design Process
Barton, A. C., Tan, E., & Greenberg, D. (2016). The makerspace movement: Sites of possibilities for equitable opportunities to engage underrepresented youth in STEM. Teachers College Record, 119(6).
Manzini, E., & Rizzo, F. (2011). Small projects/large changes: Participatory design as an open participated process. CoDesign, 7(3-4), 199-215.
Massimi, M., Baecker, R. M., & Wu, M. (2007, October). Using participatory activities with seniors to critique, build, and evaluate mobile phones. In Proceedings of the 9th international ACM SIGACCESS conference on Computers and accessibility (pp. 155-162). ACM.
Young, S., & Brownotter, C. (2018). Toward a More Just Library: Participatory Design with Native American Students. Weave: Journal of Library User Experience, 1(9). Retrieved from https://quod.lib.umich.edu/w/weave/12535642.0001.901?view=text;rgn=main
Yost, F. (2018). Participatory Approaches to Building and Improving Learning Ecosystems: The Case Study of the Library for Food Sovereignty. Weave: Journal of Library User Experience, 1(9). Retrieved from https://quod.lib.umich.edu/cgi/t/text/idx/w/weave/12535642.0001.902?view=text;rgn=main
Case Studies
Participation, Design, Empathy, Justice: The User Experience with Underrepresented Populations (UXUP) Project, a case study focusing on participatory design with Native youth, by Scott Young.
Print This Page