This article is the first in a new series about diversity, equity, and inclusion efforts in Washtenaw County's tech sector. Support for this series is provided by Ann Arbor SPARK.
Bias is in every decision-making process, but in the tech world, programmers' biases may have particularly harmful effects on the people their products are intended to serve.
Meg Green (they/them), an Ann Arbor-based senior user experience researcher for
Rocket Homes, has explored this topic at length in their personal research. Green’s job primarily focuses on research and also utilizes design to work with developers and designers. They share the challenges that biased data can create with developers.
Green gives an example of how data can be biased in purchasing homes.
“Clients who are purchasing or selling homes want to easily find a home in an area that they will enjoy living in, and there are public things about a neighborhood that can be easily searchable," Green says. "The most commonly asked question is about crime statistics.”
Green acknowledges that the more police there are in an area, the more crime that's reported – which could affect the data and how it is being analyzed.
Meg Green.
“Anyone can access information that is in the public domain regarding the city they wish to live in, but the way the information is used for data can perpetuate a problem that already exists,” says Green.
Bias can also play a powerful role in artificial intelligence (AI), and Green and others in the industry are reevaluating those effects. For example, Green points to an AI product called the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), which uses demographic info to assess how dangerous criminals are. Assessments by COMPAS and similar programs are sometimes used to determine the length of jail or probation sentences, despite
evidence that they are biased against Black offenders.
“When it comes from a computer, people are more likely to trust the data,” Green says. “... The data is just statistics, but if you put biased data in, you will get biased data out."
Bias can also come into play in machine learning algorithms, which are used to teach AI how language works. The way we categorize people and terminology have changed over time, and context is very important in language. But unfortunately AI can be biased for both gender and race concepts.
"The word 'doctor' and ‘nurse’ in English are gender-neutral, but when it’s translated into German, Google Translate uses the masculine term for 'doctor,' and feminine term for 'nurse,'" Green says. "AI attempts to assume the context for determining whether to use the masculine or feminine word in languages that use gender signifiers."
Algorithms can also create negative associations with certain racial or gender identifiers such as "Black," "woman," or "gay." For example, a
Google search for "Black girls" used to return primarily results for pornography. And
including the word "transgender" in video titles has resulted in YouTubers receiving lower ad revenue on their videos.
“Being gay or being Black or being a trans woman does not mean these things are negative and that you don’t want to read this information,” Green says. “Anything about being bisexual and gay is pornographic and not acceptable for children, according to some biased data found with AI.”
Green suggests that counteracting these biases is as simple as sending software developers along with user experience researchers while they interview target users of their software.
“They can empathize with people and see their situations,” Green says. “The developers are more empathic and can now talk to their team about the situations. It is about building that empathy and helping people relate to the bias that is being entered."
Bias in software can have harmful consequences, and Green says it's up to programmers to reverse them.
“A machine algorithm is not going to learn unless we teach them better and help reprogram the data,” says Green.
For additional information about AI and gender roles, watch Green's video presentation on "
Gender and Artificial Intelligence."
Monica Hickson is a freelance writer currently based in Ypsilanti. She joined Concentrate as a news writer in 2020 and is the author of a book, "The COVID Diaries." You may reach her at monica_alexis@yahoo.com.
All photos by Doug Coombe.
Enjoy this story?
Sign up for free solutions-based reporting in your inbox each week.