August 26, 2023
NOTE: A version of this story appears in our 2023 South Dakota Festival of Books guide, produced by South Dakota Magazine.
Meredith Broussard’s first book, Artificial Unintelligence: How Computers Misunderstand the World, made the case that our eagerness to incorporate technology into every aspect of our lives was resulting in poorly designed systems that just as often made tasks as solved them. She continues her case against technochauvinism — the idea that computational solutions are superior to all others — in More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech.
Broussard is a data journalist and former computer scientist who teaches at the Arthur L. Carter Journalism Institute of New York University. More Than a Glitch explores what she calls “the intersection of technology and social issues.” She argues that neutrality in technology is a myth and that we need to root out the algorithms that fail certain segments of the population. “The problems in our technological systems are actually problems with human society,” Broussard says. “Technology is not flawless, even though people like to imagine that technology is objective or neutral or unbiased. It’s not. The systems we build are socio-technical systems.”
She explains that machine learning works by feeding data into a computer and then asking it to create a model showing mathematical patterns. Those models can then be used to make decisions and predictions. Mathematical fairness reigns, while social fairness suffers.
A startling example is an investigation by The Markup, a news organization focused on algorithmic accountability reporting. A study on automated mortgage approval algorithms revealed that borrowers of color were 40 to 80 percent more likely to be denied than their white counterparts. “The reason for this becomes really clear when we think about how machine learning systems are made,” she says. “The automated mortgage approval model is fed with data about who has gotten mortgages in the past, and in the past there has been financial discrimination against borrowers of color. The model is picking up on those mathematical patterns. We could use more math to put a thumb on the scale and make the system less biased. It’s just that nobody’s really doing that, or doing it well enough.”
The book explores similar stories in realms from education to medicine to policing. Broussard covers facial recognition software that was built to work more accurately on people with light skin; or predictive policing, which uses a computer model to identify who may be likely to commit a crime based on past criminal activity. Broussard cites incidents in which innocent people were unjustly targeted through the use of each technological “advancement.”
So how do we fix it? Broussard remains optimistic that as more people gain computational literacy and feel empowered to examine and criticize artificial intelligence systems, we will eventually build technology that brings us toward a better world. “Right now, we are building AI systems that replicate the world as it is with all of its flaws and biases,” she says. “I would rather we build technology that helps us get to the world as it should be.”
Learn more about humanities programming in South Dakota by signing up for SDHC e-Updates!