State leaders on the Connecticut Advisory Committee to the U.S. Commission on Civil Rights started their Thursday morning with a lesson on the way artificial intelligence and algorithms can still discriminate against minority groups even when they were intended to counteract that very thing.
Dr. Nicol Turner Lee, a Senior Fellow at the Brookings Institution, focuses her research on digital privacy and the application of civil rights in an increasingly connected world. On Thursday, she was the first to speak to the Advisory Committee on these subjects in an effort to increase awareness of biases built into automated systems governing things like employment and financial decisions.
During her presentation, Dr. Turner Lee cited research that has found that people with ethnic-sounding names are served ads for higher-interest credit cards or loans. She also stated her own concerns that, as companies rely more and more on aggregating personal data, like shopping habits or the type of device used, decisions which can affect a person’s employment or financial status will increasingly be made based on a machine’s best guess rather than an individual assessment.
“This is a discrimination that is very different from what my parents and other parents have experienced during the 60s of the Jim Crowe South,” explained Dr. Turner Lee. “Back then, you could see the signs of Whites Only and Coloreds Only. Today, the internet doesn’t share or compare or place you in the same virtual platform to determine whether you are seeing this ad or that ad based on your inferential attributes.”
So what can we do about it? Dr. Turner Lee offered a few suggestions beyond just awareness of the problem. For one, she said that it is important for legislatures to reconsider or re-explore the civil rights laws currently in place to determine how they relate to our digital economy.
Additionally, since the algorithms are usually a product of private industry, she also recommended that state leaders establish accountability strategies to create a more responsive ecosystem and require civil rights impact statements and assessments from companies that make financial decisions for citizens. These reviews would help to ensure that any automated systems avoid bias where possible.
Dr. Turner Lee was quick to point out that these algorithms and automated processes serve an important purpose. They increase efficiency and make it possible for companies to operate more quickly. But, she said, they are still coded by individuals and within systems that have proved over time to have their own internal biases.
She offered the example of home appraisals, which can be biased against people of color in part due to a history of red lining which devalued neighborhoods populated by Black homeowners. Over decades, that intentional discrimination creates a cumulative effect, leading to lower valued properties within certain zip codes. A computer cannot easily take these historic factors into consideration.
The key to counteracting these systemic biases, then, is to ensure that the people creating the algorithms and automation counteract each other’s biases. “It’s very very important that we start out with the presumption that these systems start out flawed and we need the human agency and the accountability strategies to make sure that as people are resting and relying on these systems that they’re not necessarily getting the same outcomes and goals,” said Dr. Turner Lee. “We want this technology to be better than us.”