How data bias is impacting the future of Artificial Intelligence (and what you should know) — digitalundivided

Image via MIT Press

Want to crack into the world of AI, but need help figuring out where to start? Enter our More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech book giveaway! Open through EOD June 16th, 8 pm EST.

You’ve probably been hearing a lot about Artificial Intelligence (“AI”). Many have clutched their pearls when they found out AI is being used to write wedding vows, while others are still digesting the thought of robots brewing beer. Conversely, AI gives many reasons to celebrate important causes, such as using AI to address climate change or detect the next epidemic. However you feel, you probably have some strong reactions.

“We’re at a particular phase in a hype cycle around AI. It is similar to the early aughts' hype cycle around social media.” NYU Professor Meredith Broussard, an artificial intelligence journalist and author of the book, More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech, states. “We have the opportunity to make smarter decisions this time. How are these AI systems working? And what are the harms that people are experiencing at the hands of AI systems?”

Image via MIT Press

But for many of us outside the technology sector, wrapping our minds around how artificial intelligence works, let alone knowing how it impacts our lives or what questions to ask about it, seems daunting. But according to Broussard, public understanding and challenging advances in AI is essential to how it is already impacting our everyday lives, especially regarding computational bias.

“When AI algorithms discriminate, it’s often treated like a glitch, not a bug. In computing, there are glitches and there are bugs. Bugs are substantial. They are significant problems in the code that deserve to be addressed. But a glitch is just a blip: It’s not very important. It’s just the code acting weird. It’ll just take a minor update to fix.

So, for example, when Google Images labeled images of Black men as gorillas, it was treated as a glitch, like, ‘Oh, ha-ha, we definitely could not have ever predicted that would happen.’ But yet this is what happens over and over and over again.”

What Broussard argues in More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech means we must treat racism, sexism, and ableism in code as significant issues. Jump into this fascinating interview to understand everything you need to know about how AI works, the future it could hold for all of us, and why you should challenge it.

Image via MIT Press

digitalundivided: First of all, congratulations. What an incredible read. I’m not going to lie; my background is not in coding or anything tech or mathematically related. So, I was very apprehensive about this book, wondering how much I could understand. But you do such a beautiful job of very clearly and simply describing all of these intricate concepts in an easily digestible way.

Meredith Broussard: One of the reasons I wrote the book is that many people feel intimidated by technology. That’s something that technologists sometimes intentionally do to create more mystique around the field of computer science. But my ideal vision is a more inclusive — where people are empowered to understand the technology and push back when algorithms make unfair or unjust decisions.

digitalundivided: That’s a great starting point for our interview for those in our community, especially those who may not be familiar with how AI, algorithms, or the state of technological development. How does it all work?

Meredith Broussard: What I would love for people to take away from the book is the knowledge that computer systems do not spring fully formed from the head of Zeus. These are things that are made by human beings. They’re fallible, just like human beings are fallible.

People should understand the basics of AI, and the super basics of AI is that AI is math. It’s nothing more. It’s nothing less. It’s not Hollywood. It’s just math. The way you make an AI system, or a machine learning system, is you take a whole bunch of data, plug it into the computer, tell the computer to make a model and show the mathematical patterns in the data. You can then use that model to make new decisions and predictions or generate new material like sentences, paragraphs, and images.

That’s what’s happening with generative AI. But all of these machine learning systems are constructed the same way. When we talk about patterns, the patterns in the data are the patterns in the real world. So, if you’re training a computer model on data about the real world, you’re going to encode all of the biases of the natural world into that mathematical model.

digitalundivided: Talk about some examples of technological bugs you put forward in the book. One that shook me was automated soap dispensers. I always thought soap dispensers didn’t work for me, that I was somehow doing it wrong. But in your book, you explain that automated soap dispensers don’t respond well to melanated skin. Can you discuss how these algorithmic bugs — or biases — are encoded into standard technology or situations?

Meredith Broussard: Sure. I mean, it’s shocking. When you see the video of the racist soap dispenser, it’s just such a clear example of technology discriminating. What happens in the video is that there are two men in a washroom. The light-skinned man puts his hand under the soap dispenser, which works. The man with dark skin puts his hand on a soap dispenser, which doesn’t work. But then the man with dark skin gets a white paper towel, puts it under the soap dispenser, and the soap emerges. It’s so apparent that the soap dispenser is racist.

I don’t think the people who made the racist soap dispenser got up in the morning and said, “I’m going to build a soap dispenser that oppresses people.” I don’t think there was malice involved. It was probably unconscious bias. Perhaps the soap dispenser was developed by people with lighter skin who tested it on themselves and their friends and family and said, “Oh, it worked on us. It must work for everybody.” Right?

We all have unconscious biases. We’re all working on it and trying to become better people. But we can’t see our unconscious bias because it’s unconscious. And then, an inescapable fact is that we embed our preferences in the technologies that we create. A straightforward fix would be to have more diverse teams of people creating technology. But inside Silicon Valley, diversity has never been a priority.

digitalundivided: Can you tell us more about what Silicon Valley looks like and its compositional impact on developing technology?

Meredith Broussard: Despite some apparent efforts, the needle has stayed on diversity and Silicon Valley throughout its history. There are many causes for this. One of the causes is that technology firms have never prioritized diversity. If they were making it a priority, it would have happened.

Now, it’s not because there’s no talent out there. It’s a legacy of AI. Computer science as a discipline is a descendant of mathematics. All of the earliest computer scientists were mathematicians and physicists, which has a gender issue still, in 2023. Looking at the elite academic math departments, you do not see many women in full professor roles. Math has never really reckoned with its diversity issues. In part, because mathematicians were always held up as being above, you know, “petty human concerns” because they were dwelling in this elite, mathematical realm.

As a descendant of mathematics, computer science inherited that lofty “above it-allness.” That’s one of the things that we see in tech companies. Ignoring diversity comes from a sense that it doesn’t matter because tech companies dwell in this lofty technical realm.

digitalundivided: What are the consequences of that lofty sentiment within the tech space?

Meredith Broussard: Techno chauvinism is something that I write about in this book. It’s the idea that computational solutions are superior to others. I argue that we should consider using the right tool for the task. Sometimes, the right tool for the task is a computer. Sometimes it’s something as simple as a book in the hands of a child sitting on a parent’s lap. One is not inherently better than the other.

digitalundivided: The racist soap dispenser is one everyday example of techno chauvinism in algorithms and technologically advanced products. What examples should we expect to see as AI advances within other industries?

Meredith Broussard: One of the stories I tell in the book is of Robert McDaniel. McDaniel is a Chicago man identified by an AI-predictive policing program. The police arrived at his door one day and said, “Our algorithm says that the person living at this address is at risk of being involved in a shooting. We don’t know if that means you’re going to be shot or you’re going to be a shooter. But our algorithm says whoever lives here is at risk.” McDaniel said, “No, thank you. I am not interested in your algorithm. Please go.”

The police kept coming back. They kept coming back and offering intervention methods. They offered to get him into different programs around gun safety, around employment. McDaniel kept saying, “No, thank you.”

Eventually, the police came around so often, and their cars were parked outside his house that he got a reputation as an informer for the police. Then, he was shot for being a snitch to the police.

Now, this is not what anybody wanted. Right? Had the police just heard this citizen saying, “No, thank you. Leave me alone”, he would not have been shot.

digitalundivided: What agency does the everyday person have in developing technologies?

Meredith Broussard: We are currently in a complicated situation. It took us 30 years to get into this mess. I don’t think there’s a quick, easy way to escape it. I also don’t think that individual effort is the only thing that will help. We need people to make personal changes and companies to make changes. We need changes at the policy level. We need all of those things to happen.

One step is for people to empower themselves around AI and computation to understand more about what’s happening inside computing machines.

Image via MIT Press

Order your copy of More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech, now!

Want to crack into the world of AI, but need help figuring out where to start? Enter our More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech book giveaway! Open through EOD June 16th, 8 pm EST.