Author opens up about AI technology’s use in her own breast cancer diagnosis

ABC News

(NEW YORK) — The growing use of artificial intelligence has been a hot topic in recent years, with a growing number of industries starting to use the technology that’s predicted to revolutionize certain sectors.

The technology fuels excitement, but there are also concerns about potentially negative impacts on people and society.

In her new book, More Than a Glitch: Confronting Race, Gender and Ability Bias in Tech, AI researcher and data journalism professor Meredith Broussard dives into the impact of some of AI’s deep-rooted problems.

Broussard spoke to ABC News about discovering how AI was used during her own diagnostic testing for breast cancer and some of the potential downsides of the technology being used in this capacity.

LINSEY DAVIS: So in the book you talk about being diagnosed with breast cancer and you learn at some point that AI had actually read your exam. What was your reaction to that?

MEREDITH BROUSSARD: I thought it was so strange, because I was depending on my doctors for care in this crisis moment of my life. And I thought, what did this AI find? How is it being used? And because I’m an AI researcher, I thought, who made this particular AI, because I know that there’s a lot of bias in artificial intelligence systems.

So I didn’t do anything with this knowledge right away, but then after I was recovered, I went back to it, and I actually took an open source AI and ran my films through this open source AI in order to write about the state of the art in AI-based cancer detection.

DAVIS: Do you think that this is something we’re going to increasingly see in the medical world?

BROUSSARD: I think that one of the things that researchers really want to do, that medical researchers really want to do, is they want to heal more people and they want to diagnose more people earlier and more accurately, and the real hope is that AI will help with that. Is it going to happen any time soon? No. Could it happen eventually? Maybe.

DAVIS: What are the potential risks of AI being used in this capacity?

BROUSSARD: One of the things that people often don’t realize is that when you make an AI system, a machine learning system, for something like breast cancer diagnosis, you actually have to tune it to have a higher rate of false positives or false negatives.

So a false positive would mean that it says, “Oh, you might have cancer,” when you don’t actually have cancer, and it sends you for more testing. And a false negative would mean that it says, “Nope, no cancer here,” but you actually do have cancer. So the cost of a false negative is much higher in medicine. And so these systems are actually tuned to have more false positives than false negatives, which means that it is quite often going to say, “Oh, yeah, I think there might be a problem here,” and then people are going to get referred for more testing, which, as many people know, can mean weeks or even months of waiting for additional tests and just being on the edge of your seat while you’re worrying about this.

DAVIS: Well, people are at home, are going to wonder, did you have a false positive?

BROUSSARD: So, I had really great medical care throughout my cancer experience. And I just, I’m so grateful to everybody who took care of me. One of the interesting things about the AI that diagnosed me is that it’s not actually used for diagnosis. It is used by the doctors at this particular hospital after they have already given their own diagnosis, so it’s like a backup tool. So the doctor, the radiologist, will enter in what they think the diagnosis is, and then they’ll get access to the AI’s results. And they can either ignore the AI’s results or they can use it to say, “OK, well, maybe I’ll go back and look at that area of concern again.”

So it’s still in the hands of doctors. Nobody needs to worry that AIs are out there, you know, diagnosing people with cancer. And we’re definitely not at the situation that a lot of people imagine, where it would be like a box that you kind of walk up to it and it scans you and tells you, “Yes, you have cancer. No, you don’t have cancer.” Like, that’s not a reasonable scenario.

DAVIS: Not today.

BROUSSARD: Not today. I mean, I hope it’s not ever like that, because I do not want bad news medically. I want that from a doctor. I don’t want it from a machine.

DAVIS: Meredith, we thank you so much. Of course, this is a conversation that’s going to be ongoing in the coming months and years ahead. Want to let our viewers know that her book, More Than a Glitch: Confronting Race, Gender and Ability Bias in Tech, is now available wherever books are sold.

Copyright © 2023, ABC Audio. All rights reserved.