Carnegie Mellon University

Conrad Tucker

November 13, 2020

Conrad Tucker visits IPS for Professional Journeys discussion

By Bill Brink

A few minutes into Conrad Tucker’s presentation, it was quiz time. The Arthur Hamerschlag Career Development Professor in Carnegie Mellon University’s Mechanical Engineering department clicked to a slide with the heading “What is this?” Beneath it was an apple.

Or so the audience at Tucker’s presentation, part of the Institute for Politics and Strategy’s Professional Journeys series, thought. It is, Tucker told them, a picture of an apple. He did it again, this time with a banana, except the banana was a plantain.

The next slide featured two pictures of the same dark-skinned woman. The computer could identify the face on the left, but not in the dimmer light on the right.

“Imagine if you were from this demographic,” Tucker said, “and you were expecting the AI system to detect who you are.”

These biases are among the issues Tucker studies in his work, which focuses on using machine learning to increase the efficiency of engineering design as well as improve STEM education. He also studies the implications of AI and machine learning on healthcare, systems design, and cybersecurity, and he discussed the ramifications of potential biases of facial-recognition software by putting them in a human context.

“Forget AI, let’s just look at everyone here,” he said. “Pre-COVID, if I walked into a gas station with a mask on, even if it were winter, would there be an awareness or bias that’s created? There are certain groups that were very concerned when these mask mandates started to go on that they may be incorrectly classified and labeled in another way.”

Tucker brings a varied background to Carnegie Mellon, where he began working last year. He was born in Sierra Leone and spent parts of his childhood in Australia and Nigeria before moving to the United States. He grew up fascinated by airplanes and wanted to be a pilot; one of the presents he received after earning his PhD was a flight lesson.

conrad-tucker-flying-slide

He used this as an example of the way AI and machine learning can be tailored to the student: While a project featuring an airplane might work for him, it won’t work for everyone. That wide range of examples needs to extend to STEM faculty in order to attract underrepresented demographics.

“I think it’s really expanding the diversity of the faculty or whoever is leading these efforts who may provide different perspectives that resonate with students who may not necessarily be exposed to the technology,” he said.

When asked about which demographics should be prioritized in the study of AI and machine learning, Tucker hearkened back to his example of the dark-skinned woman by suggesting that researchers start with groups who are already at risk of being adversely affected. He also outlined the risks and rewards of expanded use of AI worldwide.

“Many people are calling AI and machine learning the new gold rush,” he said. “There’s good reason to believe this will indeed impact geopolitics. The challenge is going to be, is it a race to the bottom in terms of privacy and policy, or do you uphold your morals and principles, but then potentially get left behind in terms of a competitive advantage?”

Tucker left the attendees with a plea to be passionate. He pointed over his shoulder at his desk, where a flight instruction manual sat upright on a shelf. He planned to resume instruction before pandemic, and though it’ll have to wait, he has not lost his passion.