Carnegie Mellon University
October 25, 2017

"Frankenstein"'s 200th Anniversary Inspires Discussions on Technology, AI

By Jennie Dorris

Shannon Riffe
  • University Libraries
  • 412-268-7260

Carnegie Mellon University faculty and students are exploring the influence of technology on modern life through a 200-year-old lens: Mary Shelley's "Frankenstein." Shelley's novel, which explores the drive to create and the ethics of responsibility, provides a current metaphor for examining the fast-paced development of artificial intelligence.

To celebrate the 200-year anniversary of "Frankenstein," University Libraries, which houses a rare first-edition copy of the novel, created "Frankenstein 200: Perils and Potential," a series of events in which CMU experts are discussing the innovative possibilities and the hazards of technology. The book and a special exhibit will be on display beginning late May 2018 through the summer.

"Where does all this innovation lead? It's not about accidentally slipping up and having commercially driven science create a genetically modified dinosaur. It's how we change our habits, how automated bots on the web alter us, alter our voting, and alter our perception of each other," said Rikk Mulligan, digital scholarship strategist with CMU Libraries. "It's programmed by humans for now, but there's the possibility of artificial intelligence growing so sophisticated it could stretch itself into designs we can't even comprehend anymore."

Other scholars agree that it is not the typical science fiction model of a robot-turned-evil that catches their attention.

“One of the key pieces of Frankenstein’s story was…the human element,” said Barry Luokkala, teaching professor of physics in the Mellon College of Science. After bringing his creation to life, Frankenstein escapes to his bedroom. The creature, wanting a relationship with his creator, finds Frankenstein, parts the curtains of his bed and smiles at him, before Frankenstein flees the building altogether. “The human element and bias [in data and artificial technology] is something that has to be recognized.”

Jeffrey Bigham, an associate professor of human-computer interaction in the School of Computer Science, recently was exposed to software bias while working with colleagues to make a system that could label images on social media so people who have visual impairments could know the images' content. Along the way, researchers found unexpected errors.

"One example was a picture of Hillary Clinton at a rally and the description that the algorithm made was that it was 'a man doing a trick on a skateboard,'" Bigham said.

The error ended up being simpler than they thought — the creators of the system were skateboarding fans, and skateboards appeared frequently in their initial data. It biased the system toward sports, and it saw things that were not there.

A skateboard-level mistake may seem harmless, but the implications are dangerous, Bigham said. Bias can easily move from sports to gender or race. A recent case of biased software predicted the rate that someone would be rearrested in Wisconsin, and race turned out to be a very significant factor. It leads to a question: Should algorithms represent the world as it is or the world as it should be?

"Descriptively, it was probably right. If you are African-American in the United States you are more likely to be rearrested than if you are white. But we might also plausibly think that is due to structural racism," said David Danks, head of the Department of Philosophy in the Dietrich College of Humanities and Social Sciences.

Mulligan, Luokkala, Danks and Bigham participated in a panel discussion titled "Creation and Consequence" and were joined by Molly Steenson, a professor of design in the College of Fine Arts. They discussed their roles as academics studying technological developments, as well as opportunities to influence how artificial intelligence is developed ethically.

"It's often done working with companies directly. Working with the educational systems to teach the next generation of designers and the next generation of technologists to be aware of the human dimension. It means going out to policymakers and making sure they understand," Danks said.

"Frankenstein 200: Perils and Potential" events continue with a short film competition for students — there will be a screening of selected entries on Dec. 6. On Feb. 8 there will be a roundtable discussion featuring CMU graduate students discussing the perils and promise of their research.