February 19, 2020
Army War College International Fellows visit Carnegie Mellon
By Bill Brink
A cartoon featuring a caricature of a stereotypical American flashed on the screen in Tepper Quad’s Simmons Auditorium Tuesday. The man was fat, slovenly and hooked up to an oxygen tank. He rode a motorized grocery cart with an American flag on the back and a machine gun mounted on the front. Above and below him, text: “Syrians are being attacked? We must help them by attacking them too.”
The cartoon, shown during a presentation about the role of social networks in defense and intelligence to visiting fellows from the Army War College, did not represent latest and greatest in the American arsenal. It represented part of a misinformation campaign, the type of cyber warfare that has increased — and worked — in recent years.
“Traditional information warfare, all of the Cold War, was often viewed as simply involving four different kinds of campaigns: Dismiss, distort, dismay and distract,” said Professor Kathleen Carley, the director of the Center for Computations Analysis of Social and Organizational Systems and an Institute for Politics and Strategy executive committee member. “As of 2018, NATO was still using these four. What we have found, in fact, is that that’s not sufficient enough, that many campaigns are actually carried out by trying to make people happy, not just sad; by using these new types of information maneuvers to actually create polarization or mass hysteria; and there’s new technologies, new resources out there, tactics, techniques and procedures.”
The Institute for Strategic Analysis and the CyLab Security and Privacy Institute collaborated to produce a day of presentations the Army War College’s International Fellows, a group of eighty or so officers from militaries around the world who spend 10 months on the Carlisle, Pennsylvania campus taking courses and completing a research project.
“Both current and future leaders recognized that it is through university partnerships where services are enabled to develop concepts and to acquire capabilities,” said Colin Clarke, an assistant teaching professor in the Institute for Politics and Strategy who also works with the Institute for Strategic Analysis. “University partnerships provide the military, government and intelligence agencies with the theories and technology to meet and overcome seemingly intractable security challenges. These relationships often produce operational and tactical solutions, and I know that the network and relationships that are forged here today will go on to serve you all in the future.”
The fellows heard from Professor Carley about efforts to trace terror groups to determine the method of their next attack and how eager a country is to obtain a nuclear weapon. Bryan Parno, an associate professor in the Computer Science Department and Electrical and Computer Engineering Department, detailed the latest advances in verified encryption systems. Known as Project Everest, the program attempts to verify the accuracy of code for HTTPS — Hypertext Transfer Protocol Secure, the prefix you look for when submitting your credit card information.
“We can promise you that it is going to run correctly, that it is not going to leak secrets,” Professor Parno said.
The fellows toured the CyLab Biometrics Center for a demonstration of Carnegie Mellon’s advances in facial recognition software. They gathered around Dipan Pal, a PhD student working in the lab. Pal stood in front of a monitor displaying a blurry picture of Dzhokhar Tsarnaev, one of the brothers who bombed the Boston Marathon in 2013.
The image was of no use to facial recognition software, but in one day, crowd-sourcing unveiled a picture that worked. Carnegie Mellon’s software pins sixty thousand points on the face — “The reason you do it,” Pal said, “is because now you understand the face” — and the digital rendering of Tsarnaev’s face ranked in the top twenty among a million mugshots in a database.
“The haystack gets smaller,” Pal said.
Pal then fired up a long-distance iris identification machine. From across the room, the machine read his eyes and matched them with a picture of him from five years ago. Then he did it again wearing a Guy Fawkes mask; again the device identified him.
The day of presentations concluded with presentations on creating national Computer Emergency Response Teams (CERTs) and how decision and behavior science factor into policy.
“We are the school of strategic land power, but we have people from Army, Navy, Air Force, Marines, Civilians, now that we have a space force — and we do have some space officers at the War College,” said Colonel Brian R. Foster, the director of the International Fellows Program at the Army War College. “It’s definitely relevant now. Cyber, the information, the fake news, that’s a really big topic these days.”