In September 2020, ISA hosted a virtual three-day course for the United States Special Operations Command (SOCOM) on data-driven decision-making with artificial intelligence and data science within the Department of Defense (DoD). The goal of this course was to discuss and debate these topics to identify points of leverage, as well as knowledge gaps within the DoD. There were more than 300 participants from SOCOM and across the DoD, joining the presentations from around the world, including Afghanistan, South Korea, and SOCOM headquarters in Florida.
IPS Communications specialist Bill Brink was allowed to attend the short course and write about what he saw and heard:
A man entered the frame and turned to face the camera. The CyLab and Institute for Software Research computers correctly identify the subject, known to their facial-recognition software as Mahmood. Mahmood put on glasses, and again the computers ID’d him.
Martial Hebert, the Dean of Carnegie Mellon University’s School of Computer Science, played the next video. “Just a small corruption in principle,” he said, “can be used to perturb the system.”
Mahmood donned another pair of glasses, these much brighter in color. This time, the computer identified him as Ariel, a woman. Further iterations ID’ him as Reese Witherspoon and Russell Crowe. Dr. Hebert illustrated the vulnerabilities of artificial intelligence and machine learning – an adversary doesn’t need to overwhelm an entire system to disable it, only a small portion of it.
“It’s very important to understand truth in advertisement,” he said.
Dr. Hebert presented these examples during an executive short course that Carnegie Mellon’s Institute for Strategic Analysis (ISA), an initiative within the Institute for Politics and Strategy, held on September 21-23 with the United States Special Operations Command, or SOCOM. During the three-day event, titled “Data-driven Decision-making with Artificial Intelligence and Data Science in the Department of Defense,” more than three hundred SOCOM members learned from Carnegie Mellon faculty, and officials from SOCOM and the Department of Defense, about the latest innovations in artificial intelligence and machine learning, as well as the importance of data and creating a culture that is conducive to embracing these technologies.
“The cultural aspect of it really is about, how do we educate the teams that we have, ensure that we can get people to align to a common set of goals, a common set of instructions, on how do we move on this path forward so that we think collectively about the right ways to move forward?” said Thomas Kenney, SOCOM’s Chief Data Officer. “But also, part of it too is, how do we select the right leaders who are thinking this way?”
ISA has engaged with the defense, diplomatic, and intelligence communities since its founding in 2013, providing information on the science that underlies policy. It has held several short courses, including sessions at the US Army War College, educating military and defense personnel on the latest in cyber warfare, artificial intelligence, machine learning, and robotics.
Established in 1987 and headquartered at MacDill Air Force Base in Tampa, Florida, SOCOM organizes, trains, and equips the US military’s special forces, such as Navy SEALs and Army Rangers. Ten Carnegie Mellon faculty members participated in the three-day event.
Dr. David Danks, the Department Head of Philosophy and Psychology at Carnegie Mellon, contextualized the challenges inherent in the ethics of artificial intelligence with the example of self-driving cars. Do you program them to go the speed limit, or keep up with traffic, which is proven to reduce accidents? Follow the rules, or do what’s best for the passenger?
“We need to move away from thinking that everything is either ethically obligatory or ethically forbidden,” he said. “The reality is, in most cases in our lives, the vast majority of things, we have to make a choice and there’s a range of things that are ethically permissible.”
Jon Fox, who served as the Chief Information Officer for an Army unit during his twenty years in active duty and is now the leader of Sam’s Club’s enterprise architecture team, put that notion into reality. Sam’s Club can track everyone who buys, for example, a pressure cooker: when, where, and what car they get into, information that can be used for legal issues or to advertise future products to that consumer. The question becomes whether to use that information, and if so, how.
“We have that ability,” Fox said. “Our culture continuously battles on, how do we turn that into a capability.”
Those types of decisions become easier when the workforce is both up to speed on the technology and optimized to utilize it effectively.
“SOCOM has it a little easier than most. They make video games about you guys,” said David Spirk, the Department of Defense’s Chief Data Officer. “They make a lot of great movies about you. The concept of special forces is a great motivator and gets a lot of people to initially pay attention, but what I tell you is, to make sure that the opportunities are advertised well in advance of a USAJobs posting. You have to generate that interest, that enthusiasm, and then have a real opportunity using the tools that the commercial world uses today. We’ve got to not be afraid of getting on traditional social media, as well as employment-focused social media platforms and push the job announcements out there.”
Once you gather the workforce, you need to gather the data. Advancements in artificial intelligence, machine learning, and robotics require as much information for the computers to process as possible.
“Can a human do it?” asked Dr. Nathan VanHoudnos, a Senior Machine Learning Research Scientist at Carnegie Mellon’s Software Engineering Institute. “If yes, a machine can be taught to do it if we gather enough data.”
After Mahmood’s glasses turned him in to Ariel, Dr. Hebert showed a video of a drone flying through a dense forest. The models used in the drone’s vision system are learned, rather than programmed. The trick is to teach the system to recognize when the data isn’t good enough, the way humans slow down while driving when dense fog impairs their vision.
“In real situations, this is always going to happen,” Dr. Hebert said. “The system is going to encounter a situation at the edge of where it’s been built, and it needs to recognize that.”
Dr. Matthew Travers, a Systems Scientist at CMU’s Robotics Institute, further illustrated Dr. Hebert’s point. Travers and a team of researchers competed in the DARPA Subterranean Challenge, during which teams from around the world built robots and drones to map subsurface environments like tunnels and caves. He showed a video of drones taking off from the back of the robots and mapping the “underground environment,” a never-commissioned nuclear power facility. During the competition, the drones and robots had to find, classify, and localize as many artifacts within one hour as they could, and they had to do it on their own.
“Everything that you’re looking at here is autonomous navigation and search,” Travers said.
When the event concluded, ISA had deepened its relationship with the defense and military communities, and those communities left with a better understanding of the latest technology and how it could impact their systems. They had what their computers need: More data.