Carnegie Mellon University
Carnegie Mellon Goes to the Dark Side

Carnegie Mellon Goes to the Dark Side

Imagine dedicating your career to studying something you can’t touch, or even see. The only way you know that it exists is by the mysterious push and pull it has on stars and planets millions of light years away.

A group of researchers at MCS’s newly formed Bruce and Astrid McWilliams Center for Cosmology chose to do just that when they decided to devote their life’s work to the study of dark matter and dark energy.

Dark matter and dark energy make up more than 95 percent of the universe, yet scientists have no clue what it is. But they have some guesses about what the dark part of the universe does. It is thought that dark matter plays a role in the formation and clustering of galaxies while dark energy is responsible for the expansion of the universe. The idea of dark matter and dark energy is relatively new in the world of science. Dark matter was first proposed in the 1930s, but wasn’t widely accepted until the early 1970s; dark energy was proposed in the 1970s, but wasn’t named until 1998. Prior to this development of the theories of dark matter and dark energy, many thought that the universe was made up of the same elements as those found on the periodic table, the same elements that make up everything we see around us.

While physicists can successfully explain the nature of ordinary matter using the Standard Model of Particle Physics, they came to realize this theory didn’t apply to the majority of matter in our galaxy and beyond. The stars were simply moving too fast to be explained by the amount of matter that existed in space, and the universe was expanding too quickly — some other unknown force had to be at play. Researchers came to believe that it was dark matter and dark energy. To find out what exactly these elusive dark elements are, researchers need to understand how the universe began and how it has evolved over time to become what it is today.

“We are in the most revolutionary time in physics since the development of the Standard Model of Particle Physics in the 1960s and early 1970s,” said Fred Gilman, dean of the Mellon College of Science and Buhl Professor of Theoretical Physics. “The questions we want answers to are just as revolutionary. At the same time, the tools that we have give us the power to answer them. Using machine learning to help us analyze huge data sets, supercomputing and programs to simulate the universe, and possibly directly observing dark matter in the give us the ability to do things that were previously impossible.” Gilman hopes that cosmology researchers will be able to pull from the data a fundamental understanding of the nature of dark matter and how it shaped the universe.

Tiziana Di Matteo and Rupert Croft, associate professors of physics, are harnessing the power of supercomputers to recreate how galaxies are born and how they develop over time. They work with machines within the physics department and at the Pittsburgh Supercomputing Center. A computer cluster nicknamed “Ferrari” is dedicated solely for the use of McWilliams Center researchers, and a supercomputer cluster funded through a donation from the Moore Foundation is shared with the School of Computer Science.

Croft crafts computer simulations that start with the conditions thought to be present at the beginning of the universe — the so-called Big Bang — and then enters in the laws of physics as well as proposed algorithms to simulate the gravitational pull of dark matter and other cosmological phenomena including cooling gas and exploding stars. Di Matteo uses simulations to better understand the physics of black holes. She was the first to incorporate black hole physics into simulations, the result of which was the most detailed and accurate recreation of the evolution of the universe to date.

Simulations such as Croft and Di Matteo’s provide snapshots of the development of the galaxy in frames of a half a million years each. Strung together, the frames create a movie of cosmic evolution over the past 14 billion years. At the beginning of the simulations, matter is evenly dispersed, but over time it begins to clump, with superclusters aggregating at intersections, becoming what is known at the “cosmic web.” It is thought that this clumping is caused by dark matter and dark energy.

“Knowing why there is structure in the universe and not just empty space is key in understanding dark matter. Without dark matter, our galaxy wouldn’t have formed and we wouldn’t be here,” said Croft. “Without dark energy, we might still be here, but things would look a whole lot different.”

Thanks to the power of the supercomputers available to the Carnegie Mellon researchers, the images created in the simulation are at such a high resolution and are so precise and detailed, they can zoom in to a particular event, like the formation of a black hole, to see what happens during the formation as well as the aftermath.

“While a biologist can recreate an environment in a Petri dish, or a chemist in a test tube, cosmologists can’t make another universe in the lab,” said Croft. “Computer simulations give us the opportunity to test our theories that otherwise couldn’t be tested.” In fact, Croft believes that the Carnegie Mellon simulations could be used to invent new tests for dark matter and dark energy.

Furthermore, the computer simulations can help experimental astrophysicists plan their telescopic observing strategies to best see the types of phenomena important to the history of the universe.

Indeed, it is a great big sky, billions of light years deep. Physics Professor Jeff Peterson hopes to see as far back as possible — into the “Dark Ages” of the universe, a time where there were no stars and no light. He hopes to be able to see “The Enlightenment,” when the first stars were turned on. This event has been simulated, but never measured.

To do this, Peterson is gathering data from the 21 cm (centimeter) band of radiation given off by neutral hydrogen using a prototype radio telescope he built on the old LTV Coke Works site along the Monongahela River. Radio telescopes use antennae to gather information from the dark radiofrequency part of the electromagnetic spectrum as opposed to optical telescopes, which gather information from light in the visible part of the electromagnetic spectrum. Kevin Bandura, a doctoral student and McWilliams Fellow working with Peterson, has developed a prototype inferometer that detects these radio emissions.

At 21 cm, you can see the glow of neutral gas from the Dark Ages. When stars begin to glow, they ionize neutral hydrogen, turning off its glow. The telescope, which according to Peterson looks like a “hammock for giants,” focuses radiowaves from the universe into a line of antennas that then wire the data to computers.

Peterson hopes to build a much larger version of this telescope that will allow him to look at radiowaves from across the universe, an essential next step because there is little neutral hydrogen left in the universe.

Carnegie Mellon will also use data from optical telescopes to find evidence of dark energy and dark matter. Carnegie Mellon has joined more than 23 universities, national laboratories and corporations in constructing the world’s most powerful survey telescope called the Large Synoptic Survey Telescope (LSST). Images from the LSST will be used to trace billions of remote galaxies and measure the distortions in their shapes produced by concentrations of dark matter, providing multiple tests of dark energy.

Telescopes like the one Peterson hopes to build and the LSST accumulate large sets of astrophysical data. To attempt to define dark matter and energy, researchers need to be able to detect subtle non-linear signals out of these noisy masses of information. Carnegie Mellon has assembled a team of astrostatisticians focused on developing statistical techniques for analyzing this data. The researchers from the College of Humanities and Social Sciences have used standard traditional statistical methods to analyze cosmic microwave background radiation, estimate the dark energy equation of state, analyze galaxy spectra and detect galaxy clusters.

To study and understand dark matter and dark energy, researchers must sift through billions of years and a universe worth of data. While the McWilliams Center is housed in MCS’s Physics Department, it depends on the contributions from across the university, including from researchers in computer science and statistics, as well as those from international collaborations.

Complementing the work of the statisticians, researchers in the School of Computer Science will develop scalable algorithms, data mining, and machine learning techniques for analyzing and gaining knowledge from massive amounts of cosmological data and create tools for completing cutting-edge numerical simulations relevant to cosmology.

With these tools, and this large multi-disciplinary team, the McWilliams Center hopes to shed light on the dark part of the universe, making this elusive matter to which they have dedicated their careers definable and real.