Research-The McWilliams Center for Cosmology - Carnegie Mellon University

The McWilliams Center Youtube channel showcases some recent research by members of the center.

Research Directions

Research directions of the Center include theoretical astrophysics, with emphasis on computation and simulation; experimental astrophysics, with emphasis on the dark part of the universe and data mining; and particle physics, especially as related to the search for and theoretical understanding of dark matter particles at the LHC.


     Astrostatistics is concerned with developing statistical techniques for
     the analysis of astrophysical data.  Carnegie Mellon has a unique
     established group of researchers in astrostatistics who have tackled a
     wide range of astrophysical problems. Recent research topics include:
     analysis of the Cosmic Microwave Background, estimating the dark energy
     equation of state, analysis of galaxy spectra, detecting galaxy clusters
     via the Sunyaev-Zeldovich effect, identifying filaments, and estimating
     density functions with truncated data.  A common theme in this work is
     the goal of detecting subtle, nonlinear signals in noisy,
     high-dimensional data.  The group plans to be deeply involved in future
     large surveys such as the LSST.

     Computer Science

      As petascale computing becomes a mainstay in many fields of scientific
      research, Computer Science researchers at Carnegie Mellon aim to
      develop the software, architectures, and community expertise to use these
      machines optimally. Emerging advances in multiscale modeling, simulation
      machine learning, data mining, and visualization can be
      exploited at the petascale for future scientific discovery. In particular,
      research focuses on developing data-intensive scalable computing (DISC)
      architectures and algorithms for managing and serving scientific data and computations
      to potentially very large user bases; developing scalable algorithms, data
      mining, and machine learning techniques for analyzing and gaining
      knowledge from massive amounts of data such as those to be gathered
      by the LSST; and developing the tools for doing cutting-edge numerical
      numerical simulations relevant to cosmology.

     Experimental Astrophysics

     From the study of the earliest energy emission in the universe - the
     Cosmic Microwave Background Radiation - to the evolution of galaxies and
     the formation of large-scale structure, Center researchers are part of
     the worldwide scientific effort to determine the basic cosmological
     parameters, investigate the nature of dark matter and dark energy, and to
     describe and understand the evolution of the universe.  Many of these
     parameters are expected to be tied down using data from current and
     planned ground-based and space-based observatories.  Carnegie Mellon has
     joined the collaboration building the Large Synoptic Survey Telescope,
     which will be the premier ground-based survey telescope in the next
     decade.  The analyses of these data sets are very challenging and will
     require both the development of highly sophisticated simulations and the
     application of the latest tools in data-mining, statistics, and computer
     science.  Carnegie Mellon is working as well on partnerships to build the
     Cylinder Radio Telescope to explore the universe, and especially the
     nature of dark energy, using the 21 cm radiation from neutral Hydrogen. 

    Theoretical Astrophysics

    Theoretical astrophysics research carried out at the Center focuses on the
    formation of structure in the universe and the role played by dark matter
    and dark energy. Large scale cosmological simulations are used as a tool
    to investigate the formation of galaxies and the growth and evolution of
    their associated super-massive black holes. The material in between
    galaxies is also an active area of study, as it contains the gas from
    which future stars will form. As we look back in time to the so called
    "dark ages" before the first stars formed, all these topics converge, and
    important roles are played by the first black holes, earliest galaxies and
    intergalactic gas in the re-ionization of the universe. This epoch is just
    beyond the current observational frontier, and theoretical predictions are
    being made for what will be seen, work made possible by the development of
    petascale simulation algorithms and physical modeling at the McWilliams
    Center.  The Center's dedicated computer cluster, Ferrari, and the Moore
    supercomputer shared with Computer Science will be important facilities in
    carrying out this research.

    Theoretical Particle Physics

     The LHC (Large Hadron Collider) will produce collisions of protons at
     energies never before reached. The products of these collisions could
     very well include the dark matter particles that compose twenty-three
     percent of the mass-energy in the universe. Indeed, there are compelling
     arguments that the energies at which the LHC operates are exactly in the
     window to see Weakly Interacting Massive Particles (WIMPs).  The
     remarkably successful Standard Model (SM) of particle physics however,
     does not include WIMPs, or any other realistic dark matter
     candidate. With data soon to come from the LHC, theorists in the Center
     will be part of the world-wide challenge to extend the SM in a way which
     is consistent with both the mathematics of quantum field theory and the
     bounds arising from laboratory experiments and cosmological
     observations. In particular, the discovery of a dark matter candidate
     would allow us to study its properties in a laboratory setting and,
     together with theoretical insights, to develop an underlying theory that
     encompasses both the SM and the new physics that includes dark matter.


     Experimental Particle Physics

    Carnegie Mellon is a member of the international collaboration that built and operates the Compact Muon Solenoid (CMS), one of the two major detectors at the Large Hadron Collider (LHC).  Carnegie Mellon physicists constructed the state-of-the-art electronics, consisting of 150,000 channels, for the end-cap muon detectors of CMS.  While so far unsuccessful at total proton-proton collision energies up to 8 TeV, a prime experimental activity at the LHC as it operates at higher collision energies will be searches for the production of the particles that make up some or all of the dark matter observed in the cosmos. For example, a widely-studied candidate for the dark matter particles is the neutralino, if it is the lightest and most stable of a set of new supersymmetric partners to each of the particles of the Standard Model. Experimental particle physicists in the Center will be searching the CMS data for evidence of the neutralino or other possible dark matter particles, and then studying their properties if they are found. Astroparticle experiments looking for either indirect evidence of the annihilation of dark matter particles in the cosmos or direct evidence of dark matter particles interacting with ordinary matter in cryogenic experiments deep underground are possible areas for future involvement of physicists in the McWilliams Center.

Research Collaborations

      Sloan Digital Sky Survey III

            Carnegie Mellon University joined the Sloan Digital Sky Survey (SDSS-III) as a full institutional member in 2011. Cosmologists in the McWilliams Center have focused on the BOSS (Baryon Oscillation Spectroscopic Survey) portion of SDSS-III.  Its main mission is to measure the distance scale of the Universe and thus constrain cosmic geometry and models for dark energy. The survey has reached its observing goals ahead of schedule, measuring redshifts for 1.5 million Luminous Red Galaxies and collecting 160,000 high redshift quasar spectra (the latter is 50 times the number of quasar spectra in the largest previous survey, SDSS-II. These data were used to map the baryon oscillation scale, which represents a "standard ruler." The enormous number of galaxies at distances up to 9 billion light years away (with redshift z=0.8) have enabled the most precise measurements ever made on these scales: a one percent measure of the Universe described in  CMU scientists led the investigation of the many subtle effects ( that needed to be taken into account to make the BOSS measurement possible.

     The quasar spectra were used to measure the baryon oscillation scale from the intergalactic Lyman-alpha forest of hydrogen lines.  This measurement is at a distance corresponding to a  redshift of z=2.5, which is farther than this technique has ever been used before, and is the first such measurement made in  the epoch before the Universe started to accelerate, as described in  Both techniques show that the expansion history of the Universe is consistent with the cosmological constant dominated cold dark matter paradigm. Alongside these constraints, the survey represents a rich source of data ( that have greatly improved our knowledge and understanding of the formation of galaxies, quasars (, the dark matter distribution (, and the intergalactic medium. (

     Sloan Digital Sky Survey IV

      Carnegie Mellon University joined the 4th round of the Sloan Digital Sky Survey (SDSS-IV) as a full institutional member in 2013. The survey will start in 2014 and run through 2020.  All the observations will be carried out using the 2.5 meter SDSS telescope at Apache Point, New Mexico. 7500 square degrees of the sky are being surveyed over a four-year period, resulting in a catalog that will contain 0.6 million galaxy spectra at redshifts, z, from 0.6 to 1.0, and 750,000 quasar spectra from z = 1 to 3.5.

Cosmologists in the McWilliams Center are focusing on the eBOSS (Extended Baryon Oscillation Spectroscopic Survey) and MaNGA (Mapping Nearby Galaxies at APO) experiments.

eBOSS will achieve the best ever measurements of cosmic expansion to a distance of 12 billion light-years. It will provide the first measurements across the critical epoch between 6.5 and 11 billion light-years, the predicted "onset time" for dark energy.     Using observational data to map the scale of baryon acoustic oscillations, which represents a "standard ruler," the universe's scale will be measured to order 1% at redshifts z = 0.7 and z = 0.9 (using galaxies), z = 1.5 (using quasar spectra) and z = 2.5  (using the forest of Lyman alpha absorption lines seen in the quasar spectra).  eBOSS will also include a massive sample of variable stars selected from time-domain imaging surveys, follow-up on a unique sample of X-ray sources, and create the largest existing sample of accreting supermassive black holes.

MaNGA will uncover the internal structure and formation history of 10,000 galaxies and characterize the diversity of their evolutionary histories. It will contain the largest sample of galaxies ever observed in resolved spectroscopy by more than a factor of ten. With many spectra across the extent of each galaxy to trace its assembly history and dark matter content, MaNGA will provide a uniquely rich, legacy data set.

     Dark Energy Spectroscopic Instrument

        Carnegie Mellon was a participant in the Dark Energy Spectroscopic Instrument (DESI) from the time of one of its predecessors, BigBOSS.  DESI is a ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar spectroscopic survey.  The survey is planned to begin in 2018 using the four-meter Mayall Telescope at Kitt Peak National Observatory and a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm. Using data from imaging surveys that are already underway, spectroscopic targets will be selected that trace the underlying dark matter distribution. The DESI survey should deliver 20 million galaxy spectra that will yield measurements of BAO and the expansion history of the universe to the sub-percent level of accuracy, as well as measurements of the clustering of dark matter at small scales and redshift space distortions  up to z = 1.7.  This would be at least an order of magnitude of improvement over SDSS-III both in terms of the number of galaxies mapped and also the comoving volume of the Universe probed.

     The Large Synoptic Survey Telescope

    Carnegie Mellon joined the Large Synoptic Survey Telescope (LSST) collaboration early in 2008 as a participating institution.  The LSST will be a new kind of telescope that combines a very wide field of view, moving quickly between images, and the ability to observe very faint objects.  Located on a mountain ridge on Cerro Pachon in the Chile, the LSST will take more than 800 panoramic pictures each night and cover the sky twice each week.  The 3.2 Giga-pixel camera will acquire about 20 Terabytes of data each night. Over the ten-year survey in six broad photometric bands starting in the early 2020s, each region of the sky will be covered roughly 2000 times. 

The top priority ground-based project of the 2010 National Academy’s Decadal Review of Astronomy, the LSST will be constructed with funding from the National Science Foundation (NSF) and the Department of Energy of the U.S., plus private funds that have been largely used to construct the primary and tertiary mirrors.  After passing its Preliminary Design Review (PDR) of the NSF in 2011 and Final Design Review (FDR) in 2013, the LSST project officially begins construction in July 2014.  


Euclid is an upcoming (2020s) space-based cosmology survey led by the European Space Agency (ESA) but with recent involvement from NASA in the form of an agreement that allows 40 US scientists to participate as full members of the Euclid survey.  Shirley Ho and Rachel Mandelbaum are among the 40 in that group.  The Euclid mission will carry out both imaging and spectroscopic surveys, with the goal of measuring the growth of cosmological structure with time using a wide-field weak lensing survey, as well as constraining geometry using baryonic acoustic oscillations (BAO). The ultimate goal is to constrain dark energy very precisely, at the level of a Stage-IV dark energy experiment.  Because of the unique capabilities of a space-based telescope - in particular, the high resolution of the imaging and the ability to easily measure at infrared wavelengths - the Euclid mission will also have interesting synergies with some of the ground-based dark energy surveys that will be taking place around that time, such as the LSST survey.


The HyperSuprimeCam (HSC) survey is a project to carry out a wide-field imaging survey at the 8.2m Subaru telescope, based on a collaboration between the Japanese and Taiwanese astronomical communities and Princeton University.  Rachel Mandelbaum is also a member of this collaboration.  HSC is a new instrument that was commissioned at the Subaru telescope in late 2012, and the plan is to use it starting in spring 2014 to carry out a 3-layer survey.  The wide layer of the survey will focus on weak gravitational lensing, which can be used to map out the statistical growth of cosmological structure with time during the past ~7 billion years.    The deep and ultradeep layer will cover smaller volumes but look at the more distant past to study various galaxy populations and AGN.  The project is expected to take 300 nights over a 5-year period, and will provide a bridge to Stage-IV surveys such as the LSST project.