← Back to Featured Research Archive

Mapping the Universe with HPC

Professor Gus Evrard demonstrates the Bernoulli Principle

Anyone who has ever imagined being in deep outer space will soon have a map to aid that visual journey.  Professor of Physics and Astronomy Gus Evrard and his colleagues are using high-performance computing (HPC) in their work to generate galaxy catalogs covering most of the entire visible universe.  ”We live in a finite universe,” says Evrard.  “There are a countable number of galaxies like our Milky Way – about 100 billion – and we have the technology to find all of them in the sky.  They’re not distributed uniformly, but instead cluster to form local, connected structures – the great peaks and connecting ridges of an immense cosmic terrain.”

To a non-physicist the whole observable universe may sound like infinity, but for cosmologists like Evrard it’s starting to feel small.  “We live in an age of great exploration, an extension of spirit of the great European mariners to the largest scales possible.  In our lifetimes, we will map the cartography of the entire universe.  But these maps have a time dimension, since more distant structures are seen earlier, so we get to do what archeologists do, piecing together the evidence to learn how these mountains formed.  Soon we’ll have a complete narrative to all of this; right now it’s patchy and incomplete, but the capability to fill in the gaps is there.

Cosmologists have determined that 74% of the universe is in the form of dark energy.  The leading speculation is that dark energy is associated with the vacuum, the energy cost of space itself. The Dark Energy Survey (DES) will be the first survey to enable stringent tests of whether or not the cost of space varies over cosmic time.

Synthetic sky image from the Dark Energy Survey

Computational modeling was just becoming available in the late 1980s, when astrophysicists were focusing on how galaxies form.  Since then, computing power has exploded.  Evrard recalls that his first simulation in 1988 contained 4096 particles, whereas his team’s simulations follow tens of billions of particles today.  He and his colleagues use XSEDE (formerly TeraGrid) supercomputers to perform large dynamical simulations, and the U-M shared cluster, Flux, for post-processing and other calculations. “We need HPC to interpret the data,” says Evrard. “Without computation, we wouldn’t be able to do our science.  The large amount of data we’ll get has enormous discovery potential, but unlocking its secrets requires very precise expectations for what will be observed in a variety of model scenarios.  Sophisticated simulations offer such expectations.”

Evrard says that there is broad appeal to understanding where we came from, and scientists have established a compelling creation story known as inflationary Big Bang cosmology.  ”There is an enormity of scale in what we do, but the entire universe that we can observe once fit within a single atom long ago.”  Evrard sees this as reassuring, in that science has illuminated a shared cosmic womb.  In the same way that astronauts lifted the view of humanity above the Earth, cosmologists reach for a view much farther up.  Says Evrard, “Our job is to look out and report back — stay tuned.”