Astrophysicists reveal largest-ever suite of universe simulations

A snapshot of one of the AbacusSummit simulations, measuring 10 billion light years across. Credit: The AbacusSummit Team
A new set of cosmological simulations, which collectively totals nearly 60 trillion particles is the largest ever released, is undoubtedly the most impressive.

Its creators believe that the simulation suite, AbacusSummit will help to extract secrets from the universe in upcoming cosmos surveys. AbacusSummit was presented in several papers published in Monthly Notices of the Royal Astronomical Society, October 25, 2005.

Harvard & Smithsonian. It is a collection of over 160 simulations that simulate how gravity causes particles to move around in a box-shaped universe. These simulations are known as N-body models. They simulate the behavior of dark matter which is the majority of the universe's materials and interacts only through gravity.

Lehman Garrison (CCA research fellow) says that "this suite is so large that it probably contains more particles than all of the other N-body sims that have been run together though that's hard to be certain of."

There are hundreds of simulations that show how gravity has affected the distribution of dark matter in the universe. The AbacusSummit suite includes hundreds of simulations. A snapshot of one simulation is shown here at three zoom levels: 10 billion light years across, 1.2 million light-years wide, and 100,000,000 light-years wide. This simulation reproduces large-scale structures in our universe such as the cosmic web, colossal galaxies clusters, and the cosmic web. Credit: The AbacusSummit Team; layout and design by Lucy Reading-Ikkanda/Simons Foundation

Along with Nina Maksimova, a graduate student, and Daniel Eisenstein (astronomy professor), Garrison developed the AbacusSummit simulations. The simulations were run on the Summit supercomputer of the U.S. Department of Energy at the Oak Ridge Leadership Computing Facility, Tennessee.

AbacusSummit is soon going to be useful, as many surveys will produce maps that will show the cosmos in unprecedented detail over the next few years. These include the Dark Energy Spectroscopic Instrument, the Nancy Grace Roman Space Telescope, and the Euclid spacecraft. These big-budget missions have one goal: to improve our understanding of the cosmic and other astrophysical parameters that govern how the universe behaves, and how it looks.

Scientists will use the improved estimates to compare the new observations with computer simulations of universe, which have different values for various parameters. Garrison states that the next-generation survey will provide better simulations.

He says, "The galaxy surveys provide extremely detailed maps of our universe. We need simulations that can cover as many possible universes as we may live in." "AbacusSummit has been the first such suite to have the same breadth and fidelity as these incredible observations.

Abacus uses parallel computer processing to dramatically speed up its calculations about how particles move around due to their gravitational attraction. The sequential processing approach (top), computes the gravitational pull between each pair of particles individually. Parallel processing (bottom), instead, divides the work across multiple computing centers. This allows for multiple particle interactions to be calculated simultaneously. Credit: Lucy Reading-Ikkanda/Simons Foundation

It was a daunting task. Since Isaac Newton's time, N-body calculations, which attempt to calculate the gravitational interactions of objects like planets, have been a major challenge in physics. These calculations are difficult because every object interacts with each other, regardless of how far apart they may be. This means that the number of interactions increases rapidly as you add objects.

There is no universal solution to the N-body problem when there are three or more large bodies. These calculations are only approximates. Common approaches include freezing time and calculating the total force that each object experiences. Then, nudge each object based on its net force. The process is repeated, with time moving slightly forward.

This approach allowed AbacusSummit to handle colossal amounts of particles using clever code, a new mathematical method, and lots of computing power. At the time of the calculations, the Summit supercomputer was the fastest in the world.

Their codebase, called Abacus, was designed by the team to take advantage of Summit's parallel processing power. This allows multiple calculations to run simultaneously. Summit has many graphics processing units (or GPUs) that excel in parallel processing.

A snapshot of one of the AbacusSummit simulations measuring 1.2 million light-years across. Credit: The AbacusSummit Team

Because a large simulation will require a lot of memory to store, parallel processing is required for N-body calculations. Abacus cannot simply make copies of the simulation to be used by different supercomputer nodes. Instead, the code divides each simulation into grids. The initial calculation gives a good approximation to the effects of distant particles at any point in the simulation. The role of distant particles is much less than that of nearby particles. Abacus then divides nearby cells so that each group can be worked on independently. This allows the computer to combine the approximation and calculations of distant particles.

The researchers discovered that Abacus' approach is significantly better than other N-body codebases for large simulations. These codes base simulations on irregular distributions of particles and divide them in a way that makes it difficult to perform large simulations. Researchers report that AbacusSummit's uniform divisions make it easier to use parallel processing. Abacus' grid method allows for large amounts of distant-particle approximation computations to be done before the simulation starts.

Abacus's design allows it to update 70 million particles per second at Summit supercomputer nodes. Each particle is a clump or dark matter, with a mass 3 billion times that of the sun. The code can analyze a simulation in real time, looking for dark matter patches that could be indicative of bright star-forming galaxies.

Garrison says, "Our vision was that we would create the code to deliver simulations for this specific brand of galaxy surveys." "We created the code to perform the simulations faster and more accurately than ever before."

A snapshot of 100 million light-years across from one of the AbacusSummit simulations. Credit: The AbacusSummit Team

Eisenstein is a member the Dark Energy Spectroscopic Instrument Collaboration. It recently started its survey to map a remarkable fraction of the universe. He says he is excited to use Abacus later in life.

He says that cosmology is moving forward due to the multidisciplinary fusion between spectacular observations and state of-the-art computer technology. "The next decade promises to be a remarkable age in our study on the historical sweeps of the universe."

Sihan Yuan of Stanford University and Philip Pinto of Arizona are co-creators of AbacusSummit. Sownak Bose of Durham University and Center for Astrophysics researchers Boryana Hadzhiyska and Thomas Satterthwaite were also involved in the creation of Abacus. Simulations were run on Summit supercomputer under the Advanced Scientific Computing Research Leadership Computing Challenge allocation.

Explore more Largest virtual universe available for everyone to explore

Simons Foundation