The Coma Cluster is a dense collection of almost 1000 galaxies located in the nearby universe. There are problems with estimating the mass of something so large and dense. Sources of error bias the mass higher or lower in the initial measurement.
A team led by Carnegie Mellon University physicists has developed a deep-learning method that accurately estimates the mass of the Coma Cluster and is able to mitigate the sources of error.
People have been estimating the Coma Cluster for a long time. Matthew Ho, a fifth-year graduate student in the Department of Physics' McWilliams Center for Cosmology, said that by showing that our machine- learning methods are consistent with previous mass estimates, we are building trust in these new, very powerful methods.
Machine-learning methods can be used to find patterns in complex data, but they have only gained a foothold in the last 10 years. Since it's hard to understand the inner workings of a machine-learning model, can they be trusted to do what they're designed to do? The research was published in Nature Astronomy.
To calculate the mass of the Coma Cluster, Zwicky and others used a dynamical mass measurement in which they studied the motion of objects in and around the cluster and then used their understanding of gravity to estimate the cluster's mass. This measurement is susceptible to a lot of mistakes. There is a huge web of matter distributed throughout the universe and the clusters are constantly colliding and merging with each other. There are a lot of things in between that can look like they are part of the galaxy cluster, which can affect the mass measurement. According to Ho, machine learning-based methods offer an innovative data-driven approach to quantifying and accounting for errors.
Ho said that their deep-learning method learns from real data what are useful measurements and what are not. Data-driven methods make our predictions better.
"Standard machine learning approaches tend to yield results without any uncertainties, which is one of the major drawbacks of them," said Associate Professor of physics. We can quantify the uncertainty in our results by using our method.
The method developed by Ho and his colleagues is based on the use of a machine-learning tool called a convolutional neural network. Researchers fed their model data from simulations of the universe. The model was learing from the observable characteristics of thousands of galaxy clusters. Ho applied the model's handling of simulation data to a real system, whose true mass is unknown. Most of the mass estimates made since the 1980s are in line with Ho's method. This is the first time that this specific machine-learning methodology has been applied.
The model's predictions on Coma are important to build reliability of machine-learning models. A more thorough check of our method is currently underway. The results are promising and will help us apply our method to more data.
When large-scale surveys such as the Dark Energy Spectroscopic Instrument, the Vera C. Rubin Observatory and Euclid start releasing huge amounts of data, models such as these are going to be critical.
We're going to have a big data flow soon. That's big." It's not possible for humans to do that on their own. If we're going to process this huge data flow from these new surveys, we need models that are efficient and robust so that we don't have to worry about errors. Machine learning is being used to improve our analyses and speed them up.
More information: Matthew Ho et al, The dynamical mass of the Coma cluster from deep learning, Nature Astronomy (2022). DOI: 10.1038/s41550-022-01711-1 Journal information: Nature Astronomy