University of Michigan

01/20/2025 | Press release | Distributed by Public on 01/20/2025 09:42

Getting the most out of cosmic maps

Study: How Much Information Can Be Extracted from Galaxy Clustering at the Field Level? (DOI: 10.1103/PhysRevLett.133.221006)

Research led by the University of Michigan could help put cosmology on the inside track to reaching the full potential of telescopes and other instruments studying some of the universe's largest looming questions.

The project showcased how a new computational method gleans more information than its predecessors from maps showing how galaxies are clustered and threaded throughout the universe.

Scientists are currently using tools like DESI, the Dark Energy Spectroscopic Instrument, to generate these maps and dig deeper into the nature of dark energy, dark matter and other cosmic mysteries.

This is a two-dimensional slice of a 3D map of the universe generated by the Dark Energy Spectroscopic Instrument. In the inset, the web-like structure of galaxies is visible. New research from the University of Michigan and the Max Planck Institute for Astrophysics shows a new method of analyzing such maps outperforms the standard methods. Image credit: Claire Lamman/DESI collaboration (Custom colormap package by cmastro)

The dark side of cosmology

Even as DESI makes headlines now, scientists know they will need more advanced tools to find the answers they seek. Some are developing the next generation of instruments like DESI. Minh Nguyen and his colleagues, however, are focusing on optimizing our understanding of the data we're getting now-and in the future.

"As we move to bigger and better telescopes, we might also be throwing away more information," said Nguyen, who helped lead the work as a Leinweber Research Fellow in the U-M Department of Physics. "While we're collecting more data, we can also try to get more out of the data."

Teaming up with colleagues at the Max Planck Institute for Astrophysics, or MPA, Nguyen worked with a computational framework dubbed LEFTfield to upgrade how scientists analyze the large-scale structure of the cosmos.

"In the early universe, the structure was Gaussian-like the static you would see on old TV sets," Nguyen said. "But because of the interplay between dark energy and dark matter, the large-scale structure of the universe today isn't Gaussian anymore. It's more like a spider web."

Dark energy drives the expansion of the universe, but researchers can't directly observe it, hence the "dark" part of its name. The universe's matter works against that expansion with its attractive force of gravity.

[Link]Most of the universe is composed of dark energy and dark matter that scientists can't directly probe, according to the prevailing cosmological model. Studying cosmic maps can provide greater understanding of these mysterious entities. Image credit: Jessie Muir

That matter comes in two distinct varieties: the regular matter that we can observe and interact with, and the dark matter that we can't-again, hence the "dark" part.

Adding to the intrigue is the fact that the overwhelming majority of the universe's mass and energy balance is tied up in these mysterious dark entities. Studying maps of the universe can thus open new windows to probe the dark energy and dark matter largely responsible for its weblike structure.

With LEFTfield, Nguyen and his colleagues showed they can extract even more information from existing cosmic maps. They published their study in the journal Physical Review Letters, which also earned a 2024 Buchalter Cosmology Prize.

To get that extra information, the team didn't bolster the existing standard methods, which have been very valuable. Rather, they took a fundamentally different approach.

Out of LEFTfield

The key difference is in how LEFTfield sees data compared with standard approaches.

"With a standard analysis, you basically cannot work with the data as is. People have to compress it," Nguyen said. "That reduces the complexity of the analysis and makes it easier to make theoretical predictions, but the trade-off is you lose some information."

For the standard analysis, researchers use computational models that move through the galaxies, grouping them into pairs or triplets, to make statistical measurements and calculations more efficient.

This works very well for the more Gaussian features of the universe, Nguyen said. But he and his colleagues saw an opportunity to push the understanding of our non-Gaussian universe further by keeping information the standard methods omit through compression.

The new approach, which is also called field-level inference, treats cosmic maps as 3D grids. Then each constituent cube or voxel-the 3D counterpart of a pixel-becomes a working element of data, containing uncompressed information about the distribution and density of galaxies inside.

[Link]In standard methods for analyzing cosmic maps, galaxies are grouped into pairs or triplets for calculations, which compresses information. In the new method, studied by the University of Michigan and the Max Planck Institute for Astrophysics, the map is instead divided into a 3D, represented in two-dimensions on the right side of the figure. This lets researchers analyze the map as-is, without compressing or losing data. Image credit: The Millennium Simulation Project/MPA

This preserves the fidelity of the data in a way that's inaccessible to the standard methods, Nguyen said.

"I love the idea of field-level inference because it is, in principle, the actual thing we want to do," said Shaun Hotchkiss, host of the online seminar series, Cosmology Talks. The series recently featured Nguyen and his co-author Beatriz Tucci, a doctoral student at MPA.

"If we've measured the density field, why compress the information inside of it?" Hotchkiss said. "Of course, field-level inference is therefore more difficult to do, but this hasn't stopped Bea and Minh, and shouldn't stop the community."

To benchmark the performance of LEFTfield, the team calculated a cosmological parameter called sigma-8, which essentially measures the clumpiness of the universe, Nguyen explained.

Compared with standard approaches, the team's LEFTfield method could improve the sigma-8 determination by a factor of 3.5 to 5.2.

"That's like going from DESI to the successor of DESI," Nguyen said. "Typically, moving between two generations of surveys would take 10 to 20 years."

Before making that leap forward, though, there is still work to do. A vital hurdle to clear will be integrating LEFTfield with specific instruments and making sure it understands how noise and idiosyncrasies of the tools impact data as it comes in, Nguyen said.

Still, he believes the approach will prove to be a powerful asset.

"It really opens the fast track to get insights into dark energy, dark matter and general relativity-the theory that this is all based on," Nguyen said.

The research team also included Fabian Schmidt, a cosmologist and group leader at MPA, along with staff scientist Martin Reinecke and Andrija Kostić, who worked on the project as a Ph.D. student then postdoctoral researcher.

Nguyen recently completed his fellowship at U-M and is now a research fellow at the Kavli Institute for the Physics and Mathematics of the Universe in Tokyo.