Proving the universe seems like a gargantuan task, but we might have a chance to do so with exascale computers. As the National Academy of Sciences notes, the future of computing may be in exascale computers, with the capability of doing one quintillion operations per second. The US Department of Energy's exascale computing project intends to use exascale computing to verify the reality of nature that we see around us. This description sounds a bit sci-fi for most people, but the idea seems to be sound. By using Hybrid Accelerated Cosmology Code (HACC) and adaptive mesh refinement cosmology code (Nyx) on GPU-accelerated hardware, the assumption is that we can simulate reality. Doing so allows us to see the underlying forces of nature that shape the world around us in action and trace the universe back to its starting point.
ExaSky and Potential New Physics
The team tasked with investigating the nature of the universe using these exascale computers is known as ExaSky. Their mandate is to figure out what exactly is causing the universe to expand at such a rate. The expansion of the universe is nothing new. Live Science reminds us that Edwin Hubble discovered the expansion of the universe way back in 1929. The thing that Hubble (and generations of scientists since him) failed to account for was the force causing this expansion. If you ask any scientist in the field today, they'll probably tell you that "dark energy" causes that expansion, but the truth is that no one knows what this "dark energy" is.
Cosmology has advanced from its early days, but this particular point of contention remains. ExaSky may actually have the hardware to attempt to solve this problem and give us an honest answer to the question of what dark energy really is. As with most things, there exist a few discrepancies between how we observe the elements of the cosmos. On the one hand, these discrepancies can be problems in our measurement methods. However, more exciting is if these discrepancies are doorways to revolutionary discoveries in physics. According to Scientific American, the disparity between different measurements of how fast the universe is expanding is known as Hubble Tension. But what exactly can Hubble Tension tell us about the underlying structure of the universe? If discoveries are lurking under the surface, they could translate into a shift in the understanding of general relativity at large distances and even impact how we view the Standard Model of particle physics.
Looking Back in Time
Simulations through an exascale computer would allow us to examine many things we assume about the universe's birth and early life. The density of energy within the universe tends to increase as one goes back in time. Using this basic premise, scientists can examine the fundamental properties of particles, such as the mass of a neutrino. However, ExaSky isn't limited to simply looking at the fundamental formation of primary particles in the universe. They might even be able to pin down one of the most elusive things that physics has wondered about for years - the nature of dark matter and the primordial fluctuations within the background radiation left over from the formation of the universe. This information has the potential to unlock many doors about the nature of the universe and its composition.
Tying Simulation to Observed Data
Having a simulation is fine if it doesn't have to link back to any real-world data. The real challenge is ensuring that the simulation performs in an expected way when real-world data is used. ExaSky wants to have a picture of the universe linked back to what we see today. For this to become a reality, the team at ExaSky needs to institute strict guidelines on their observational data. Currently, ExaSky's team is preparing their system to compare models against real-world survey data. This process will take some time, as it requires the system to "learn" through challenges. The result is supposed to be a small number of immense cosmological simulations that focus on a few critical scientific questions. Because of the sheer size of the computation required, exascale-level computing is a must if this project is to deliver any results.
How Far Along Are They?
The ExaSky team has progressed quite well in the time they've been at this project. Initially, the team used HACC and Nyx on older GPUs, but the newest iteration has been upgraded to work alongside GPUs from Intel, Nvidia, and AMD. The coding to bring things to this level involved porting the original processing across and using a portability translation layer for the AMD GPUs. For Intel, the coding was done to bring it as close as possible to the actual processor, with little abstraction. The GPUs are the perfect candidate for this project since they allow near-perfect floating-point performance with their register memory. Unfortunately, if the value overflows, that data is shunted to slower overflow memory, resulting in a significant performance slowdown.
The team's latest advancements include incorporating gas physics and subgrid models inside its code. They've also added advanced software to analyze the data generated in the simulation. As systems become available, ExaSky will have hardware that can handle their code and offer them insight into things that need improvement. Until such time, the team intends to add new physics interactions to the model, bringing it closer to a simulation of reality. It might not be enough to simulate the kind of sound from pro headphones (see website), but it’s exciting in its own way.
Comments