Astrophysicists Reveal Largest-Ever Suite of Universe Simulations – How Gravity Shaped the Distribution of Dark Matter - SciTechDaily | Canada News Media
Connect with us

Science

Astrophysicists Reveal Largest-Ever Suite of Universe Simulations – How Gravity Shaped the Distribution of Dark Matter – SciTechDaily

Published

 on


To understand how the universe formed, astronomers have created AbacusSummit, more than 160 simulations of how gravity may have shaped the distribution of dark matter.

Collectively clocking in at nearly 60 trillion particles, a newly released set of cosmological simulations is by far the biggest ever produced.

The simulation suite, dubbed AbacusSummit, will be instrumental for extracting secrets of the universe from upcoming surveys of the cosmos, its creators predict. They present AbacusSummit in several recently published papers in the Monthly Notices of the Royal Astronomical Society.

AbacusSummit is the product of researchers at the Flatiron Institute’s Center for Computational Astrophysics (CCA) in New York City and the Center for Astrophysics | Harvard & Smithsonian. Made up of more than 160 simulations, it models how particles in the universe move about due to their gravitational attraction. Such models, known as N-body simulations, capture the behavior of the dark matter, a mysterious and invisible force that makes up 27 percent of the universe and interacts only via gravity.

The AbacusSummit suite comprises hundreds of simulations of how gravity shaped the distribution of dark matter throughout the universe. Here, a snapshot of one of the simulations is shown at a zoom scale of 1.2 billion light-years across. The simulation replicates the large-scale structures of our universe, such as the cosmic web and colossal clusters of galaxies. Credit: The AbacusSummit Team; layout and design by Lucy Reading-Ikkanda

“This suite is so big that it probably has more particles than all the other N-body simulations that have ever been run combined — though that’s a hard statement to be certain of,” says Lehman Garrison, lead author of one of the new papers and a CCA research fellow.

Garrison led the development of the AbacusSummit simulations along with graduate student Nina Maksimova and professor of astronomy Daniel Eisenstein, both of the Center for Astrophysics. The simulations ran on the U.S. Department of Energy’s Summit supercomputer at the Oak Ridge Leadership Computing Facility in Tennessee.

Several space surveys will produce maps of the cosmos with unprecedented detail in the coming years. These include the Dark Energy Spectroscopic Instrument (<span aria-describedby="tt" class="glossaryLink" data-cmtooltip="

DESI
The Dark Energy Spectroscopic Instrument (DESI) is a new instrument for conducting a spectrographic survey of distant galaxies that has been retrofitted onto the Mayall Telescope on top of Kitt Peak in the Sonoran Desert 55 miles distant from Tucson, Arizona. Its main components are a focal plane containing 5000 fiber-positioning robots and a bank of spectrographs which are fed by the fibers. It enables an experiment to probe the expansion history of the Universe and the mysterious physics of dark energy.

“>DESI), the Nancy Grace Roman Space Telescope, the Vera C. Rubin Observatory and the Euclid spacecraft. One of the goals of these big-budget missions is to improve estimations of the cosmic and astrophysical parameters that determine how the universe behaves and how it looks.

Scientists will make those improved estimations by comparing the new observations to computer simulations of the universe with different values for the various parameters — such as the nature of the dark energy pulling the universe apart.

Abacus leverages parallel computer processing to drastically speed up its calculations of how particles move about due to their gravitational attraction. A sequential processing approach (top) computes the gravitational tug between each pair of particles one by one. Parallel processing (bottom) instead divides the work across multiple computing cores, enabling the calculation of multiple particle interactions simultaneously. Credit: Lucy Reading-Ikkanda/Simons Foundation

“The coming generation of cosmological surveys will map the universe in great detail and explore a wide range of cosmological questions,” says Eisenstein, a co-author on the new MNRAS papers. “But leveraging this opportunity requires a new generation of ambitious numerical simulations. We believe that AbacusSummit will be a bold step for the synergy between computation and experiment.”

The decade-long project was daunting. N-body calculations — which attempt to compute the movements of objects, like planets, interacting gravitationally — have been a foremost challenge in the field of physics since the days of Isaac Newton. The trickiness comes from each object interacting with every other object, no matter how far away. That means that as you add more things, the number of interactions rapidly increases.

There is no general solution to the N-body problem for three or more massive bodies. The calculations available are simply approximations. A common approach is to freeze time, calculate the total force acting on each object, then nudge each one based on the net force it experiences. Time is then moved forward slightly, and the process repeats.

Using that approach, AbacusSummit handled colossal numbers of particles thanks to clever code, a new numerical method and lots of computing power. The Summit supercomputer was the world’s fastest at the time the team ran the calculations; it is still the fastest computer in the U.S.

The team designed the codebase for AbacusSummit — called Abacus — to take full advantage of Summit’s parallel processing power, whereby multiple calculations can run simultaneously. In particular, Summit boasts lots of graphical processing units, or GPUs, that excel at parallel processing.

Running N-body calculations using parallel processing requires careful algorithm design because an entire simulation requires a substantial amount of memory to store. That means Abacus can’t just make copies of the simulation for different nodes of the supercomputer to work on. The code instead divides each simulation into a grid. An initial calculation provides a fair approximation of the effects of distant particles at any given point in the simulation (which play a much smaller role than nearby particles). Abacus then groups nearby cells and splits them off so that the computer can work on each group independently, combining the approximation of distant particles with precise calculations of nearby particles.

“The Abacus algorithm is well matched to the capabilities of modern supercomputers, as it provides a very regular pattern of computation for the massive parallelism of GPU co-processors,” Maksimova says.

Thanks to its design, Abacus achieved very high speeds, updating 70 million particles per second per node of the Summit supercomputer, while also performing analysis of the simulations as they ran. Each particle represents a clump of dark matter with 3 billion times the mass of the sun.

“Our vision was to create this code to deliver the simulations that are needed for this particular new brand of galaxy survey,” says Garrison. “We wrote the code to do the simulations much faster and much more accurate than ever before.”

Eisenstein, who is a member of the DESI collaboration — which recently began its survey to map an unprecedented fraction of the universe — says he is eager to use Abacus in the future.

“Cosmology is leaping forward because of the multidisciplinary fusion of spectacular observations and state-of-the-art computing,” he says. “The coming decade promises to be a marvelous age in our study of the historical sweep of the universe.”

Reference: “AbacusSummit: a massive set of high-<span aria-describedby="tt" class="glossaryLink" data-cmtooltip="

accuracy
How close the measured value conforms to the correct value.

“>accuracy, high-resolution N-body simulations” by Nina A Maksimova, Lehman H Garrison, Daniel J Eisenstein, Boryana Hadzhiyska, Sownak Bose and Thomas P Satterthwaite, 7 September 2021, Monthly Notices of the Royal Astronomical Society.
DOI: 10.1093/mnras/stab2484

Additional co-creators of Abacus and AbacusSummit include Sihan Yuan of Stanford University, Philip Pinto of the University of Arizona, Sownak Bose of Durham University in England and Center for Astrophysics researchers Boryana Hadzhiyska, Thomas Satterthwaite and Douglas Ferrer. The simulations ran on the Summit supercomputer under an Advanced Scientific Computing Research Leadership Computing Challenge allocation.

Adblock test (Why?)



Source link

Continue Reading

News

Here’s how Helene and other storms dumped a whopping 40 trillion gallons of rain on the South

Published

 on

 

More than 40 trillion gallons of rain drenched the Southeast United States in the last week from Hurricane Helene and a run-of-the-mill rainstorm that sloshed in ahead of it — an unheard of amount of water that has stunned experts.

That’s enough to fill the Dallas Cowboys’ stadium 51,000 times, or Lake Tahoe just once. If it was concentrated just on the state of North Carolina that much water would be 3.5 feet deep (more than 1 meter). It’s enough to fill more than 60 million Olympic-size swimming pools.

“That’s an astronomical amount of precipitation,” said Ed Clark, head of the National Oceanic and Atmospheric Administration’s National Water Center in Tuscaloosa, Alabama. “I have not seen something in my 25 years of working at the weather service that is this geographically large of an extent and the sheer volume of water that fell from the sky.”

The flood damage from the rain is apocalyptic, meteorologists said. More than 100 people are dead, according to officials.

Private meteorologist Ryan Maue, a former NOAA chief scientist, calculated the amount of rain, using precipitation measurements made in 2.5-mile-by-2.5 mile grids as measured by satellites and ground observations. He came up with 40 trillion gallons through Sunday for the eastern United States, with 20 trillion gallons of that hitting just Georgia, Tennessee, the Carolinas and Florida from Hurricane Helene.

Clark did the calculations independently and said the 40 trillion gallon figure (151 trillion liters) is about right and, if anything, conservative. Maue said maybe 1 to 2 trillion more gallons of rain had fallen, much if it in Virginia, since his calculations.

Clark, who spends much of his work on issues of shrinking western water supplies, said to put the amount of rain in perspective, it’s more than twice the combined amount of water stored by two key Colorado River basin reservoirs: Lake Powell and Lake Mead.

Several meteorologists said this was a combination of two, maybe three storm systems. Before Helene struck, rain had fallen heavily for days because a low pressure system had “cut off” from the jet stream — which moves weather systems along west to east — and stalled over the Southeast. That funneled plenty of warm water from the Gulf of Mexico. And a storm that fell just short of named status parked along North Carolina’s Atlantic coast, dumping as much as 20 inches of rain, said North Carolina state climatologist Kathie Dello.

Then add Helene, one of the largest storms in the last couple decades and one that held plenty of rain because it was young and moved fast before it hit the Appalachians, said University of Albany hurricane expert Kristen Corbosiero.

“It was not just a perfect storm, but it was a combination of multiple storms that that led to the enormous amount of rain,” Maue said. “That collected at high elevation, we’re talking 3,000 to 6000 feet. And when you drop trillions of gallons on a mountain, that has to go down.”

The fact that these storms hit the mountains made everything worse, and not just because of runoff. The interaction between the mountains and the storm systems wrings more moisture out of the air, Clark, Maue and Corbosiero said.

North Carolina weather officials said their top measurement total was 31.33 inches in the tiny town of Busick. Mount Mitchell also got more than 2 feet of rainfall.

Before 2017’s Hurricane Harvey, “I said to our colleagues, you know, I never thought in my career that we would measure rainfall in feet,” Clark said. “And after Harvey, Florence, the more isolated events in eastern Kentucky, portions of South Dakota. We’re seeing events year in and year out where we are measuring rainfall in feet.”

Storms are getting wetter as the climate change s, said Corbosiero and Dello. A basic law of physics says the air holds nearly 4% more moisture for every degree Fahrenheit warmer (7% for every degree Celsius) and the world has warmed more than 2 degrees (1.2 degrees Celsius) since pre-industrial times.

Corbosiero said meteorologists are vigorously debating how much of Helene is due to worsening climate change and how much is random.

For Dello, the “fingerprints of climate change” were clear.

“We’ve seen tropical storm impacts in western North Carolina. But these storms are wetter and these storms are warmer. And there would have been a time when a tropical storm would have been heading toward North Carolina and would have caused some rain and some damage, but not apocalyptic destruction. ”

___

Follow AP’s climate coverage at https://apnews.com/hub/climate

___

Follow Seth Borenstein on Twitter at @borenbears

___

Associated Press climate and environmental coverage receives support from several private foundations. See more about AP’s climate initiative here. The AP is solely responsible for all content.

Source link

Continue Reading

Science

‘Big Sam’: Paleontologists unearth giant skull of Pachyrhinosaurus in Alberta

Published

 on

 

It’s a dinosaur that roamed Alberta’s badlands more than 70 million years ago, sporting a big, bumpy, bony head the size of a baby elephant.

On Wednesday, paleontologists near Grande Prairie pulled its 272-kilogram skull from the ground.

They call it “Big Sam.”

The adult Pachyrhinosaurus is the second plant-eating dinosaur to be unearthed from a dense bonebed belonging to a herd that died together on the edge of a valley that now sits 450 kilometres northwest of Edmonton.

It didn’t die alone.

“We have hundreds of juvenile bones in the bonebed, so we know that there are many babies and some adults among all of the big adults,” Emily Bamforth, a paleontologist with the nearby Philip J. Currie Dinosaur Museum, said in an interview on the way to the dig site.

She described the horned Pachyrhinosaurus as “the smaller, older cousin of the triceratops.”

“This species of dinosaur is endemic to the Grand Prairie area, so it’s found here and nowhere else in the world. They are … kind of about the size of an Indian elephant and a rhino,” she added.

The head alone, she said, is about the size of a baby elephant.

The discovery was a long time coming.

The bonebed was first discovered by a high school teacher out for a walk about 50 years ago. It took the teacher a decade to get anyone from southern Alberta to come to take a look.

“At the time, sort of in the ’70s and ’80s, paleontology in northern Alberta was virtually unknown,” said Bamforth.

When paleontogists eventually got to the site, Bamforth said, they learned “it’s actually one of the densest dinosaur bonebeds in North America.”

“It contains about 100 to 300 bones per square metre,” she said.

Paleontologists have been at the site sporadically ever since, combing through bones belonging to turtles, dinosaurs and lizards. Sixteen years ago, they discovered a large skull of an approximately 30-year-old Pachyrhinosaurus, which is now at the museum.

About a year ago, they found the second adult: Big Sam.

Bamforth said both dinosaurs are believed to have been the elders in the herd.

“Their distinguishing feature is that, instead of having a horn on their nose like a triceratops, they had this big, bony bump called a boss. And they have big, bony bumps over their eyes as well,” she said.

“It makes them look a little strange. It’s the one dinosaur that if you find it, it’s the only possible thing it can be.”

The genders of the two adults are unknown.

Bamforth said the extraction was difficult because Big Sam was intertwined in a cluster of about 300 other bones.

The skull was found upside down, “as if the animal was lying on its back,” but was well preserved, she said.

She said the excavation process involved putting plaster on the skull and wooden planks around if for stability. From there, it was lifted out — very carefully — with a crane, and was to be shipped on a trolley to the museum for study.

“I have extracted skulls in the past. This is probably the biggest one I’ve ever done though,” said Bamforth.

“It’s pretty exciting.”

This report by The Canadian Press was first published Sept. 25, 2024.

The Canadian Press. All rights reserved.

Source link

Continue Reading

News

The ancient jar smashed by a 4-year-old is back on display at an Israeli museum after repair

Published

 on

 

TEL AVIV, Israel (AP) — A rare Bronze-Era jar accidentally smashed by a 4-year-old visiting a museum was back on display Wednesday after restoration experts were able to carefully piece the artifact back together.

Last month, a family from northern Israel was visiting the museum when their youngest son tipped over the jar, which smashed into pieces.

Alex Geller, the boy’s father, said his son — the youngest of three — is exceptionally curious, and that the moment he heard the crash, “please let that not be my child” was the first thought that raced through his head.

The jar has been on display at the Hecht Museum in Haifa for 35 years. It was one of the only containers of its size and from that period still complete when it was discovered.

The Bronze Age jar is one of many artifacts exhibited out in the open, part of the Hecht Museum’s vision of letting visitors explore history without glass barriers, said Inbal Rivlin, the director of the museum, which is associated with Haifa University in northern Israel.

It was likely used to hold wine or oil, and dates back to between 2200 and 1500 B.C.

Rivlin and the museum decided to turn the moment, which captured international attention, into a teaching moment, inviting the Geller family back for a special visit and hands-on activity to illustrate the restoration process.

Rivlin added that the incident provided a welcome distraction from the ongoing war in Gaza. “Well, he’s just a kid. So I think that somehow it touches the heart of the people in Israel and around the world,“ said Rivlin.

Roee Shafir, a restoration expert at the museum, said the repairs would be fairly simple, as the pieces were from a single, complete jar. Archaeologists often face the more daunting task of sifting through piles of shards from multiple objects and trying to piece them together.

Experts used 3D technology, hi-resolution videos, and special glue to painstakingly reconstruct the large jar.

Less than two weeks after it broke, the jar went back on display at the museum. The gluing process left small hairline cracks, and a few pieces are missing, but the jar’s impressive size remains.

The only noticeable difference in the exhibit was a new sign reading “please don’t touch.”

The Canadian Press. All rights reserved.

Source link

Continue Reading

Trending

Exit mobile version