Connect with us

Science

Astrophysicists Reveal Largest-Ever Suite of Universe Simulations – How Gravity Shaped the Distribution of Dark Matter – SciTechDaily

Published

 on


To understand how the universe formed, astronomers have created AbacusSummit, more than 160 simulations of how gravity may have shaped the distribution of dark matter.

Collectively clocking in at nearly 60 trillion particles, a newly released set of cosmological simulations is by far the biggest ever produced.

The simulation suite, dubbed AbacusSummit, will be instrumental for extracting secrets of the universe from upcoming surveys of the cosmos, its creators predict. They present AbacusSummit in several recently published papers in the Monthly Notices of the Royal Astronomical Society.

AbacusSummit is the product of researchers at the Flatiron Institute’s Center for Computational Astrophysics (CCA) in New York City and the Center for Astrophysics | Harvard & Smithsonian. Made up of more than 160 simulations, it models how particles in the universe move about due to their gravitational attraction. Such models, known as N-body simulations, capture the behavior of the dark matter, a mysterious and invisible force that makes up 27 percent of the universe and interacts only via gravity.

How Gravity Shaped the Distribution of Dark Matter

The AbacusSummit suite comprises hundreds of simulations of how gravity shaped the distribution of dark matter throughout the universe. Here, a snapshot of one of the simulations is shown at a zoom scale of 1.2 billion light-years across. The simulation replicates the large-scale structures of our universe, such as the cosmic web and colossal clusters of galaxies. Credit: The AbacusSummit Team; layout and design by Lucy Reading-Ikkanda

“This suite is so big that it probably has more particles than all the other N-body simulations that have ever been run combined — though that’s a hard statement to be certain of,” says Lehman Garrison, lead author of one of the new papers and a CCA research fellow.

Garrison led the development of the AbacusSummit simulations along with graduate student Nina Maksimova and professor of astronomy Daniel Eisenstein, both of the Center for Astrophysics. The simulations ran on the U.S. Department of Energy’s Summit supercomputer at the Oak Ridge Leadership Computing Facility in Tennessee.

Several space surveys will produce maps of the cosmos with unprecedented detail in the coming years. These include the Dark Energy Spectroscopic Instrument (<span aria-describedby="tt" class="glossaryLink" data-cmtooltip="

DESI
The Dark Energy Spectroscopic Instrument (DESI) is a new instrument for conducting a spectrographic survey of distant galaxies that has been retrofitted onto the Mayall Telescope on top of Kitt Peak in the Sonoran Desert 55 miles distant from Tucson, Arizona. Its main components are a focal plane containing 5000 fiber-positioning robots and a bank of spectrographs which are fed by the fibers. It enables an experiment to probe the expansion history of the Universe and the mysterious physics of dark energy.

“>DESI), the Nancy Grace Roman Space Telescope, the Vera C. Rubin Observatory and the Euclid spacecraft. One of the goals of these big-budget missions is to improve estimations of the cosmic and astrophysical parameters that determine how the universe behaves and how it looks.

Scientists will make those improved estimations by comparing the new observations to computer simulations of the universe with different values for the various parameters — such as the nature of the dark energy pulling the universe apart.

AbacusSummit Leverages Parallel Computer Processing

Abacus leverages parallel computer processing to drastically speed up its calculations of how particles move about due to their gravitational attraction. A sequential processing approach (top) computes the gravitational tug between each pair of particles one by one. Parallel processing (bottom) instead divides the work across multiple computing cores, enabling the calculation of multiple particle interactions simultaneously. Credit: Lucy Reading-Ikkanda/Simons Foundation

“The coming generation of cosmological surveys will map the universe in great detail and explore a wide range of cosmological questions,” says Eisenstein, a co-author on the new MNRAS papers. “But leveraging this opportunity requires a new generation of ambitious numerical simulations. We believe that AbacusSummit will be a bold step for the synergy between computation and experiment.”

The decade-long project was daunting. N-body calculations — which attempt to compute the movements of objects, like planets, interacting gravitationally — have been a foremost challenge in the field of physics since the days of Isaac Newton. The trickiness comes from each object interacting with every other object, no matter how far away. That means that as you add more things, the number of interactions rapidly increases.

There is no general solution to the N-body problem for three or more massive bodies. The calculations available are simply approximations. A common approach is to freeze time, calculate the total force acting on each object, then nudge each one based on the net force it experiences. Time is then moved forward slightly, and the process repeats.

Using that approach, AbacusSummit handled colossal numbers of particles thanks to clever code, a new numerical method and lots of computing power. The Summit supercomputer was the world’s fastest at the time the team ran the calculations; it is still the fastest computer in the U.S.

The team designed the codebase for AbacusSummit — called Abacus — to take full advantage of Summit’s parallel processing power, whereby multiple calculations can run simultaneously. In particular, Summit boasts lots of graphical processing units, or GPUs, that excel at parallel processing.

Running N-body calculations using parallel processing requires careful algorithm design because an entire simulation requires a substantial amount of memory to store. That means Abacus can’t just make copies of the simulation for different nodes of the supercomputer to work on. The code instead divides each simulation into a grid. An initial calculation provides a fair approximation of the effects of distant particles at any given point in the simulation (which play a much smaller role than nearby particles). Abacus then groups nearby cells and splits them off so that the computer can work on each group independently, combining the approximation of distant particles with precise calculations of nearby particles.

“The Abacus algorithm is well matched to the capabilities of modern supercomputers, as it provides a very regular pattern of computation for the massive parallelism of GPU co-processors,” Maksimova says.

Thanks to its design, Abacus achieved very high speeds, updating 70 million particles per second per node of the Summit supercomputer, while also performing analysis of the simulations as they ran. Each particle represents a clump of dark matter with 3 billion times the mass of the sun.

“Our vision was to create this code to deliver the simulations that are needed for this particular new brand of galaxy survey,” says Garrison. “We wrote the code to do the simulations much faster and much more accurate than ever before.”

Eisenstein, who is a member of the DESI collaboration — which recently began its survey to map an unprecedented fraction of the universe — says he is eager to use Abacus in the future.

“Cosmology is leaping forward because of the multidisciplinary fusion of spectacular observations and state-of-the-art computing,” he says. “The coming decade promises to be a marvelous age in our study of the historical sweep of the universe.”

Reference: “AbacusSummit: a massive set of high-<span aria-describedby="tt" class="glossaryLink" data-cmtooltip="

accuracy
How close the measured value conforms to the correct value.

“>accuracy, high-resolution N-body simulations” by Nina A Maksimova, Lehman H Garrison, Daniel J Eisenstein, Boryana Hadzhiyska, Sownak Bose and Thomas P Satterthwaite, 7 September 2021, Monthly Notices of the Royal Astronomical Society.
DOI: 10.1093/mnras/stab2484

Additional co-creators of Abacus and AbacusSummit include Sihan Yuan of Stanford University, Philip Pinto of the University of Arizona, Sownak Bose of Durham University in England and Center for Astrophysics researchers Boryana Hadzhiyska, Thomas Satterthwaite and Douglas Ferrer. The simulations ran on the Summit supercomputer under an Advanced Scientific Computing Research Leadership Computing Challenge allocation.

Adblock test (Why?)



Source link

Continue Reading

Science

A new approach to flagship space telescopes – The Space Review

Published

 on


 

The astrophysics decadal survey recommended a scaled-down version of a space telescope concept called LUVOIR as the first in a line of flagship space observatories to be developed over the next few decades. (credit: NASA/GSFC)

For much of this year, the biggest puzzle for astrophysicists had nothing to do with dark matter, dark energy, or discrepancies in the value of the Hubble Constant. Instead, the question at the top of their minds was: when was Astro2020 coming out?

Astro2020 was the shorthand for the latest astrophysics decadal survey, the once-a-decade review of the state of the field and recommendations for both ground- and space-based projects to pursue to answer the top scientific questions. The final report by the decadal survey’s steering committee, once expected in late 2020 as the name suggests, had slipped to some time in 2021 because of the pandemic, which forced a shift from in-person to virtual meetings just as work on the survey was going into high gear.

The decision to pick a concept between LUVOIR and HabEx was driven by science and budgets: big enough to do meet key science objectives like characterizing exoplanets, but also small enough to fit into a reasonable cost and schedule.

The committee itself kept quiet about its work, providing little specific guidance about when to expect the final report. At a meeting of NASA’s Astrophysics Advisory Committee in October, Paul Hertz, director of the agency’s astrophysics division, presented the results of an office pool from earlier in the year predicting when the report would be released. All but two thought the report would have already been released by the mid-October meeting of that committee; those two predicted it would be released the week of Thanksgiving.

Fortunately, they and the rest of the astrophysics community did not have to wait until last week’s holiday to get their hands on the report. The document, released November 4, provided astronomers with a long-awaited roadmap for not just the next decade but arguably through the middle of the century, endorsing a set of observatories that can peer back into the distant early universe and also look for habitable worlds close to home.

While the decadal survey makes a series of recommendations for smaller missions and ground-based telescopes, what gets the most attention is its recommendation for a large strategic, or flagship, space mission. That recommendation is just that—NASA isn’t bound to accept it—yet the agency has adopted the top-ranked flagship missions of previous decadals. That includes the one picked in the previous decadal in 2010, which became the Wide-Field Infrared Survey Telescope (WFIRST), renamed by NASA to the Nancy Grace Roman Space Telescope last year.

NASA, in preparation for Astro2020, commissioned detailed studies of four proposed flagship observatories, operating from far infrared to X-ray wavelengths (see “Selecting the next great space observatory”, The Space Review, January 21, 2019.) These studies offered detailed technical, scientific, and budgetary information for the concepts, which were effectively finalists for the being the next flagship mission—although the decadal survey was not under any obligation to pick one.

And, in the end, they didn’t pick one of the four. Instead, the recommended flagship mission was something of a compromise between two of the concepts. One, the Habitable Exoplanet Observatory, or HabEx, proposed a space telescope between 3.2 and 4 meters across optimized to search for potentially habitable exoplanets. The other, the Large Ultraviolet Optical Infrared Surveyor, or LUVOIR, proposed a large space telescope between 8 and 15 meters across for use in a wide range of astrophysics, from exoplanet studies to cosmology.

What the decadal recommended was a telescope six meters across capable of observations in ultraviolet, visible, and infrared wavelengths: similar to LUVOIR but scaled down to a size between the smaller version of LUVOIR and HabEx.

The decision to pick a concept between LUVOIR and HabEx was driven by science and budgets: big enough to do meet key science objectives like characterizing exoplanets, but also small enough to fit into a reasonable cost and schedule. “We thought that six meters provides assurance of enough target planets, but it’s also a big enough gain in capability over Hubble to really enable general astrophysics,” said Robert Kennicutt, an astronomer at the University of Arizona and Texas A&M University who was one of the two co-chairs of the decadal survey committee.

“We realized that all of these are visionary ideas but they require timelines that are pan-decadal, even multi-generational,” said Harrison. “We really think a different approach needs to be taken.”

That telescope—not given a name by the decadal survey—will still be expensive and take a long time to build. The decadal’s estimates, which included independent cost and schedule analyses, projected the telescope would cost $11 billion to build, in line with the James Webb Space Telescope when accounting for inflation, and be ready for launch in the first half of the 2040s. But the original LUVOIR concept would have cost $17 billion and not be ready until the 2050s, according to those same analyses. HabEx, the decadal survey concluded, would have been cheaper but too small to meet many of those science goals.

That selection of a flagship mission was, alone, not that different than past decadal surveys. Even that compromise pick is not unprecedented, as the previous decadal’s recommendation of what would become Roman emerged from combining several concepts. What was different, though, was the realization that, after the delays and cost overruns suffered by past flagships, notably the James Webb Space Telescope, NASA needed a different approach to developing such missions.

“We realized that all of these are visionary ideas but they require timelines that are pan-decadal, even multi-generational,” said Fiona Harrison of Caltech, the other co-chair of the steering committee, referring to the four flagship concepts studied for the decadal. “We really think a different approach needs to be taken.”

What the decadal survey recommended was that the space telescope it recommended be just the first mission to emerge from a new Great Observatories Mission and Technology Maturation Program at NASA. That program would mature technologies for a series of flagship missions in a coherent fashion.

“The survey committee expects that this process will result in decreased cost and risk and enable more frequent launches of flagship missions, even if it does require significantly more upfront investment prior to a decadal recommendation regarding implementation,” the committee concluded in the report.

Specifically, it recommended that, five years after starting work on the large space telescope that was the report’s top priority, NASA begin studies of two other flagship missions, a far infrared space telescope and an X-ray observatory, at estimated costs of $3–5 billion each. Both are similar to the other two flagship mission concepts studied by NASA for this decadal survey, the Origins Space Telescope and Lynx X-ray Observatory.

Setting up studies of those future mission concepts, without committing to them, allows NASA to adapt if both technologies or science goals change, another member of the decadal survey steering committee noted. “If the progress appears to be stalled or delayed, then we can rapidly onramp another one of the compelling, exciting ideas,” said Keivan Stassun of Vanderbilt University. “We can be phasing in multiple great ideas.”

“We were tasked and encouraged by the funding agencies, including NASA, to really think big, bold, ambitious, and long-term,” Stassun said.

The idea that it takes a long time to develop flagship space telescopes is not new: the first studies of JWST, originally called the Next Generation Space Telescope, predate the launch of the Hubble Space Telescope more than three decades ago, and that spacecraft is only now about to launch. But the study’s proposal recognizes that the problems experienced by JWST and, to a lesser extent, Roman, require a different approach to managing such complex, expensive missions.

It also reflects the realization that some of the questions that astrophysics is seeking to answer can’t be easily fit into decade-long timeframes. “We were tasked and encouraged by the funding agencies, including NASA, to really think big, bold, ambitious, and long-term,” Stassun said. “We took that to mean that we should not be thinking only about that which can be accomplished in a ten-year period.”

NASA’s Hertz had, in fact, urged the decadal survey to be bold on many occasions before and during its deliberations. “I asked the decadal survey to be ambitious, and I believe they are certainly ambitious,” he said at a November 8 meeting of the Committee on Astronomy and Astrophysics of the National Academies’ Space Studies Board.

NASA is only starting to review the overall recommendations of the decadal, he said. That includes not just its analysis of flagship missions but endorsement of a new medium-class line of “probe” missions, with a cost of $1.5 billion per mission and flying once a decade. Such missions would be analogous to the New Frontiers line of planetary science missions that fall between planetary science flagships and smaller Discovery class missions.

The delay in completing the decadal means it won’t have an impact on NASA’s next budget proposal for fiscal year 2023, which is already in active development for release in early 2022. Hertz said he’ll provide some initial comment on the decadal at a town hall meeting during the American Astronomical Society meeting in early January, particularly any recommendations that can be accommodated in the fiscal year 2022 budget. A complete, formal response will come later next year after a series of town hall meetings.

Those plans will depend on budgets. The first views Congress has on the decadal, including its flagship mission plans, will come Wednesday when the House Science Committee’s space subcommittee holds a hearing on the report.

Hertz was optimistic in general about the state of NASA’s astrophysics programs, citing the impending launch of JWST and Roman passing its critical design review. “I’m really excited. This is a great time for astrophysics,” he said. Astronomers hope the decadal’s recommendations, if implemented, can make it a great few decades for the field.


Note: we are using a new commenting system, which may require you to create a new account.

Please enable JavaScript to view the comments powered by Disqus.

Adblock test (Why?)



Source link

Continue Reading

Science

AI Just Designed The World’s First Living Robots That Can Make Babies – Forbes

Published

 on


Xenobots are the world’s first AI-designed biological robots that can self-repair and self-replicate.

The year was 1948 when Hungarian-American mathematician John von Neumann proposed the idea of an autonomous robot capable of using raw materials to reproduce itself. Today, Neumann’s vision is finally realized with one major twist: the self-replicating robot isn’t made of aluminum, plastics, spur gears or sprockets. The parent robot and its babies, a new lineage of organism called Xenobots, are entirely biological. “It was exciting to see that we could [make] this Von Neumann machine, but using cells instead of robot parts,” says co-author Sam Kriegman, computer scientist at the Wyss Institute at Harvard and co-author of the Xenobots paper published today in PNAS.

“People have philosophized about this forever,” says Joshua Bongard, senior author and computer scientist at the University of Vermont. “But now you can actually do experiments to create biological machines, or machines that make biology, which in turn make machines.” 

It’s okay to be confused. The researchers liberally refer to Xenobots as “machines” even though Xenobots don’t contain a single mechanical component. Science may be moving faster than our framework for talking and even thinking about this new category of machine life. “I think it challenges us to see that there may not be a clear dividing line between machine and organism,” says Bongard.

Artificial Intelligence

The self-replicating Xenobot was first “conceived” by an artificial intelligence (AI) program working on UVM’s supercomputer. The AI ran an evolutionary algorithm capable of testing billions of biological body shapes in a simulation. The goal was to discover which configuration of cells is capable of self-replication. The AI rendered a winning design: a cluster of cells shaped like Pac-Man from the 1980s arcade game.

Biologist Douglas Blackiston took the AI’s blueprint and used microcautery electrodes and surgical forceps to hand-sculpt the Xenobots, creatures made up of clusters of 4,000-5,000 frog cells swimming in a petri dish. Random frog cells added to the dish give the parent Xenobots raw material to make babies inside their Pac-Man-shaped mouths. The Xenobabies grow into parent Xenobots. By adding frog cells, self-replication continues generation after generation.

Biological Intelligence

Sculpting a bespoke shape out of stem cells is the “programming” that instructs cell clusters to develop a certain way. Shaping a cluster of frog cells in this specific configuration programs them to become a new self-replicating life form. “This is an AI designing life, or designing a robot, whatever you want to call it,” says Blackiston. “These are things that are not under the purview of [natural] selection.” 

New Definitions For Intelligence

Robots made of traditional robot parts that perform quite well in controlled environments often fail in the real world. “Once you move through the world, it’s unpredictable, things are messy,” says Kriegman, who was delighted by the possibility of using robotics materials that have biological intelligence built in. “Doug came up with the notion of building robots out of biological stuff,” says Kriegman. “You get this intelligence for free. And we were off to the races.”

When asked if Xenobots are intelligent, Blackiston has reservations. Of the two computer scientists and two biologists on the research team, Blackiston is more comfortable calling Xenobots programmed engineered organisms with intelligence happening at the design and programming level but not in the actual Xenobot. “My opinion is that they’re not intelligent,” says Blackiston. Though he agrees with the rest of the team that their work challenges scientific definitions. “[Definitions] are being driven into extinction because of these technologies,” says Bongard. “Xenobots are a product of AI and AI itself is helping to drive to extinction our standard definitions of intelligence.”

Intelligent Design

Definitions aside, Blackiston thinks society will have to grapple with many of the applications for, and implications of, this new technology—like the question of artificial intelligence designing replacement parts for humans. “What if an AI tinkers around and figures out it can design a better heart than the one that evolution has given us?” Asks Blackiston, who thinks it’s possible AI could give us blueprints to create superior organs to our current models. “I think we’re going to see these questions popping up all over the medical and environmental space in the next 10-15 years.”

Blackiston surgically shaping the first generation of AI-designed Xenobot:

Xenobots collect frog cells and shaping them into Xenobabies:

Adblock test (Why?)



Source link

Continue Reading

Science

Roy Physical Therapy Clinic Outlines Laser Therapy Treatment Phases – Digital Journal

Published

 on


Rock Run Physical Therapy is a leading physical therapy office. In a recent update, the facility outlined in detail its laser therapy treatment phases.

Rock Run Physical Therapy, in a website post, has shared the laser therapy treatment phases. Laser Therapy is used for the relief of pain, accelerating healing, and decreasing inflammation. When the light source is placed against the skin, the photons penetrate several centimeters and absorb the energy-producing part by the mitochondria.

This energy fuels many positive physiological responses resulting in the restoration of normal cell morphology and function. The Roy physical therapy laser has been successfully used to treat a broad range of medical conditions. Some include musculoskeletal problems, arthritis, sports injuries, post-surgical wounds, diabetic ulcers, and dermatological disorders.

Rock Run Physical Therapy follows the following phases in delivering laser therapy. The first step is inflammation. Often, the injured area gets red and puffy and might even feel a little warm to the touch. This is the body’s natural response to any injury. Unfortunately, the swelling experienced can be painful.

Inflammation is just blood flowing into that area to help heal the damaged tissue. Increased blood flow creates less space for movement in that area, which results in pain. So, the primary role of laser therapy is to reduce this pain.

The next step is repair. This is where Laser Therapy starts to come into play. The body will increase blood flow to an injured area to bring in cells to repair the damaged tissue. Laser physical therapy and the PBM process speed up the healing process by giving energy to the cells responsible for healing tissue, decreasing the amount of time someone can spend in the Inflammation stage.

The final stage of the healing process is remodeling. This means training the repaired tissue to perform all the tasks it had achieved before the injury. Laser Therapy is still beneficial because not all of the tissue is restored by this point. The most helpful thing someone can do at this stage of healing is to come to Physical Therapy. With the help of a therapist, they can regain normal movement and improve the strength in the repaired tissue so that they never injure that body part again for the foreseeable future.

For more information, visit https://rockruntherapy.com/

About the Company:

Rock Run offers award-winning physical therapy in Northern Utah. The team is committed to providing the best physical therapy experience with the highest quality of care for optimal results. Not only that, but they also pride themselves on providing the community with the highest quality of physical therapy and rehab care that is delivered efficiently. Thanks to the hands-on techniques that facilitate pain relief and functional recovery. Therapeutic exercise and home programs will also help patients to back to where they want to be.

Media Contact
Company Name: Rock Run Physical Therapy
Contact Person: Brandon Hepner
Email: Send Email
Phone: (801) 985-2700
Address:5991 S 3500 W #300
City: Roy
State: UT 84067
Country: United States
Website: https://rockruntherapy.com/

Adblock test (Why?)



Source link

Continue Reading

Trending