Peer deeper into the heart of the atom than any microscope allows and scientists hypothesize that you will find a rich world of particles popping in and out of the vacuum, decaying into other particles, and adding to the weirdness of the visible world. These subatomic particles are governed by the quantum nature of the Universe and find tangible, physical form in experimental results.
Some subatomic particles were first discovered over a century ago with relatively simple experiments. More recently, however, the endeavor to understand these particles has spawned the largest, most ambitious and complex experiments in the world, including those at particle physics laboratories such as the European Organization for Nuclear Research (CERN) in Europe, Fermilab in Illinois, and the High Energy Accelerator Research Organization (KEK) in Japan.
These experiments have a mission to expand our understanding of the Universe, characterized most harmoniously in the Standard Model of particle physics; and to look beyond the Standard Model for as-yet-unknown physics.
“The Standard Model explains so much of what we observe in elementary particle and nuclear physics, but it leaves many questions unanswered,” said Steven Gottlieb, distinguished professor of Physics at Indiana University. “We are trying to unravel the mystery of what lies beyond the Standard Model.”
Ever since the beginning of the study of particle physics, experimental and theoretical approaches have complemented each other in the attempt to understand nature. In the past four to five decades, advanced computing has become an important part of both approaches. Great progress has been made in understanding the behavior of the zoo of subatomic particles, including bosons (especially the long sought and recently discovered Higgs boson), various flavors of quarks, gluons, muons, neutrinos and many states made from combinations of quarks or anti-quarks bound together.
Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed. It combines classical field theory, special relativity and quantum mechanics, developed with contributions from Einstein, Dirac, Fermi, Feynman, and others. Within the Standard Model, quantum chromodynamics, or QCD, is the theory of the strong interaction between quarks and gluons, the fundamental particles that make up some of the larger composite particles such as the proton, neutron and pion.
Peering Through The Lattice
Carleton DeTar and Steven Gottlieb are two of the leading contemporary scholars of QCD research and practitioners of an approach known as lattice QCD. Lattice QCD represents continuous space as a discrete set of spacetime points (called the lattice). It uses supercomputers to study the interactions of quarks, and importantly, to determine more precisely several parameters of the Standard Model, thereby reducing the uncertainties in its predictions. It’s a slow and resource-intensive approach, but it has proven to have wide applicability, giving insight into parts of the theory inaccessible by other means, in particular the explicit forces acting between quarks and antiquarks.
DeTar and Gottlieb are part of the MIMD Lattice Computation (MILC) Collaboration and work very closely with the Fermilab Lattice Collaboration on the vast majority of their work. They also work with the High Precision QCD (HPQCD) Collaboration for the study of the muon anomalous magnetic moment. As part of these efforts, they use the fastest supercomputers in the world.
Since 2019, they have used Frontera at the Texas Advanced Computing Center (TACC)—the fastest academic supercomputer in the world and the 9th fastest overall—to propel their work. They are among the largest users of that resource, which is funded by the National Science Foundation. The team also uses Summit at the Oak Ridge National Laboratory (the #2 fastest supercomputer in the world); Cori at the National Energy Research Scientific Computing Center (#20), and Stampede2 (#25) at TACC, for the lattice calculations.
The efforts of the lattice QCD community over decades have brought greater accuracy to particle predictions through a combination of faster computers and improved algorithms and methodologies.
“We can do calculations and make predictions with high precision for how strong interactions work,” said DeTar, professor of Physics and Astronomy at the University of Utah. “When I started as a graduate student in the late 1960s, some of our best estimates were within 20 percent of experimental results. Now we can get answers with sub-percent accuracy.”
In particle physics, physical experiment and theory travel in tandem, informing each other, but sometimes producing different results. These differences suggest areas of further exploration or improvement.
“There are some tensions in these tests,” said Gottlieb, distinguished professor of Physics at Indiana University. “The tensions are not large enough to say that there is a problem here—the usual requirement is at least five standard deviations. But it means either you make the theory and experiment more precise and find that the agreement is better; or you do it and you find out, ‘Wait a minute, what was the three sigma tension is now a five standard deviation tension, and maybe we really have evidence for new physics.'”
DeTar calls these small discrepancies between theory and experiment ‘tantalizing.’ “They might be telling us something.”
Over the last several years, DeTar, Gottlieb and their collaborators have followed the paths of quarks and antiquarks with ever-greater resolution as they move through a background cloud of gluons and virtual quark-antiquark pairs, as prescribed precisely by QCD. The results of the calculation are used to determine physically meaningful quantities such as particle masses and decays.
One of the current state-of-the-art approaches that is applied by the researchers uses the so-called highly improved staggered quark (HISQ) formalism to simulate interactions of quarks with gluons. On Frontera, DeTar and Gottlieb are currently simulating at a lattice spacing of 0.06 femtometers (10-15 meters), but they are quickly approaching their ultimate goal of 0.03 femtometers, a distance where the lattice spacing is smaller than the wavelength of the heaviest quark, consequently removing a significant source of uncertainty from these calculations.
Each doubling of resolution, however, requires about two orders of magnitude more computing power, putting a 0.03 femtometers lattice spacing firmly in the quickly-approaching ‘exascale’ regime.
“The costs of calculations keeps rising as you make the lattice spacing smaller,” DeTar said. “For smaller lattice spacing, we’re thinking of future Department of Energy machines and the Leadership Class Computing Facility [TACC’s future system in planning]. But we can make do with extrapolations now.”
The Anomalous Magnetic Moment Of The Muon And Other Outstanding Mysteries
Among the phenomena that DeTar and Gottlieb are tackling is the anomalous magnetic moment of the muon (essentially a heavy electron) – which, in quantum field theory, arises from a weak cloud of elementary particles that surrounds the muon. The same sort of cloud affects particle decays. Theorists believe yet-undiscovered elementary particles could potentially be in that cloud.
A large international collaboration called the Muon g-2 Theory Initiative recently reviewed the present status of the Standard Model calculation of the muon’s anomalous magnetic moment. Their review appeared in Physics Reports in December 2020. DeTar, Gottlieb and several of their Fermilab Lattice, HPQCD and MILC collaborators are among the coauthors. They find a 3.7 standard deviation difference between experiment and theory.
“… the processes that were important in the earliest instance of the Universe involve the same interactions that we’re working with here. So, the mysteries we’re trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well.”
Carleton DeTar, Professor of Physics, University of UtahWhile some parts of the theoretical contributions can be calculated with extreme accuracy, the hadronic contributions (the class of subatomic particles that are composed of two or three quarks and participate in strong interactions) are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. Lattice QCD is one of two ways to calculate these contributions.
“The experimental uncertainty will soon be reduced by up to a factor of four by the new experiment currently running at Fermilab, and also by the future J-PARC experiment,” they wrote. “This and the prospects to further reduce the theoretical uncertainty in the near future… make this quantity one of the most promising places to look for evidence of new physics.”
Gottlieb, DeTar and collaborators have calculated the hadronic contribution to the anomalous magnetic moment with a precision of 2.2 percent. “This give us confidence that our short-term goal of achieving a precision of 1 percent on the hadronic contribution to the muon anomalous magnetic moment is now a realistic one,” Gottlieb said. They hope to achieve a precision of 0.5 percent a few years later.
Other ‘tantalizing’ hints of new physics involve measurements of the decay of B mesons. There, various experimental methods arrive at different results. “The decay properties and mixings of the D and B mesons are critical to a more accurate determination of several of the least well-known parameters of the Standard Model,” Gottlieb said. “Our work is improving the determinations of the masses of the up, down, strange, charm and bottom quarks and how they mix under weak decays.” The mixing is described by the so-called CKM mixing matrix for which Kobayashi and Maskawa won the 2008 Nobel Prize in Physics.
The answers DeTar and Gottlieb seek are the most fundamental in science: What is matter made of? And where did it come from?
“The Universe is very connected in many ways,” said DeTar. “We want to understand how the Universe began. The current understanding is that it began with the Big Bang. And the processes that were important in the earliest instance of the Universe involve the same interactions that we’re working with here. So, the mysteries we’re trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well.”
More information:
T. Aoyama et al, The anomalous magnetic moment of the muon in the Standard Model, Physics Reports (2020). DOI: 10.1016/j.physrep.2020.07.006
Citation:
Searching for hints of new physics in the subatomic world (2021, March 24)
retrieved 25 March 2021
from https://phys.org/news/2021-03-hints-physics-subatomic-world.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
More than 40 trillion gallons of rain drenched the Southeast United States in the last week from Hurricane Helene and a run-of-the-mill rainstorm that sloshed in ahead of it — an unheard of amount of water that has stunned experts.
That’s enough to fill the Dallas Cowboys’ stadium 51,000 times, or Lake Tahoe just once. If it was concentrated just on the state of North Carolina that much water would be 3.5 feet deep (more than 1 meter). It’s enough to fill more than 60 million Olympic-size swimming pools.
“That’s an astronomical amount of precipitation,” said Ed Clark, head of the National Oceanic and Atmospheric Administration’s National Water Center in Tuscaloosa, Alabama. “I have not seen something in my 25 years of working at the weather service that is this geographically large of an extent and the sheer volume of water that fell from the sky.”
The flood damage from the rain is apocalyptic, meteorologists said. More than 100 people are dead, according to officials.
Private meteorologist Ryan Maue, a former NOAA chief scientist, calculated the amount of rain, using precipitation measurements made in 2.5-mile-by-2.5 mile grids as measured by satellites and ground observations. He came up with 40 trillion gallons through Sunday for the eastern United States, with 20 trillion gallons of that hitting just Georgia, Tennessee, the Carolinas and Florida from Hurricane Helene.
Clark did the calculations independently and said the 40 trillion gallon figure (151 trillion liters) is about right and, if anything, conservative. Maue said maybe 1 to 2 trillion more gallons of rain had fallen, much if it in Virginia, since his calculations.
Clark, who spends much of his work on issues of shrinking western water supplies, said to put the amount of rain in perspective, it’s more than twice the combined amount of water stored by two key Colorado River basin reservoirs: Lake Powell and Lake Mead.
Several meteorologists said this was a combination of two, maybe three storm systems. Before Helene struck, rain had fallen heavily for days because a low pressure system had “cut off” from the jet stream — which moves weather systems along west to east — and stalled over the Southeast. That funneled plenty of warm water from the Gulf of Mexico. And a storm that fell just short of named status parked along North Carolina’s Atlantic coast, dumping as much as 20 inches of rain, said North Carolina state climatologist Kathie Dello.
Then add Helene, one of the largest storms in the last couple decades and one that held plenty of rain because it was young and moved fast before it hit the Appalachians, said University of Albany hurricane expert Kristen Corbosiero.
“It was not just a perfect storm, but it was a combination of multiple storms that that led to the enormous amount of rain,” Maue said. “That collected at high elevation, we’re talking 3,000 to 6000 feet. And when you drop trillions of gallons on a mountain, that has to go down.”
The fact that these storms hit the mountains made everything worse, and not just because of runoff. The interaction between the mountains and the storm systems wrings more moisture out of the air, Clark, Maue and Corbosiero said.
North Carolina weather officials said their top measurement total was 31.33 inches in the tiny town of Busick. Mount Mitchell also got more than 2 feet of rainfall.
Before 2017’s Hurricane Harvey, “I said to our colleagues, you know, I never thought in my career that we would measure rainfall in feet,” Clark said. “And after Harvey, Florence, the more isolated events in eastern Kentucky, portions of South Dakota. We’re seeing events year in and year out where we are measuring rainfall in feet.”
Storms are getting wetter as the climate change s, said Corbosiero and Dello. A basic law of physics says the air holds nearly 4% more moisture for every degree Fahrenheit warmer (7% for every degree Celsius) and the world has warmed more than 2 degrees (1.2 degrees Celsius) since pre-industrial times.
Corbosiero said meteorologists are vigorously debating how much of Helene is due to worsening climate change and how much is random.
For Dello, the “fingerprints of climate change” were clear.
“We’ve seen tropical storm impacts in western North Carolina. But these storms are wetter and these storms are warmer. And there would have been a time when a tropical storm would have been heading toward North Carolina and would have caused some rain and some damage, but not apocalyptic destruction. ”
Associated Press climate and environmental coverage receives support from several private foundations. See more about AP’s climate initiative here. The AP is solely responsible for all content.
It’s a dinosaur that roamed Alberta’s badlands more than 70 million years ago, sporting a big, bumpy, bony head the size of a baby elephant.
On Wednesday, paleontologists near Grande Prairie pulled its 272-kilogram skull from the ground.
They call it “Big Sam.”
The adult Pachyrhinosaurus is the second plant-eating dinosaur to be unearthed from a dense bonebed belonging to a herd that died together on the edge of a valley that now sits 450 kilometres northwest of Edmonton.
It didn’t die alone.
“We have hundreds of juvenile bones in the bonebed, so we know that there are many babies and some adults among all of the big adults,” Emily Bamforth, a paleontologist with the nearby Philip J. Currie Dinosaur Museum, said in an interview on the way to the dig site.
She described the horned Pachyrhinosaurus as “the smaller, older cousin of the triceratops.”
“This species of dinosaur is endemic to the Grand Prairie area, so it’s found here and nowhere else in the world. They are … kind of about the size of an Indian elephant and a rhino,” she added.
The head alone, she said, is about the size of a baby elephant.
The discovery was a long time coming.
The bonebed was first discovered by a high school teacher out for a walk about 50 years ago. It took the teacher a decade to get anyone from southern Alberta to come to take a look.
“At the time, sort of in the ’70s and ’80s, paleontology in northern Alberta was virtually unknown,” said Bamforth.
When paleontogists eventually got to the site, Bamforth said, they learned “it’s actually one of the densest dinosaur bonebeds in North America.”
“It contains about 100 to 300 bones per square metre,” she said.
Paleontologists have been at the site sporadically ever since, combing through bones belonging to turtles, dinosaurs and lizards. Sixteen years ago, they discovered a large skull of an approximately 30-year-old Pachyrhinosaurus, which is now at the museum.
About a year ago, they found the second adult: Big Sam.
Bamforth said both dinosaurs are believed to have been the elders in the herd.
“Their distinguishing feature is that, instead of having a horn on their nose like a triceratops, they had this big, bony bump called a boss. And they have big, bony bumps over their eyes as well,” she said.
“It makes them look a little strange. It’s the one dinosaur that if you find it, it’s the only possible thing it can be.”
The genders of the two adults are unknown.
Bamforth said the extraction was difficult because Big Sam was intertwined in a cluster of about 300 other bones.
The skull was found upside down, “as if the animal was lying on its back,” but was well preserved, she said.
She said the excavation process involved putting plaster on the skull and wooden planks around if for stability. From there, it was lifted out — very carefully — with a crane, and was to be shipped on a trolley to the museum for study.
“I have extracted skulls in the past. This is probably the biggest one I’ve ever done though,” said Bamforth.
“It’s pretty exciting.”
This report by The Canadian Press was first published Sept. 25, 2024.
TEL AVIV, Israel (AP) — A rare Bronze-Era jar accidentally smashed by a 4-year-old visiting a museum was back on display Wednesday after restoration experts were able to carefully piece the artifact back together.
Last month, a family from northern Israel was visiting the museum when their youngest son tipped over the jar, which smashed into pieces.
Alex Geller, the boy’s father, said his son — the youngest of three — is exceptionally curious, and that the moment he heard the crash, “please let that not be my child” was the first thought that raced through his head.
The jar has been on display at the Hecht Museum in Haifa for 35 years. It was one of the only containers of its size and from that period still complete when it was discovered.
The Bronze Age jar is one of many artifacts exhibited out in the open, part of the Hecht Museum’s vision of letting visitors explore history without glass barriers, said Inbal Rivlin, the director of the museum, which is associated with Haifa University in northern Israel.
It was likely used to hold wine or oil, and dates back to between 2200 and 1500 B.C.
Rivlin and the museum decided to turn the moment, which captured international attention, into a teaching moment, inviting the Geller family back for a special visit and hands-on activity to illustrate the restoration process.
Rivlin added that the incident provided a welcome distraction from the ongoing war in Gaza. “Well, he’s just a kid. So I think that somehow it touches the heart of the people in Israel and around the world,“ said Rivlin.
Roee Shafir, a restoration expert at the museum, said the repairs would be fairly simple, as the pieces were from a single, complete jar. Archaeologists often face the more daunting task of sifting through piles of shards from multiple objects and trying to piece them together.
Experts used 3D technology, hi-resolution videos, and special glue to painstakingly reconstruct the large jar.
Less than two weeks after it broke, the jar went back on display at the museum. The gluing process left small hairline cracks, and a few pieces are missing, but the jar’s impressive size remains.
The only noticeable difference in the exhibit was a new sign reading “please don’t touch.”