Ancient Quasars Help Confirm Quantum Entanglement - Canadanewsmedia
Connect with us

Science

Ancient Quasars Help Confirm Quantum Entanglement

Published

on


The quasar dates back to less than one billion years after the big bang. Image: NASA/ESA/G.Bacon, STScI

New results are among the strongest evidence yet for “spooky action at a distance.”

Last year, physicists at MIT, the University of Vienna, and elsewhere provided strong support for quantum entanglement, the seemingly far-out idea that two particles, no matter how distant from each other in space and time, can be inextricably linked, in a way that defies the rules of classical physics.

Take, for instance, two particles sitting on opposite edges of the universe. If they are truly entangled, then according to the theory of quantum mechanics their physical properties should be related in such a way that any measurement made on one particle should instantly convey information about any future measurement outcome of the other particle — correlations that Einstein skeptically saw as “spooky action at a distance.”

In the 1960s, the physicist John Bell calculated a theoretical limit beyond which such correlations must have a quantum, rather than a classical, explanation.

But what if such correlations were the result not of quantum entanglement, but of some other hidden, classical explanation? Such “what-ifs” are known to physicists as loopholes to tests of Bell’s inequality, the most stubborn of which is the “freedom-of-choice” loophole: the possibility that some hidden, classical variable may influence the measurement that an experimenter chooses to perform on an entangled particle, making the outcome look quantumly correlated when in fact it isn’t.

Ancient Quasars Help Confirm Quantum Entanglement

Courtesy of the researchers

Last February, the MIT team and their colleagues significantly constrained the freedom-of-choice loophole, by using 600-year-old starlight to decide what properties of two entangled photons to measure. Their experiment proved that, if a classical mechanism caused the correlations they observed, it would have to have been set in motion more than 600 years ago, before the stars’ light was first emitted and long before the actual experiment was even conceived.

Now, in a paper published today in Physical Review Letters, the same team has vastly extended the case for quantum entanglement and further restricted the options for the freedom-of-choice loophole. The researchers used distant quasars, one of which emitted its light 7.8 billion years ago and the other 12.2 billion years ago, to determine the measurements to be made on pairs of entangled photons. They found correlations among more than 30,000 pairs of photons, to a degree that far exceeded the limit that Bell originally calculated for a classically based mechanism.

“If some conspiracy is happening to simulate quantum mechanics by a mechanism that is actually classical, that mechanism would have had to begin its operations — somehow knowing exactly when, where, and how this experiment was going to be done — at least 7.8 billion years ago. That seems incredibly implausible, so we have very strong evidence that quantum mechanics is the right explanation,” says co-author Alan Guth, the Victor F. Weisskopf Professor of Physics at MIT.

“The Earth is about 4.5 billion years old, so any alternative mechanism — different from quantum mechanics — that might have produced our results by exploiting this loophole would’ve had to be in place long before even there was a planet Earth, let alone an MIT,” adds David Kaiser, the Germeshausen Professor of the History of Science and professor of physics at MIT. “So we’ve pushed any alternative explanations back to very early in cosmic history.”

Guth and Kaiser’s co-authors include Anton Zeilinger and members of his group at the Austrian Academy of Sciences and the University of Vienna, as well as physicists at Harvey Mudd College and the University of California at San Diego.

A decision, made billions of years ago

In 2014, Kaiser and two members of the current team, Jason Gallicchio and Andrew Friedman, proposed an experiment to produce entangled photons on Earth — a process that is fairly standard in studies of quantum mechanics. They planned to shoot each member of the entangled pair in opposite directions, toward light detectors that would also make a measurement of each photon using a polarizer. Researchers would measure the polarization, or orientation, of each incoming photon’s electric field, by setting the polarizer at various angles and observing whether the photons passed through — an outcome for each photon that researchers could compare to determine whether the particles showed the hallmark correlations predicted by quantum mechanics.

The team added a unique step to the proposed experiment, which was to use light from ancient, distant astronomical sources, such as stars and quasars, to determine the angle at which to set each respective polarizer. As each entangled photon was in flight, heading toward its detector at the speed of light, researchers would use a telescope located at each detector site to measure the wavelength of a quasar’s incoming light. If that light was redder than some reference wavelength, the polarizer would tilt at a certain angle to make a specific measurement of the incoming entangled photon — a measurement choice that was determined by the quasar. If the quasar’s light was bluer than the reference wavelength, the polarizer would tilt at a different angle, performing a different measurement of the entangled photon.

In their previous experiment, the team used small backyard telescopes to measure the light from stars as close as 600 light years away. In their new study, the researchers used much larger, more powerful telescopes to catch the incoming light from even more ancient, distant astrophysical sources: quasars whose light has been traveling toward the Earth for at least 7.8 billion years — objects that are incredibly far away and yet are so luminous that their light can be observed from Earth.

Tricky timing

On Jan. 11, 2018, “the clock had just ticked past midnight local time,” as Kaiser recalls, when about a dozen members of the team gathered on a mountaintop in the Canary Islands and began collecting data from two large, 4-meter-wide telescopes: the William Herschel Telescope and the Telescopio Nazionale Galileo, both situated on the same mountain and separated by about a kilometer.

One telescope focused on a particular quasar, while the other telescope looked at another quasar in a different patch of the night sky. Meanwhile, researchers at a station located between the two telescopes created pairs of entangled photons and beamed particles from each pair in opposite directions toward each telescope.

In the fraction of a second before each entangled photon reached its detector, the instrumentation determined whether a single photon arriving from the quasar was more red or blue, a measurement that then automatically adjusted the angle of a polarizer that ultimately received and detected the incoming entangled photon.

“The timing is very tricky,” Kaiser says. “Everything has to happen within very tight windows, updating every microsecond or so.”

Demystifying a mirage

The researchers ran their experiment twice, each for around 15 minutes and with two different pairs of quasars. For each run, they measured 17,663 and 12,420 pairs of entangled photons, respectively. Within hours of closing the telescope domes and looking through preliminary data, the team could tell there were strong correlations among the photon pairs, beyond the limit that Bell calculated, indicating that the photons were correlated in a quantum-mechanical manner.

Guth led a more detailed analysis to calculate the chance, however slight, that a classical mechanism might have produced the correlations the team observed.

He calculated that, for the best of the two runs, the probability that a mechanism based on classical physics could have achieved the observed correlation was about 10 to the minus 20 — that is, about one part in one hundred billion billion, “outrageously small,” Guth says. For comparison, researchers have estimated the probability that the discovery of the Higgs boson was just a chance fluke to be about one in a billion.

“We certainly made it unbelievably implausible that a local realistic theory could be underlying the physics of the universe,” Guth says.

And yet, there is still a small opening for the freedom-of-choice loophole. To limit it even further, the team is entertaining ideas of looking even further back in time, to use sources such as cosmic microwave background photons that were emitted as leftover radiation immediately following the Big Bang, though such experiments would present a host of new technical challenges.

“It is fun to think about new types of experiments we can design in the future, but for now, we are very pleased that we were able to address this particular loophole so dramatically. Our experiment with quasars puts extremely tight constraints on various alternatives to quantum mechanics. As strange as quantum mechanics may seem, it continues to match every experimental test we can devise,” Kaiser says.

This research was supported in part by the Austrian Academy of Sciences, the Austrian Science Fund, the U.S. National Science Foundation, and the U.S. Department of Energy.

Publication: Dominik Rauch, et al., “Cosmic Bell Test Using Random Measurement Settings from High-Redshift Quasars,” Physical Review Letters, 2018; doi:10.1103/PhysRevLett.121.080403

Let’s block ads! (Why?)



Source link

Continue Reading

Science

From birds to fish, humans reshaping evolutionary history of species everywhere: paper

Published

on

By


A theoretical biologist says the activities and presence of human beings have become one of the largest drivers of evolutionary change everywhere on the planet.


A white-winged crossbill sits in a tree near the Highwood Pass on Sept. 4, 2018. A new paper based on research from around the globe concludes that human beings have radically reshaped evolution for everything from birds to fish to plants.


Mike Drew/Postmedia

VANCOUVER — Swallows are evolving smaller, more manoeuvrable wings to help them dodge buildings and vehicles.

Some fish are growing mouths that are smaller and harder to hook.

Large animals from caribou to tuna are disappearing.

Meanwhile, it’s boom time for anything not too fussy about where it lives or what it eats.

“It’s a reshaping of the tree of life,” said Sarah Otto, a University of British Columbia researcher, whose paper was published Wednesday by the London-based Proceedings of the Royal Society.

Otto, a much-awarded and highly regarded theoretical biologist, says the activities and presence of human beings have become one of the largest drivers of evolutionary change everywhere on the planet.

“Human impacts on the world are not just local,” she said. “They are changing the course of evolutionary history for all species on the planet, and that’s a remarkable concept to ponder.”

Earth scientists have long discussed the idea of the Anthropocene — a period of Earth’s history defined by geological markers of human impact. Otto, after reviewing dozens of research papers, concludes the planet’s biology is becoming similarly marked as plants and animals respond to human pressure.

Her paper is replete with examples from bird species slowly forgetting to migrate to mosquito breeds adapted specifically to underground subway tunnels.

Backyard bird feeders are behind changes in the beak shape and strength of house finches. Different mammals are becoming nocturnal as a way to avoid human conflict. Introduced species change the ground rules for native plants and animals.

It’s a mistake to think evolution requires millennia, said Otto.

“Evolution happens really fast if the selection regimes are strong. We can see sometimes in plant populations evolutionary change in the course of years.”

If the changes come too fast for evolution to keep up, there’s always extinction.

Rates of species loss are now estimated to be 1,000 times higher than they were before human domination. More than one in five of all plant and animal species are considered at risk.

Extinctions have always happened. But Otto said they’re happening at such a pace and in response to such similar pressures that they are reducing the ability of evolution to respond to change.

“We’re losing the ability for evolution to bounce back.”

Forcing species into a human-formed box reduces variability, leaving evolution less to work with in response to future changes. And wiping species out removes them forever.

“If we’re eliminating the large-bodied mammals, even if humans went extinct on the planet, we’re not going to see an immediate return of ecosystems to have the right balance of small, medium and large species,” Otto said.

“We’re cutting off options. We’re cutting off options both within species by eliminating variability, and we’re also cutting off options at the tree of life level by cutting off species.”

Species that are doing well are generalists — crows, coyotes, dandelions.

“The ones that can both tolerate and thrive in human-altered environments,” said Otto. “The pigeons and the rats.”

The biggest single human-caused evolutionary pressure, Otto said, is climate change.

“The No. 1 thing we have to do is tackle climate change. If we don’t do that, we’re going to lose a lot more species.”

— By Bob Weber in Edmonton. Follow @row1960 on Twitter

Let’s block ads! (Why?)



Source link

Continue Reading

Science

Nations gather to weigh the meaning of a kilogram

Published

on

By


Carlos Sanchez, expert in metrology at the National Research Council of Canada, is working on the Kibble balance.

HANDOUT

If things go as planned, by the end of this week, the world will have a new definition of the kilogram.

The change will not require any adjustments to bathroom scales, or alter the heft of a bag of potatoes. But the milestone will serve to make our universal unit of mass a lot more universal.

In short, instead of basing it on a lump of precious metal locked in a vault in France, scientists have decided to recast the kilogram as something truly immutable, tied by a mathematical umbilical cord to the fundamental constants of nature that have endured since the Big Bang banged.

Story continues below advertisement

For those who play in the rarefied world of high-precision metrology − the science of measurement − it just doesn’t get any better.

“I’ve always said this is the best time to be the chief metrologist for your country,” said Alan Steele, who plays that role for Canada as director-general of the National Research Council’s Metrology Research Centre in Ottawa.

Dr. Steele is set to cast Canada’s vote to adopt the new kilogram definition on Friday.

The kilogram was originally devised as part of the metric system, a byproduct of the French revolution that sought to break with traditional measures such as the pound − an arbitrary quantity that harks back to the Roman libra − in favour of more scientifically derived units.

Initially, the kilogram was defined as the mass of a litre of water at the freezing point. But by 1889, the countries who were then part of the General Conference on Weights and Measures agreed that the value of the unit mass had to be pegged to something that could be measured far more precisely.

They settled on a reference weight machined out of a platinum-iridium alloy that has served as the world’s prototype kilogram ever since.

Canada was not among the original signatories, having joined the conference in 1907. But this time around, Canada has played a key role in supplanting the prototype with a new definition that no longer depends on a physical artifact.

Story continues below advertisement

Story continues below advertisement

“It’s something we’re really proud of,” Dr. Steele said. “Metrology is about credibility and demonstrating you’re as good as you say you are.”

What Canada turns out to be good at is measuring an exceedingly small number, known as Planck’s constant, that serves as the fundamental increment of action in the universe. In theory, the constant emerges whenever you divide the energy in a particle of light by its frequency. In practice, it’s not so easy to measure. Right now, Canada holds the record for setting the value of the tiny constant with the least amount of uncertainty − about 9.1 parts per billion.

The device that made it possible is the Kibble balance, a supercharged version of the standard laboratory scale. But instead of comparing the masses of two objects, the Kibble balance very precisely sets the mass of one object against the magnetic force generated by an electric current flowing through a coil of wire. The ingenious design bridges the mechanical realm with the electromagnetic, and thereby allows the kilogram to be bound firmly to Planck’s constant for all time.

This has long been a goal of scientists who serve as the arbiters of measurement. The metre and the second have been defined by physical constants, such as the speed of light, for decades. But because of the difficulty of measuring Planck’s constant, the kilogram has been a holdout until now.

“It was only in the last two years or so that it became clear that we could vote for the redefinition,” said Michael Stock, director of physical metrology for the International Bureau of Weights and Measures in Sèvres, France. The effort, he said, is not driven by the needs of today but by the possibilities that more precise units of measurement open up for scientists in years to come.

Canada first got into the kilogram game in 2009 when it took over a Kibble balance from Britain and, in Dr. Steele’s words, “went to town and made almost every aspect of the experiment better and better.”

Story continues below advertisement

The work ultimately allowed Canada’s blue-chip measurement to be included with a handful of other labs who have combined to set the value of Planck’s constant and, by extension, the kilogram.

Carlos Sanchez, who was part of the team that conducted the work at the NRC, said that the challenge was not just about being precise but about relentlessly beating down more than a dozen sources of uncertainty in the equipment for several years to get the cleanest possible result.

“It takes patience,” Dr. Sanchez said. “You have to have a plan, otherwise you can spend your life doing experiments that lead nowhere.”

Happily, the NRC’s experiments have led to the Palace of Versailles, where representatives from 60 countries have gathered to officially approve the new definition, not only for the kilogram but also for units of electric current (the ampere), temperature (the kelvin) and particulate quantity (the mole). A unanimous vote is expected, after which the global edifice of measurement, perhaps humanity’s greatest tribute to objective reality, will stand on firmer ground − all with nothing apparently having changed.

Which is precisely the point, Dr. Steele said. At the end of it all, “you want the kilogram to still weigh a kilogram.”

Let’s block ads! (Why?)



Source link

Continue Reading

Science

Researchers walk back major ocean warming result

Published

on

By



<!–

googletag.cmd.push(function() googletag.display(‘post_link_unit’); );

–>

Scripps Pier after sunset in La Jolla, California. Image via Hayne Palmour IV/ San Diego Union-Tribune/ os Angeles Times.http://www.latimes.com/science/sciencenow/la-sci-sn-oceans-heat-error-20181114-story.html

This is good news. It is less certain today that Earth’s oceans are 60% warmer than we thought (although they may still be that warm). As reported in the Los Angeles Times today (November 14, 2018), researchers with UC San Diego’s Scripps Institution of Oceanography and Princeton University have had to walk back a widely reported scientific result – based on a paper published in Nature last month – showing that showed Earth’s oceans were heating up dramatically faster than previously thought, as a result of climate change.

The October 31 paper in Nature stated the oceans had warmed 60% more than outlined by the United Nations’ Intergovernmental Panel on Climate Change (IPCC). On November 6, mathematician Nic Lewis posted his criticisms of the paper at Judith Curry’s blog. Both Lewis and Curry are critics of the scientific consensus that global warming is ongoing and human-caused.

In his November 6 blog post, Lewis pointed out flaws in the October 31 paper. The authors of the October 31 paper now say they’ve redone their calculations, and – although they find the ocean is still likely warmer than the estimate used by the IPCC – they agree that they “miffed” the range of probability. They can no longer support the earlier statement of a heat increase 60% greater than indicated. They now say there is a larger range of probability, between 10% and 70%, as other studies have already found.

A correction has been submitted to Nature.

The Los Angeles Times reported that one of the co-author’s on the paper – Ralph Keeling at the Scripps Institution of Oceanography – “took full blame” and thanked Lewis for alerting him to the mistake. Keeling told the Los Angeles Times:

When we were confronted with his insight it became immediately clear there was an issue there. We’re grateful to have it be pointed out quickly so that we could correct it quickly.

In the meantime, the Twitter-verse today has done the expected in a situation like this, where a widely reported and dramatic climate result has had to be walked back. Many are making comments like this one:

But cooler heads on Twitter and elsewhere in the media are also weighing in, pointing out – as has been necessary to point out time and again – that science is not a “body of facts.” Science is a process. Part of the reason scientists publish is so that other scientists can find errors in their work, so that the errors can be corrected.

All scientists know this. The Los Angeles Times explained it this way:

While papers are peer-reviewed before they’re published, new findings must always be reproduced before gaining widespread acceptance throughout the scientific community …

The Times quoted Gerald Meehl, a climate scientist at the National Center for Atmospheric Research in Boulder, Colorado, as saying:

This is how the process works. Every paper that comes out is not bulletproof or infallible. If it doesn’t stand up under scrutiny, you review the findings.

Bottom line: An error has been found in the October 31, 2018 paper published in Nature – showing an increase in ocean warming 60% greater than that estimated by the IPCC. The authors have acknowledged the error, and a correction has been submitted to Nature.

October 31 paper in Nature: Quantification of ocean heat uptake from changes in atmospheric O2 and CO2 composition

November 6 blog post by Nic Lewis: A major problem with the Resplandy et al. ocean heat uptake paper

Deborah Byrd

MORE ARTICLES

Please enable JavaScript to view the comments powered by Disqus.

Let’s block ads! (Why?)



Source link

Continue Reading

Trending

Copyright © 2018 Canada News Media

%d bloggers like this: