Connect with us

Science

The next generation of NASA’s space telescopes, and their impact on the next century of observational astronomy

Published

 on

Astronomers have turned their eye towards the future following the US National Academies’ latest decadal survey of astronomy and astrophysics, which recommended a new generation of space telescopes. Keith Cooper explores their prospects, and the lessons learned from the troubled development of the James Webb Space Telescope

Genius Dog 336 x 280 - Animated


Compare and contrast The Pillars of Creation as seen by the Hubble Space Telescope and the James Webb Space Telescope (JWST). On the left is Hubble’s iconic view, taken in visible light in 2014. On the right is the JWST’s new near-infrared view, released in October 2022. (Courtesy: NASA, ESA, CSA, STScI)

Christmas Day 2021 was a happy occasion for most astronomers around the world, as it was when the much-delayed James Webb Space Telescope (JWST) was finally launched. However, the fanfare surrounding its unfurling in space over the next month, as well as the subsequent jubilation over its first images, has masked a troubling problem in observational astronomy – which is that much of the rest of NASA’s fleet of space-based orbiting observatories is ageing. The Hubble Space Telescope has been working since 1990, while the Chandra X-ray Observatory was launched nearly a decade later. Meanwhile, their infrared compatriot, the Spitzer Space Telescope, launched in 2003, is no longer operating, having been shut down in 2020.

That’s why astronomers are worried that should something happen to one or more of these increasingly rickety telescopes, they could be cut off from whole swathes of the electromagnetic spectrum. With the shutdown of Spitzer, the far-infrared (160 μm) is already out of reach as the JWST only ventures into the mid-infrared at 26 μm. Similarly, the JWST is not optimized for observing visible or ultraviolet wavelengths like Hubble does. Sure, the forthcoming Nancy Grace Roman Space Telescope – formerly the Wide Field InfraRed Survey Telescope (WFIRST) – is an optical and near-infrared telescope, but its field of view is much wider than Hubble’s, meaning it is not geared for close-up, detailed work; nor does it have Hubble’s ultraviolet coverage.

Great observatories

To ensure our view of the universe across the spectrum remains bright, US astronomers are currently picking and choosing the next cohort of space telescopes. The prime recommendation of the latest astronomical decadal survey from the US National Academies of Sciences, Engineering and Medicine – the 614-page report Pathways to Discovery in Astronomy and Astrophysics for the 2020s (Astro2020) – is for plans to be put in place for a new generation of “great observatories” to begin launching in the 2040s. This echoes when Chandra, Hubble, Spitzer and the Compton Gamma-Ray Observatory (which operated between 1991 and 2000 and was succeeded in 2008 by the Fermi Space Telescope) were being developed, and which were heralded as the “great observatories”.

 

Working alongside each other to study the universe, these telescopes have spearheaded NASA’s astrophysics research for decades. The reuse of this phrase “great observatories” in the new decadal survey is deliberate, says the survey’s co-chair, Fiona Harrison of the California Institute of Technology. “It’s to get across the point that panchromatic observations, from X-rays to infrared, are really essential for modern astrophysics,” she says. “A lot of the success of the [original] great observatories is that they were developed and launched one after the other, with overlapping observations.”

Building a successful space telescope is a long process, typically taking 25 years from the start of development to launch. Concept work for Hubble began in the 1960s, while plans for the JWST first came together in 1995, after the Hubble Deep Field images showed that the first galaxies are within reach of a larger telescope. The next generation of such space-based probes therefore won’t launch until the 2040s, at the earliest. But they will include the survey’s number one recommendation: a flagship mission to replace Hubble, drawing inspiration from two concepts – the Habitable Exoplanet Observatory (HabEx) and the Large UltraViolet, Optical and InfraRed (LUVOIR) telescope. Also on the drawing board are an X-ray mission and a telescope that can observe in the far-infrared.

Table of NASA mission timescales and costs

But given the precarious health of our current crop of space telescopes, and knowing the new missions won’t launch for another 20 years, shouldn’t astronomers have started planning for new great observatories years ago? “For sure,” says Steven Kahn of Stanford University, who chaired one of the panels in the decadal survey looking at future space telescopes. He cites the Constellation-X observatory – an X-ray space probe that was recommended as a follow-up to Chandra in the 2000 decadal survey, but never came to fruition because of the drawn-out development of the JWST, which sucked up all the astrophysics budget. “The JWST basically dominated the great observatory programme at NASA for two-and-a-half decades,” explains Kahn. “As a result, there wasn’t room to do a follow-on X-ray mission, or the kind of pioneering far-infrared mission that we’re envisaging.”

Winner takes it all 

Indeed, the JWST’s development saw many issues, including huge overruns in cost and development time, which almost saw the project cancelled. The memory of these mistakes looms large over the new decadal survey, influencing some of the recommendations made to restore balance to astrophysics in the US. But it wasn’t always like this. Kahn laments how, prior to the 2000 survey, just getting on the list of recommendations in a decadal survey was enough to virtually guarantee that your project or mission would happen. But in the modern era of $10bn telescopes, “you have to be number one or you’re not going to get it done” says Kahn. “The problem is that in this winner-takes-all environment, everybody wants to throw all the bells and whistles they can onto a project because if you think you’re only going to get one shot at a big mission in the next 50 years, you want to make it count.”

It’s this way of thinking that can lead to the problems the JWST both faced and caused. The more complex a mission design becomes, the more instruments and capability that you want it to have to make it worthwhile – which means that it grows more expensive and takes longer to develop. “All of which gets us back into this vicious cycle of winner takes all,” continues Kahn.

Harrison agrees, emphasizing that this new decadal survey is an attempt to try and change US astronomy’s approach. “For a decadal survey to say, this is the number-one thing, we need to do it no matter what, at whatever cost it ends up being, is not a responsible approach,” she says. In an attempt to counter this, the recent survey makes a number of new proposals. Among them is the idea that missions should be designed in tune with specific science priorities, rather than allowing the mission concept to run away with itself, with all the “bells and whistles”, to quote Kahn.

Artist concepts of Lynx and Origins

For example, one of the key science questions that Kahn’s panel looked at was the way in which active supermassive black holes in distant, dusty galaxies influence star formation. The accretion of matter onto such black holes would be detectable to a high-angular-resolution X-ray telescope, while a far-infrared spectroscopic mission would be able to peer through the dust and probe specific spectral lines related to star formation and feedback from black-hole winds. The hope is that the two missions could be launched within a few years of one another, and operate in unison. However, what shape those missions will take is still up in the air.

Prior to the decadal survey there were two mission concepts – the Lynx X-ray Observatory and the Origins Space Telescope – that would operate at mid- to far-infrared wavelengths, with a telescope mirror between 6 and 9 m in diameter. Each was estimated to cost about $5bn, but the decadal survey concluded that these costs were being underestimated and that their science capabilities didn’t quite fit into the requirements that the panel was looking for.

Flagship missions

And here enters one of the decadal survey’s other innovations – namely, a new class of space telescope referred to as “probe-class”, with budgets of a few billion dollars. “We have to acknowledge that if things were all going to be as expensive as JWST, it would be difficult to have all the great observatories operating at the same time,” says Marcia Rieke of the University of Arizona, who led the second panel on space telescopes, focusing on the optical and near-infrared regime. “The best way instead might be to have one flagship mission, and then have the other parts of the electromagnetic spectrum covered by probe missions.”

Indeed, any possible X-ray and far-infrared probe-class missions could also be joined by a probe-class ultraviolet telescope. Improvements in mirror coatings and detectors over the last few decades mean that a 1.5 m telescope could actually be more sensitive than Hubble at ultraviolet wavelengths. “That would provide some robustness against Hubble out-and-out failing,” says Rieke.

Timeline of missions recommended in NASA's decadal survey

To help develop these future space telescopes, whether they proceed as $10bn behemoths or go forward as more modest (but still ambitious) probe missions, the decadal survey recommends that NASA creates a new Great Observatories Mission and Technology Maturation Programme. It would not just develop the technology, but also “mature the mission concepts”, says Harrison. For its part, NASA is already holding workshops as part of this new programme and has produced a draft call for probe missions.

If the X-ray and far-infrared missions – nicknamed “Fire” and “Smoke” for now – are to be probe-class, then the flagship great observatory will be the long-awaited direct replacement for the Hubble Space Telescope. The concept that leads the way is LUVOIR, and two versions of the telescope have been proposed: either a wildly ambitious 15 m telescope, or an 8 m telescope, the latter of which would still be the largest space telescope ever launched.

Other Earths

For cost and practicality reasons, the decadal survey recommended that the 15 m version fall by the wayside, and that the final design meld the best parts of both LUVOIR and HabEx. The key science goal of this telescope, explains Rieke, is that it has to be able to detect Earth-mass planets in the habitable zone of stars. To that end, Rieke’s panel engaged in a discussion with the exoplanet community about how many potentially habitable planets could be detected as a function of the size of the telescope.

Artist's concept of LUVOIR

“As a group, you ask: what are the key science goals? What level of sensitivity is needed? What’s the smallest telescope that will do the job?” says Rieke. The answer she got back was that a 6–8 m-aperture telescope is about as small as you dare go if you want to find potentially habitable exoplanets.

Success isn’t just about the size of the telescope though; its instruments have to be up to scratch too. Successfully imaging Earth-sized planets close to their stars will require a coronagraph as part of its design. Exoplanets the size of Earth normally cannot be imaged because the glare of their star is too overpowering. A coronagraph blocks the light of the star, making it easier to see any planets in attendance. They have been a staple of studies of the Sun for decades – their name comes from blocking the Sun’s disc so astronomers can see the solar corona. But devising a coronagraph that can precisely block the bright light of a star, which appears as essentially a point source, while allowing planets just milliarcseconds from the star to be visible by reducing the contrast between the star’s glare and the planets’ light to 10–10, is “quite a step beyond anything that we’ve done before”, says Rieke.

Beyond space, telescopes on the ground

Artist's concept of the completed Giant Magellan Telescope

Not all of the decadal survey’s recommendations are related to giant telescopes in space. Indeed, some of them are giant telescopes firmly rooted on Earth. For example, the controversial Thirty Meter Telescope to be built on Mauna Kea in Hawaii, despite the protests of some native Hawaiians, continues to move forward. So too is the Grand Magellan Telescope, which is under construction in Chile and will feature seven 8.4 m telescopes to give an effective diameter of 24.5 m.

The survey also recommends that the Next Generation Very Large Array – 244 radio dishes of 18 m diameter and 19 dishes of 6 m diameter spread across the US south-west – should start being built by the end of the decade. It will replace the ageing Very Large Array in New Mexico and the Very Long Baseline Array of dishes across the US. Upgrades to the Large Interferometer Gravitational-wave Observatory (LIGO) and plans for an eventual successor are also recommended.

Meanwhile, cosmologists will be heartened to hear that the survey also calls for a new ground-based observatory, dubbed the CMB Stage 4 observatory, to detect polarization in the cosmic microwave background radiation to search for evidence of primordial gravitational waves that resulted from cosmic inflation in the earliest moments of the universe.

Finally, back in space, the highest priority for medium-scale missions is a fast-response time-domain and multimessenger programme to replace NASA’s Swift spacecraft and detect supernovae, gamma-ray bursts, kilonovae and various other kinds of astronomical transients. Crucially, the missions in this new programme need to be able to work with and support the ground-based observations of LIGO, the Cherenkov Telescope Array and the IceCube neutrino detector, for which a “Generation 2” detector has also been recommended.

Sufficiently funded?

The general response to the decadal survey’s recommendations has been mostly positive, with NASA, the National Optical-Infrared Astronomy Research Laboratory (NOIRLab) and the National Radio Astronomy Observatory (NRAO) all giving it their seal of approval. The next step, says Harrison, is to convince politicians to part with the funds that will be needed to make the great observatories possible.

The next step is to convince politicians to part with the funds that will be needed to make the great observatories possible

Fiona Harrison, California Institute of Technology

“Certainly a focus now for myself and Robert Kennicutt [Harrison’s fellow co-chair from the University of Arizona and Texas A&M University] is to try and articulate to Congress the excitement of the compelling projects recommended by the survey,” she says. “It was a positive response from NASA, and it wants to make the recommendations happen, but the budget has to be there.”

Should that money be forthcoming, then Rieke estimates the funding required to mature the technology for the optical telescope to be about half a billion dollars. “We would then be poised, near the end of this decade, to have all the technology ducks sitting in a row and we’ll be able to enter the construction phase,” she says.

The timescales involved are phenomenal. If Hubble and Chandra are anything to go by, the next-generation telescopes launched in the 2040s could still be operational in the 2070s or beyond. The decadal survey’s recommendations are therefore not just important for the next 10 years of astronomy, but for their impact on much of this century. There was therefore tremendous pressure on the survey to have got it right.

“That’s where it’s important to pick ambitious goals,” says Rieke. “You have to identify something that’s so important that everyone agrees, and is enough of a step forward that something else isn’t going to overtake you while you’re doing it.” History will judge whether this decadal survey got its key decisions correct, but from today’s perspective, the future of astrophysics promises to be an exciting one.

Adblock test (Why?)

Source link

Continue Reading

Science

Physicists Create ‘the Smallest, Crummiest Wormhole You Can Imagine’ – The New York Times

Published

 on


In an experiment that ticks most of the mystery boxes in modern physics, a group of researchers announced on Wednesday that they had simulated a pair of black holes in a quantum computer and sent a message between them through a shortcut in space-time called a wormhole.

Physicists described the achievement as another small step in the effort to understand the relation between gravity, which shapes the universe, and quantum mechanics, which governs the subatomic realm of particles.

“This is important because what we have here in its construct and structure is a baby wormhole,” said Maria Spiropulu, a physicist at the California Institute of Technology and the leader of a consortium called Quantum Communication Channels for Fundamental Physics, which conducted the research. “And we hope that we can make adult wormholes and toddler wormholes step-by-step.”

Genius Dog 336 x 280 - Animated

In their report, published Wednesday in Nature, the researchers described the result in measured words: “This work is a successful attempt at observing traversable wormhole dynamics in an experimental setting.”

The wormhole that Dr. Spiropulu and her colleagues created and exploited is not a tunnel through real physical space but rather through an “emergent” two-dimensional space. The “black holes” were not real ones that could swallow the computer but lines of code in a quantum computer. Strictly speaking, the results apply only to a simplified “toy model” of a universe — in particular, one that is akin to a hologram, with quantum fields on the edge of space-time determining what happens within, sort of in the way that the label on a soup can describes the contents.

To be clear: The results of this experiment do not offer the prospect anytime soon, if ever, of a cosmic subway through which to roam the galaxy like Jodie Foster in the movie “Contact” or Matthew McConaughey in “Interstellar.”

“I guess the key question, which is perhaps hard to answer, is: Do we say from the simulation it’s a real black hole?” Daniel Jafferis, a physics professor at Harvard, said. “I kind of like the term ‘emergent black hole.’”

He added: “We are just using the quantum computer to find out what it would look and feel like if you were in this gravitational situation.” He and Alexander Zlokapa, a doctoral student at the Massachusetts Institute of Technology, are the lead authors of the paper.

Physicists reacted to the paper with interest and caution, expressing concern that the public and media would mistakenly think that actual physical wormholes had been created.

“The most important thing I’d want New York Times readers to understand is this,” Scott Aaronson, a quantum computing expert at the University of Texas in Austin, wrote in an email. “If this experiment has brought a wormhole into actual physical existence, then a strong case could be made that you, too, bring a wormhole into actual physical existence every time you sketch one with pen and paper.”

Daniel Harlow, a physicist at M.I.T. who was not involved in the experiment, noted that the experiment was based on a model of quantum gravity that was so simple, and unrealistic, that it could just as well have been studied using a pencil and paper.

“So I’d say that this doesn’t teach us anything about quantum gravity that we didn’t already know,” Dr. Harlow wrote in an email. “On the other hand, I think it is exciting as a technical achievement, because if we can’t even do this (and until now we couldn’t), then simulating more interesting quantum gravity theories would CERTAINLY be off the table.” Developing computers big enough to do so might take 10 or 15 years, he added.

Leonard Susskind, a physicist at Stanford University who was not part of the team, agreed. “They’re learning that they could do this experiment,” he said, adding: “The really interesting thing here is the possibility of analyzing purely quantum phenomena using general relativity, and who knows where that’s going to go.”

Albert Einstein at the Carnegie Institute of Technology, now known as Carnegie Mellon University, in Pittsburgh in 1934.Pictorial Press Ltd., via Alamy

Wormholes entered the physics lexicon in 1935 as one of the weirder predictions of Albert Einstein’s general theory of relativity, which describes how matter and energy warp space to create what we call gravity. That year Einstein and his colleague, Nathan Rosen, showed in a paper that shortcuts through space-time, connecting pairs of black holes, could exist. The physicist John Wheeler later called these connectors “wormholes.”

Originally it seemed that wormholes were effectively useless; theory held that they would slam shut the instant anything entered them. They have never been observed outside of science fiction.

A month earlier that same year, in 1935, Einstein, Rosen and Boris Podolsky made another breakthrough, one they thought would discredit the chancy nature of quantum mechanics. They pointed out that quantum rules permitted what Einstein called “spooky action at a distance.” Measuring one of a pair of particles would determine the results of measuring the other particle, even if the two were light-years apart. Einstein thought this prediction was absurd, but physicists now call it “entanglement” and use it every day in the lab.

Until a few years ago, such quantum tricks weren’t thought to have anything to do with gravity. As a result, physicists were left with no theory of “quantum gravity” to explain what happened when the realms of inner space and outer space collided, as in the Big Bang or inside black holes.

But in 2013 Juan Maldacena, a theoretical physicist at the Institute for Advanced Study in Princeton, and Dr. Susskind proposed that these two phenomena — spooky action and wormholes — were actually two sides of the same coin, each described in a different but complementary mathematical language.

Those spooky, entangled particles, by this logic, were connected by equally mysterious wormholes. Quantum mechanics could be enlisted to study gravity, and vice versa. The equations that describe quantum phenomena turned out to have analogues in the Einsteinian equations for gravity.

“It’s mostly a matter of taste which description you use because they give exactly the same answer,” Dr. Jafferis said. “And that was an incredible discovery.”

In a quantum computer, physicists use a circuit of operations called gates to open a shortcut in an imaginary space between qubits representing two black holes and send messages between them.Andrew Mueller/INQNET

The recent wormhole experiment sought to employ the mathematics of general relativity to examine an aspect of quantum magic, known as quantum teleportation, to see if some new aspect of physics or gravity might be revealed.

In quantum teleportation, physicists use a set of quantum manipulations to send information between two particles — inches or miles apart — that are entangled in a pair, without the physicists knowing what the message is. The technology is expected to be the heart of a next-generation, unhackable “quantum internet.”

Physicists like to compare the teleportation process to two cups of tea. Drop a cube of sugar into one teacup, and it promptly dissolves — then, after a tick of the quantum clock, the cube reappears intact in the other teacup.

The experiment became conceivable after a pair of papers by Dr. Susskind and, independently, by Dr. Jafferis, Ping Gao of M.I.T., and Aron Wall, a theoretical physicist at the University of Cambridge. They suggested a way that wormholes could be made traversable, after all. What was needed, Dr. Gao and his collaborators said, was a small dose of negative energy at the exit end of the wormhole to prop open the hatch long enough for information to escape.

In classical physics, there is no such thing as negative energy. But in quantum theory, energy can be negative, generating an antigravitational effect. For example, so-called virtual particles, which flit in and out of existence using energy borrowed from empty space, can fall into a black hole, carrying a debt to nature in the form of energy that the black hole must then pay back. This slow leak, Stephen Hawking calculated in 1974, causes the black hole to lose energy and shrink.

When Dr. Spiropulu proposed trying to recreate this wormhole magic on a quantum computer, her colleagues and sponsors at the Department of Energy “thought I was completely nuts,” she recalled. “But Jafferis said, Let’s do it.”

One clue that the researchers were actually recording “wormholelike” behavior was that signals emerged from the other end of the wormhole in the order that they went in.Andrew Mueller/INQNET

In ordinary computers, including the phone in your pocket, the currency of calculation is bits, which can be ones or zeros. Quantum computers run on qubits, which can be 0 or 1 or anywhere in between until they are measured or observed. This makes quantum computers super powerful for certain kinds of tasks, like factoring large numbers and (maybe one day) cracking cryptographic codes. In essence, a quantum computer runs all the possible variations of the program simultaneously to arrive at a solution.

“We make uncertainty an ally and embrace it,” Dr. Spiropulu said.

To reach their full potential, quantum computers will need thousands of working qubits and a million more “error correction” qubits. Google hopes to reach such a goal by the end of the decade, according to Hartmut Neven, head of the company’s Quantum Artificial Intelligence lab in Venice, Calif., who is also on Dr. Spiropulu’s team.

The Caltech physicist and Nobel laureate Richard Feynman once predicted that the ultimate use of this quantum power might be to investigate quantum physics itself, as in the wormhole experiment.

“I’m excited to see that researchers can live out Feynman’s dream,” Dr. Neven said.

The wormhole experiment was carried out on a version of Google’s Sycamore 2 computer, which has 72 qubits. Of these, the team used only nine to limit the amount of interference and noise in the system. Two were reference qubits, which played the roles of input and output in the experiment.

The seven other qubits held the two copies of code describing a “sparsified” version of an already simple model of a holographic universe called SYK, named after its three creators: Subir Sachdev of Harvard, Jinwu Ye of Mississippi State University and Alexei Kitaev of Caltech. Both SYK models were packed into the same seven qubits. In the experiment these SYK systems played the role of two black holes, one by scrambling the message into nonsense — the quantum equivalent of swallowing it — and then the other by popping it back out.

“Into this we throw a qubit,” Dr. Lykken said, referring to the input message — the quantum analog of a series of ones and zeros. This qubit interacted with the first copy of the SYK qubit; its meaning was scrambled into random noise and it disappeared.

Then, in a tick of the quantum clock, the two SYK systems were connected and a shock of negative energy went from the first system to the second one, briefly propping open the latter.

The signal then reappeared in its original unscrambled form — in the ninth and last qubit, attached to the second SYK system, which represented the other end of the wormhole.

One clue that the researchers were actually recording “wormholelike” behavior, Dr. Lykken said, was that signals emerged from the other end of the wormhole in the order that they went in.

In a Nature article accompanying Dr. Jafferis’s paper, Dr. Susskind and Adam Brown, a physicist at Stanford, noted that the results might shed light on some still-mysterious aspects of ordinary quantum mechanics. For instance, after the sugar cube dissolves in the first teacup, why does it reappear in the other cup in its original form?

“The surprise is not that the message made it across in some form, but that it made it across unscrambled,” the two authors wrote.

The easiest explanation, they added, is that the message went through a wormhole, albeit a “really short” one, Dr. Lykken said in an interview. In quantum mechanics, the shortest conceivable length in nature is 10³³ centimeters, the so-called Planck length. Dr. Lykken calculated that their wormhole was maybe only three Planck lengths long.

“It’s the smallest, crummiest wormhole you can imagine making,” he said. “But that’s really cool because now we’re clearly doing quantum gravity.”

Adblock test (Why?)



Source link

Continue Reading

Science

Distant black hole is caught in the act of annihilating a star – Edmonton Journal

Published

 on


Article content

WASHINGTON — Astronomers have detected an act of extreme violence more than halfway across the known universe as a black hole shreds a star that wandered too close to this celestial savage. But this was no ordinary instance of a ravenous black hole.

Article content

It was one of only four examples – and the first since 2011 – of a black hole observed in the act of tearing apart a passing star in what is called a tidal disruption event and then launching luminous jets of high-energy particles in opposite directions into space, researchers said. And it was both the furthest and brightest such event on record.

Genius Dog 336 x 280 - Animated

Article content

Astronomers described the event in studies published on Wednesday in the journals Nature and Nature Astronomy.

The culprit appears to be a supermassive black hole believed to be hundreds of millions of times the mass of our sun located roughly 8.5 billion light years away from Earth. A light year is the distance light travels in a year, 5.9 trillion miles (9.5 trillion km).

“We think that the star was similar to our sun, perhaps more massive but of a common kind,” said astronomer Igor Andreoni of the University of Maryland and NASA’s Goddard Space Flight Center, lead author of one of the studies.

Article content

The event was detected in February through the Zwicky Transient Facility astronomical survey using a camera attached to a telescope at the Palomar Observatory in California. The distance was calculated using the European Southern Observatory’s Very Large Telescope in Chile.

“When a star dangerously approaches a black hole – no worries, this will not happen to the sun – it is violently ripped apart by the black hole’s gravitational tidal forces -similar to how the moon pulls tides on Earth but with greater strength,” said University of Minnesota astronomer and study co-author Michael Coughlin. (See animation of tidal disruption event)

“Then, pieces of the star are captured into a swiftly spinning disk orbiting the black hole. Finally, the black hole consumes what remains of the doomed star in the disk. In some very rare cases, which we estimated to be 100 times rarer, powerful jets of material are launched in opposite directions when the tidal disruption event occurs,” Coughlin added.

Article content

Andreoni and Coughlin said the black hole was likely spinning rapidly, which might help explain how the two powerful jets were launched into space at almost the speed of light.

Massachusetts Institute of Technology astronomer Dheeraj Pasham, lead author of the other study, said the researchers were able to observe the event very early on – within a week of the black hole starting to consume the doomed star.

While researchers detect tidal disruption events about twice per month, ones that produce jets are extremely rare. One of the jets emanating from this black hole seems to be pointing toward Earth, making it appear brighter than if it were heading in another direction – an effect called “Doppler boosting” that is similar to the enhanced sound of a passing police siren.

The supermassive black hole is believed to reside at the center of a galaxy – much as the Milky Way and most galaxies have one of these at their core. But the tidal disruption event was so bright that it obscured the light of the galaxy’s stars.

“At its peak, the source appeared brighter than 1,000 trillion suns,” Pasham said.

(Reporting by Will Dunham, Editing by Rosalba O’Brien)

Adblock test (Why?)



Source link

Continue Reading

Science

Mars very visible in night sky as it moves closer to Earth – Ashcroft Cache Creek Journal

Published

 on


By Gary Boyle

Some three billion years ago, Mars is believed to have been a water world just like Earth. It possessed great oceans and was most likely on its way to forming life in one form or another. Water is made up of hydrogen, the most common element in the universe, and oxygen, the third most common element. Water is extremely important to the development and sustaininment of life as we know it.

Because Mars is half the size of the earth, the planet lost its heat faster as its internal core stopped rotating. Similar to Earth’s core, which produces a magnetic field around our planet, Mars’ core ceased producing its protective magnetic field, thus allowing the solar winds to eat away at its atmosphere, and the red planet lost its water.

Early telescopic observations were made by the Italian astronomer Giovanni Schiaparelli in 1877, when Mars was in opposition, residing 56-million-kilometres away, and he is said to have seen canali (channels) on Mars. Seeing these features gave the impression of a possible civilization. Since then, the red planet has been the focus of searching for ancient life, and is also the inspiration for science fiction writers and movie makers.

By the 2030s or 2040s, humans are expected to land on this fascinating world, looking for the possibility of life that might have once existed, even at the microbial level. After all, life is life. But Mars is now in the news for other reasons: it is now a very visible object in the night sky.

Appearing as a bright-orange object rising in the northeast sky about forty-five minutes after the sun sets in the west, Mars is nicely placed amongst the bright winter constellations of Orion the Hunter, Taurus the Bull, etc. If you are still not sure where to look, any smartphone astronomy app will guide you.

So why is it so bright? Earth orbits the Sun in 365 days, whereas Mars does so in 687 days. Just like runners on the inner lap on a race track, Earth catches up with, and overtakes, slower Mars every 26 months. This upcoming opposition will occur on Dec. 8 at a separation of only 82 million kilometres. Over the weeks after opposition, our distance increases and Mars will slowly fade. Every seventh opposition is super close, such as back in 2003 and 2020. The next opposition occurs on Jan. 15, 2025.

Be sure to look at Mars the night before, on Dec. 7, as the Full Cold Moon will cover Mars for a little less than one hour. All of Canada, as well as much of the US except for Alaska and the Southeastern states, will see this amazing sight. Throughout its 29.5-day orbit around the earth, the moon moves its width every hour. Throughout the month, it covers stars as seen through a telescope and, in rare events, bright planets. This should be a fantastic photo opportunity, as the disappearance and later reappearance should be quite evident.

Clear skies!

Known as “The Backyard Astronomer”, Gary Boyle is an astronomy educator, guest speaker, and monthly columnist for the Royal Astronomical Society of Canada, as well as a STEM educator. In recognition of his public outreach in astronomy, the International Astronomical Union has honoured him with the naming of Asteroid (22406) Garyboyle. Visit his website at www.wondersofastronomy.com.



editorial@accjournal.ca

Like us on Facebook and follow us on Twitter

Astronomy

Adblock test (Why?)



Source link

Continue Reading

Trending