NASA’s next great eye in the sky, the golden-mirrored James Webb Space Telescope, passed a key review this week, bringing it one step closer to launching in November and observing new parts of the cosmos for scientists here on Earth.
That’s good news for the United States’ space agency, which has spent the last several weeks trying to troubleshoot issues with its current window on the universe, the Hubble Space Telescope.
The storied telescope that has revolutionised our understanding of the cosmos for more than three decades is experiencing a technical glitch. According to NASA, the Hubble Space Telescope’s payload computer, which operates the spacecraft’s scientific instruments, went down suddenly on June 13.
As a result, the instruments on board meant to snap pictures and collect data are not currently functioning. The agency’s best and brightest have been working diligently to get the ageing telescope back online and have run a barrage of tests but still can’t seem to figure out what went wrong.
“It’s just the difficulty of trying to fix something orbiting 400 miles [653 kilometres] over your head instead of in your laboratory,” Paul Hertz, the director of astrophysics for NASA, told Al Jazeera.
“If this computer were in the lab, it would be really quick to diagnose it,” he explained. “All we can do is send a command, see what data comes out of the computer, and then send that data down and try to analyse it.”
When Hubble launched on April 24, 1990, scientists were excited to peer into the vast expanse of space with a new set of “eyes”, but they had no idea how much one telescope would change our understanding of the universe.
The telescope has looked into the far reaches of space, spying the most distant galaxy ever observed — one that formed just 400 million years after the big bang.
Hubble has also produced stunning galactic snapshots like the Hubble Ultra Deep Field.
Captured in one single photograph are hundreds of thousands of ancient galaxies that formed long before the Earth even existed — each galaxy a vast and thriving stellar hub, where hundreds of billions of stars were born, lived their lives, and died.
The light from these galaxies has taken billions of years to reach Hubble’s sensors, making it a time machine of sorts – one that takes us on a journey through time to see them as they were billions of years ago.
Hubble has also spied on our cosmic neighbours, discovering some of the moons around Pluto.
Its observations showed us that almost every galaxy has a supermassive back hole at its centre, and Hubble has also helped scientists create a vast three-dimensional map of an elusive, invisible form of matter that accounts for most of the matter in the universe.
Called dark matter, the enigmatic substance can’t be seen. Scientists only know it exists by measuring its effects on ordinary matter. Thanks to Hubble’s suite of scientific instruments, scientists were able to create a 3D map of dark matter.
What went wrong
Scientists have been planning for Hubble’s inevitable demise for quite some time. Over the past 31 years, the telescope has seen its fair share of turmoil.
Shortly after it launched, NASA discovered that something wasn’t quite right: Hubble’s primary mirror was flawed. Fortunately, the problem could be fixed, as the telescope is the only one in NASA’s history that was designed to be serviced by astronauts.
Over its lifetime (and the course of the agency’s shuttle programme), groups of NASA astronauts have repaired and upgraded Hubble and its instruments five different times.
When the space shuttle retired in 2011, it meant that Hubble would be on its own. If the telescope were in trouble, ground controllers would need to troubleshoot remotely.
So far that has proven to be effective. That is, until June 13.
Just after 4pm EDT (20:00 GMT), an issue with the observatory’s payload computer popped up, putting the telescope and its scientific instruments into safe mode.
Hubble has two payload computers on board — the main computer and a backup for redundancy. These computers, called a NASA Standard Spacecraft Computer-1 (or NSSC-1), were installed during one of the telescope’s servicing missions in 2009; however, they were built in the 1980s.
They’re part of the Science Instrument Command and Data Handling (SI C&DH) unit, a module on the Hubble Space Telescope that communicates with the telescope’s science instruments and formats data for transmission to the ground. It also contains four memory modules (one primary and three backups).
The current unit is a replacement that was installed by astronauts on shuttle mission STS-125 in May 2009 after the original unit failed in 2008.
When the main computer went down in June, NASA tried to activate its backup, but both computers are experiencing the same glitch, which suggests the real issue is in another part of the telescope.
Currently, the team is looking at the various components of the SI C&DH, including the power regulator and the data formatting unit. If one of those pieces is the problem, then engineers may have to perform a more complicated series of commands to switch to backups of those parts.
NASA says it’s going to take some time to sort out the issue and switch over to the backup systems if necessary. That’s because turning on those backups is a riskier manoeuvre than anything the team has tried so far.
The operations team will need several days to see how the backup computer performs before it can resume normal operations. The backup hasn’t been used since its installation in 2009, but according to NASA, it was “thoroughly tested on the ground prior to installation on the spacecraft”.
Part of the trouble with Hubble is that the observatory was designed to be serviced directly. Without a space shuttle, there’s just no way to do so.
“The biggest difference between past issues and this one is there’s no way to replace parts now,” John Grunsfeld, a former NASA astronaut, told Al Jazeera.
But, he added, “The team working on Hubble are masters of engineering. I”m confident they will succeed.”
Looking to the future
The James Webb Space Telescope, scheduled to launch in November, is expected to expand upon Hubble’s legacy. The massive telescope, essentially a giant piece of space origami, will unfold its shiny golden mirrors and peer even further into the universe than Hubble ever could. Its infrared sensors will let scientists study stellar nurseries, the heart of galaxies and much more.
#Webb moves a big step closer to launch!!! 🚀🛰️🔭
Webb has just successfully passed its “Final Mission Analysis Review”, moving it closer to seeing farther!
— ESA Webb Telescope (@ESA_Webb) July 1, 2021
Hubble has shown us that nearly all galaxies have supermassive black holes at their centres, the brightest of which we call quasars. These incredibly bright objects can tell us a lot about galaxy evolution, as the jets and wind produced by a quasar help to shape its host galaxy.
Previous observations have shown that there is a correlation between the masses of supermassive black holes and the masses of their galaxies, meaning that quasars could help regulate star formation in their host galaxy.
“We see black holes at a time when the universe was only 800 million years old that are almost as massive as the biggest we see today, so they evolved extremely early,” Chris Willott of the Canadian Space Agency told Al Jazeera.
“By studying their galaxies, we can see what the impact of such extreme black holes is on the early formation of stars in these galaxies.”
Through Hubble’s eyes, scientists cannot detect individual stars in the galaxies with these ultra-bright quasars, but with Webb, scientists hope they will be able to see not only individual stars, but also the gas from which these stars form.
That means the Webb telescope has the potential to truly revolutionise our understanding of galaxy formation and evolution, the same way that Hubble did for our knowledge of the universe over the past three decades.
NASA’s Europa Clipper will fly on SpaceX’s Falcon Heavy – The Verge
The Europa Clipper got the green light from NASA in 2015. It will fly by the moon 45 times, providing researchers with a tantalizing look at the icy world, believed to have an ocean lurking under its icy crust. The Clipper is equipped with instruments that will help scientists figure out if the moon could support life.
For years, the Clipper was legally obligated to launch on NASA’s long-delayed Space Launch System (SLS). But with the SLS perpetually delayed and over budget, NASA has urged Congress to consider allowing the Europa Clipper to fly commercial. Switching to another vehicle could save up to $1 billion, NASA’s inspector general said in 2019.
NASA got permission to consider commercial alternatives to the SLS in the 2021 budget, and started officially looking for a commercial alternative soon after.
The SLS has powerful allies in Congress, who have kept the costly program alive for years, even as it blew past budgets and deadlines. The first flight of the SLS was originally supposed to happen in 2017. That mission — launching an uncrewed trip around the Moon — has since been pushed to November 2021, and keeping to that new schedule remains “highly unlikely” according to NASA’s Office of Inspector General, a watchdog agency.
SpaceX first launched its Falcon Heavy rocket in 2018, and started flying satellites in 2019. Earlier this year, NASA selected the rocket as the ride to space for two parts of a planned space station orbiting the Moon.
Researchers Develop Genome Techniques to Analyze Adaptation of Cattle – AZoCleantech
Jared Decker, a fourth-generation cattle farmer, has been aware of cattle suffering from health and productivity problems when they are moved from one location to another. The shift is from a region where they had spent generations to another place with a different climate, grass, or elevation.
Decker, as a researcher at the University of Missouri, looks at the chances of using science to resolve this issue, thereby serving a dual purpose to enhance the cattle’s welfare and sealing the leak in an almost $50 billion industry in the United States.
When I joined MU in 2013, I moved cattle from a family farm in New Mexico to my farm here in Missouri. New Mexico is hot and dry, and Missouri is also hot but has much more humidity. The cattle certainly didn’t do as well as they did in New Mexico, and that spurred me to think about how we could give farmers more information about what their animals need to thrive.
Jared Decker, Associate Professor and Wurdack Chair, Animal Genetics, College of Agriculture, Food and Natural Resources
The study was published in the journal PLOS Genetics on July 23rd, 2021.
Decker and his research team have revealed the proof exposing the fact that cattle are losing their key environmental adaptations. The researchers regard this as a loss due to the lack of genetic information available to farmers.
After assessing the genetic materials dating back to the 1960s, the team determined particular DNA variations linked with adaptations that could someday be used to develop DNA tests for cattle. These tests could help educate the farmers regarding the adaptability of cattle from one environment or another.
We can see that, for example, historically cows in Colorado are likely to have adaptations that ease the stress on their hearts at high altitudes. But if you bring in bulls or semen from a different environment, the frequency of those beneficial adaptations is going to decrease. Over generations, that cow herd will lose advantages that would have been very useful to a farmer in Colorado.
Jared Decker, Associate Professor and Wurdack Chair, Animal Genetics, College of Agriculture, Food and Natural Resources, University of Missouri
The research team included then-doctoral student Troy Rowan who had examined 60 years’ worth of bovine DNA data from tests of cryo-preserved semen produced by cattle breed associations. They observed that, as time runs, the genes related to higher fertility and productivity increased as a result of careful selection by farmers. Also, many genes relating to environmental adaptations have decreased.
According to Decker, the farmers are not to be blamed as there are no affordable methods available at present to identify the suitability of cattle for a specific environment. The study also proposes easy-to-use cattle DNA tests that focus on the particular adaptations identified in the study.
Such adaptations include resistance to vasoconstriction, which is a process of blood vessel narrowing that takes place at high elevation and puts excessive stress on the heart. Also creating resistance to the toxin in the grass can result in vasoconstriction and tolerance for increased temperature or humidity. All these factors tend to decline over generations when the cattle are shifted from the associated surroundings.
Sometimes, natural and artificial selection are moving in the same direction, and other times there is a tug of war between them. Efficiency and productivity have vastly improved in the last 60 years, but environmental stressors are never going to go away. Farmers need to know more about the genetic makeup of their herd, not only for the short-term success of their farm, but for the success of future generations.
Jared Decker, Associate Professor and Wurdack Chair, Animal Genetics, College of Agriculture, Food and Natural Resources
The first widely adopted genetic test for cattle was developed at the University of Missouri in 2007. Decker and Rowan are looking forward to giving further details of the development. Both the researchers grew up on farms with a desire to use research to help farmers to balance farm traditions of America with the requirement for eco-friendly business practices.
“As a society, we must produce food more sustainably and be good environmental stewards. Making sure a cow’s genetics match their environment makes life better for cattle and helps farmers run efficient and productive operations. It’s a win-win,” concluded Decker.
Rowan, T. N., et al. (2021) Powerful detection of polygenic selection and evidence of environmental adaptation in US beef cattle. PLOS Genetics. doi.org/10.1371/journal.pgen.1009652.
'Eye of Sauron' volcano and other deep-sea structures discovered in underwater 'Mordor' – Livescience.com
Researchers exploring the Indian Ocean have discovered the remains of a collapsed underwater volcano with an uncanny resemblance to the all-seeing “Eye of Sauron” from J.R.R. Tolkien’s famous fantasy series “The Lord of the Rings,” as well as two other seafloor structures named after places in Tolkien’s Middle-earth.
The eye is actually an oval-shaped depression measuring 3.9 miles (6.2 kilometers) long by 3 miles (4.8 km) wide. Called a caldera, this giant divot is left over from the ancient collapse of a deep-sea volcano. The caldera is surrounded by a 984-foot-tall (300 meters) rim, giving the impression of eyelids, and an equally tall cone-shaped peak at the center, which looks like a pupil, according to The Conversation. The unusual structure is located 174 miles (280 km) southeast of Christmas Island ― an Australian external territory off mainland Australia ― at a depth of 10,170 feet (3,100 m).
A team of researchers discovered the structure while onboard the ocean research vessel Investigator, owned by Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO), on the 12th day of an expedition to Australia’s Indian Ocean Territories. The researchers used multibeam sonar to create 3D maps of the caldera and the surrounding seafloor.
Like other calderas, this one formed when the peak of the original volcano collapsed, according to the researchers.
“The molten magma at the base of the volcano shifts upwards, leaving empty chambers [below],” chief scientist Tim O’Hara, senior curator at Museums Victoria in Australia, wrote in The Conversation. “The thin, solid crust on the surface of the dome then collapses, creating a large, crater-like structure.”
The area surrounding the volcanic crater is also home to two other noteworthy structures.
“Our volcanic ‘eye’ was not alone,” O’Hara wrote. “Further mapping to the south revealed a smaller sea mountain covered in numerous volcanic cones, and further still to the south was a larger, flat-topped seamount.”
Continuing the connection to Tolkien’s fantasy epic, the researchers named the cone-covered mountain Barad-dûr, after Sauron’s main stronghold, and the seamount Ered Lithui, after the Ash Mountains, both of which are found alongside the Eye of Sauron in the evil realm of Mordor.
The Ered Lithui seamount is part of a cluster of seamounts thought to date back about 100 million years, O’Hara wrote. The Ered Lithui seamount was once above the water’s surface, giving it its flat top, and it has gradually sunk to around 1.6 miles (2.6 km) below sea level.
Over millions of years, sand and sinking detritus — particulate matter, including plankton, excrement and other organic matter — have coated the seamount in a thick layer of sediment around 328 feet (100 m) deep. However, the caldera remains relatively uncovered, suggesting it may be significantly younger, O’Hara said.
“This sedimentation rate should have smothered and partially hidden the caldera,” O’Hara wrote. It also “looks surprisingly intact for a structure that should be 100 million years old.”
This freshness suggests that the volcano was created, and subsequently collapsed, after the seamount began sinking into the ocean.
“It is possible that volcanoes have continued to sprout long after the original foundation,” O’Hara wrote. “Our restless Earth is never still.”
Originally published on Live Science.
Investing inside a corporation: what you need to know – MoneySense
Calgary real estate predicted to moderate this year, with hot spring demand – Calgary Herald
Art exhibits return to Callander’s Alex Dufresne gallery – BayToday.ca
Silver investment demand jumped 12% in 2019
Europe kicks off vaccination programs | All media content | DW | 27.12.2020 – Deutsche Welle
Iran anticipates renewed protests amid social media shutdown
Media21 hours ago
Current's 2021 Public Media Salary Survey – Current
Science24 hours ago
Why some scientists want to rebrand shark attacks as 'negative encounters' – CBC.ca
News23 hours ago
Global Affairs Canada 'engaging' with staff in Austria following reports of 'Havana syndrome' symptoms – CTV News
Politics23 hours ago
Jason Kenney's longing for Alberta's pre-COVID politics – iPolitics.ca
Tech14 hours ago
OnePlus Nord 2: An impressive 5G phone at an affordable price – CNET
Investment24 hours ago
Newcomer SageBlan backs up its belief in Montreal tourism with heavy investment – Montreal Gazette
Science22 hours ago
DeepMind puts the entire human proteome online, as folded by AlphaFold – Yahoo Movies Canada
Business20 hours ago
Nova Scotia reports 93rd COVID-19 related death; no new cases Thursday – CTV News Atlantic