Technology moves fast. Scientists are already developing the next generation of computing called quantum computing. Over the past year, I’ve had the opportunity to speak several times with Bob Sutor, VP of IBM’s Quantum Ecosystem Development, on what was happening in the field. While quantum computing isn’t a household term, Sutor shared that it isn’t new. Quantum computing’s roots extend back to the 1900s and quantum mechanics.
Quantum computing uses quantum mechanics concepts such as superposition, entanglement and interference. Yet, these terms are confusing for individuals not deeply rooted in physics. Rather than try to explain each of these, I suggest watching this video where WIRED challenged Dr. Talia Gershon, Director of Hybrid Cloud Infrastructure Research at IBM Research, to explain quantum computing to 5 different people. Gershon was previously IBM’s Senior Manager of Quantum Research. The video is brilliant.
Sutor also provided a simplified way of describing it when he said we can think of computing as being in two camps. Sutor referred to these camps as classical and quantum. The classical era uses what we have today, such as processors, servers on the internet, the mainframes, and high-performance computing. But why was quantum developed and why do we care about it?
Sutor described how certain problems simply couldn’t be solve with a classical computer. Similar to what Gershon said in the video, traditional computing can run out of capacity to solve the problem. In other cases, classical computing takes too long to solve the computation. Hence, IBM and others are working on a different type of computer that removes those constraints.
Why don’t we have it today?
The move to quantum requires constructing different hardware, software, physical enclosures and even a new programming model. A quantum computer uses qubits, which are quantum bits. It’s an extension of the idea of zeros or ones, but with quantum, we have more than a binary choice of a zero or a one. One way to describe this is a game of heads or tails. When a coin lands, it’s either heads or tails. However, while it is spinning before it lands, it is neither heads nor tails. It’s both. This same principle applies to a qubit in computing, which can be in a state of zero, one, or a superposition that represents both. This flexible state provides extra dimensions in which to compute.
If we add the concept of entanglement (intertwining qubits to make their behavior correlated), we can double the power of quantum computing every time we add another qubit. For example, the system can go from two to four to eight to 16. This exponential increase in computing makes it well suited to solve complex computing problems.
But there are challenges
In the ideal world, you’d create a qubit, add additional qubits, and apply operations (instruction sets) to the qubits to achieve the desired outcome. However, it’s not quite that simple. As Scientific Computer said, “Quantum computers are exceedingly difficult to engineer, build and program. As a result, they are crippled by errors in the form of noise, faults and loss of quantum coherence, which is crucial to their operation and yet falls apart before any nontrivial program has a chance to run to completion.”
Qubits are sensitive to heat and any outside interference. The qubits must be kept in a cold chamber because heat creates computing errors. For example, the IBM quantum computer sits in a chamber where the temperature is 0.015 Kelvin. As a comparison, outer space has an average temperature of 2.7 Kelvin.
Noise can cause some errors which affect the computation. Some of these errors might come from small manufacturing defects, while others may occur if the system applies too much energy to the qubit. The physical structure of computing has to remove extra noise from the real world. To do this, you have to perform what’s called mitigation to reduce it. IBM published a paper in Nature, explicitly talking about a smart solution that uses noise to help eliminate noise.
It’s about quality, not quantity
Like the processor wars of old, it’s easy to get caught up with the discussion of which vendors system has more qubits than another vendor. The question in quantum should not be how many qubits a computer supports, but what is the number of high-quality qubits. A good quantum system starts at the device level. Scientists and engineers build the qubits for a device, attempt to minimize the errors in each qubit, and optimize how it’s connected to other qubits. For example, your television isn’t going to have a clear picture if you had used poor quality HDMI cable. It’s the same with quantum. Each step is essential to ensure the highest quality.
Since quantum computing differs from classical computing, qubits require new metrics for measuring quality. Sutor said IBM uses a metric called Quantum Volume (QV) to measure a quantum computer’s power. The QV method quantifies the largest random circuit of equal width and depth that the system implements with high performance. Quantum computing systems with high-fidelity operations, high connectivity, large calibrated gate sets, and circuit-rewriting software toolchains should have higher quantum volumes. Taking the technical terms aside, this shifts the dialogue from merely stating the number of qubits to talking about the number of stable qubits (coherence) that can interact (connectivity) with each other as a system. Sutor said IBM has been able to double this power year over year since 2017.
Given the nascent state of the industry, it’s not surprising to hear that there are differing views on the best way to measure a quantum computers performance. While I can’t comment on the best way to do this, vendors must provide a framework for helping buyers understand the attributes of performance and the different styles of measuring them. Quantum Volume is starting to be used by others, such as Honeywell, in stating the performance of their systems.
Is quantum computing right around the corner?
Quantum computing is perceived as a technology that is very far away. And, indeed, quantum computing isn’t around the corner. However, tremendous progress occurs every six months. Just last week, IBM announced that by combining a series of new software and hardware techniques to improve overall performance, IBM had upgraded one of its 27-qubit client-deployed systems to achieve a Quantum Volume 64.
Real businesses are investing today
Companies, such as IBM, Google, Intel and others, are spending this time creating better systems and better software. However, this doesn’t happen if they’re working in a research vacuum. Real progress only happens when technology vendors work with clients to solve actual problems. IBM offers the Quantum Experience, but the IBM Q Network is the commercial version of the program where companies have access to IBM’s latest technologies and support for business strategy engagements.
For example, Daimler’s working with IBM to research how quantum algorithms for chemistry and materials science will support Daimler’s long-term goal of designing new batteries. IBM and Exxon are looking at how quantum computing can improve predictive environmental modeling, and help Exxon discover new materials for more efficient carbon capture.Meanwhile, in finance, JPMorgan Chase and IBM are researching methodologies for financial modeling and risk management.
Quantum advantage versus quantum supremacy
As you read more about quantum computing, you’ll inevitably run into the concept of “quantum supremacy,” which is the idea that quantum computers are powerful enough to complete calculations that classical supercomputers can’t perform at all. IBM’s Sutor talks about establishing a quantum advantage as being more significant. Quantum advantage is the point where quantum computers perform specific computing tasks more efficiently (hundreds to thousands of times faster) or at a lower cost than using classical computing alone. It’s not for every type of problem, but it will excel at specific things. He believes this is easily achievable within this decade.
Sutor noted that quantum computing is not just as simple as manufacturing qubit. He stated there’s also science, experimental physics and engineering to refine. For example, scientists must improve quantum circuits to achieve quantum advantage. Sutor expects that in three to five years, we will see early instances of quantum advantage. However, this vision of quantum advantage requires hardware, software, and algorithms systems, to all come together.
Advice to organizations
Quantum is exciting because it forces people to move beyond the old notions of how to accomplish various computing tasks. Putting aside the old allows breakthrough thinking, which is essential for companies to take their digital journey to get to the next level. Like any other IT project, you need a quantum computing champion to explore, experiment, and evangelized the technology within your organization.
Quantum computing isn’t something you can pick up in a weekend or at the grocery store. It’s an emerging technology that is arguably five or more years away from widespread commercial use. However, an organization should begin its quantum computing education today and start to define projects that can leverage cloud-resident quantum technology.
Organizations that do this have the opportunity to create breakthroughs in areas such as material science and artificial intelligence. Many are already on this educational journey: at the recent Global Summer School, 4000 people attended workshops and labs to learn about IBM’s open source Qiskit development platform.
Individuals can take advantage of the quantum wave as well.
The evolution of quantum computing also provides an opportunity for people to engage in learning new computing skills to enter a market at its inception. Since the fundamentals of quantum differ from classical computing, it doesn’t require previous knowledge or the need to relearn skills.
It’s an exciting time in computing. We have real problems, such as creating vaccines, that quantum computing can help us develop faster and better. Researchers and technology vendors are supplying training and free access to computing resources that cost billions of dollars to design. Anyone with the desire and aptitude to learn a new skill has the opportunity to participate in the next wave of computing jobs regardless of race, age, or other affiliations. The world is what you make it and individuals can use quantum computing to make it better.
New study suggests icebergs may not have sunk the Titanic eventually – haveeruonline
- The new paper has solar flare activity Contribute to sinking of giant.
- Sun Emits a huge solar storm It can even cut off power on Earth.
- The correct (wrong) type of solar flare may have hindered your navigation. and Radio, affecting giantTrajectory and structure correspondence.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="Just when we think we know everything there is to know about the Titanic—unsinkable ship, giant iceberg, "I’m the king of the world," etc.—along comes fascinating new research that raises big questions about what really transpired on the fateful night of April 14, 1912. Did a weather fluke from space actually cause the Titanic to sink?
” data-reactid=”37″>When you think you know everything we need to know giantIt comes with exciting new research that raises big questions about what really happened on the fateful night of April 14, 1912, including an irremovable ship, a huge iceberg, and “I’m the King of the World.” Weather coincidence from space in reality giant Sinking?
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="🚢 You love badass ships. So do we. Let’s nerd out over them together.” data-reactid=”38″>🚢 You like bad belly. We do too. Let’s bother them together.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="The new study's key finding is that the northern hemisphere was in the grips of a “moderate to severe” magnetic storm that night, which could have altered the Titanic’s navigational readings, affecting both its planned course and the information the crew shared about their location during SOS signals.” data-reactid=”43″>The key finding of the new study was that the Northern Hemisphere was in the grip of a “middle to severe” magnetic storm that night. giantExplore readings of affecting all planned courses and Information shared by the crew about their location during the SOS signal.
The idea is very simple. The sun is covered with sunspots, powered by an innate nuclear generator that burns at millions of degrees. These, in turn, are distinguished by massive explosions over the size of the Earth, i.e. solar flares.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="“In a matter of just a few minutes they heat material to many millions of degrees and release as much energy as a billion megatons of TNT,” NASA explains. These flares are often caused by magnetic changes or crashes, and their explosions cause magnetic ripples through the solar system.” data-reactid=”45″>“In just a few minutes, it heats the material to millions of degrees and releases as much energy as billions of megatons of TNT.” NASA explains. These flares are often caused by magnetic changes or collisions, and explosions cause magnetic ripples through the solar system.
It is intuitively understandable that the hottest things in the solar system swirl and experience extreme responses to a changing magnetic field. One of the reasons Earth is a successful habitat for life is that humans have a protective magnetic field that reflects huge amounts of solar radiation and cosmic winds. Otherwise, it will blow us to the surface of a planet like the bald, lifeless Mars.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="This magnetic field also shifts and changes over time, especially as the magnetic poles move around Earth’s surface. Both animals and humans have learned to rely on the magnetic poles, in the form of manmade devices like compasses as well as animals’ sense for migration and navigation. Compasses, like clocks, must be adjusted to the correct units—like accounting for magnetic north as it moves around in a normal way.” data-reactid=”47″>This magnetic field moves and changes over time, especially as the stimulus moves around the Earth’s surface. Both animals and humans have learned to rely on stimuli in the form of artificial devices such as compasses. Animal sense of movement and navigation. A compass like a watch should be adjusted in the correct units as follows: Explain magnetic north It moves in the normal way.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="It’s here that we rejoin the Titanic. Paper author Mila Zinkova has published four previous papers about the Titanic in the journal RMetS Weather, exploring a theory that mirages or other visual distortions played a part in the sinking. Now, Zinkova is using weather and space data to explore a different theory.” data-reactid=”48″>Here we are again giant. Paper author Mila Zinkova said Published 4 previous papers on giant In the journal RMetS Weather, A mirage, or other visual distortion contributed to the sinking. Now Zinkova is exploring other theories using weather and space data.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="If a solar flare is severe enough, marked on that historic night by the telltale Aurora Borealis, it can skew the Earth’s magnetic field and wreak havoc with magnetic instruments like compasses. Even today, solar flares interfere with the electrical grid and space traffic, and truly precious file backups may be kept in protective Faraday cages.” data-reactid=”49″>If the solar flare is severe enough, and marked by Aurora Borealis on that historic night, it can distort the Earth’s magnetic field and cause confusion with magnetic devices like compasses. Even today, solar flares disrupt power grids and space traffic, and prevent valuable file backups. Can be stored in a protective Faraday cage..
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="Zinkova posits that the impact on compasses affected the coordinates reported in distress signals. “The Titanic’s Fourth Officer Joseph Boxhall worked out the ship’s SOS position. Boxhall’s position was around 13 nautical miles (24 km) off their real position,” Zinkova writes.” data-reactid=”70″>Zinkova assumes that the effect on the compass affects the coordinates reported in the distress signal. “that much Titanic 4th Joseph Box Hall located the ship’s SOS. Boxhall’s location was about 24 kilometers (13 nautical miles) from its actual location,” wrote Zinkova.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="But the rescue ship Carpathia likely had the same wrong information. “The compasses of the Carpathia could have been under the influence of the geomagnetic storm for 5.5 hours, before and after she received the Titanic’s SOS, and until she reached the lifeboats,” Zinkova continues. “Therefore, a possible combined compass error could have been one of the factors that contributed to the successful rescue of the Titanic survivors.”” data-reactid=”71″>But the rescue ship Carpathian You probably have the exact same misinformation. “Carpathia’s compass may have been affected by a geomagnetic storm. giantUntil she gets to the lifeboat, Zinkova continues. So the possible combined compass error could be one of the factors that contributed to the successful rescue of the Titanic survivors.”
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="This also points to how localized the solar flare phenomenon was. Ships in a certain radius received scrambled radio calls or missed them altogether. Back on land or even outside of the affected radius, everything seemed normal except when trying to contact or be contacted by the Titanic and other ships near it.” data-reactid=”72″>This also indicates how localized the solar flare phenomenon is. Vessels in a certain radius receive scrambled cordless calls or missed them all. Everything seemed normal, even when returning to land or outside the affected radius. giant And other ships nearby.
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="You Might Also Like” data-reactid=”73″>You may also like
Devoted web lover. Food expert. Hardcore twitter maven. Thinker. Freelance organizer. Social media enthusiast. Creator. Beer buff.
Northern municipalities put support behind satellite internet service – ElliotLakeToday.com
The Federation of Northern Ontario Municipalities (FONOM) has cast its attention skyward in joining other Northern Ontario stakeholders calling for better access to high-speed internet.
At its recent board meeting, the municipal advocacy group passed a resolution indicating its support for Starlink, a satellite internet service being developed by SpaceX, the company founded by U.S.-based innovator Elon Musk.
“We know today our citizens require greater connectivity than 50/10 megabits per second,” said FONOM president Danny Whalen in a Sept. 16 news release.
“FONOM believes that the Starlink program is our best option.”
According to a report released by Blue Sky Net earlier this year, the average download speed of participants in a study of northern internet users was just below 9 megabits per second (Mbps) and the average upload speed was just above 5 Mbps.
But for the average user that relies on fast internet speeds for business, education and more, download speeds of 50 Mbps and upload speeds of 10 Mbps are required as the bare minimum to participate in those activities.
In 2018, the federal government’s Canadian Radio-television and Telecommunications Commission (CRTC) set a new target to have 90 per cent of Canadian households with services that deliver download speeds of 50 Mbps and upload speeds of 10 Mbps by 2021. But that goal is still far from being achieved.
Starlink is aiming to deliver high-speed broadband internet, via satellite, to locations where access has been unreliable, expensive, or completely unavailable.
It’s targetting the northern U.S. and Canada for its initial release in 2020, and has plans to reach “near global coverage” in 2021.
The FONOM resolution calls on the CRTC to provide Starlink with a basic international telecommunications licence, which would allow the company to conduct international telecommunications activities.
FONOM said it would also seek support from its partners for the Starlink program.
FONOM, which represents 100 communities in northeastern Ontario, works to better municipal government in Northern Ontario and improve legislation respecting local government in the North.
Venus is a Russian planet — say the Russians – CTV News
This week, Dmitry Rogozin, head of the Russian space corporation Roscosmos, revealed that the country plans to send its own mission to Venus in addition to “Venera-D,” the planned joint mission with the US, the Russian state news agency TASS reported.
Rogozin was addressing reporters at the HeliRussia 2020 exhibition, an international expo of the helicopter industry in Moscow.
“Resuming Venus exploration is on our agenda,” he told reporters Tuesday.
“We think that Venus is a Russian planet, so we shouldn’t lag behind,” he said.
“Projects of Venus missions are included in the united government program of Russia’s space exploration for 2021-2030.”
The statement came the day after scientists revealed that a gas on Earth called phosphine had also been detected in the atmosphere of Venus.
Venus is similar in size to Earth and is our closest planetary neighbour, but it spins backward compared to other planets.
The study authored by Cardiff University professor Jane Greaves and her colleagues was published Monday in the journal Nature Astronomy.
The discovery of phosphine on Venus elevates it to an area of interest worth exploring in our solar system alongside the ranks of Mars and “water world” moons like Enceladus and Europa, Seager said.
“Our hoped-for impact in the planetary science community is to stimulate more research on Venus itself, research on the possibilities of life in Venus’ atmosphere, and even space missions focused to find signs of life or even life itself in the Venusian atmosphere,” Seager said.
According to the European Space Agency, the Russians do have significant experience when it comes to Venus.
Its website states: “Between 1967-1984 Venusian studies carried out in Russia were at the forefront of international research into this planet.
“Since then, Russia has still preserved its unique expertise in designing and developing landing craft for Venus and continues to define scientific tasks for those craft.”
Denver Nuggets' Jamal Murray remembers his Kitchener roots before semi-final game – CBC.ca
Videotron offers 55-inch TV with Samsung phones and all-inclusive plans – MobileSyrup
COVID-19: Etches says 'second wave' has begun but can be controlled; City readying more test centres, mayor says – Ottawa Citizen
Silver investment demand jumped 12% in 2019
Iran anticipates renewed protests amid social media shutdown
Richmond BBQ spot speaks out about coronavirus rumours Vancouver Is Awesome
- Health12 hours ago
Public Health Agency of Canada president resigns as COVID-19 cases spike – Yahoo News Canada
- Tech17 hours ago
PS5 Disc And Digital Pre-Orders Continue To Sell Out Nightmarishly Fast – Forbes
- Tech18 hours ago
The new iPad Air reminds us just how bad most Android tablets really are – Android Central
- Investment18 hours ago
UTAM looks under the hood at investment managers' ESG approaches – Benefits Canada
- Sports16 hours ago
2020 Stanley Cup Final Preview: Lightning vs. Stars – Sportsnet.ca
- News16 hours ago
Today's coronavirus news: Tory calls alarming jump of 130 new cases in Toronto 'troubling'; Ontario surpasses 400 infections for first time since June; Canada/U.S. border closure extended to Oct. 21 – Toronto Star
- Economy18 hours ago
The price of the pandemic on Montreal's economy – Montreal Gazette
- Health24 hours ago
Why saliva testing for COVID-19 in Canada won't be a panacea for long lineups any time soon – CBC.ca