Showing posts with label quantum computers. Show all posts
Showing posts with label quantum computers. Show all posts

Erasure Error Correction Key To Quantum Computers

 

Overview of an erasure-converted neutral atom quantum computer that is fault-tolerant. A plane of atoms beneath a microscope objective that is used to image fluorescence and project trapping and control fields is shown in the schematic of a neutral atom quantum computer. b Individual 171Yb atoms are used as the physical qubits. The Rydberg state |r|rleft|rright|rangle, which is accessible by a single-photon transition ( = 302 nm) with Rabi frequency, is used to conduct two-qubit gates. The qubit states are encoded in the metastable 6s6p 3P0F = 1/2 level (subspace Q). With a total rate of = B + R + Q, decays from |r|rleft|rrightrangle are the most common faults during gates. The remaining decays are either blackbody (BBR) transitions to neighbouring Rydberg states (B/ 0.61) or radiative decays to the ground state 6s2 1S0 (R/ 0.34), with just a tiny percentage (Q/ 0.05) returning to the qubit subspace. These events may be identified and turned into erasing errors at the end of a gate by observing the fluorescence from ground state atoms (subspace R) or by autoionizing any leftover Rydberg population and seeing the fluorescence on the Yb+ transition (subspace B). c A section of the XZZX surface code that was the subject of this study, displaying stabiliser operations, data qubits, and ancilla qubits in the order shown by the arrows. A quantum circuit utilising ancilla A1 and interspersed erasure conversion steps to simulate a measurement of a stabiliser on data qubits D1 through D4. After every gate, erasure detection is used, and when necessary, erased atoms are restored using a movable optical tweezer pulled from a reservoir. Although just the atom that was discovered to have left the subspace has to be replaced, doing so also guards against the risk of leakage on the second atom being unnoticed. Nature Communications (2022). DOI: 10.1038/s41467-022-32094-6


Why is "erasure" essential to creating useful quantum computers?

Researchers have uncovered a brand-new technique for fixing mistakes in quantum computer computations, possibly eliminating a significant roadblock to a powerful new computing domain.

Error correction in traditional computers is a well-established discipline. In order to transmit and receive data across clogged airwaves, every cellphone has to be checked and fixed. 

Using very ephemeral subatomic particle characteristics, quantum computers have the incredible potential to tackle certain difficult problems that are intractable for traditional computers. 

Even peeking into these computing activities to seek for problems might bring the whole system crashing down since they are so fleeting.

A multidisciplinary team led by Jeff Thompson, an associate professor of electrical and computer engineering at Princeton, and including collaborators Yue Wu, Shruti Puri, and Shimon Kolkowitz from Yale University and the University of Wisconsin-Madison demonstrated how they could significantly increase a quantum computer's tolerance for faults and decrease the amount of redundant information. 


The new method doubles the allowed error rate, from 1% to 4%, and makes it workable for developing quantum computers.


The operations you wish to perform on quantum computers are noisy, according to Thompson, which means that computations are subject to a variety of failure scenarios.


An error in a traditional computer may be as basic as a memory bit mistakenly switching from a 1 to a 0, or it can be complex like many wireless routers interfering with one another. 

Building in some redundancy to ensure that each piece of data gets examined with duplicate copies is a popular strategy for addressing these problems. 

However, such strategy calls for more data and raises the likelihood of mistakes. Therefore, it only functions when the great majority of the available information is accurate. 

Otherwise, comparing incorrect data to incorrect data just serves to deepen the inaccuracy.

According to Thompson, redundancy is a terrible technique if your baseline error rate is too high. The biggest obstacle is lowering that barrier.

Thompson's team simply increased the visibility of mistakes rather than concentrating just on lowering the amount of errors. 

The researchers studied the physical sources of mistake in great detail and designed their system such that the most frequent source of error effectively destroys the damaged data rather than merely corrupting it. 

According to Thompson, this behavior is an example of a specific kind of mistake known as a "erasure error," which is inherently simpler to filter out than damaged data that still seems to be all the other data.


In a traditional computer, it might be dangerous to presume that the slightly more common 1s are accurate and the 0s are incorrect if a packet of presumably duplicate information appears as 11001. 

However, the argument is stronger if the information appears as 11XX1, where the damaged bits are obvious.

Because you are aware of the erasure mistakes, Thompson said that they are much simpler to fix. "They could not participate in the majority vote. That is a significant benefit."

Erasure faults in conventional computing are widely recognized, but researchers hadn't previously thought about attempting to construct quantum computers to turn errors into erasures, according to Thompson.

Their device could, in fact, sustain an error rate of 4.1%, which Thompson claimed is well within the range of possibilities for existing quantum computers. 

The most advanced error correction in prior systems, according to Thompson, could only tolerate errors of less than 1%, which is beyond the capacity of any existing quantum system with a significant number of qubits.

The team's capacity to produce erasure mistakes ended up being a surprising advantage of a decision Thompson made in the past. 

His work examines "neutral atom qubits," in which a single atom is used to store a "qubit" of quantum information. 

They were the ones who invented this use of the element ytterbium. As opposed to the majority of other neutral atom qubits, which contain only one electron in their outermost layer of electrons, ytterbium possesses two in this layer, according to Thompson.

As an analogy, Thompson remarked, "I see it as a Swiss army knife, and this ytterbium is the larger, fatter Swiss army knife." "You get a lot of new tools from that additional little bit of complexity you get from having two electrons."

Eliminating mistakes turned out to be one application for those additional tools. 

The group suggested boosting ytterbium electrons from their stable "ground state" to excited levels known as "metastable states," which may be long-lived under the appropriate circumstances but are fundamentally brittle. 

The researchers' proposal to encode the quantum information using these states is counterintuitive.

The electrons seem to be walking a tightrope, Thompson said. Additionally, the system is designed such that the same elements that lead to inaccuracy also result in electrons slipping off the tightrope.

A collection of ytterbium qubits may be illuminated, but only the defective ones light up because, as an added bonus, the electrons scatter light extremely visibly after they reach the ground state. 

Those that illuminate need to be discounted as mistakes.

This development requires merging knowledge from the theory of quantum error correction and the hardware of quantum computing, drawing on the multidisciplinary character of the research team and their close cooperation.

Although the physics of this configuration are unique to Thompson's ytterbium atoms, he said that the notion of building quantum qubits to produce erasure mistakes might be a desirable objective in other systems—of which there are many being developed all over the world—and the group is still working on it.


According to Thompson, other organizations have already started designing their systems to turn mistakes into erasures. 

"We view this research as setting out a type of architecture that might be utilized in many various ways," Thompson said. "We already have a lot of interest in discovering adaptations for this task," said the researcher.

Thompson's team is now working on a demonstration of the transformation of mistakes into erasures in a modest operational quantum computer that integrates several tens of qubits as a next step.

The article was published in Nature Communications on August 9 and is titled "Erasure conversion for fault-tolerant quantum computing in alkaline earth Rydberg atom arrays."


~ Jai Krishna Ponnappan

Find Jai on Twitter | LinkedIn | Instagram


You may also want to read more about Quantum Computing here.


References And Further Reading:


  • Hilder, J., Pijn, D., Onishchenko, O., Stahl, A., Orth, M., Lekitsch, B., Rodriguez-Blanco, A., Müller, M., Schmidt-Kaler, F. and Poschinger, U.G., 2022. Fault-tolerant parity readout on a shuttling-based trapped-ion quantum computer. Physical Review X12(1), p.011032.
  • Nakazato, T., Reyes, R., Imaike, N., Matsuda, K., Tsurumoto, K., Sekiguchi, Y. and Kosaka, H., 2022. Quantum error correction of spin quantum memories in diamond under a zero magnetic field. Communications Physics5(1), pp.1-7.
  • Krinner, S., Lacroix, N., Remm, A., Di Paolo, A., Genois, E., Leroux, C., Hellings, C., Lazar, S., Swiadek, F., Herrmann, J. and Norris, G.J., 2022. Realizing repeated quantum error correction in a distance-three surface code. Nature605(7911), pp.669-674.
  • Ajagekar, A. and You, F., 2022. New frontiers of quantum computing in chemical engineering. Korean Journal of Chemical Engineering, pp.1-10.



Artificial Intelligence - Quantum AI.

 



Artificial intelligence and quantum computing, according to Johannes Otterbach, a physicist at Rigetti Computing in Berkeley, California, are natural friends since both technologies are essentially statistical.

Airbus, Atos, Baidu, b|eit, Cambridge Quantum Computing, Elyah, Hewlett-Packard (HP), IBM, Microsoft Research QuArC, QC Ware, Quantum Benchmark Inc., R QUANTECH, Rahko, and Zapata Computing are among the organizations that have relocated to the region.

Bits are used to encode and modify data in traditional general-purpose computer systems.

Bits may only be in one of two states: 0 or 1.

Quantum computers use the actions of subatomic particles like electrons and photons to process data.

Superposition—particles residing in all conceivable states at the same time—and entanglement—the pairing and connection of particles such that they cannot be characterized independently of the state of others, even at long distances—are two of the most essential phenomena used by quantum computers.

Such entanglement was dubbed "spooky activity at a distance" by Albert Einstein.

Quantum computers use quantum registers, which are made up of a number of quantum bits or qubits, to store data.

While a clear explanation is elusive, qubits might be understood to reside in a weighted combination of two states at the same time to yield many states.

Each qubit that is added to the system doubles the processing capability of the system.

More than one quadrillion classical bits might be processed by a quantum computer with just fifty entangled qubits.

In a single year, sixty qubits could carry all of humanity's data.

Three hundred qubits might compactly encapsulate a quantity of data comparable to the observable universe's classical information content.

Quantum computers can operate in parallel on large quantities of distinct computations, collections of data, or operations.

True autonomous transportation would be possible if a working artificially intelligent quantum computer could monitor and manage all of a city's traffic in real time.

By comparing all of the photographs to the reference photo at the same time, quantum artificial intelligence may rapidly match a single face to a library of billions of photos.

Our understanding of processing, programming, and complexity has radically changed with the development of quantum computing.

A series of quantum state transformations is followed by a measurement in most quantum algorithms.

The notion of quantum computing goes back to the 1980s, when physicists such as Yuri Manin, Richard Feynman, and David Deutsch realized that by using so-called quantum gates, a concept taken from linear algebra, researchers would be able to manipulate information.

They hypothesized qubits might be controlled by different superpositions and entanglements into quantum algorithms, the outcomes of which could be observed, by mixing many kinds of quantum gates into circuits.

Some quantum mechanical processes could not be efficiently replicated on conventional computers, which presented a problem to these early researchers.

They thought that quantum technology (perhaps included in a universal quantum Turing computer) would enable quantum simulations.

In 1993, Umesh Vazirani and Ethan Bernstein of the University of California, Berkeley, hypothesized that quantum computing will one day be able to effectively solve certain problems quicker than traditional digital computers, in violation of the extended Church-Turing thesis.

In computational complexity theory, Vazirani and Bernstein argue for a special class of bounded-error quantum polynomial time choice problems.

These are issues that a quantum computer can solve in polynomial time with a one-third error probability in most cases.

The frequently proposed threshold for Quantum Supremacy is fifty qubits, the point at which quantum computers would be able to tackle problems that would be impossible to solve on conventional machines.

Although no one believes quantum computing would be capable of solving all NP-hard issues, quantum AI researchers think the machines will be capable of solving specific types of NP intermediate problems.

Creating quantum machine algorithms that do valuable work has proved to be a tough task.

In 1994, AT&T Laboratories' Peter Shor devised a polynomial time quantum algorithm that beat conventional methods in factoring big numbers, possibly allowing for the speedy breakage of current kinds of public key encryption.

Since then, intelligence services have been stockpiling encrypted material passed across networks in the hopes that quantum computers would be able to decipher it.

Another technique devised by Shor's AT&T Labs colleague Lov Grover allows for quick searches of unsorted datasets.

Quantum neural networks are similar to conventional neural networks in that they label input, identify patterns, and learn from experience using layers of millions or billions of linked neurons.

Large matrices and vectors produced by neural networks can be processed exponentially quicker by quantum computers than by classical computers.

Aram Harrow of MIT and Avinatan Hassidum gave the critical algorithmic insight for rapid classification and quantum inversion of the matrix in 2008.

Michael Hartmann, a visiting researcher at Google AI Quantum and Associate Professor of Photonics and Quantum Sciences at Heriot-Watt University, is working on a quantum neural network computer.

Hartmann's Neuromorphic Quantum Computing (Quromorphic) Project employs superconducting electrical circuits as hardware.

Hartmann's artificial neural network computers are inspired by the brain's neuronal organization.

They are usually stored in software, with each artificial neuron being programmed and connected to a larger network of neurons.

Hardware that incorporates artificial neural networks is also possible.

Hartmann estimates that a workable quantum computing artificial intelligence might take 10 years to develop.

D-Wave, situated in Vancouver, British Columbia, was the first business to mass-produce quantum computers in commercial numbers.

In 2011, D-Wave started producing annealing quantum computers.

Annealing processors are special-purpose products used for a restricted set of problems with multiple local minima in a discrete search space, such as combinatorial optimization issues.

The D-Wave computer isn't polynomially equal to a universal quantum computer, hence it can't run Shor's algorithm.

Lockheed Martin, the University of Southern California, Google, NASA, and the Los Alamos National Laboratory are among the company's clients.

Universal quantum computers are being pursued by Google, Intel, Rigetti, and IBM.

Each one has a quantum processor with fifty qubits.

In 2018, the Google AI Quantum lab, led by Hartmut Neven, announced the introduction of their newest 72-qubit Bristlecone processor.

Intel also debuted its 49-qubit Tangle Lake CPU last year.

The Aspen-1 processor from Rigetti Computing has sixteen qubits.

The IBM Q Experience quantum computing facility is situated in Yorktown Heights, New York, inside the Thomas J.

Watson Research Center.

To create quantum commercial applications, IBM is collaborating with a number of corporations, including Honda, JPMorgan Chase, and Samsung.

The public is also welcome to submit experiments to be processed on the company's quantum computers.

Quantum AI research is also highly funded by government organizations and universities.

The NASA Quantum Artificial Intelligence Laboratory (QuAIL) has a D-Wave 2000Q quantum computer with 2,048 qubits that it wants to use to tackle NP-hard problems in data processing, anomaly detection and decision-making, air traffic management, and mission planning and coordination.

The NASA team has chosen to concentrate on the most difficult machine learning challenges, such as generative models in unsupervised learning, in order to illustrate the technology's full potential.

In order to maximize the value of D-Wave resources and skills, NASA researchers have opted to focus on hybrid quantum-classical techniques.

Many laboratories across the globe are investigating completely quantum machine learning.

Quantum Learning Theory proposes that quantum algorithms might be utilized to address machine learning problems, hence improving traditional machine learning techniques.

Classical binary data sets are supplied into a quantum computer for processing in quantum learning theory.

The NIST Joint Quantum Institute and the University of Maryland's Joint Center for Quantum Information and Computer Science are also bridging the gap between machine learning and quantum computing.

Workshops bringing together professionals in mathematics, computer science, and physics to use artificial intelligence algorithms in quantum system control are hosted by the NIST-UMD.

Engineers are also encouraged to employ quantum computing to boost the performance of machine learning algorithms as part of the alliance.

The Quantum Algorithm Zoo, a collection of all known quantum algorithms, is likewise housed at NIST.

Scott Aaronson is the director of the University of Texas at Austin's Quantum Information Center.

The department of computer science, the department of electrical and computer engineering, the department of physics, and the Advanced Research Laboratories have collaborated to create the center.

The University of Toronto has a quantum machine learning start-up incubator.

Peter Wittek is the head of the Quantum Machine Learning Program of the Creative Destruction Lab, which houses the QML incubator.

Materials discovery, optimization, and logistics, reinforcement and unsupervised machine learning, chemical engineering, genomics and drug discovery, systems design, finance, and security are all areas where the University of Toronto incubator is fostering innovation.

In December 2018, President Donald Trump signed the National Quantum Initiative Act into law.

The legislation establishes a partnership of the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and the Department of Energy (DOE) for quantum information science research, commercial development, and education.

The statute anticipates the NSF and DOE establishing many competitively awarded research centers as a result of the endeavor.

Due to the difficulties of running quantum processing units (QPUs), which must be maintained in a vacuum at temperatures near to absolute zero, no quantum computer has yet outperformed a state-of-the-art classical computer on a challenging job.

Because quantum computing is susceptible to external environmental impacts, such isolation is required.

Qubits are delicate; a typical quantum bit can only exhibit coherence for ninety microseconds before degrading and becoming unreliable.

In an isolated quantum processor with high thermal noise, communicating inputs and outputs and collecting measurements is a severe technical difficulty that has yet to be fully handled.

The findings are not totally dependable in a classical sense since the measurement is quantum and hence probabilistic.

Only one of the quantum parallel threads may be randomly accessed for results.

During the measuring procedure, all other threads are deleted.

It is believed that by connecting quantum processors to error-correcting artificial intelligence algorithms, the defect rate of these computers would be lowered.

Many machine intelligence applications, such as deep learning and probabilistic programming, rely on sampling from high-dimensional probability distributions.

Quantum sampling methods have the potential to make calculations on otherwise intractable issues quicker and more efficient.

Shor's method employs an artificial intelligence approach that alters the quantum state in such a manner that common properties of output values, such as symmetry of period of functions, can be quantified.

Grover's search method manipulates the quantum state using an amplification technique to increase the possibility that the desired output will be read off.

Quantum computers would also be able to execute many AI algorithms at the same time.

Quantum computing simulations have recently been used by scientists to examine the beginnings of biological life.

Unai Alvarez-Rodriguez of the University of the Basque Country in Spain built so-called artificial quantum living forms using IBM's QX superconducting quantum computer.


~ Jai Krishna Ponnappan

Find Jai on Twitter | LinkedIn | Instagram


You may also want to read more about Artificial Intelligence here.



See also: 


General and Narrow AI.


References & Further Reading:


Aaronson, Scott. 2013. Quantum Computing Since Democritus. Cambridge, UK: Cambridge University Press.

Biamonte, Jacob, Peter Wittek, Nicola Pancotti, Patrick Rebentrost, Nathan Wiebe, and Seth Lloyd. 2018. “Quantum Machine Learning.” https://arxiv.org/pdf/1611.09347.pdf.

Perdomo-Ortiz, Alejandro, Marcello Benedetti, John Realpe-Gómez, and Rupak Biswas. 2018. “Opportunities and Challenges for Quantum-Assisted Machine Learning in Near-Term Quantum Computers.” Quantum Science and Technology 3: 1–13.

Schuld, Maria, Ilya Sinayskiy, and Francesco Petruccione. 2015. “An Introduction to Quantum Machine Learning.” Contemporary Physics 56, no. 2: 172–85.

Wittek, Peter. 2014. Quantum Machine Learning: What Quantum Computing Means to Data Mining. Cambridge, MA: Academic Press




Quantum Computing - Discovery Of Unexpected Features In Ta2NiSe5, A Complicated Quantum Material.

 




A recent research discloses previously unknown features in Ta2NiSe5, a complicated quantum material. 

These results, which were made possible by a new approach pioneered at Penn, have implications for the development of future quantum devices and applications. 

This study, which was published in Science Advances, was directed by professor Ritesh Agarwal and done by graduate student Harshvardhan Jog in conjunction with Penn's Eugene Mele and Luminita Harnagea of the Indian Institute of Science Education and Research. (Find Research Paper Attached Below)



While progress has been made in the area of quantum information science in recent years, quantum computers are still in their infancy. 


  • Because present platforms are not built to enable many qubits to "speak" to one another, one problem is the ability to only employ a minimal number of "qubits," the unit that executes operations in a quantum computer. 
  • Materials must be efficient at quantum entanglement, which happens when the states of qubits stay connected regardless of their distance from one another, as well as coherence, or when a system can sustain this entanglement, in order to meet this challenge. 



Jog investigated Ta2NiSe5, a material system with high electrical correlation, which makes it suitable for quantum devices. 




Strong electronic correlation refers to the relationship between a material's atomic structure and its electronic characteristics, as well as the strong contact between electrons. 

To investigate Ta2NiSe5, Jog modified an Agarwal lab method known as the circular photogalvanic effect, in which light is made to convey an electric field and may be utilized to examine various material characteristics. 

This approach, which has been developed and refined over many years, has revealed information about materials such as silicon and Weyl semimetals in ways that are not achievable with traditional physics and materials science research. 

But, as Agarwal points out, this method has only been used in materials without inversion symmetry, whereas Ta2NiSe5 does. 

Jog "wanted to see if this technique could be used to study materials with inversion symmetry that, from a conventional sense, should not be producing this response," says Agarwal. 

Jog and Agarwal employed a modified version of the circular photogalvanic effect after connecting with Harnagea to collect high-quality Ta2NiSe5 samples and were startled to observe that a signal was created. 

They collaborated with Mele to build a hypothesis that may help explain these surprising findings after performing more research to assure that this was not a mistake or an experimental artifact. 





The difficulty in creating a theory, according to Mele, was that what was anticipated about the symmetry of Ta2NiSe5 did not match the experimental data. 




They were then able to offer an explanation for these results after discovering a prior theoretical work that revealed the symmetry was lower than what was expected. 

"We recognized that if there was a low-temperature phase when the system spontaneously shears, that would do it," Mele adds. 

The researchers discovered that this material had broken symmetry by integrating their experience from both experiment and theory, which was critical to the project's success. This result has major implications for the use of this and other materials in future devices. 

This is due to the fact that symmetry is essential for categorizing phases of matter and, eventually, determining their downstream qualities. 


These findings may also be used to uncover and describe comparable features in other kinds of materials. 




We now have a technology that can detect even the most minute symmetry breaks in crystalline materials. 


Symmetries must be considered in order to comprehend any complicated subject since they have enormous ramifications.

While there is still a "far way to go" before Ta2NiSe5 can be used in quantum devices, the researchers are already working to better understand this phenomena. 

In the lab, Jog and Agarwal are interested in searching for possible topological qualities in extra energy levels inside Ta2NiSe5, as well as utilizing the circular photogalvanic approach to look at other associated systems to see if they have comparable properties. 

Mele is investigating how often this phenomenon is in various material systems and generating recommendations for new materials for experimentalists to investigate. 

"What we're seeing here is a reaction that shouldn't happen but does in this situation," Mele adds. 




"Expanding the area of structures available to you, where you may activate these effects that are ostensibly disallowed, is critical. It's not the first time something has occurred in spectroscopy, but it's always intriguing when it occurs." 


This work not only introduces the research community to a new tool for studying complex crystals, but it also sheds light on the types of materials that can provide two key features, entanglement and macroscopic coherence, which are critical for future quantum applications ranging from medical diagnostics to low-power electronics and sensors. 

"The long-term aim, and one of the most important goals in condensed matter physics," adds Agarwal, "is to be able to comprehend these highly entangled states of matter because these materials can conduct a lot of intricate modeling." "

It's possible that if we can figure out how to comprehend these systems, they'll become natural platforms for large-scale quantum simulation."



References:


Harshvardhan Jog et al, Exchange coupling–mediated broken symmetries in Ta 2 NiSe 5 revealed from quadrupolar circular photogalvanic effect, Science Advances (2022). DOI: 10.1126/sciadv.abl9020

Jog, H., Harnagea, L., Mele, E. and Agarwal, R., 2021. Circular photogalvanic effect in inversion symmetric crystal: the case of ferro-rotational order in Ta 2 NiSe 5. Bulletin of the American Physical Society66.



~ Jai Krishna Ponnappan.


You may also want to read more about Quantum Computing here.





What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...