Showing posts with label Quantum Computing. Show all posts
Showing posts with label Quantum Computing. Show all posts

Erasure Error Correction Key To Quantum Computers

 

Overview of an erasure-converted neutral atom quantum computer that is fault-tolerant. A plane of atoms beneath a microscope objective that is used to image fluorescence and project trapping and control fields is shown in the schematic of a neutral atom quantum computer. b Individual 171Yb atoms are used as the physical qubits. The Rydberg state |r|rleft|rright|rangle, which is accessible by a single-photon transition ( = 302 nm) with Rabi frequency, is used to conduct two-qubit gates. The qubit states are encoded in the metastable 6s6p 3P0F = 1/2 level (subspace Q). With a total rate of = B + R + Q, decays from |r|rleft|rrightrangle are the most common faults during gates. The remaining decays are either blackbody (BBR) transitions to neighbouring Rydberg states (B/ 0.61) or radiative decays to the ground state 6s2 1S0 (R/ 0.34), with just a tiny percentage (Q/ 0.05) returning to the qubit subspace. These events may be identified and turned into erasing errors at the end of a gate by observing the fluorescence from ground state atoms (subspace R) or by autoionizing any leftover Rydberg population and seeing the fluorescence on the Yb+ transition (subspace B). c A section of the XZZX surface code that was the subject of this study, displaying stabiliser operations, data qubits, and ancilla qubits in the order shown by the arrows. A quantum circuit utilising ancilla A1 and interspersed erasure conversion steps to simulate a measurement of a stabiliser on data qubits D1 through D4. After every gate, erasure detection is used, and when necessary, erased atoms are restored using a movable optical tweezer pulled from a reservoir. Although just the atom that was discovered to have left the subspace has to be replaced, doing so also guards against the risk of leakage on the second atom being unnoticed. Nature Communications (2022). DOI: 10.1038/s41467-022-32094-6


Why is "erasure" essential to creating useful quantum computers?

Researchers have uncovered a brand-new technique for fixing mistakes in quantum computer computations, possibly eliminating a significant roadblock to a powerful new computing domain.

Error correction in traditional computers is a well-established discipline. In order to transmit and receive data across clogged airwaves, every cellphone has to be checked and fixed. 

Using very ephemeral subatomic particle characteristics, quantum computers have the incredible potential to tackle certain difficult problems that are intractable for traditional computers. 

Even peeking into these computing activities to seek for problems might bring the whole system crashing down since they are so fleeting.

A multidisciplinary team led by Jeff Thompson, an associate professor of electrical and computer engineering at Princeton, and including collaborators Yue Wu, Shruti Puri, and Shimon Kolkowitz from Yale University and the University of Wisconsin-Madison demonstrated how they could significantly increase a quantum computer's tolerance for faults and decrease the amount of redundant information. 


The new method doubles the allowed error rate, from 1% to 4%, and makes it workable for developing quantum computers.


The operations you wish to perform on quantum computers are noisy, according to Thompson, which means that computations are subject to a variety of failure scenarios.


An error in a traditional computer may be as basic as a memory bit mistakenly switching from a 1 to a 0, or it can be complex like many wireless routers interfering with one another. 

Building in some redundancy to ensure that each piece of data gets examined with duplicate copies is a popular strategy for addressing these problems. 

However, such strategy calls for more data and raises the likelihood of mistakes. Therefore, it only functions when the great majority of the available information is accurate. 

Otherwise, comparing incorrect data to incorrect data just serves to deepen the inaccuracy.

According to Thompson, redundancy is a terrible technique if your baseline error rate is too high. The biggest obstacle is lowering that barrier.

Thompson's team simply increased the visibility of mistakes rather than concentrating just on lowering the amount of errors. 

The researchers studied the physical sources of mistake in great detail and designed their system such that the most frequent source of error effectively destroys the damaged data rather than merely corrupting it. 

According to Thompson, this behavior is an example of a specific kind of mistake known as a "erasure error," which is inherently simpler to filter out than damaged data that still seems to be all the other data.


In a traditional computer, it might be dangerous to presume that the slightly more common 1s are accurate and the 0s are incorrect if a packet of presumably duplicate information appears as 11001. 

However, the argument is stronger if the information appears as 11XX1, where the damaged bits are obvious.

Because you are aware of the erasure mistakes, Thompson said that they are much simpler to fix. "They could not participate in the majority vote. That is a significant benefit."

Erasure faults in conventional computing are widely recognized, but researchers hadn't previously thought about attempting to construct quantum computers to turn errors into erasures, according to Thompson.

Their device could, in fact, sustain an error rate of 4.1%, which Thompson claimed is well within the range of possibilities for existing quantum computers. 

The most advanced error correction in prior systems, according to Thompson, could only tolerate errors of less than 1%, which is beyond the capacity of any existing quantum system with a significant number of qubits.

The team's capacity to produce erasure mistakes ended up being a surprising advantage of a decision Thompson made in the past. 

His work examines "neutral atom qubits," in which a single atom is used to store a "qubit" of quantum information. 

They were the ones who invented this use of the element ytterbium. As opposed to the majority of other neutral atom qubits, which contain only one electron in their outermost layer of electrons, ytterbium possesses two in this layer, according to Thompson.

As an analogy, Thompson remarked, "I see it as a Swiss army knife, and this ytterbium is the larger, fatter Swiss army knife." "You get a lot of new tools from that additional little bit of complexity you get from having two electrons."

Eliminating mistakes turned out to be one application for those additional tools. 

The group suggested boosting ytterbium electrons from their stable "ground state" to excited levels known as "metastable states," which may be long-lived under the appropriate circumstances but are fundamentally brittle. 

The researchers' proposal to encode the quantum information using these states is counterintuitive.

The electrons seem to be walking a tightrope, Thompson said. Additionally, the system is designed such that the same elements that lead to inaccuracy also result in electrons slipping off the tightrope.

A collection of ytterbium qubits may be illuminated, but only the defective ones light up because, as an added bonus, the electrons scatter light extremely visibly after they reach the ground state. 

Those that illuminate need to be discounted as mistakes.

This development requires merging knowledge from the theory of quantum error correction and the hardware of quantum computing, drawing on the multidisciplinary character of the research team and their close cooperation.

Although the physics of this configuration are unique to Thompson's ytterbium atoms, he said that the notion of building quantum qubits to produce erasure mistakes might be a desirable objective in other systems—of which there are many being developed all over the world—and the group is still working on it.


According to Thompson, other organizations have already started designing their systems to turn mistakes into erasures. 

"We view this research as setting out a type of architecture that might be utilized in many various ways," Thompson said. "We already have a lot of interest in discovering adaptations for this task," said the researcher.

Thompson's team is now working on a demonstration of the transformation of mistakes into erasures in a modest operational quantum computer that integrates several tens of qubits as a next step.

The article was published in Nature Communications on August 9 and is titled "Erasure conversion for fault-tolerant quantum computing in alkaline earth Rydberg atom arrays."


~ Jai Krishna Ponnappan

Find Jai on Twitter | LinkedIn | Instagram


You may also want to read more about Quantum Computing here.


References And Further Reading:


  • Hilder, J., Pijn, D., Onishchenko, O., Stahl, A., Orth, M., Lekitsch, B., Rodriguez-Blanco, A., Müller, M., Schmidt-Kaler, F. and Poschinger, U.G., 2022. Fault-tolerant parity readout on a shuttling-based trapped-ion quantum computer. Physical Review X12(1), p.011032.
  • Nakazato, T., Reyes, R., Imaike, N., Matsuda, K., Tsurumoto, K., Sekiguchi, Y. and Kosaka, H., 2022. Quantum error correction of spin quantum memories in diamond under a zero magnetic field. Communications Physics5(1), pp.1-7.
  • Krinner, S., Lacroix, N., Remm, A., Di Paolo, A., Genois, E., Leroux, C., Hellings, C., Lazar, S., Swiadek, F., Herrmann, J. and Norris, G.J., 2022. Realizing repeated quantum error correction in a distance-three surface code. Nature605(7911), pp.669-674.
  • Ajagekar, A. and You, F., 2022. New frontiers of quantum computing in chemical engineering. Korean Journal of Chemical Engineering, pp.1-10.



Quantum Computing - Discovery Of Unexpected Features In Ta2NiSe5, A Complicated Quantum Material.

 




A recent research discloses previously unknown features in Ta2NiSe5, a complicated quantum material. 

These results, which were made possible by a new approach pioneered at Penn, have implications for the development of future quantum devices and applications. 

This study, which was published in Science Advances, was directed by professor Ritesh Agarwal and done by graduate student Harshvardhan Jog in conjunction with Penn's Eugene Mele and Luminita Harnagea of the Indian Institute of Science Education and Research. (Find Research Paper Attached Below)



While progress has been made in the area of quantum information science in recent years, quantum computers are still in their infancy. 


  • Because present platforms are not built to enable many qubits to "speak" to one another, one problem is the ability to only employ a minimal number of "qubits," the unit that executes operations in a quantum computer. 
  • Materials must be efficient at quantum entanglement, which happens when the states of qubits stay connected regardless of their distance from one another, as well as coherence, or when a system can sustain this entanglement, in order to meet this challenge. 



Jog investigated Ta2NiSe5, a material system with high electrical correlation, which makes it suitable for quantum devices. 




Strong electronic correlation refers to the relationship between a material's atomic structure and its electronic characteristics, as well as the strong contact between electrons. 

To investigate Ta2NiSe5, Jog modified an Agarwal lab method known as the circular photogalvanic effect, in which light is made to convey an electric field and may be utilized to examine various material characteristics. 

This approach, which has been developed and refined over many years, has revealed information about materials such as silicon and Weyl semimetals in ways that are not achievable with traditional physics and materials science research. 

But, as Agarwal points out, this method has only been used in materials without inversion symmetry, whereas Ta2NiSe5 does. 

Jog "wanted to see if this technique could be used to study materials with inversion symmetry that, from a conventional sense, should not be producing this response," says Agarwal. 

Jog and Agarwal employed a modified version of the circular photogalvanic effect after connecting with Harnagea to collect high-quality Ta2NiSe5 samples and were startled to observe that a signal was created. 

They collaborated with Mele to build a hypothesis that may help explain these surprising findings after performing more research to assure that this was not a mistake or an experimental artifact. 





The difficulty in creating a theory, according to Mele, was that what was anticipated about the symmetry of Ta2NiSe5 did not match the experimental data. 




They were then able to offer an explanation for these results after discovering a prior theoretical work that revealed the symmetry was lower than what was expected. 

"We recognized that if there was a low-temperature phase when the system spontaneously shears, that would do it," Mele adds. 

The researchers discovered that this material had broken symmetry by integrating their experience from both experiment and theory, which was critical to the project's success. This result has major implications for the use of this and other materials in future devices. 

This is due to the fact that symmetry is essential for categorizing phases of matter and, eventually, determining their downstream qualities. 


These findings may also be used to uncover and describe comparable features in other kinds of materials. 




We now have a technology that can detect even the most minute symmetry breaks in crystalline materials. 


Symmetries must be considered in order to comprehend any complicated subject since they have enormous ramifications.

While there is still a "far way to go" before Ta2NiSe5 can be used in quantum devices, the researchers are already working to better understand this phenomena. 

In the lab, Jog and Agarwal are interested in searching for possible topological qualities in extra energy levels inside Ta2NiSe5, as well as utilizing the circular photogalvanic approach to look at other associated systems to see if they have comparable properties. 

Mele is investigating how often this phenomenon is in various material systems and generating recommendations for new materials for experimentalists to investigate. 

"What we're seeing here is a reaction that shouldn't happen but does in this situation," Mele adds. 




"Expanding the area of structures available to you, where you may activate these effects that are ostensibly disallowed, is critical. It's not the first time something has occurred in spectroscopy, but it's always intriguing when it occurs." 


This work not only introduces the research community to a new tool for studying complex crystals, but it also sheds light on the types of materials that can provide two key features, entanglement and macroscopic coherence, which are critical for future quantum applications ranging from medical diagnostics to low-power electronics and sensors. 

"The long-term aim, and one of the most important goals in condensed matter physics," adds Agarwal, "is to be able to comprehend these highly entangled states of matter because these materials can conduct a lot of intricate modeling." "

It's possible that if we can figure out how to comprehend these systems, they'll become natural platforms for large-scale quantum simulation."



References:


Harshvardhan Jog et al, Exchange coupling–mediated broken symmetries in Ta 2 NiSe 5 revealed from quadrupolar circular photogalvanic effect, Science Advances (2022). DOI: 10.1126/sciadv.abl9020

Jog, H., Harnagea, L., Mele, E. and Agarwal, R., 2021. Circular photogalvanic effect in inversion symmetric crystal: the case of ferro-rotational order in Ta 2 NiSe 5. Bulletin of the American Physical Society66.



~ Jai Krishna Ponnappan.


You may also want to read more about Quantum Computing here.





State Of An Emerging Quantum Computing Technology Ecosystem And Areas Of Business Applications.






    Quantum Computing Hardware.


    The ecosystem's hardware is a major barrier. The problem is both technical and structural in nature. 


    • The first issue is growing the number of qubits in a quantum computer while maintaining a high degree of qubit quality. 
    • Hardware has a high barrier to entry because it requires a rare mix of cash, experimental and theoretical quantum physics competence, and deep knowledge—particularly domain knowledge of the necessary implementation possibilities. 

    Several quantum-computing hardware platforms are presently in the works. 



    The realization of completely error-corrected, fault-tolerant quantum computing will be the most significant milestone, since a quantum computer cannot give precise, mathematically accurate outputs without it. 



    • Experts argue over whether quantum computers can provide substantial commercial value until they are entirely fault resilient. 
    • Many argue, however, that a lack of fault tolerance does not render quantum-computing systems unworkable. 



    When will we be able to tolerate flaws as in produce viable fault-tolerant quantum computing systems? 


    Most hardware companies are cautious to publish their development intentions, although a handful have done so openly. 

    By 2030, five manufacturers have said that they will have fault-tolerant quantum computing hardware. 

    If this timeframe holds true, the industry will most likely have established a distinct quantum advantage for many applications by then. 




    Quantum Computing Software.


    The number of software-focused startups is growing at a higher rate than any other part of the quantum-computing value chain. 


    • Sector players in the software industry today provide bespoke services and want to provide turnkey services as the industry matures. 
    • Organizations will be able to update their software tools and ultimately adopt completely quantum tools as quantum-computing software develops. 
    • Quantum computing, in the meanwhile, necessitates a new programming paradigm—as well as a new software stack. 
    • The bigger industry players often distribute their software-development kits for free in order to foster developer communities around their goods. 



    Quantum Computing Cloud-Based Services. 


    In the end, cloud-based quantum-computing services may become the most important aspect of the ecosystem, and those who manage them may reap enormous riches. 


    • Most cloud computing service providers now give access to quantum computers on their platforms, allowing prospective customers to try out the technology. 
    • Due to the impossibility of personal or mobile quantum computing this decade, early users may have to rely on the cloud to get a taste of the technology before the wider ecosystem grows. 



    Ecosystem of Quantum Computing.




    The foundations for a quantum-computing business have started to take shape. 

    According to our analysis, the value at stake for quantum-computing businesses is close to $80 billion (not to be confused with the value that quantum-computing use cases could generate). 



    Private And Public Funding For Quantum Computing




    Because quantum computing is still a relatively new topic, the bulk of funding for fundamental research is currently provided by the government. 

    Private financing, on the other hand, is fast expanding. 


    Investments in quantum computing start-ups have topped $1.7 billion in 2021 alone, more than double the amount raised in 2020. 

    • As quantum computer commercialization gathers steam, I anticipate private financing to increase dramatically. 
    • If leaders prepare now, a blossoming quantum-computing ecosystem and developing commercial use cases promise to produce enormous value for sectors. 



    Quantum computing's fast advancements serve as potent reminders that the technology is soon approaching commercial viability. 


    • For example, a Japanese research institute recently revealed a breakthrough in entangling qubits (quantum's fundamental unit of information, equivalent to bits in conventional computers) that might enhance error correction in quantum systems and pave the way for large-scale quantum computers. 
    • In addition, an Australian business has created software that has been demonstrated to boost the performance of any quantum-computing hardware in trials. 
    • Investment funds are flowing in, and quantum-computing start-ups are sprouting as advancements speed. 
    • Quantum computing is still being developed by major technological firms, with Alibaba, Amazon, IBM, Google, and Microsoft having already introduced commercial quantum-computing cloud services. 


    Of course, all of this effort does not always equate to commercial success. 



    While quantum computing has the potential to help organizations tackle challenges that are beyond the reach and speed of traditional high-performance computers, application cases are still mostly experimental and conceptual. 


    • Indeed, academics are still disputing the field's most fundamental concerns (for more on these unresolved questions, see the sidebar "Quantum Computing Debates"). 
    • Nonetheless, the behavior shows that CIOs and other executives who have been keeping an eye on quantum-computing developments may no longer be considered spectators. 
    • Leaders should begin to plan their quantum-computing strategy, particularly in businesses like pharmaceuticals that might profit from commercial quantum computing early on. 
    • Change might arrive as early as 2030, according to some firms, who anticipate that practical quantum technologies will be available by then. 


    I conducted extensive research and interviewed experts from around the world about quantum hardware, software, and applications; the emerging quantum-computing ecosystem; possible business use cases; and the most important drivers of the quantum-computing market to help leaders get started planning. 


    ~ Jai Krishna Ponnappan


    Further Reading:



    You may also want to read more about Quantum Computing here.






    Quantum Computing's Future Outlook.

     



    Corporate executives from all sectors should plan for quantum computing's development. 


    I predict that quantum-computing use cases will have a hybrid operating model that is a mix of quantum and traditional high-performance computing until about 2030. 


    • Quantum-inspired algorithms, for example, may improve traditional high-performance computers. 
    • In order to develop quantum hardware and allow greater—and more complex—use cases beyond 2030, intensive continuous research by private enterprises and governmental organizations will be required
    • The route to commercialization of the technology will be determined by six important factors: finance, accessibility, standards, industry consortia, talent, and digital infrastructure. 


    Outsiders to the quantum-computing business should take five tangible measures to prepare for quantum computing's maturation: 


    • With an in-house team of quantum-computing specialists or by partnering with industry organizations and joining a quantum-computing consortium, keep up with industry advances and actively screen quantum-computing application cases. 
    • Recognize the most important risks, disruptions, and opportunities in their respective businesses. 
    • Consider partnering with or investing in quantum-computing players (mainly software) to make knowledge and expertise more accessible. 
    • Consider hiring quantum-computing experts in-house. Even a small team of up to three specialists may be sufficient to assist a company in exploring prospective use cases and screening potential quantum computing strategic investments. 
    • Build a digital infrastructure that can handle the fundamental operational needs of quantum computing, store important data in digital databases, and configure traditional computing processes to be quantum-ready whenever more powerful quantum hardware becomes available. 



    Every industry's leaders have a once-in-a-lifetime chance to keep on top of a generation-defining technology. 

    The reward might be strategic insights and increased company value.



    ~ Jai Krishna Ponnappan


    You may also want to read more about Quantum Computing here.





    Quantum Computing - Areas Of Application.

     




    Quantum simulation, quantum linear algebra for AI and machine learning, quantum optimization and search, and quantum factorization are the four most well-known application cases. 


    I go through them in detail in this paper, as well as issues leaders should think about when evaluating prospective use cases. 

    I concentrate on prospective applications in a few areas that, according to studies, might profit the most in the near term from the technology: medicines, chemicals, automotive, and finance. 

    The total value at risk for these sectors might be between $300 billion and $700 billion (to be cautious). 




    Chemicals


    Chemicals may benefit from quantum computing for R&D, manufacturing, and supply-chain optimization. 


    • Consider how quantum computing may be utilized to enhance catalyst designs in the manufacturing process. 
    • New and improved catalysts, for example, could allow existing production processes to save energy—a single catalyst can increase efficiency by up to 15%—and innovative catalysts could allow for the replacement of petrochemicals with more sustainable feedstocks or the breakdown of carbon for CO2 usage. 

    A realistic 5 to 10% efficiency boost in the chemicals sector, which spends $800 billion on production each year (half of which depends on catalysis), would result in a $20 billion to $40 billion gain in value. 





    Pharmaceuticals


    Quantum computing has the potential to improve the biopharmaceutical industry's research and development of molecular structures, as well as providing value in manufacturing and farther down the value chain. 


    • New medications, for example, cost an average of $2 billion and take more than 10 years to reach the market once they are discovered in R&D. 
    • Quantum computing has the potential to make R&D more quicker, more focused, and accurate by reducing the reliance on trial and error in target identification, drug design, and toxicity assessment. 
    • A shorter R&D timetable might help deliver medications to the correct patients sooner and more efficiently—in other words, it would enhance the quality of life of more people. 
    • Quantum computing might also improve production, logistics, and the supply chain. 


    While it's difficult to predict how much revenue or patient impact such advancements will have, in a $1.5 trillion industry with average EBIT margins of 16 percent (by our calculations), even a 1 to 5% revenue increase would result in $15 billion to $75 billion in additional revenue and $2 billion to $12 billion in EBIT. 




    Finance


    Quantum-computing applications in banking remain a ways off, and the benefits of any short-term applications are speculative. 


    • However, I feel that portfolio and risk management are the most potential applications of quantum computing in finance. 
    • Quantum-optimized loan portfolios that concentrate on collateral, for example, might let lenders to enhance their services by decreasing interest rates and freeing up money. 


    Although it is too early—and complicated—to evaluate the value potential of quantum computing–enhanced collateral management, the worldwide loan industry is estimated to be $6.9 trillion in 2021, implying that quantum optimization might have a substantial influence.



    Automobiles


    Quantum computing can help the automotive sector with R&D, product design, supply-chain management, manufacturing, mobility, and traffic management. 


    • By improving features such as route planning in complicated multirobot processes (the path a robot travels to perform a job), such as welding, gluing, and painting, the technology might, for example, reduce manufacturing process–related costs and cut cycle times. 


    Even a 2% to 5% increase in efficiency might provide $10 billion to $25 billion in annual value in an industry that spends $500 billion on manufacturing expenditures. 



    ~ Jai Krishna Ponnappan


    You may also want to read more about Quantum Computing here.





    Quantum Computing - What Is Quantum Chromodynamics (QCD)?







    Quantum Chromodynamics (QCD) is a physics theory that explains interactions mediated by the strong force, one of the four basic forces of nature. 


    It was developed as an analogue for Quantum Electrodynamics (QED), which describes interactions owing to the electromagnetic force carried by photons. 



    The theory of the strong interaction between quarks mediated by gluons, the basic particles that make up composite hadrons like the proton, neutron, and pion, is known as quantum chromodynamics (QCD). 

    QCD is a non-abelian gauge theory with the symmetry group SU, which is a form of quantum field theory (3). 




    The color attribute is the QCD equivalent of electric charge. 




    Gluons are the theory's force carriers, exactly as photons are in quantum electrodynamics for the electromagnetic force. 

    The hypothesis is an essential aspect of particle physics' Standard Model. 

    Over the years, a considerable amount of experimental data supporting QCD has accumulated. 



    How does the QCD scale work? 


    The quantity is known as the QCD scale in quantum chromodynamics (QCD). 

    When the energy-momentum involved in the process permits just the up, down, and strange quarks to be produced, but not the heavier quarks, the value is for three "active" quark flavors. 

    This is equivalent to energies less than 1.275 GeV. 



    Who was the first to discover quantum chromodynamics? 



    One of the founders of quantum chromodynamics, Harald Fritzsch, remembers some of the backdrop to the theory's development 40 years ago. 



    What is the Quantum Electrodynamics (QED) Theory? 


    Quantum electrodynamics (QED) is the quantum field theory of charged particles' interactions with electromagnetic fields. 

    It mathematically defines not just light's interactions with matter, but also the interactions of charged particles with one another. 

    Albert Einstein's theory of special relativity is integrated into each of QED's equations, making it a relativistic theory. 

    Because atoms and molecules are mainly electromagnetic in nature, all of atomic physics may be thought of as a test bed for the hypothesis. 

    Experiments using the behavior of subatomic particles known as muons have been some of the most exact tests of QED. 

    This sort of particle's magnetic moment has been found to accord with theory to nine significant digits. 

    QED is one of the most effective physics theories ever established, with such great precision. 



    Recent Developments In The Investigation Of QCD


    A new collection of papers edited by Diogo Boito, Instituto de Fisica de Sao Carlos, Universidade de Sao Paulo, Brazil, and Irinel Caprini, Horia Hulubei National Institute for Physics and Nuclear Engineering, Bucharest, Romania, and published in The European Physical Journal Special Topics brings together recent developments in the investigation of QCD. 


    The editors explain in a special introduction to the collection that,

    the divergence of perturbation expansions in the mathematical descriptions of a system can have important physical consequences because the strong force — carried by gluons between quarks, forming the fundamental building blocks of matter — described by QCD has a much stronger coupling than the electromagnetic force. 


    The editors note out that, with to developments in so-called higher-order loop computations, this has become more significant with recent high-precision calculations in QCD. 


    "The fact that perturbative expansions in QCD are divergent greatly influences the renormalization scheme and scale dependency of the truncated expansions," write Boito and Caprini, "which provides a major source of uncertainty in the theoretical predictions of the standard model."

    "One of the primary problems for precision QCD to meet the needs of future accelerator facilities is to understand and tame this behavior.


    A cadre of specialists in the subject discuss these and other themes pertaining to QCD, such as the mathematical theory of revival and the presence of infrared (IR) and ultraviolet (UV) renormalons, in the special edition. 

    These issues are approached from a range of perspectives, including a more basic viewpoint or phenomenological approach, and in the context of related quantum field theories.



    ~ Jai Krishna Ponnappan


    You may also want to read more about Quantum Computing here.



    Further Reading


    Diogo Boito et al, Renormalons and hyperasymptotics in QCD, 

    The European Physical Journal Special Topics (2021).

    DOI: 10.1140/epjs/s11734-021-00276-w


    Quantum Computing Error Correction - Improving Encoding Redundancy Exponentially Drops Net Error Rate.




    Researchers at QuTech, a joint venture between TU Delft and TNO, have achieved a quantum error correction milestone. 

    They've combined high-fidelity operations on encoded quantum data with a scalable data stabilization approach. 

    The results are published in the December edition of Nature Physics. 


    Physical quantum bits, also known as qubits, are prone to mistakes. Quantum decoherence, crosstalk, and improper calibration are some of the causes of these problems. 



    Fortunately, quantum error correction theory suggests that it is possible to compute while simultaneously safeguarding quantum data from such defects. 


    "An error corrected quantum computer will be distinguished from today's noisy intermediate-scale quantum (NISQ) processors by two characteristics," explains QuTech's Prof Leonardo DiCarlo. 


    • "To begin, it will handle quantum data stored in logical rather than physical qubits (each logical qubit consisting of many physical qubits). 

    • Second, quantum parity checks will be interspersed with computing stages to discover and fix defects in physical qubits, protecting the encoded information as it is processed." 


    According to theory, if the occurrence of physical faults is below a threshold and the circuits for logical operations and stabilization are fault resistant, the logical error rate may be exponentially reduced. 

    The essential principle is that when redundancy is increased and more qubits are used to encode data, the net error decreases. 


    Researchers from TU Delft and TNO have recently achieved a crucial milestone toward this aim, producing a logical qubit made up of seven physical qubits (superconducting transmons). 


    "We demonstrate that the encoded data may be used to perform all calculation operations. 

    A important milestone in quantum error correction is the combination of high-fidelity logical operations with a scalable approach for repetitive stabilization " Prof. Barbara Terhal, also of QuTech, agrees. 


    Jorge Marques, the first author and Ph.D. candidate, goes on to say, 


    "Researchers have encoded and stabilized till now. We've now shown that we can also calculate. 

    This is what a fault-tolerant computer must finally do: handle data while also protecting it from faults. 

    We do three sorts of logical-qubit operations: initializing it in any state, changing it using gates, and measuring it. We demonstrate that all operations may be performed directly on encoded data. 

    We find that fault-tolerant versions perform better than non-fault-tolerant variants for each category.



    Fault-tolerant processes are essential for preventing physical-qubit faults from becoming logical-qubit errors. 

    DiCarlo underlines the work's interdisciplinary nature: This is a collaboration between experimental physics, Barbara Terhal's theoretical physics group, and TNO and external colleagues on electronics. 


    IARPA and Intel Corporation are the primary funders of the project.


    "Our ultimate aim is to demonstrate that as we improve encoding redundancy, the net error rate drops exponentially," DiCarlo says. 

    "Our present concentration is on 17 physical qubits, and we'll move on to 49 in the near future. 

    Our quantum computer's architecture was built from the ground up to allow for this scalability."


    ~ Jai Krishna Ponnappan


    You may also want to read more about Quantum Computing here.



    Further Reading:


    J. F. Marques et al, Logical-qubit operations in an error-detecting surface code, Nature Physics (2021). DOI: 10.1038/s41567-021-01423-9

    Journal information: Nature Physics 

    Abstract:

    "Future fault-tolerant quantum computers will require storing and processing quantum data in logical qubits. 
    Here we realize a suite of logical operations on a distance-2 surface code qubit built from seven physical qubits and stabilized using repeated error-detection cycles. 
    Logical operations include initialization into arbitrary states, measurement in the cardinal bases of the Bloch sphere and a universal set of single-qubit gates. 
    For each type of operation, we observe higher performance for fault-tolerant variants over non-fault-tolerant variants, and quantify the difference. 
    In particular, we demonstrate process tomography of logical gates, using the notion of a logical Pauli transfer matrix. 
    This integration of high-fidelity logical operations with a scalable scheme for repeated stabilization is a milestone on the road to quantum error correction with higher-distance superconducting surface codes."



    What Is Artificial General Intelligence?

    Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...