Quantum Computing Hype Cycle



    Context: Quantum computing has been classified as an emerging technology since 2005.





    Because quantum computing has been on the Gartner Hype Cycle up-slope for more than 10 years, it is arguably the most costly and hardest to comprehend new technology. 


    Quantum computing has been classified as an emerging technology since 2005, and it is still classified as such.

    The idea that theoretical computing techniques cannot be isolated from the physics that governs computing devices is at the heart of quantum computing





    Quantum physics, in particular, introduces a new paradigm for computer science that fundamentally changes our understanding of information processing and what we previously believed to be the top limits of computing



    If quantum mechanics governs nature, we should be able to mimic it using QCs. 

    The executive summary depicts the next generation of computing.




     

    Quantum Computing On The Hype Cycle.


    Since the hype cycle for quantum computing had been first established by Gartner, Pundits have predicted that it will take over and permanently affect the world. 

    Although it's safe to argue that quantum computers might mark the end for traditional cryptography, the truth will most likely be less dramatic. 

    This has obvious ramifications for technology like blockchain, which are expected to power future financial systems. 

    While the Bitcoin system, for example, is expected to keep traditional mining computers busy until 2140, a quantum computer could potentially mine every token very instantly using brute-force decoding. 



    Quantum cryptography-based digital ledger technologies that are more powerful might level the playing field. 




    All of this assumes that quantum computing will become widely accessible and inexpensive. As things are, this seems to be feasible. 

    Serious computer companies such as IBM, Honeywell, Google, and Microsoft, as well as younger specialty startups, are all working on putting quantum computing in the cloud right now and welcoming participation from the entire computing community. 

    To assist novice users, introduction packs and development kits are provided. 

    These are significant steps forward that will very probably accelerate progress as users develop more diversified and demanding workloads and find out how to handle them with quantum technology. 

    The predicted democratizing impact of universal cloud access, which should bring more individuals from a wider diversity of backgrounds into touch with quantum to comprehend, utilize, and influence its continued development, is also significant. 




    Despite the fact that it has arrived, quantum computing is still in its infancy. 


    • Commercial cloud services might enable inexpensive access in the future, similar to how scientific and banking institutions can hire cloud AI applications to do complicated tasks that are invoiced based on the amount of computer cycles utilized now. 
    • To diagnose genetic problems in newborn newborns, hospitals, for example, are using genome sequencing applications housed on AI accelerators in hyperscale data centers. The procedure is inexpensive, and the findings are available in minutes, allowing physicians to intervene quickly and possibly save lives. 
    • Quantum computing as a service has the potential to improve healthcare and a variety of other sectors, including materials science. 
    • Simulating a coffee molecule, for example, is very challenging with a traditional computer, requiring more than 100 years of processing time. The work can be completed in seconds by a quantum computer. 
    • Climate analysis, transit planning, biology, financial services, encryption, and codebreaking are some of the other areas that might benefit. 
    • Quantum computing, for all of its potential, isn't come to replace traditional computing or flip the world on its head. 
    • Quantum bits (qubits) may hold exponentially more information than traditional binary bits since they can be in both states, 0 and 1, but binary bits can only be in one state. 
    • Quantum, on the other hand, is only suitable for specific kinds of algorithms since their state when measured is determined by chance. Others are best handled by traditional computers. 





    Quantum computing will take more than a decade to reach the Plateau of Productivity.




    Because of the massive efficiency it delivers at scale, quantum computing has caught the attention of technological leaders. 

    However, it will take years to develop for most applications, even if it makes limited progress in highly specialized sectors like materials science and cryptography in the short future. 


    Quantum approaches, on the other hand, are gaining traction with specific AI tools, as seen by recent advancements in natural language processing that potentially break open the "black box" of today's neural networks. 




    • The lambeq kit, sometimes known as lambeq, is a traditional Python repository available on GitHub. 
    • It coincides with the arrival to Cambridge Quantum of well-known AI and NLP researchers, and provides an opportunity for hands-on QNLP experience. 
    • The lambeq program is supposed to turn phrases into quantum circuits, providing a fresh perspective on text mining, language translation, and bioinformatics corpora. It is named after late semantics scholar Joachim Lambek. 
    • According to Bob Coecke, principal scientist at Cambridge Quantum, NLP may give explainability not feasible in today's "bag of words" neural techniques done on conventional computers. 





    These patterns, as shown on schema, resemble parsed phrases on elementary school blackboards. 

    Coecke told that current NLP approaches "don't have the capacity to assemble things together to discover a meaning." 


    "What we want to do is introduce compositionality in the traditional sense, which means using the same compositional framework. We want to reintroduce logic." 

    Honeywell announced earlier this year that it would merge its own quantum computing operations with Cambridge Quantum to form an independent company to pursue cybersecurity, drug discovery, optimization, material science, and other applications, including AI, as part of its efforts to expand quantum infrastructure. 

    Honeywell claimed the new operation will cost between $270 million and $300 million to build. 


    Cambridge Quantum said that it will stay autonomous while collaborating with a variety of quantum computing companies, including IBM. 

    In an e-mail conversation, Cambridge Quantum founder and CEO Ilyas Khan said that the lambeq work is part of a larger AI project that is the company's longest-term initiative. 

    In terms of timetables, we may be pleasantly pleased, but we feel that NLP is at the core of AI in general, and thus something that will truly come to the fore as quantum computers scale," he added. 

    In Cambridge Quantum's opinion, the most advanced application areas are cybersecurity and quantum chemistry. 





    What type of quantum hardware timetable do we expect in the future? 




    • Not only is there a well-informed agreement on the hardware plan, but also on the software roadmap (Honeywell and IBM among credible corporate players in this regard). 
    • Quantum computing is not a general-purpose technology; we cannot utilize quantum computing to solve all of our existing business challenges.
    • According to Gartner's Hype Cycle for Computing Infrastructure for 2021, quantum computing would take more than ten years to reach the Plateau of Productivity. 
    • That's where the analytics company expects IT users to get the most out of a certain technology. 
    • Quantum computing's current position on Gartner's Peak of Inflated Expectations — a categorization for emerging technologies that are deemed overhyped — is the same as it was in 2020.


    ~ Jai Krishna Ponnappan

    You may also want to read more about Quantum Computing here.



    What Is Artificial General Intelligence?

    Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...