Showing posts with label Quantum Computing. Show all posts
Showing posts with label Quantum Computing. Show all posts

What Is A QPU?






    What is a Quantum Processing Unit (QPU)? 



    Despite its widespread use, the phrase "quantum computer" may be misleading. 



    It conjures up thoughts of a whole new and alien kind of computer, one that replaces all current computing software with a future alternative. 




    • This is a widespread, though massive, misunderstanding at the time of writing. 
    • The potential of quantum computers comes from its capacity to significantly expand the types of problems that are tractable inside computing, rather than being a traditional computer killer. 
    • There are significant computational problems that a quantum computer can readily solve, but that would be impossible to solve on any conventional computing device we could ever hope to construct. 





    But, importantly, these sorts of speedups have only been observed for a few issues, and although more are expected to be discovered, it's very doubtful that doing all calculations on a quantum computer would ever make sense. 



    For the vast majority of activities that use your laptop's clock cycles, a quantum computer is no better. 



    In other words, a quantum computer is actually a co-processor from the standpoint of the programmer. 


    • Previously, computers utilized a variety of coprocessors, each with its own set of capabilities, such as floating-point arithmetic, signal processing, and real-time graphics. 
    • With this in mind, we'll refer to the device on which our code samples run as a QPU (Quantum Processing Unit). 

    This, we believe, emphasizes the critical context in which quantum computing should be considered. 



    A quantum processing unit (QPU), sometimes known as a quantum chip, is a physical (fabricated) device with a network of linked qubits. 


    • It's the cornerstone of a complete quantum computer, which also comprises the QPU's housing environment, control circuits, and a slew of other components.




    Programming for a QPU











    Like other co-processors like the GPU (Graphics Processing Unit), QPU programming entails creating code that will mainly execute on a regular computer's CPU (Central Processing Unit). 


    • The CPU only sends QPU coprocessor instructions to start tasks that are appropriate for its capabilities. 
    • Fortunately (and excitingly), a few prototype QPUs are already accessible and may be accessed through the cloud as of this writing. 
    • Furthermore, conventional computer gear may be used to mimic the behavior of a QPU for simpler tasks. 







    Although emulating bigger QPU programs is impractical, it is a handy method to learn how to operate a real QPU for smaller code snippets. 


    • Even when more complex QPUs become available, the fundamental QPU code examples will remain both useful and instructive. 
    • There are a plethora of QPU simulators, libraries, and systems to choose from.




    Quantum Processing Units (QPU) Make Quantum Computing Possible.



    A quantum processing unit (QPU) is a physical or virtual processor with a large number of linked qubits that may be used to calculate quantum algorithms. 


    • A quantum computer or quantum simulator would not be complete without it. 
    • Quantum devices are still in their infancy, and not all of them are capable of running all Q#  programs. 
    • As a result, while creating programs for various targets, you must keep certain constraints in mind. 
    • Quantum mechanics, the study of atomic structure and function, is used to create a computer architecture. 



    Quantum computing is a world apart from traditional computing ("classical computing"). 


    • It can only answer a limited number of issues, all of which are based on mathematics and expressed as equations. 
    • Quantum computer processing imitates nature at the atomic level, and one of its most promising applications is the investigation of molecule interactions in order to unravel nature's secrets. 



    At Oxford University and IBM's Almaden Research Center in 1998, the first quantum computers were demonstrated. 


    • There were around a hundred functional quantum computers across the globe by 2020. 
    • Due to the exorbitant expense of creating and maintaining quantum computers, quantum computing will most likely be delivered as a cloud service rather than as hardware for enterprises to purchase. We'll have to wait and see. 




    Quantum coprocessor and quantum cloud are two terms for the same thing. 



    Because data rise at such a rapid rate, even the fastest supercomputers face a slew of issues. 


    • Consider the classic traveling salesman dilemma, which entails determining the most cost-effective round journey between locations. 
    • The first stage is to calculate all feasible routes, which yields a 63-digit number if the journey involves 50 cities. 
    • Whereas traditional computers may take days or even months to tackle similar issues, quantum computers are projected to respond in seconds or minutes. 
    • Quantum teleportation, binary values, rice, and the chessboard legend are all examples of quantum supremacy. 



    Superposition and Entanglement of Qubits. 



    Quantum computing relies on the "qubit," or quantum bit, which is made up of one or more electrons and may be designed in a variety of ways. 


    • The situation that permits a qubit to be in several states at the same time is known as quantum superposition (see qubit). 
    • Entanglement is a trait that enables one particle to communicate with another across a long distance. 
    • The two major kinds of quantum computer designs are gate model and quantum annealing. 

     



    Gate Model QC

     


    "Quality Control Model" : 

    Quantum computers based on the gate model have gates that are similar in principle to classical computers but have significantly different logic and design. 


    • Google, IBM, Intel, and Rigetti are among the businesses working on gate model machines, each with its own qubit architecture. 
    • Microwave pulses are used to train the qubits in the quantum device. 
    • The QC chip does digital-to-analog and analog-to-digital conversion. 



    IBM's Q Experience on the Cloud


    • In 2016, IBM released a cloud-based 5-qubit gate model quantum computer to enable scientists to experiment with gate model programming. 
    • A collection of instructional resources is available as part of the IBM Q Experience


    Superconducting materials


    • Superconducting materials, like those employed in the D-Wave computer, must be stored at subzero temperatures, and both photographs show the coverings removed to reveal the quantum chip at the bottom. 
    • Intel's Tangle Lake gate model quantum processor, featuring a novel design of single-electron transistors linked together, was introduced in 2018. 
    • At CES 2018, Intel CEO Brian Krzanich demonstrated the processor. 



    D-Wave Systems


    D-Wave Systems in Canada is the only company that provides a "quantum annealing" computer. 


    • D-Wave computers are massive, chilled computers with up to 2,000 qubits that are utilized for optimization tasks including scheduling, financial analysis, and medical research. 
    • To solve an issue, annealing is used to identify the best path or the most efficient combination of parameters. 



    D-Wave Chips have 5,000 qubits in their newest quantum annealing processor. 


    • A cooling mechanism is required, much as it is for gate type quantum computers. 
    • It becomes colder all the way down to minus 459 degrees Fahrenheit using liquid nitrogen and liquid helium stages from top to bottom. 



    Algorithms for Quantum Computing. 


    Because new algorithms impact the construction of the next generation of quantum architecture, the algorithms for addressing real-world issues must be devised first. 


    • Both the gate model and the annealing processes have challenges to overcome. 
    • However, experts anticipate that quantum computing will become commonplace in the near future. 


    State of Quantum Computing


    Quantum computers are projected to eventually factor large numbers and break cryptographic secrets in a couple of seconds. 


    • It is just a matter of time, according to scientists, until this becomes a reality. 
    • When it occurs, it will have grave consequences since every encrypted transaction, as well as every current cryptocurrency system, will be exposed to hackers. 
    • Quantum-safe approaches, on the other hand, are being developed. Quantum secure is one example of this. 


    The United States, Canada, Germany, France, the United Kingdom, the Netherlands, Russia, China, South Korea, and Japan are the nations that are studying and investing in quantum computing as of 2020. 


    The field of quantum computing is still in its infancy. 

    When an eight-ton UNIVAC I in the 1950s developed into a chip decades later, it begs the question of what quantum computers would look like in 50 years.




    ~ Jai Krishna Ponnappan


    You may also want to read more about Quantum Computing here.





    Quantum Computers A Step Closer To Reality



    Engineers make a significant advancement in the design of quantum computers. 



    A significant roadblock to quantum computers becoming a reality has been overcome thanks to quantum engineers from UNSW Sydney. 


    • They developed a novel method that they claim would allow them to manage millions of spin qubits—the fundamental units of information in a silicon quantum processor. 
    • Until far, quantum computer engineers and scientists have only been able to demonstrate the control of a few qubits in a proof-of-concept model of quantum processors. 
    • However, the team has discovered what they call "the missing jigsaw piece" in the quantum computer design, which should allow them to manage the millions of qubits required for very complicated computations, according to their new study, which was published today in Science Advances. 
    • Dr. Jarryd Pla, a professor at UNSW's School of Electrical Engineering and Telecommunications, says his research group wanted to solve a problem that had plagued quantum computer scientists for decades: how to control millions of qubits without taking up valuable space with additional wiring, which consumes more electricity and generates more heat. 



    "Controlling electron spin qubits depended on our providing microwave magnetic fields by sending a current through a wire directly near the qubit up to this point," Dr. Pla explains. 


    • "If we want to scale up to the millions of qubits that a quantum computer would require to tackle globally important issues like the creation of new vaccines, this presents some serious difficulties." 
    • To begin with, magnetic fields diminish rapidly with distance, so we can only control the qubits that are nearest to the wire. 
    • As we brought in more and more qubits, we'd need to add more and more wires, which would take up a lot of space on the chip." 
    • And, since the device must function at temperatures below -270°C, Dr. Pla claims that adding additional wires will create much too much heat in the chip, jeopardizing the qubits' stability. 
    • "With this wiring method, we're only able to manage a few qubits," Dr. Pla explains. 




    A thorough rethinking of the silicon chip structure was required to solve this issue. 


    • Rather of putting thousands of control lines on a tiny silicon device with millions of qubits, the researchers investigated the possibility of using a magnetic field generated from above the chip to operate all of the qubits at the same time. 
    • The concept of controlling all qubits at the same time was originally proposed by quantum computing experts in the 1990s, but until today, no one had figured out how to accomplish it in a practical manner. 
    • "After removing the cable adjacent to the qubits, we devised a new method for delivering microwave-frequency magnetic control fields throughout the device. In theory, we could send control fields to as many as four million qubits "Dr. Pla agrees. 



    A crystal prism termed a dielectric resonator was inserted immediately above the silicon chip by Dr. Pla and his colleagues. 


    • When microwaves are directed into a resonator, the wavelength of the microwaves is reduced dramatically. 
    • "Because the dielectric resonator reduces the wavelength to less than one millimeter, we now have a highly effective conversion of microwave power into the magnetic field that controls all of the qubits' spins." 

      • The first is that we don't need a lot of power to create a strong driving field for the qubits, which means we don't produce a lot of heat. 
      • The second is that the field is very consistent throughout the device, ensuring that millions of qubits have the same degree of control." Despite the fact that Dr. 



    Pla and his team had created a prototype resonator technology, they lacked the silicon qubits with which to test it. 


    • So he spoke to his UNSW engineering colleague, Scientia Professor Andrew Dzurak, whose team had proven the earliest and most precise quantum logic utilizing the same silicon fabrication process as traditional computer chips during the previous decade. 
    • "When Jarryd presented me with his new concept, I was absolutely blown away," Prof. Dzurak recalls, "and we immediately went to work to see how we might combine it with the qubit devices that my team has created." Ensar Vahapoglu from my team and James Slack-Smith from Jarryd's were assigned to the project as two of our top Ph.D. students. 



    "When the experiment turned out to be a success, we were ecstatic. This issue of controlling millions of qubits had been bothering me for a long time, since it was a significant stumbling block in the development of a full-scale quantum computer." 


    • Quantum computers with thousands of qubits to address business issues, which were once just a pipe dream in the 1980s, may now be less than a decade away. 
    • In addition, due of their capacity to simulate very complex systems, they are anticipated to offer fresh firepower to addressing global problems and creating new technologies. 



    Quantum computing technology has the potential to help climate change, medicine and vaccine development, code decryption, and artificial intelligence. 


    • The team's next goal is to utilize this new technique to make designing near-term silicon quantum computers easier. 
    • "The on-chip control wire is removed, making room for more qubits and the rest of the components needed to create a quantum processor. 
    • It simplifies the job of moving on to the next stage of manufacturing devices with tens of qubits "Prof. Dzurak agrees. 
    • "While there are still technical hurdles to overcome before computers with a million qubits can be built," Dr. Pla adds, "we are thrilled that we now have a method to manage them."



    ~ Jai Krishna Ponnappan

    You may also want to read more about Quantum Computing here.




    Quantum Computing Hype Cycle



      Context: Quantum computing has been classified as an emerging technology since 2005.





      Because quantum computing has been on the Gartner Hype Cycle up-slope for more than 10 years, it is arguably the most costly and hardest to comprehend new technology. 


      Quantum computing has been classified as an emerging technology since 2005, and it is still classified as such.

      The idea that theoretical computing techniques cannot be isolated from the physics that governs computing devices is at the heart of quantum computing





      Quantum physics, in particular, introduces a new paradigm for computer science that fundamentally changes our understanding of information processing and what we previously believed to be the top limits of computing



      If quantum mechanics governs nature, we should be able to mimic it using QCs. 

      The executive summary depicts the next generation of computing.




       

      Quantum Computing On The Hype Cycle.


      Since the hype cycle for quantum computing had been first established by Gartner, Pundits have predicted that it will take over and permanently affect the world. 

      Although it's safe to argue that quantum computers might mark the end for traditional cryptography, the truth will most likely be less dramatic. 

      This has obvious ramifications for technology like blockchain, which are expected to power future financial systems. 

      While the Bitcoin system, for example, is expected to keep traditional mining computers busy until 2140, a quantum computer could potentially mine every token very instantly using brute-force decoding. 



      Quantum cryptography-based digital ledger technologies that are more powerful might level the playing field. 




      All of this assumes that quantum computing will become widely accessible and inexpensive. As things are, this seems to be feasible. 

      Serious computer companies such as IBM, Honeywell, Google, and Microsoft, as well as younger specialty startups, are all working on putting quantum computing in the cloud right now and welcoming participation from the entire computing community. 

      To assist novice users, introduction packs and development kits are provided. 

      These are significant steps forward that will very probably accelerate progress as users develop more diversified and demanding workloads and find out how to handle them with quantum technology. 

      The predicted democratizing impact of universal cloud access, which should bring more individuals from a wider diversity of backgrounds into touch with quantum to comprehend, utilize, and influence its continued development, is also significant. 




      Despite the fact that it has arrived, quantum computing is still in its infancy. 


      • Commercial cloud services might enable inexpensive access in the future, similar to how scientific and banking institutions can hire cloud AI applications to do complicated tasks that are invoiced based on the amount of computer cycles utilized now. 
      • To diagnose genetic problems in newborn newborns, hospitals, for example, are using genome sequencing applications housed on AI accelerators in hyperscale data centers. The procedure is inexpensive, and the findings are available in minutes, allowing physicians to intervene quickly and possibly save lives. 
      • Quantum computing as a service has the potential to improve healthcare and a variety of other sectors, including materials science. 
      • Simulating a coffee molecule, for example, is very challenging with a traditional computer, requiring more than 100 years of processing time. The work can be completed in seconds by a quantum computer. 
      • Climate analysis, transit planning, biology, financial services, encryption, and codebreaking are some of the other areas that might benefit. 
      • Quantum computing, for all of its potential, isn't come to replace traditional computing or flip the world on its head. 
      • Quantum bits (qubits) may hold exponentially more information than traditional binary bits since they can be in both states, 0 and 1, but binary bits can only be in one state. 
      • Quantum, on the other hand, is only suitable for specific kinds of algorithms since their state when measured is determined by chance. Others are best handled by traditional computers. 





      Quantum computing will take more than a decade to reach the Plateau of Productivity.




      Because of the massive efficiency it delivers at scale, quantum computing has caught the attention of technological leaders. 

      However, it will take years to develop for most applications, even if it makes limited progress in highly specialized sectors like materials science and cryptography in the short future. 


      Quantum approaches, on the other hand, are gaining traction with specific AI tools, as seen by recent advancements in natural language processing that potentially break open the "black box" of today's neural networks. 




      • The lambeq kit, sometimes known as lambeq, is a traditional Python repository available on GitHub. 
      • It coincides with the arrival to Cambridge Quantum of well-known AI and NLP researchers, and provides an opportunity for hands-on QNLP experience. 
      • The lambeq program is supposed to turn phrases into quantum circuits, providing a fresh perspective on text mining, language translation, and bioinformatics corpora. It is named after late semantics scholar Joachim Lambek. 
      • According to Bob Coecke, principal scientist at Cambridge Quantum, NLP may give explainability not feasible in today's "bag of words" neural techniques done on conventional computers. 





      These patterns, as shown on schema, resemble parsed phrases on elementary school blackboards. 

      Coecke told that current NLP approaches "don't have the capacity to assemble things together to discover a meaning." 


      "What we want to do is introduce compositionality in the traditional sense, which means using the same compositional framework. We want to reintroduce logic." 

      Honeywell announced earlier this year that it would merge its own quantum computing operations with Cambridge Quantum to form an independent company to pursue cybersecurity, drug discovery, optimization, material science, and other applications, including AI, as part of its efforts to expand quantum infrastructure. 

      Honeywell claimed the new operation will cost between $270 million and $300 million to build. 


      Cambridge Quantum said that it will stay autonomous while collaborating with a variety of quantum computing companies, including IBM. 

      In an e-mail conversation, Cambridge Quantum founder and CEO Ilyas Khan said that the lambeq work is part of a larger AI project that is the company's longest-term initiative. 

      In terms of timetables, we may be pleasantly pleased, but we feel that NLP is at the core of AI in general, and thus something that will truly come to the fore as quantum computers scale," he added. 

      In Cambridge Quantum's opinion, the most advanced application areas are cybersecurity and quantum chemistry. 





      What type of quantum hardware timetable do we expect in the future? 




      • Not only is there a well-informed agreement on the hardware plan, but also on the software roadmap (Honeywell and IBM among credible corporate players in this regard). 
      • Quantum computing is not a general-purpose technology; we cannot utilize quantum computing to solve all of our existing business challenges.
      • According to Gartner's Hype Cycle for Computing Infrastructure for 2021, quantum computing would take more than ten years to reach the Plateau of Productivity. 
      • That's where the analytics company expects IT users to get the most out of a certain technology. 
      • Quantum computing's current position on Gartner's Peak of Inflated Expectations — a categorization for emerging technologies that are deemed overhyped — is the same as it was in 2020.


      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.



      Quantum Computing - What Exactly Is A Qubit?


      While the idea of a qubit has previously been discussed, it is critical to remember that it is the basic technology of any quantum computing paradigm, whether adiabatic or universal. 


      • A qubit is a physical device that acts as a quantum computer's most basic memory block. 
      • They are quantum versions of the classical bits (transistors) used in today's computers and smartphones. 
      • Both bits and qubits have the same objective in mind: to physically record the data that each computer is processing. 
      • The bit or qubit must be modified to reflect the change in information as it is altered throughout computation. 
      • This is the only way the computer will be able to keep track of what is going on.
      • Because quantum computers store information in quantum states (superpositions and entanglement states), qubits must be able to physically represent these quantum states. 
      • This is difficult since quantum events only occur in the most severe circumstances. 



      To make matters worse, quantum phenomena are natural occurrences in the proper context. 

      Such events may be triggered by anything from a beam of light to a change in pressure or temperature, which can excite the qubit into a different quantum state than planned, distorting the information the qubit was supposed to contain. 


      • To address these issues, scientists place quantum computers in extremely controlled environments, such as temperatures no higher than 0.02 Kelvin — 20,000 degrees colder than outer space — in nearly an empty vacuum — 100 trillion times lower than atmospheric pressure — and either extremely light or extremely strong magnetic fields, depending on the circumstances. 
      • All of this effort is aimed at allowing a qubit candidate to participate mainly in superposition states. 
      • The core of quantum computing is this event, which allows qubits to store not just 0 or 1 but also a superposition of 0 and 1. 
      • These memory blocks can store considerably more information than their binary counterparts because each qubit may have many states – potentially infinite states (classical bits). 
      • As a result, quantum computers can do computations considerably more quickly.


      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.



      Quantum Computer Physics



      Computers, no matter how sophisticated they have gotten over the last century, still rely on binary choices of 0 and 1 to make sense of the chaos around us. 


      However, as our knowledge of the world grows, we become increasingly aware of the limits of this paradigm. 



      Quantum mechanics advancements continue to remind us of our universe's unfathomable complexity. The ideas of superposition and entanglement are at the heart of this rapidly growing area of physics. 

      • Simply stated, this is the notion that subatomic particles such as electrons may exist in many locations at the same time (superposition) and can seem to interact across apparently empty space (entanglement). 
      • These phenomena offer a one-of-a-kind physical mechanism for analyzing and storing data at rates that are orders of magnitude quicker than traditional computers. 
      • QCs, which were originally proposed in 1980, are now widely regarded as the technology to achieve this goal. 
      • The concept behind quantum computer bits (or qubits) is that they may store information not just as 0s or 1s, but also as a superposition of both 0 and 1 – theoretically endless permutations of numbers between 0 and 1. 
      • As a result, each quantum-bit is endowed with enormous quantities of data. Imagine the potential of a machine that can access millions of superpositions between 0 and 1 if computers today can do so much with only two states. 
      • QCs will be able to compute information much faster, shattering our present data processing limitations. 


      They're the means of bringing artificial intelligence, risk analysis, optimization, and a slew of other technologies to fruition that we've long envisioned. 

      They are the logical successor to the contemporary computer, which has characterized the information era, for many new jobs. 

      This has ramifications for brain degenerative illnesses, energy, agriculture, economics, biochemistry, and a variety of other fields of research. 


      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.



      Quantum Computing - Transition from Classical to Quantum Computers (QCs)


      Since the invention of the computer in the 1930s, we have been able to build economic, social, and technical models for many areas of life.

      The binary system is used in these machines. This implies that data is represented as a string of 0s or 1s, with each letter having to be a binary option of 0 or 1 without ambiguity. 



      Computers need a matching physical mechanism to represent this data. Consider this system as a set of switches, one in each direction indicating a 1 and the other a 0. On today's microprocessors, there are billions of these switches. 



      Information is stored in the form of strings of 0s and 1s, which are then processed, evaluated, and computed using logic gates. 


      These are transistors that have been linked together. Logic gates are the basic building blocks for the massive calculations we ask modern computers to do, and they may be linked together hundreds of millions of times to execute sophisticated algorithms.


      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.



      Digital To Quantum Computers At A Breakneck Speed



      Every year, the quantity of data created throughout the globe doubles. As data and its collection and transport transcend beyond stationary computers, as many gigabytes, terabytes, petabytes, and exabytes are created, processed, and gathered in 2018 as in all of human history previously to 2018. 

      Smart Phones, Smart Homes, Smart Clothes, Smart Factories, Smart Cities... the Internet is connecting numerous "smart" objects. And they're generating a growing amount of their own data. 

      • As a result, the demand for computer chip performance is increasing at an exponential rate. 
      • In fact, during the previous 50 years, their computational capacity has about quadrupled every 18 months. 
      • The number of components per unit space on integrated circuits grows in accordance with a law proposed in 1965 by Gordon Moore, Intel's future co-founder. 
      • The reason that the overall volume of data is growing faster than individual computer performance is due to the fact that the number of data-producing devices is growing at the same rate.


      Concerns that "Moore's Law" will lose its validity at some time date back 25 years. The reason for this is because component miniaturization is causing issues: 


      • As electrons move through progressively smaller and more numerous circuits, the chips get more hot. But there's a bigger issue: electronic structures have shrunk to fewer than 10 nanometers in size. This is around 40 atoms. 
      • The principles of quantum physics rule in transistors this small, rendering electron behavior completely unpredictable. Moore himself forecast the conclusion of his legislation in 2007, giving it another 10 to 15 years. 
      • Indeed, for the first time ever, the semiconductor industry's 2016 plan for chip development for the next year did not follow Moore's law. 
      • However, thanks to nano-engineers' ingenuity, it is conceivable that even smaller and quicker electronic structures will be achievable in the future, delaying the end of “classical” shrinking for a few more years. But then what? 

      How long can we depend on the ability to simply increase the performance of computer chips? 

      The fact that Moore's Law will no longer be true does not indicate that we have reached the end of the road in terms of improving information processing efficiency. 


      However, there is a technique to make computers that are significantly quicker, even billions of times more powerful: quantum computers. 

      • These computers operate in a very different manner than traditional computers. 
      • Rather than ignoring electron quantum qualities and the challenges associated with ever-increasing component downsizing, a quantum computer overtly uses these qualities in how it processes data. 
      • We might tackle issues that are much too complicated for today's "supercomputers" in physics, biology, weather research, and other fields with the aid of such devices. 
      • The development of quantum computers might spark a technological revolution that will dominate the twenty-first century in the same way that digital circuits dominated the twentieth. 
      • Quantum computers are expected to offer computation speeds that are unimaginable today.

      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.


      Quantum Computing And Digital Evolution



      The Computer of Today is based on a concept from the 1940s. Although the shrinking of computer chips has prompted computer developers to study quantum mechanical rules, today's computers still operate purely on classical physics principles. 



      • Tubes and capacitors were used in the earliest computers in the 1940s, and the transistor, which was initially a "classical" component, is still a vital component in any computer today. 
      • The term "transistor" stands for "transfer resistor," which simply indicates that an electrical resistance is controlled by a voltage or current. 
      • The first transistor patent was submitted in 1925. Shortly after, in the 1930s, it was discovered that basic arithmetical operations may be performed by carefully controlling the electric current (for example, in diodes). 
      • The lack of computation speed and energy consumption are the two primary reasons why point contact transistors, triodes, and diodes based on electron tubes are only seen in technological museums today. 
      • Although the components have evolved, the architecture developed by Hungarian mathematician and scientist John von Neumann in 1945 remains the foundation for today's computers. 
      • The memory card, which carries both program instructions and (temporarily) the data to be processed, is at the heart of von Neumann's computer reference model. 
      • A control unit manages the data processing sequentially, that is, step by step, in single binary computing steps. A “SISD architecture” is a term used by computer scientists (Single Instruction, Single Data ). 

      Despite the fact that transistors and electron tubes have been replaced with smaller, faster field effect transistors on semiconductor chips, the architecture of today's computers has remained same since its inception. 


      How does sequential information processing in computers work? 


      Alan Turing, a British mathematician, theoretically outlined the fundamental data units and their processing in 1936. 

      The binary digital units, or "bits," are the most basic information units in the system. Because a bit may assume either the state "1" or the state "0," similar to a light switch that may be turned on or off, binary implies "two-valued." 

      • The word "digital" comes from the Latin digitus, which means "finger," and refers to a time when people counted with their fingers. 
      • Today, "digital" refers to information that may be represented by numbers. 
      • In today's computers, electronic data processing entails turning incoming data in the form of many consecutively organized bits into an output that is also in the form of many consecutively ordered bits. 
      • Blocks of individual bits are processed one after the other, much like chocolate bars on an assembly line; for a letter, for example, a block of eight bits, referred to as a "byte," is needed. 
      • There are just two processing options for single bits: a 0 (or 1) stays a 0 (or 1), or a 0 (or 1) transforms to a 1. (or 0). 
      • The fundamental electrical components of digital computers, known as logic gates1, are always the same fundamental fundamental electronic circuits, embodied by physical components such as transistors, through which information is transferred as electric impulses. 
      • The connection of many similar gates allows for more sophisticated processes, such as the addition of two integers. 

      Every computer today is a Turing machine: it does nothing but process information encoded in zeros and ones in a sequential manner, changing it into an output encoded in zeros and ones as well. 


      • However, this ease of data processing comes at a cost: to manage the quantity of data necessary in today's complicated computer systems, a large number of zeros and ones must be handled. 
      • The amount of accessible computational blocks improves the processing capacity of a computer in a linear fashion. A chip with twice as many circuits can process data twice as quickly. 
      • The speed of today's computer chips is measured in gigahertz, or billions of operations per second. This necessitates the use of billions of transistors. 
      • The circuitry must be tiny to fit this many transistors on chips the size of a thumb nail. Only thus can such fast-switching systems' total size and energy consumption be kept under control. 
      • The move from the electron tube to semiconductor-based bipolar or field effect transistors, which were created in 1947, was critical for the shrinking of fundamental computing units on integrated circuits in microchips. 
      • Doped semiconductor layers are used to construct these nanoscale transistors. 


      This is where quantum mechanics enters the picture. 

      • We need a quantum mechanical model for the migration of the electrons within these semiconductors to comprehend and regulate what's going on. 
      • This is the so-called "band model" of electronic energy levels in metallic conductors. 

      Understanding quantum physics was not required for the digital revolution of the twentieth century, but it was a need for the extreme downsizing of integrated circuits.


      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.


      Quantum Computing - A Different Approach to Calculation.



      Richard Feynman posed the subject of whether the quantum world might be replicated by a normal computer in his 1981 lecture Simulating Physics with Computer, as part of a philosophical reflection on quantum theory. 

      Because quantum variables do not assume fixed values, the difficulty arises from the probabilities associated with quantum states. 

      They do, in fact, occupy a full mathematical space of potential states at any given instant. 


      This greatly expands the scope of the computations. 

      Any traditional computer, Feynman concluded, would be swamped sooner or later. 

      However, he went on to wonder if this challenge might be handled with a computer that merely calculates state probabilities, or a computer whose internal states are quantum variables themselves. 


      • The weird quantum features of atomic and subatomic particles would be openly exploited by such a quantum computer. 
      • Above important, it would have a radically different structure and operation from today's computers' von Neumann architecture. 
      • It would compute in parallel on the many states adopted concurrently by the quantum variables, rather than processing bit by bit like a Turing computer. 
      • In a quantum computer, the basic information units are no longer called "bits," but "quantum bits," or "qubits" for short. 
      • Unfortunately, this term is deceptive since it still includes the term binary, which is precisely what quantum bits are not.  
      • The nature of information in qubits differs significantly from that of traditional data. Quantum bits, or qubits, are no longer binary, accepting both states at the same time, as well as any values in between. 
      • As a result, a qubit can store significantly more information than merely 0 or 1. 


      The unusual capacity of qubits is due to two peculiar qualities that can only be found in quantum physics: 


      1. Superposition of classically exclusive states: Quantum states may exist in superpositions of classically exclusive states. The light switch in the tiny world may be turned on and off at the same time. This allows a qubit to assume the states 0 and 1 at the same time, as well as all states in between.
      2. Entanglement: Several qubits may be brought into entangled states, in which they are joined in a non-separable whole as though by an unseen spring. They are in some form of direct communication with each other, even though they are geographically distant, thanks to a "spooky action at a distance," a phrase used by Albert Einstein in sarcasm to emphasize his disbelief in this quantum phenomena. It's as though each quantum bit is aware of what the others are doing and is influenced by it.


      Superpositions and entanglement were formerly the subject of fierce debate among quantum physicists. 

      • They've now formed the cornerstone of a whole new computer architecture. 
      • Calculations on a quantum computer are substantially different from those on a conventional computer due to the radically distinct nature of qubits. 


      Unlike a traditional logic gate, a quantum gate (or quantum logical gate) represents a basic physical manipulation of one or more (entangled) qubits rather than a technological building block that transforms individual bits into one another in a well-defined manner. 


      • A particular quantum gate may be mathematically characterized by a matching (unitary) matrix that works on the qubit ensemble's states (the quantum register). 
      • The physical structure of the qubits determines how such an operation and the flow of information will seem in each situation. 

      Quantum gates' tangible technological manifestation is still a work in progress.


      ~ Jai Krishna Ponnappan

      You may also want to read more about Quantum Computing here.


      What Is Artificial General Intelligence?

      Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...