Showing posts with label Quantum Computing Timeline. Show all posts
Showing posts with label Quantum Computing Timeline. Show all posts

Digital To Quantum Computers At A Breakneck Speed



Every year, the quantity of data created throughout the globe doubles. As data and its collection and transport transcend beyond stationary computers, as many gigabytes, terabytes, petabytes, and exabytes are created, processed, and gathered in 2018 as in all of human history previously to 2018. 

Smart Phones, Smart Homes, Smart Clothes, Smart Factories, Smart Cities... the Internet is connecting numerous "smart" objects. And they're generating a growing amount of their own data. 

  • As a result, the demand for computer chip performance is increasing at an exponential rate. 
  • In fact, during the previous 50 years, their computational capacity has about quadrupled every 18 months. 
  • The number of components per unit space on integrated circuits grows in accordance with a law proposed in 1965 by Gordon Moore, Intel's future co-founder. 
  • The reason that the overall volume of data is growing faster than individual computer performance is due to the fact that the number of data-producing devices is growing at the same rate.


Concerns that "Moore's Law" will lose its validity at some time date back 25 years. The reason for this is because component miniaturization is causing issues: 


  • As electrons move through progressively smaller and more numerous circuits, the chips get more hot. But there's a bigger issue: electronic structures have shrunk to fewer than 10 nanometers in size. This is around 40 atoms. 
  • The principles of quantum physics rule in transistors this small, rendering electron behavior completely unpredictable. Moore himself forecast the conclusion of his legislation in 2007, giving it another 10 to 15 years. 
  • Indeed, for the first time ever, the semiconductor industry's 2016 plan for chip development for the next year did not follow Moore's law. 
  • However, thanks to nano-engineers' ingenuity, it is conceivable that even smaller and quicker electronic structures will be achievable in the future, delaying the end of “classical” shrinking for a few more years. But then what? 

How long can we depend on the ability to simply increase the performance of computer chips? 

The fact that Moore's Law will no longer be true does not indicate that we have reached the end of the road in terms of improving information processing efficiency. 


However, there is a technique to make computers that are significantly quicker, even billions of times more powerful: quantum computers. 

  • These computers operate in a very different manner than traditional computers. 
  • Rather than ignoring electron quantum qualities and the challenges associated with ever-increasing component downsizing, a quantum computer overtly uses these qualities in how it processes data. 
  • We might tackle issues that are much too complicated for today's "supercomputers" in physics, biology, weather research, and other fields with the aid of such devices. 
  • The development of quantum computers might spark a technological revolution that will dominate the twenty-first century in the same way that digital circuits dominated the twentieth. 
  • Quantum computers are expected to offer computation speeds that are unimaginable today.

~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing And Digital Evolution



The Computer of Today is based on a concept from the 1940s. Although the shrinking of computer chips has prompted computer developers to study quantum mechanical rules, today's computers still operate purely on classical physics principles. 



  • Tubes and capacitors were used in the earliest computers in the 1940s, and the transistor, which was initially a "classical" component, is still a vital component in any computer today. 
  • The term "transistor" stands for "transfer resistor," which simply indicates that an electrical resistance is controlled by a voltage or current. 
  • The first transistor patent was submitted in 1925. Shortly after, in the 1930s, it was discovered that basic arithmetical operations may be performed by carefully controlling the electric current (for example, in diodes). 
  • The lack of computation speed and energy consumption are the two primary reasons why point contact transistors, triodes, and diodes based on electron tubes are only seen in technological museums today. 
  • Although the components have evolved, the architecture developed by Hungarian mathematician and scientist John von Neumann in 1945 remains the foundation for today's computers. 
  • The memory card, which carries both program instructions and (temporarily) the data to be processed, is at the heart of von Neumann's computer reference model. 
  • A control unit manages the data processing sequentially, that is, step by step, in single binary computing steps. A “SISD architecture” is a term used by computer scientists (Single Instruction, Single Data ). 

Despite the fact that transistors and electron tubes have been replaced with smaller, faster field effect transistors on semiconductor chips, the architecture of today's computers has remained same since its inception. 


How does sequential information processing in computers work? 


Alan Turing, a British mathematician, theoretically outlined the fundamental data units and their processing in 1936. 

The binary digital units, or "bits," are the most basic information units in the system. Because a bit may assume either the state "1" or the state "0," similar to a light switch that may be turned on or off, binary implies "two-valued." 

  • The word "digital" comes from the Latin digitus, which means "finger," and refers to a time when people counted with their fingers. 
  • Today, "digital" refers to information that may be represented by numbers. 
  • In today's computers, electronic data processing entails turning incoming data in the form of many consecutively organized bits into an output that is also in the form of many consecutively ordered bits. 
  • Blocks of individual bits are processed one after the other, much like chocolate bars on an assembly line; for a letter, for example, a block of eight bits, referred to as a "byte," is needed. 
  • There are just two processing options for single bits: a 0 (or 1) stays a 0 (or 1), or a 0 (or 1) transforms to a 1. (or 0). 
  • The fundamental electrical components of digital computers, known as logic gates1, are always the same fundamental fundamental electronic circuits, embodied by physical components such as transistors, through which information is transferred as electric impulses. 
  • The connection of many similar gates allows for more sophisticated processes, such as the addition of two integers. 

Every computer today is a Turing machine: it does nothing but process information encoded in zeros and ones in a sequential manner, changing it into an output encoded in zeros and ones as well. 


  • However, this ease of data processing comes at a cost: to manage the quantity of data necessary in today's complicated computer systems, a large number of zeros and ones must be handled. 
  • The amount of accessible computational blocks improves the processing capacity of a computer in a linear fashion. A chip with twice as many circuits can process data twice as quickly. 
  • The speed of today's computer chips is measured in gigahertz, or billions of operations per second. This necessitates the use of billions of transistors. 
  • The circuitry must be tiny to fit this many transistors on chips the size of a thumb nail. Only thus can such fast-switching systems' total size and energy consumption be kept under control. 
  • The move from the electron tube to semiconductor-based bipolar or field effect transistors, which were created in 1947, was critical for the shrinking of fundamental computing units on integrated circuits in microchips. 
  • Doped semiconductor layers are used to construct these nanoscale transistors. 


This is where quantum mechanics enters the picture. 

  • We need a quantum mechanical model for the migration of the electrons within these semiconductors to comprehend and regulate what's going on. 
  • This is the so-called "band model" of electronic energy levels in metallic conductors. 

Understanding quantum physics was not required for the digital revolution of the twentieth century, but it was a need for the extreme downsizing of integrated circuits.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing - A Different Approach to Calculation.



Richard Feynman posed the subject of whether the quantum world might be replicated by a normal computer in his 1981 lecture Simulating Physics with Computer, as part of a philosophical reflection on quantum theory. 

Because quantum variables do not assume fixed values, the difficulty arises from the probabilities associated with quantum states. 

They do, in fact, occupy a full mathematical space of potential states at any given instant. 


This greatly expands the scope of the computations. 

Any traditional computer, Feynman concluded, would be swamped sooner or later. 

However, he went on to wonder if this challenge might be handled with a computer that merely calculates state probabilities, or a computer whose internal states are quantum variables themselves. 


  • The weird quantum features of atomic and subatomic particles would be openly exploited by such a quantum computer. 
  • Above important, it would have a radically different structure and operation from today's computers' von Neumann architecture. 
  • It would compute in parallel on the many states adopted concurrently by the quantum variables, rather than processing bit by bit like a Turing computer. 
  • In a quantum computer, the basic information units are no longer called "bits," but "quantum bits," or "qubits" for short. 
  • Unfortunately, this term is deceptive since it still includes the term binary, which is precisely what quantum bits are not.  
  • The nature of information in qubits differs significantly from that of traditional data. Quantum bits, or qubits, are no longer binary, accepting both states at the same time, as well as any values in between. 
  • As a result, a qubit can store significantly more information than merely 0 or 1. 


The unusual capacity of qubits is due to two peculiar qualities that can only be found in quantum physics: 


  1. Superposition of classically exclusive states: Quantum states may exist in superpositions of classically exclusive states. The light switch in the tiny world may be turned on and off at the same time. This allows a qubit to assume the states 0 and 1 at the same time, as well as all states in between.
  2. Entanglement: Several qubits may be brought into entangled states, in which they are joined in a non-separable whole as though by an unseen spring. They are in some form of direct communication with each other, even though they are geographically distant, thanks to a "spooky action at a distance," a phrase used by Albert Einstein in sarcasm to emphasize his disbelief in this quantum phenomena. It's as though each quantum bit is aware of what the others are doing and is influenced by it.


Superpositions and entanglement were formerly the subject of fierce debate among quantum physicists. 

  • They've now formed the cornerstone of a whole new computer architecture. 
  • Calculations on a quantum computer are substantially different from those on a conventional computer due to the radically distinct nature of qubits. 


Unlike a traditional logic gate, a quantum gate (or quantum logical gate) represents a basic physical manipulation of one or more (entangled) qubits rather than a technological building block that transforms individual bits into one another in a well-defined manner. 


  • A particular quantum gate may be mathematically characterized by a matching (unitary) matrix that works on the qubit ensemble's states (the quantum register). 
  • The physical structure of the qubits determines how such an operation and the flow of information will seem in each situation. 

Quantum gates' tangible technological manifestation is still a work in progress.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing Power On An Exponential Scale.



A single qubit can only accomplish so much. 


  • Only the entanglement of numerous qubits in quantum registers allows for the high-level parallelization of operations that make quantum computers so powerful. 
  • It's as if a slew of chocolate manufacturers opened their doors at the same moment. 
  • You can process multiple states in parallel if you have more qubits. 
  • Unlike traditional computers, which rise in processing power linearly as the number of computational components rises, the processing power of a quantum computer grows exponentially with the number of qubits employed. 

  • When 100 extra qubits are added to 100 qubits, the performance of a quantum computer does not double. 
  • When just a single qubit is added to the 100 qubits, it is already doubled in concept. 


  • In theory, adding 10 qubits to a quantum computer increases its performance by a factor of 1000, but in practice, other factors limit the increase (see below). 
  • With 20 new qubits, the quantum computer is already a million times faster, and with 50 new qubits, the quantum computer is a million billion times faster. 
  • And, with 100 additional information carriers, the performance of a quantum computer can no longer be described in numbers, even if the performance of a normal computer has only doubled.


Quantum computers, even those with just a few hundred qubits, have considerably more computing capacity than conventional computers.  


  • At this point, it's worth noting that entangled states' huge parallelization isn't precisely equivalent to the way parallel assembly lines function in chocolate factories. 
  • The way information is stored and processed in entangled quantum systems is fundamentally different from how information is stored and processed in typical digital computers. 

Quantum computers do not function in parallel in the traditional sense; instead, they arrange information such that it is dispersed over many entangled components of the system as a whole, and then process it in a strangely parallel manner. 


This is shown in the following example.  


For a standard 100-page book, the reader gains 1% of the book's material with each page read. After reading all of the pages, the reader understands all there is to know about the book. 

Things are different in a hypothetical quantum book where the pages are entangled. 

The reader sees just random nonsense while looking at the pages one by one, and after reading all of the pages one by one, he or she still knows very nothing about the book's content. 

Anyone interested in learning more about it should look at all of its pages at the same time. 

This is because the information in a quantum book is nearly entirely contained in the correlations between the pages, rather than on the individual pages. 

 

For the time being, the notion of qubits and quantum computers is primarily theoretical. 


However, quantum engineers have made significant progress in recent years in their efforts to make quantum computers operate in reality. 

Qubits may be built in a variety of ways, and they may be entangled in a variety of ways. 


In theory, the goal is to use ingenious tactics to catch individual quantum systems, such as atoms or electrons, entangle them, and control them appropriately:

 

One approach is to use electric and magnetic fields to fixate ions (electrically charged atoms) and then let them oscillate in a controlled manner, linking them together as qubits. 

Another approach uses atomic spin coupling, which is aligned by external magnetic fields in the same way as nuclear magnetic resonance technologies.

    • The use of so-called quantum dots may also be used to manufacture qubits. 
    • These are regions of a material where electron mobility is highly restricted in all directions.
    • This implies that energy can no longer be released continuously, but only in discrete numbers, according to quantum physics principles.
    • Like a result, these points act as massive artificial atoms. 

Other researchers are attempting to build quantum computers by pumping electrons into loops in circular superconductors (known as superconducting quantum interference devices or SQUIDs), which are then disrupted by extremely thin layers of insulator.

  •  Companies like Google, Microsoft, IBM, and Intel have a specific emphasis on this area.
  • The study takes use of the Josephson phenomenon, which states that the superconductor's Cooper electron pairs may tunnel through the insulator barrier.
  • They may be in two distinct quantum states at the same time, flowing both clockwise and anticlockwise. Superpositions like this may be employed as qubits and entangled. 

Qubits might potentially be made out of certain chemical substances. A complex of a vanadium ion contained by organic sulfur compounds serves as an example. The ion's spin is so thoroughly shielded by the shell that its state (and any entanglements) are kept for a long period. 

The so-called topological quantum computer is currently a completely theoretical idea. Its origins are in mathematics, and it is still unclear whether or not it can be physically realized. It is based on what are known as anyons (not to be confused with the anions from aqueous solutions). Particle attributes may be seen in these states in two-dimensional space. As a result, they're also known as "quasi-particles." At insulator interfaces, for example, anyons may form. 


Topological qubits should build highly stable networks that are significantly more resistant to perturbations than qubits in other notions. 

A quantum computer is being developed by a number of research organizations throughout the globe. The tension is building! Which strategy will win out?


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing Solutions And Problems



Quantum computers have the ability to solve problems as well as create new ones. 


Issues in which today's computers, no matter how powerful, quickly hit their limitations highlight the promise of quantum computers: 


1. Cryptography: Almost every standard encryption technique is based on factoring the product of two very large prime numbers. To decode the message, one must first figure out which two primes a particular integer is made up of. This is simple for the number 39: the corresponding primes are 3 and 13. This job, however, can no longer be done by a traditional computer if the number of participants exceeds a specific threshold. In 1994, computer scientist Peter Shor created an algorithm that could factorize the products of extremely large prime numbers into their divisors in minutes using a quantum computer. 

2. Completing difficult optimization tasks: Finding the best answer from a large number of options is a difficult challenge for mathematicians. The traveling salesman's difficulty is a common one. The goal is for him to determine the best sequence in which to visit various destinations so that the overall journey is as quick as feasible. With only 15 cities, there are approximately 43 billion potential route choices; with 18 cities, the number rises to over 177 trillion. Problems similar to these may be found in industrial logistics, semiconductor design, and traffic flow optimization. Even with a modest number of points, traditional computers struggle to find the best answers in an acceptable amount of time. Quantum computers are projected to be substantially more efficient at solving such optimization issues.

 3. In the area of artificial intelligence, a substantial application might be found: In this discipline, deep neural networks are used to address combinatorial optimization problems that quantum computers can answer far better and quicker than any conventional computer. Quantum computers, in example, might recognize structures considerably quicker in very noisy data (which is very important in practical applications) and learn considerably quicker as a result. As a result, the new "mega buzzword" quantum machine learning is presently circulating, combining two buzzwords that already pique the interest of many people. 

4. Searches in huge databases: A traditional computer is required to evaluate each data point separately while searching unsorted data collections. As a result, the search time scales linearly with the quantity of data points. The number of computing steps necessary for this activity is too enormous for a traditional computer to handle big volumes of data. Lov Grover, an Indian–American computer scientist, presented a quantum computer technique in 1996 that requires just the square root of the amount of data points in terms of processing steps. With a quantum computer using the Grove algorithm, instead of taking a thousand times as long to process a billion data entries as opposed to a million data points, the work would take just over 30 times as long. 

5. Theoretical chemistry: Quantum computers have the potential to vastly enhance models of electron behavior in solids and molecules, particularly where entanglement is a prominent factor. For as we know today, the calculation and simulation of quantum systems involving interacting electrons is actually best done using computers that themselves have quantum mechanical properties, as Feynman had already observed in 1981. Theoretical physicists and chemists nowadays often deal with sophisticated optimization issues involving selecting the best conceivable, i.e., energetically most beneficial arrangement of electrons in an atom, molecule, or solid, from a large number of options. They've been attempting to solve such issues for decades, with mixed results. 

8 Because quantum computers function as quantum systems themselves, rather than applying algorithms to qubits, they may directly map and simulate the quantum behavior of the electrons involved, while conventional computers must frequently pass though a crude abstraction of such systems. 

9 Physicists refer to quantum simulators. “Right now, we have to calibrate regularly with experimental data,” says Alán Aspuru-Guzik, a pioneer in the modeling of molecules on quantum computers. If we have a quantum computer, some of it will go away.” 

10 Quantum computing's applications are, of course, of enormous interest to government agencies. For example, with a quantum computer and its code-cracking capabilities, spy services may obtain access to sensitive material held by foreign countries (or their people). 


According to Edward Snowden, the American National Security Agency (NSA) is quite interested in the technology. 


Quantum computers might also usher in a new era of industrial espionage, since company data would no longer be completely secure. 

Some scientists even anticipate that one day, quantum computers will be able to solve all of nature's issues that are impossible to solve on conventional computers due to their complicated quantum features. 



Quantum computers, in particular, might aid in the following tasks: 


  1. Calculate the ground and excited states of complicated chemical and biological compounds, as well as the reaction kinetics. This is significant, for example, in the discovery of active medicinal compounds, the construction of even more useful catalysts, and the optimization of the Haber– Bosch fertilizer manufacturing process. 
  2. Decipher the electrical structures of crystals, which will progress solid state physics and materials science greatly. Nanotechnology would benefit greatly from new discoveries in these sectors. In molecular electronics, one example is the accurate computation of the attributes of prospective novel energy storage devices or components. Another crucial use would be the discovery of new high-temperature superconductors. 
  3. Calculate the behavior of black holes, the early universe's development, and the dynamics of high-energy elementary particle collisions. With the aid of a quantum computer, scientists may better anticipate and comprehend molecules and the specifics of chemical interactions than they can now, finding new forms of treatment on a weekly basis or developing far superior battery technologies within a month. 


Quantum computers pose a danger to data security throughout the world. 

Simultaneously, they may allow scientists to tackle previously intractable issues in a variety of scientific areas, resulting in significant technological advancements.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


When Will A Quantum Computer Be Available?



IBM stated in the spring of 2016 that it will make its quantum computing technology available to the public as a cloud service. 


As part of the IBM Quantum Experience, interested parties may utilize the offered programming and user interface to log into a 5-qubit quantum computer over the Internet and build and run programs.

  • The objective of IBM was to push the development of bigger quantum computers forward. In January 2018, the company made the 20-qubit versions of its quantum computer available to a restricted group of businesses. 
  • Prototypes with 50 qubits are reportedly already available. 
  • The corporation Google then declared in the summer of 2016 that a 50 qubit quantum computer will be ready by 2020. This deadline was subsequently pushed up to 2017 or early 2018. 

  • Google announced the release of Bristlecone, a new 72-qubit quantum processor, in March 2018. 
  • According to IBM, quantum computers with up to 100 qubits will be accessible in the mid to late 2020s. 
  • A quantum computer with around 50 qubits, according to most quantum experts, might outperform the processing capabilities of any supercomputer today—at least for certain key computational tasks. 

In the context of quantum supremacy, Google walks the talk. We'll find out very soon what new possibilities actual quantum computers open up. We may be seeing the start of a new age. 


There are still several significant difficulties to tackle on the route to developing working quantum computers:


  • The most important is that under the omnipresent impact of heat and radiation, entangled quantum states decay extremely quickly—often too quickly to complete the intended operations without mistake. 
  • The “decoherence” of quantum states is a term used by physicists in this context. Chap. 26 will go through this phenomena in further depth. 
  • Working with qubits is akin to writing on the water's surface rather than a piece of paper. 
  • The latter may persist hundreds of years, while any writing on water vanishes in a fraction of a second. 
  • As a result, it's critical to be able to operate at very high rates and by the way, even the speeds at which classical computers process data are hard for us humans to imagine. 


Quantum engineers are using a two-pronged approach to solve this obstacle. 


  • On the one side, they're attempting to lengthen the lifespan of qubits, so lowering their sensitivity to mistakes, and on the other, they're designing unique algorithms to rectify any faults that do arise (this is called quantum error correction). 
  • With the use of ultra-cold freezers, physicists can restrict the consequences of decoherence.
  • Furthermore, strategies for dealing with decoherence-related mistakes in individual qubits are improving all the time. 


As a result, there is reason to believe that quantum computer dependability will improve dramatically in the future. 

However, quantum engineers' efforts have not yet delivered reliably operating quantum computers (as of fall 2021). 

Quantum computers are being developed by companies such as IBM, Google, Intel, Microsoft, and Alibaba in the next years. They claim to have achieved great strides in the last several years.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


What is the Quantum Internet?



The conveyance of qubit information is technically far more complicated than the transfer of electrons in classical computers (as it occurs in any electric cable) or electromagnetic waves on the global internet, due to the delicate nature of the qubit. 

Nonetheless, quantum information can currently be transported across hundreds of kilometers by optical fiber with negligible data loss. 

Quantum entanglement makes this feasible. In this situation, physicists use the term quantum teleportation. 


Quantum Teleportation


The name is unfortunate since quantum teleportation has nothing to do with the conveyance of matter between two places without crossing space, as depicted in popular science fiction. 


  • Quantum teleportation is the transfer of quantum characteristics of particles, often known as quantum states (qubits), from one location to another. 
  • Only quantum information is transferred in this manner, but there is no transmission line for the data to go from sender to receiver. 
  • In principle, entangled particles may be separated indefinitely without their entanglement dissipating. Since the 1990s, physicists have speculated that this characteristic enables quantum teleportation in practice. 
  • Two quantum particles (for example, photons) are entangled in a shared quantum physical state and then geographically separated without losing their shared state. 
  • The sender sends one particle to the receiver while the other stays at the sender. So much for the forethought. The real data transmission may now commence. 
  • A simultaneous measurement of the entangled qubit and the transported qubit is made at the sender (a so-called "Bell measurement"). 
  • According to quantum physics, the measurement of the sender's particle determines the state of the entangled particle at the receiver automatically and instantly, without any direct connection between them. 
  • The result of the measurement at the transmitter is subsequently sent to the receiver over a standard communication channel. 
  • The receiver qubit and the entangled qubit at the receiver are projected onto one of four potential states as a result of the measurement.
  • The receiver qubit may be changed to be in the same state as the sender qubit using knowledge about the measurement result at the sender. 
  • Without physically carrying a particle, the required (quantum) information is sent from the transmitter to the receiver in this manner. Of course, by manipulating his or her particle in the same manner, the receiver may also become the transmitter. 
  • Quantum teleportation is not about conveying information faster than light, but rather about safely moving quantum states from one location to another, since the outcome of the measurement is sent normally, i.e., not instantly. 
  • Quantum teleportation enables the transmission, storage, and processing of qubits, or quantum information. 


As a result, a quantum internet, in addition to the quantum computer, looks to be within reach.

Quantum technologies are on the verge of transforming our planet. 


In order to truly appreciate them, we must first comprehend how physicists have learnt to characterize the world of atoms. We'll need to go further into the strange realm of quantum physics for this aim.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Precision Measurements with Quantum Technology

 


Measurements that are more precise than ever before are now possible thanks to new quantum technologies. 

The precise measurement of physical quantities such as the distance between New York and Boston or the number of electrons flowing through a wire at a particular period may appear to be tedious. 

However, this is not the case. Because, regardless of what is being measured, whether it is meters, seconds, volts, or anything else, the highest level of accuracy may be critical. In this regard, the sensitivity of quantum mechanically entangled states to external shocks can be very beneficial for many measuring applications. 


The measuring of time by atomic clocks is a well-known example of the metrological application of quantum physical processes. 


Optical atomic clocks have been in use for more than 70 years. The characteristic frequency of electron transitions in atoms subjected to electromagnetic radiation determines their temporal interval. 

Incoming electromagnetic waves with a frequency of 9,192,631,770 oscillations per second (in the microwave range) have a maximum resonance for caesium atoms, i.e., a maximum of photons are released at that frequency. 

Humans have a considerably more precise definition of the second than the assertion that one day comprises 86,400 s, thanks to the commonly recognized definition that one second equals 9,192,631,770 of these vibrations. Because atomic clocks are based on the stimulation of numerous caesium atoms and a mean value of the number of released photons is taken, they are extremely precise. 


Now that there are roughly 260 standardized atomic clocks across the world that can be compared to each other, the measurement becomes even more precise, resulting in yet another averaging effect. 


Thanks to a global network of atomic clocks, time measurement is unbelievably precise. Every million years, they are accurate to within 1 second. However, that is insufficiently correct. 

How is that possible? After all, we just need our clock to be precise to the second to ensure that we don't miss the start of our favorite television show. 

However, most of us are unaware that the global navigation system GPS would not function without atomic clocks, as it determines locations by measuring the time it takes for a signal to travel between the device and the GPS satellites. 

The time measurement must be accurate to a few billionths of a second in order to identify our position to within a meter. Similarly, digital communication, in which a huge number of phone calls are sent over a single line at the same time, relies on ultraprecise time measurement. 


Atomic clocks manage the switches that route individual digital signals across the network so that they arrive at the correct receiver in the correct order. 


External disturbances, such as electric fields, can impact the accuracy of atomic clocks. 

These extend the frequency spectrum of the photons being measured, resulting in tiny changes in the resonance frequency and, as a result, in the time being recorded. 

Fluctuations in the terrestrial magnetic field are another factor. Today's GPS and digital communications technologies, as well as high-precision measurements in physics experiments, are limited by this. Even with atomic clocks, time measurement is still too imprecise for some GPS applications or many data transmission channels. 

This weakness would be addressed by a new generation of atomic clocks that take use of quantum entanglement's impact. In each clock in the global network, a few atoms would be quantum mechanically entangled. 

Because a measurement on a single atom of one clock is also a measurement on all others, the clocks will stabilize each other in this way; because to the nature of entanglement, even the tiniest errors within the network of clocks will be instantaneously rectified. 


Quantum physical processes offer another another technique to improve the accuracy of atomic clocks. 


We could account for the unsettling magnetic field variations if we knew how long they lasted in fractions of a second using an adequate error correction approach. Nature demonstrates how the magnetic field may be measured ultra-precisely at the atomic level utilizing the impact of quantum entanglement. 

Many migrating bird species have a magnetic sense that they utilize to navigate hundreds of kilometers to their wintering sites. For a long time, ornithologists were astounded by the precision with which they measured the intensity and direction of the Earth's magnetic field. 


They just discovered a few years ago that birds employ a quantum compass for this reason. Electron pairs are entangled across two molecules by their spins in the robin's eye. 


External magnetic fields are quite sensitive to these entanglements. The electrons revolve in different directions depending on the magnetic field's orientation, which translates to different orientations of their "spin." 

The shift in the orientation of the electron spins of these molecules in the bird's eye is enough to turn them into isomers (molecules with the same chemical formula but different spatial structure). 

The varied characteristics of the isomers are very sensitive to the strength and direction of the magnetic field, generating various chemical processes in the bird's retina that eventually lead to perception—the bird's eye therefore becomes a perfect measuring device for magnetic fields. 


Many species of birds have evolved a form of quantum pair of glasses for magnetic fields. 


They may therefore make their way to their winter lodgings via quantum phenomena. Local gravity fields may be detected extremely precisely utilizing quantum mechanically entangled states, in addition to temporal and magnetic fields, which has sparked major economic interest. 

Today, detailed measurements of the intensity of local gravitational fields are used to find metal and oil resources in the earth. 

Large subterranean gas or water fields can also be detected by local density differences, which result in a slightly greater or weaker gravitational force—but this is a little impact that can only be detected with ultra-sensitive gravity sensors. 

Such measurements might be made much more precise by utilizing the phenomena of quantum mechanical entanglement. Even a single individual might be tracked down using an entanglement-based ultra-sensitive gravity sensor based on the gravitational field formed by their body mass. 


Gas pipelines in the earth, water pipe breaks, sinkholes beneath roadways, and anomalies under a proposed house plot might all be found. 


Furthermore, if archaeologists were able to use gravity sensors to simply "lit up" ancient and prehistoric sites, their work would be substantially simplified. Entanglement-based measuring devices might also detect the small magnetic currents linked to brain function or cell-to-cell communication in our bodies. 

They would allow for real-time monitoring of individual neurons and their behavior. This would allow us to assess the processes in our brain (and body) considerably more precisely than we can now with EEG recordings. 

Quantum magnetic field sensors are already in use for magnetoencephalography (MEG), which uses Superconducting Quantum Interference Devices (SQUIDs) to assess the magnetic activity of the brain (superconducting quantum interference units). Perhaps, in the future, we may be able to capture our thoughts from the outside and feed them straight into a computer. 


Future quantum technologies may, in fact, provide the ideal brain–computer interaction. Much of what has previously been unseen will become visible thanks to measurement instruments based on quantum entanglement.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.








Quantum Magick Turns into Technology




In a second visionary speech in 1981, Feynman developed what is perhaps an even more radical idea: a whole new kind of computer, called a “quantum computer”, which would make today’s high-powered computers look like the Commodore 64 from the early 1980s. 


The two main differences between a quantum computer and today’s computers are: 


  • In the quantum computer, information processing and storage no longer occur by means of electron currents, but are based on the control and steering of single quantum particles. 
  • Thanks to the quantum effect of superposition, a quantum computer can calculate on numerous quantum states, called quantum bits (qubits), at the same time. Instead of being constrained to the states 0 and 1 and processing each bit separately, the possible states that can be processed in one step are thereby multiplied in a quantum computer. 

This allows an unimaginably higher computing speed than today’s computers. 

While quantum computer technology is still in its infancy, when it reaches adulthood it will dramatically speed up a variety of algorithms in common use today, such as searching databases, computing complex chemical compounds, or cracking common encryption techniques. 

What’s more, there are a number of applications for which today’s computers are still not powerful enough, such as certain complex optimizations and even more so potent machine learning. A quantum computer will prove very useful here. And at this point the quantum computer will meet another ground-breaking future technology: the development of artificial intelligence. 


In quantum physics, Richard Feynman no longer saw just the epitome of the abstract, but very concrete future technological possibilities—this is what Quantum Physics 2.0 is about. As Feynman predicted almost 60 years ago, we already use a variety of quantum-physics-based technologies today. 

Common electronic components, integrated circuits on semiconductor chips, lasers, electron microscopes, LED lights, special solid state properties such as superconductivity, special chemical compounds, and even magnetic resonance tomography are essentially based on the properties of large ensembles of quantum particles and the possibilities for controlling them: steered flow of many electrons, targeted excitation of many photons, and measurement of the nuclear spin of many atoms. 

Concrete examples are the tunnel effect in modern transistors, the coherence of photons in lasers, the spin properties of the atoms in magnetic resonance tomography, Bose–Einstein condensation, or the discrete quantum leaps in an atomic clock.

 Physicists and engineers have long since become accustomed to bizarre quantum effects such as quantum tunneling, the fact that many billions of particles can be synchronized as if by magic, and the wave character of matter. 

For the statistical behavior of an ensemble of many quantum particles can be well captured using the established quantum theory given by Schrödinger’s equation, now 90 years old, and the underlying processes are still somewhat descriptive. They constitute the basis of the first generation of quantum technologies. 


The emerging second generation of quantum technologies, on the other hand, is based on something completely new: the directed preparation, control, manipulation, and subsequent selection of states of individual quantum particles and their interactions with each other. 

Of crucial importance here is one of the strangest phenomena in the quantum world, which already troubled the founding fathers of quantum theory. 


With entanglement, precisely that quality of the quantum world comes into focus which so profoundly confused early quantum theorists such Einstein, Bohr, and others, and whose fundamental significance physicists did not fully recognize until many years after the first formulation of quantum theory. 

It describes how a finite number of quantum particles can be in a state in which they behave as if linked to each other by some kind of invisible connection, even when they are physically far apart. 

It took nearly fifty years for physicists to get a proper understanding of this strange phenomenon of the quantum world and its violation of the locality principle, so familiar to us, which says that, causally, physical effects only affect their immediate neighborhoods. To many physicists it still looks like magic even today. 


No less magical are the technologies that will become possible by exploiting this quantum phenomenon. 


In recent years, many research centers for quantum technology have sprung up around the world, and many government funded projects with billions in grants have been launched. Moreover, high tech companies have long since been aware of the new possibilities raised by quantum technologies. 

Companies like IBM, Google, and Microsoft are recognizing the huge potential revenues and are thus investing heavily in research on how to exploit entangled quantum states and superposition in technological applications. 

Examples include Google’s partnerships with many academic research groups, the Canadian company D-Wave Systems Quantum Computing, and the investments of many UK companies in the UK National Quantum Technologies Program. 

In May 2016, 3,400 scientists signed the Quantum Manifesto, an initiative to promote co-ordination between academia and industry to research and develop new quantum technologies in Europe. Its goal is the research and successful commercial exploitation of new quantum effects. 

This manifesto aimed to draw the attention of politicians to the fact that Europe is in danger of falling behind in the research and development of quantum technologies. China, for example, now dominates the field of quantum communication, and US firms lead in the development of quantum computers. 

This plea has proved successful because the EU Commission has decided to promote a flagship project for research into quantum technologies with a billion euros over the next ten years. That’s a lot of money given the chronically weak financial situation in European countries. 

The project focuses on four areas: communication, computing, sensors, and simulations. The ultimate goal is the development of a quantum computer. 

The EU is funding a dedicated project on quantum technologies with one billion euros over ten years. Politicians have high expectations from this area of research. No wonder such a lot of money is being put into this field of research, as unimaginable advantages will reward the first to apply and patent quantum effects as the basis for new technologies. 


Here are some examples of such applications, the basics of which physicists do not yet fully understand: 


• The quantum Hall effect discovered in the 1980s and 1990s (including the fractional quantum Hall effect). These discoveries were rewarded by Nobel Prizes in 1985 and 1998, respectively. This states that it is not only energy that is emitted in packets, but at sufficiently low temperatures, the voltage that is generated in a conductor carrying an electric current in a magnetic field (classic Hall effect) is also quantized. This effect makes possible high-precision measurements of electric current and resistance.

• New miraculous substances such as graphene, which are very good conductors of electricity and heat and are at the same time up to two hundred times stronger than the strongest type of steel (Nobel Prize 2010). Graphene can be used in electronic systems and could make computers more powerful by several orders of magnitude. 

• Measuring devices based on the fact that even very small forces, such as they occur in ultra-weak electric, magnetic, and gravitational fields, have a quantifiable influence on the quantum states of entangled particles. 

• Quantum cryptography, which is based on the phenomenon of particle entanglement (Nobel Prize 2012) and allows absolutely secure encryption. By considering the last two examples, we shall show what dramatic effects the new quantum technologies, even apart from the quantum computer, may have on our everyday lives. 


You may also want to read more about Quantum Computing here.







New Generation of Quantum Technologies


Richard Feynman, a quantum physicist and Nobel winner, presented a widely referenced lecture in 1959 that detailed how future technology may function on a micro and nanoscopic scale (scales of one thousandth or one millionth of a millimeter, respectively). 

The title of the discussion was "There's Plenty of Room at the Bottom." Feynman's vision was crystal clear: he prophesied that man will soon be able to influence matter at the atomic level. 

The great bang of nanotechnology, one of the most fascinating technologies being produced today, is regarded Feynman's talk. The objective is to manipulate and control individual quantum states. 


Many of Feynman's ideas, in fact, have long since become a reality. 


  1. The electron microscope, which scans the item to be examined point by point with an electron beam with a wavelength up to 10,000 times shorter than visible light's wavelength. Light microscopes can only attain resolutions of 200 nm (200 109 m) and magnifications of 2,000, but electron microscopes can attain resolutions of 50 pm (50 1012 m) and magnifications of 10,000,000. 
  2. Semiconductor-based microscopic data storage systems that allow 500 GB to be stored on a thumbnail-sized surface. 
  3. Integrated circuits with components of just 10 to 100 atoms apiece, which, owing to the large number of them that can be packed into a single microchip, enable superfast data processing in current computers. 
  4. Nanomachines in medicine, which may be implanted into the human body and, for example, hunt for cancer cells autonomously. Many of Feynman's 1959 visions are already part of our daily technological life. 


In 1959, however, Feynman's most groundbreaking insight was the potential of building ultra-small devices capable of manipulating matter at the atomic level. 

These robots would be able to assemble any type of material from a kit of atoms from various elements, similar to how humans play Lego with a manual, with the only need being that the synthetically generated composites be energetically stable. 

Nano wheels that can truly roll a long distance, nano gearwheels that spin along a jagged edge of atoms, nano propellers, hinges, grapples, switches, and other fundamental building blocks are now available in prototype form. 


Nanotechnology is fundamentally a quantum technology since it is ten thousandths of a centimeter in size and obeys quantum physics principles rather than traditional Newtonian mechanics. 


Andreas Eschbach shows how nanomachines may assemble together individual atoms and molecules in practically any desired form in his science fiction novel "The Lord of All Things" (German: "Herr der kleinen Dinge," 2011). They eventually start duplicating themselves in such a way that their numbers grow exponentially. 

These nanomachines can create objects nearly out of thin air because to their powers. The novel's central character learns to command them and has them construct anything he need at any given time (cars, planes, even a spaceship). 

Finally, by having them directly measure his brain impulses, he is able to regulate these processes entirely by his own thoughts. 


Is it conceivable to build such nanomachines, or is this just science fiction? 

According to Feynman, there is no natural rule that contradicts their construction. In truth, today's nanoscientists are getting closer and closer to realizing his dream. 

The Nobel Prize in Chemistry 2016 was given to Jean-Pierre Sauvage, Fraser Stoddart, and Bernard Feringa for their work on molecular nanomachines, demonstrating how essential this work is to the scientific community. 

Nanomachines, as predicted by Richard Feynman, could potentially build (nearly) any material out of nothing from raw atomic material, or repair existing—even living—material.

The initial steps toward such machines have already been taken, and they will undoubtedly have a significant impact on the twenty-first century.


You may also want to read more about Quantum Computing here.




Quantum Physics Everywhere


Quantum physics may be found in a variety of fields today, from current chemistry to solid state physics, signal processing to medical imaging technologies. 


When we get into a car (and rely on on-board electronics), turn on our computer (which consists of integrated circuits, i.e., electronics based on quantum phenomena), listen to music (CDs are read by lasers, a pure quantum phenomenon), have our bodies scanned with X-rays or MRIs,3 allow ourselves to be guided by GPS, or communicate via cell phone, we trust its laws. 

According to different estimates, between one-quarter and half of the gross national product of industrialized countries today is based on inventions based on quantum theory, either directly or indirectly. In the future years, this percentage will skyrocket. 

A second generation of quantum technologies has emerged in the last 25 years, following in the footsteps of nuclear technology, medical applications, lasers, semiconductor technology, and modern physical chemistry, all of which were developed between 1940 and 1990. 


This generation is likely to shape our lives even more dramatically than the first. 


This has also been recognized by the People's Republic of China, which has long been viewed as a developing country in terms of scientific research but has been rapidly catching up in recent years. It has designated new quantum technologies as a key topic of scientific study in its 13th Five-Year Plan. 

In the meantime, Europe has seen the signs of the times and has begun investing heavily in quantum technology. 


The first quantum revolution began to take shape more than a century ago. We are currently witnessing the start of the second quantum revolution.


You may also want to read more about Quantum Computing here.






Quantum Chemistry and Quantum Biology

 


Quantum Chemistry - With quantum theory scientists also recognized a whole new connection between physics and chemistry. 


How atoms combine to form molecules and other compounds is determined by the quantum properties of the electron shells in those atoms. 

That implies that chemistry is nothing more than applied quantum physics. 

Only with knowledge of quantum physics can the structures of chemical bonds be understood. Some readers may recall the cloud-like structures that form around the atomic nucleus. 

These clouds, which are called orbitals, are nothing but approximate solutions of the fundamental equation of quantum mechanics, the Schrödinger equation. 

They determine the probabilities of finding the electrons at different positions (but note that these solutions only consider the interactions between the electrons and the atomic nucleus, not those between the electrons). 


“Quantum chemistry” consists in calculating the electronic structures of molecules using the theoretical and mathematical methods of quantum physics and thereby analyzing properties such as their reactive behavior, the nature and strength of their chemical bonds, and resonances or hybridizations. 


The ever increasing power of computers makes it possible to determine chemical processes and compounds more and more precisely, and this has gained great significance not only in the chemical industry and in materials research, but also in disciplines such as drug development and agro-chemistry. 


Quantum Biology - Last but not least, quantum physics helps us to better understand the biochemistry of life. 


A few years ago bio-scientists started talking about “quantum biology”. For example, the details of photosynthesis in plants can only be understood by explicitly considering quantum effects. 

And among other things, the genetic code is not completely stable, as protons in DNA are vulnerable to the tunnel effect, and it is this effect that is partly responsible for the emergence of spontaneous mutations. 

Yet as always, when something is labelled with the word “quantum”, there is some fuzziness in the package. 

Theoretically, the structures of atoms and molecules and the dynamics of chemical reactions can be determined by solving the Schrödinger equation (or other quantum equations) for all atomic nuclei and electrons involved in a reaction. 

However, these calculations are so complicated that, using the means available today, an exact solution is possible only for the special case of hydrogen, i.e., for a system with a single proton and a single electron. In more complex systems, i.e., in practically all real applications in chemistry, the Schrödinger equation can only be solved using approximations. 

And this requires the most powerful computers available today. 

Theoretically, the equations of quantum theory can be used to calculate any process in the world. 

However, even for simple molecules the calculations are so complex that they require the fastest computers available today, and physicists must nevertheless satisfy themselves with only approximate results. 


You may also want to read more about Quantum Computing here.




What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...