Showing posts with label Quantum Computing Origins. Show all posts
Showing posts with label Quantum Computing Origins. Show all posts

Quantum Cryptography


The Holy Grail of Data Security 


Let's take a closer look at the second item on the list: quantum cryptography. In today's society, data security is a problem that has grown more crucial. 


How can we be sure that no one else has access to our personal digital information? 

Or that third parties don't listen in on our discussions without our knowledge? 


Traditional encryption encrypts a communication with a key code in such a way that decrypting it without knowing the key would demand unreasonably large processing power. But it's like a never-ending competition to build ever-more sophisticated encryption methods that can't be cracked by ever-more powerful computers. 

At least for the dilemma of the unidentified eavesdropper, quantum cryptography offers a solution.

  Quantum key distribution is a critical component of quantum-secure communication: by conveying the key using entangled quantum states of light, any interference in the transmission, such as an eavesdropper in the communication channel, is immediately observable by the user. 

  • Assume A makes a “secure” phone call to B. (in quantum cryptography, A and B are always taken to stand for Alice and Bob). 
  • Both Alice's and Bob's equipment are capable of measuring entangled particles. 
  • When the line is intercepted, Alice and Bob quickly recognize that an undesirable third party (commonly referred to as Eve) is present, because Eve would irreversibly disrupt the entanglement of the particles while listening in, i.e., measuring it for that reason. 
  • She also can't just copy them and transfer the information, the qubit, to the intended recipient without being caught, because it's impossible to duplicate any (yet-to-be-measured) quantum state exactly. 
  • As soon as Alice and Bob observe any changes to their key, or that the entanglement of their particles has been broken, they alter the method of communication and, at least temporarily, prevent the eavesdropper. 


Cryptography relies on a fundamental fact of quantum mechanics: quantum states may never be replicated without affecting the matching state or original information. 


Engineers are currently striving to utilize the odd qualities of the micro universe, which caused so much consternation among physicists in the early part of the twentieth century. 

Physicists went back to the theoretical drawing board during the creation of the first generation of quantum technologies to achieve a proper understanding of the principles that govern the micro universe. Meanwhile, they have made great progress in their efforts. 

Quantum physics and all of its main aspects may now be applied in a technology environment. The fascinating aspect of this approach is that scientists and engineers are working on a whole new universe of possibilities that have never been conceived before, rather than just attempting to make current and familiar things quicker or more exact. 


“The nineteenth century was known as the machine era, the twentieth century will go down in history as the information era,” wrote physicist Paul Davies in 1997. The quantum age, I believe, will begin in the twenty-first century.”



You may also want to read more about Quantum Computing here.





Precision Measurements with Quantum Technology

 


Measurements that are more precise than ever before are now possible thanks to new quantum technologies. 

The precise measurement of physical quantities such as the distance between New York and Boston or the number of electrons flowing through a wire at a particular period may appear to be tedious. 

However, this is not the case. Because, regardless of what is being measured, whether it is meters, seconds, volts, or anything else, the highest level of accuracy may be critical. In this regard, the sensitivity of quantum mechanically entangled states to external shocks can be very beneficial for many measuring applications. 


The measuring of time by atomic clocks is a well-known example of the metrological application of quantum physical processes. 


Optical atomic clocks have been in use for more than 70 years. The characteristic frequency of electron transitions in atoms subjected to electromagnetic radiation determines their temporal interval. 

Incoming electromagnetic waves with a frequency of 9,192,631,770 oscillations per second (in the microwave range) have a maximum resonance for caesium atoms, i.e., a maximum of photons are released at that frequency. 

Humans have a considerably more precise definition of the second than the assertion that one day comprises 86,400 s, thanks to the commonly recognized definition that one second equals 9,192,631,770 of these vibrations. Because atomic clocks are based on the stimulation of numerous caesium atoms and a mean value of the number of released photons is taken, they are extremely precise. 


Now that there are roughly 260 standardized atomic clocks across the world that can be compared to each other, the measurement becomes even more precise, resulting in yet another averaging effect. 


Thanks to a global network of atomic clocks, time measurement is unbelievably precise. Every million years, they are accurate to within 1 second. However, that is insufficiently correct. 

How is that possible? After all, we just need our clock to be precise to the second to ensure that we don't miss the start of our favorite television show. 

However, most of us are unaware that the global navigation system GPS would not function without atomic clocks, as it determines locations by measuring the time it takes for a signal to travel between the device and the GPS satellites. 

The time measurement must be accurate to a few billionths of a second in order to identify our position to within a meter. Similarly, digital communication, in which a huge number of phone calls are sent over a single line at the same time, relies on ultraprecise time measurement. 


Atomic clocks manage the switches that route individual digital signals across the network so that they arrive at the correct receiver in the correct order. 


External disturbances, such as electric fields, can impact the accuracy of atomic clocks. 

These extend the frequency spectrum of the photons being measured, resulting in tiny changes in the resonance frequency and, as a result, in the time being recorded. 

Fluctuations in the terrestrial magnetic field are another factor. Today's GPS and digital communications technologies, as well as high-precision measurements in physics experiments, are limited by this. Even with atomic clocks, time measurement is still too imprecise for some GPS applications or many data transmission channels. 

This weakness would be addressed by a new generation of atomic clocks that take use of quantum entanglement's impact. In each clock in the global network, a few atoms would be quantum mechanically entangled. 

Because a measurement on a single atom of one clock is also a measurement on all others, the clocks will stabilize each other in this way; because to the nature of entanglement, even the tiniest errors within the network of clocks will be instantaneously rectified. 


Quantum physical processes offer another another technique to improve the accuracy of atomic clocks. 


We could account for the unsettling magnetic field variations if we knew how long they lasted in fractions of a second using an adequate error correction approach. Nature demonstrates how the magnetic field may be measured ultra-precisely at the atomic level utilizing the impact of quantum entanglement. 

Many migrating bird species have a magnetic sense that they utilize to navigate hundreds of kilometers to their wintering sites. For a long time, ornithologists were astounded by the precision with which they measured the intensity and direction of the Earth's magnetic field. 


They just discovered a few years ago that birds employ a quantum compass for this reason. Electron pairs are entangled across two molecules by their spins in the robin's eye. 


External magnetic fields are quite sensitive to these entanglements. The electrons revolve in different directions depending on the magnetic field's orientation, which translates to different orientations of their "spin." 

The shift in the orientation of the electron spins of these molecules in the bird's eye is enough to turn them into isomers (molecules with the same chemical formula but different spatial structure). 

The varied characteristics of the isomers are very sensitive to the strength and direction of the magnetic field, generating various chemical processes in the bird's retina that eventually lead to perception—the bird's eye therefore becomes a perfect measuring device for magnetic fields. 


Many species of birds have evolved a form of quantum pair of glasses for magnetic fields. 


They may therefore make their way to their winter lodgings via quantum phenomena. Local gravity fields may be detected extremely precisely utilizing quantum mechanically entangled states, in addition to temporal and magnetic fields, which has sparked major economic interest. 

Today, detailed measurements of the intensity of local gravitational fields are used to find metal and oil resources in the earth. 

Large subterranean gas or water fields can also be detected by local density differences, which result in a slightly greater or weaker gravitational force—but this is a little impact that can only be detected with ultra-sensitive gravity sensors. 

Such measurements might be made much more precise by utilizing the phenomena of quantum mechanical entanglement. Even a single individual might be tracked down using an entanglement-based ultra-sensitive gravity sensor based on the gravitational field formed by their body mass. 


Gas pipelines in the earth, water pipe breaks, sinkholes beneath roadways, and anomalies under a proposed house plot might all be found. 


Furthermore, if archaeologists were able to use gravity sensors to simply "lit up" ancient and prehistoric sites, their work would be substantially simplified. Entanglement-based measuring devices might also detect the small magnetic currents linked to brain function or cell-to-cell communication in our bodies. 

They would allow for real-time monitoring of individual neurons and their behavior. This would allow us to assess the processes in our brain (and body) considerably more precisely than we can now with EEG recordings. 

Quantum magnetic field sensors are already in use for magnetoencephalography (MEG), which uses Superconducting Quantum Interference Devices (SQUIDs) to assess the magnetic activity of the brain (superconducting quantum interference units). Perhaps, in the future, we may be able to capture our thoughts from the outside and feed them straight into a computer. 


Future quantum technologies may, in fact, provide the ideal brain–computer interaction. Much of what has previously been unseen will become visible thanks to measurement instruments based on quantum entanglement.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.








Quantum Magick Turns into Technology




In a second visionary speech in 1981, Feynman developed what is perhaps an even more radical idea: a whole new kind of computer, called a “quantum computer”, which would make today’s high-powered computers look like the Commodore 64 from the early 1980s. 


The two main differences between a quantum computer and today’s computers are: 


  • In the quantum computer, information processing and storage no longer occur by means of electron currents, but are based on the control and steering of single quantum particles. 
  • Thanks to the quantum effect of superposition, a quantum computer can calculate on numerous quantum states, called quantum bits (qubits), at the same time. Instead of being constrained to the states 0 and 1 and processing each bit separately, the possible states that can be processed in one step are thereby multiplied in a quantum computer. 

This allows an unimaginably higher computing speed than today’s computers. 

While quantum computer technology is still in its infancy, when it reaches adulthood it will dramatically speed up a variety of algorithms in common use today, such as searching databases, computing complex chemical compounds, or cracking common encryption techniques. 

What’s more, there are a number of applications for which today’s computers are still not powerful enough, such as certain complex optimizations and even more so potent machine learning. A quantum computer will prove very useful here. And at this point the quantum computer will meet another ground-breaking future technology: the development of artificial intelligence. 


In quantum physics, Richard Feynman no longer saw just the epitome of the abstract, but very concrete future technological possibilities—this is what Quantum Physics 2.0 is about. As Feynman predicted almost 60 years ago, we already use a variety of quantum-physics-based technologies today. 

Common electronic components, integrated circuits on semiconductor chips, lasers, electron microscopes, LED lights, special solid state properties such as superconductivity, special chemical compounds, and even magnetic resonance tomography are essentially based on the properties of large ensembles of quantum particles and the possibilities for controlling them: steered flow of many electrons, targeted excitation of many photons, and measurement of the nuclear spin of many atoms. 

Concrete examples are the tunnel effect in modern transistors, the coherence of photons in lasers, the spin properties of the atoms in magnetic resonance tomography, Bose–Einstein condensation, or the discrete quantum leaps in an atomic clock.

 Physicists and engineers have long since become accustomed to bizarre quantum effects such as quantum tunneling, the fact that many billions of particles can be synchronized as if by magic, and the wave character of matter. 

For the statistical behavior of an ensemble of many quantum particles can be well captured using the established quantum theory given by Schrödinger’s equation, now 90 years old, and the underlying processes are still somewhat descriptive. They constitute the basis of the first generation of quantum technologies. 


The emerging second generation of quantum technologies, on the other hand, is based on something completely new: the directed preparation, control, manipulation, and subsequent selection of states of individual quantum particles and their interactions with each other. 

Of crucial importance here is one of the strangest phenomena in the quantum world, which already troubled the founding fathers of quantum theory. 


With entanglement, precisely that quality of the quantum world comes into focus which so profoundly confused early quantum theorists such Einstein, Bohr, and others, and whose fundamental significance physicists did not fully recognize until many years after the first formulation of quantum theory. 

It describes how a finite number of quantum particles can be in a state in which they behave as if linked to each other by some kind of invisible connection, even when they are physically far apart. 

It took nearly fifty years for physicists to get a proper understanding of this strange phenomenon of the quantum world and its violation of the locality principle, so familiar to us, which says that, causally, physical effects only affect their immediate neighborhoods. To many physicists it still looks like magic even today. 


No less magical are the technologies that will become possible by exploiting this quantum phenomenon. 


In recent years, many research centers for quantum technology have sprung up around the world, and many government funded projects with billions in grants have been launched. Moreover, high tech companies have long since been aware of the new possibilities raised by quantum technologies. 

Companies like IBM, Google, and Microsoft are recognizing the huge potential revenues and are thus investing heavily in research on how to exploit entangled quantum states and superposition in technological applications. 

Examples include Google’s partnerships with many academic research groups, the Canadian company D-Wave Systems Quantum Computing, and the investments of many UK companies in the UK National Quantum Technologies Program. 

In May 2016, 3,400 scientists signed the Quantum Manifesto, an initiative to promote co-ordination between academia and industry to research and develop new quantum technologies in Europe. Its goal is the research and successful commercial exploitation of new quantum effects. 

This manifesto aimed to draw the attention of politicians to the fact that Europe is in danger of falling behind in the research and development of quantum technologies. China, for example, now dominates the field of quantum communication, and US firms lead in the development of quantum computers. 

This plea has proved successful because the EU Commission has decided to promote a flagship project for research into quantum technologies with a billion euros over the next ten years. That’s a lot of money given the chronically weak financial situation in European countries. 

The project focuses on four areas: communication, computing, sensors, and simulations. The ultimate goal is the development of a quantum computer. 

The EU is funding a dedicated project on quantum technologies with one billion euros over ten years. Politicians have high expectations from this area of research. No wonder such a lot of money is being put into this field of research, as unimaginable advantages will reward the first to apply and patent quantum effects as the basis for new technologies. 


Here are some examples of such applications, the basics of which physicists do not yet fully understand: 


• The quantum Hall effect discovered in the 1980s and 1990s (including the fractional quantum Hall effect). These discoveries were rewarded by Nobel Prizes in 1985 and 1998, respectively. This states that it is not only energy that is emitted in packets, but at sufficiently low temperatures, the voltage that is generated in a conductor carrying an electric current in a magnetic field (classic Hall effect) is also quantized. This effect makes possible high-precision measurements of electric current and resistance.

• New miraculous substances such as graphene, which are very good conductors of electricity and heat and are at the same time up to two hundred times stronger than the strongest type of steel (Nobel Prize 2010). Graphene can be used in electronic systems and could make computers more powerful by several orders of magnitude. 

• Measuring devices based on the fact that even very small forces, such as they occur in ultra-weak electric, magnetic, and gravitational fields, have a quantifiable influence on the quantum states of entangled particles. 

• Quantum cryptography, which is based on the phenomenon of particle entanglement (Nobel Prize 2012) and allows absolutely secure encryption. By considering the last two examples, we shall show what dramatic effects the new quantum technologies, even apart from the quantum computer, may have on our everyday lives. 


You may also want to read more about Quantum Computing here.







New Generation of Quantum Technologies


Richard Feynman, a quantum physicist and Nobel winner, presented a widely referenced lecture in 1959 that detailed how future technology may function on a micro and nanoscopic scale (scales of one thousandth or one millionth of a millimeter, respectively). 

The title of the discussion was "There's Plenty of Room at the Bottom." Feynman's vision was crystal clear: he prophesied that man will soon be able to influence matter at the atomic level. 

The great bang of nanotechnology, one of the most fascinating technologies being produced today, is regarded Feynman's talk. The objective is to manipulate and control individual quantum states. 


Many of Feynman's ideas, in fact, have long since become a reality. 


  1. The electron microscope, which scans the item to be examined point by point with an electron beam with a wavelength up to 10,000 times shorter than visible light's wavelength. Light microscopes can only attain resolutions of 200 nm (200 109 m) and magnifications of 2,000, but electron microscopes can attain resolutions of 50 pm (50 1012 m) and magnifications of 10,000,000. 
  2. Semiconductor-based microscopic data storage systems that allow 500 GB to be stored on a thumbnail-sized surface. 
  3. Integrated circuits with components of just 10 to 100 atoms apiece, which, owing to the large number of them that can be packed into a single microchip, enable superfast data processing in current computers. 
  4. Nanomachines in medicine, which may be implanted into the human body and, for example, hunt for cancer cells autonomously. Many of Feynman's 1959 visions are already part of our daily technological life. 


In 1959, however, Feynman's most groundbreaking insight was the potential of building ultra-small devices capable of manipulating matter at the atomic level. 

These robots would be able to assemble any type of material from a kit of atoms from various elements, similar to how humans play Lego with a manual, with the only need being that the synthetically generated composites be energetically stable. 

Nano wheels that can truly roll a long distance, nano gearwheels that spin along a jagged edge of atoms, nano propellers, hinges, grapples, switches, and other fundamental building blocks are now available in prototype form. 


Nanotechnology is fundamentally a quantum technology since it is ten thousandths of a centimeter in size and obeys quantum physics principles rather than traditional Newtonian mechanics. 


Andreas Eschbach shows how nanomachines may assemble together individual atoms and molecules in practically any desired form in his science fiction novel "The Lord of All Things" (German: "Herr der kleinen Dinge," 2011). They eventually start duplicating themselves in such a way that their numbers grow exponentially. 

These nanomachines can create objects nearly out of thin air because to their powers. The novel's central character learns to command them and has them construct anything he need at any given time (cars, planes, even a spaceship). 

Finally, by having them directly measure his brain impulses, he is able to regulate these processes entirely by his own thoughts. 


Is it conceivable to build such nanomachines, or is this just science fiction? 

According to Feynman, there is no natural rule that contradicts their construction. In truth, today's nanoscientists are getting closer and closer to realizing his dream. 

The Nobel Prize in Chemistry 2016 was given to Jean-Pierre Sauvage, Fraser Stoddart, and Bernard Feringa for their work on molecular nanomachines, demonstrating how essential this work is to the scientific community. 

Nanomachines, as predicted by Richard Feynman, could potentially build (nearly) any material out of nothing from raw atomic material, or repair existing—even living—material.

The initial steps toward such machines have already been taken, and they will undoubtedly have a significant impact on the twenty-first century.


You may also want to read more about Quantum Computing here.




Quantum Physics Everywhere


Quantum physics may be found in a variety of fields today, from current chemistry to solid state physics, signal processing to medical imaging technologies. 


When we get into a car (and rely on on-board electronics), turn on our computer (which consists of integrated circuits, i.e., electronics based on quantum phenomena), listen to music (CDs are read by lasers, a pure quantum phenomenon), have our bodies scanned with X-rays or MRIs,3 allow ourselves to be guided by GPS, or communicate via cell phone, we trust its laws. 

According to different estimates, between one-quarter and half of the gross national product of industrialized countries today is based on inventions based on quantum theory, either directly or indirectly. In the future years, this percentage will skyrocket. 

A second generation of quantum technologies has emerged in the last 25 years, following in the footsteps of nuclear technology, medical applications, lasers, semiconductor technology, and modern physical chemistry, all of which were developed between 1940 and 1990. 


This generation is likely to shape our lives even more dramatically than the first. 


This has also been recognized by the People's Republic of China, which has long been viewed as a developing country in terms of scientific research but has been rapidly catching up in recent years. It has designated new quantum technologies as a key topic of scientific study in its 13th Five-Year Plan. 

In the meantime, Europe has seen the signs of the times and has begun investing heavily in quantum technology. 


The first quantum revolution began to take shape more than a century ago. We are currently witnessing the start of the second quantum revolution.


You may also want to read more about Quantum Computing here.






Quantum Chemistry and Quantum Biology

 


Quantum Chemistry - With quantum theory scientists also recognized a whole new connection between physics and chemistry. 


How atoms combine to form molecules and other compounds is determined by the quantum properties of the electron shells in those atoms. 

That implies that chemistry is nothing more than applied quantum physics. 

Only with knowledge of quantum physics can the structures of chemical bonds be understood. Some readers may recall the cloud-like structures that form around the atomic nucleus. 

These clouds, which are called orbitals, are nothing but approximate solutions of the fundamental equation of quantum mechanics, the Schrödinger equation. 

They determine the probabilities of finding the electrons at different positions (but note that these solutions only consider the interactions between the electrons and the atomic nucleus, not those between the electrons). 


“Quantum chemistry” consists in calculating the electronic structures of molecules using the theoretical and mathematical methods of quantum physics and thereby analyzing properties such as their reactive behavior, the nature and strength of their chemical bonds, and resonances or hybridizations. 


The ever increasing power of computers makes it possible to determine chemical processes and compounds more and more precisely, and this has gained great significance not only in the chemical industry and in materials research, but also in disciplines such as drug development and agro-chemistry. 


Quantum Biology - Last but not least, quantum physics helps us to better understand the biochemistry of life. 


A few years ago bio-scientists started talking about “quantum biology”. For example, the details of photosynthesis in plants can only be understood by explicitly considering quantum effects. 

And among other things, the genetic code is not completely stable, as protons in DNA are vulnerable to the tunnel effect, and it is this effect that is partly responsible for the emergence of spontaneous mutations. 

Yet as always, when something is labelled with the word “quantum”, there is some fuzziness in the package. 

Theoretically, the structures of atoms and molecules and the dynamics of chemical reactions can be determined by solving the Schrödinger equation (or other quantum equations) for all atomic nuclei and electrons involved in a reaction. 

However, these calculations are so complicated that, using the means available today, an exact solution is possible only for the special case of hydrogen, i.e., for a system with a single proton and a single electron. In more complex systems, i.e., in practically all real applications in chemistry, the Schrödinger equation can only be solved using approximations. 

And this requires the most powerful computers available today. 

Theoretically, the equations of quantum theory can be used to calculate any process in the world. 

However, even for simple molecules the calculations are so complex that they require the fastest computers available today, and physicists must nevertheless satisfy themselves with only approximate results. 


You may also want to read more about Quantum Computing here.




Quantum Physics and Electronics




From the Transistor to the Integrated Circuit


The characteristics and states of the electrons in a solid form of matter, such as thermal conductivity, elasticity, and chemical reactivity, are substantially governed by their characteristics and states. Quantum effects play a decisive role here as well. 

Quantum physics, among other things, provides a precise explanation for the electrical conductivity of objects, including semiconductors. Their conductivity is intermediate between that of electrical conductors (like copper) and non-conductors (like porcelain), but it may be significantly affected by a variety of factors. 

Changing the temperature of some semiconductors, for example, modifies their conductivity in a way that differs from what happens in metals: it increases rather than decreases as the temperature rises. 

Doping (the technique of introducing foreign atoms into their crystal structure) can also have a substantial impact on their conductivity. 

Micro transistors are thus little more than a collection of differentially doped semiconductor components whose mode of operation is primarily dictated by the flow of electrons within them. 

All of this, once again, is based on quantum physics rules. 

Semiconductor components are the foundations of all electronics, as well as the computer and information technologies that have such a significant impact on our lives today. 

They are packaged in billions on little chips in "integrated circuits," allowing very sophisticated electronic circuits to be coupled on parts as small as a few square millimeters (e.g., in microprocessors and memory chips). 

Individual parts of these integrated circuits nowadays are made up of only a few hundred atomic layers (about 10 nm thick) and everything that happens inside them is governed by quantum physics. 


Today's chips for computers, mobile phones, and other electronic gadgets could not be made without the help of quantum physics. 


The tunnel effect is an example of a quantum phenomenon that is extremely essential in micro- scope transistors and diodes: 

Quantum particles can cross a barrier with a high probability, even though they don't have the energy to do so according to conventional physics. Simply put, the particle penetrates through the energy barrier. 

In our macro world, this means that if we fired a thousand rubber arrows against a lead wall, some would materialize on the other side, and we'd be able to calculate exactly how many arrows there would be. 


Quantum tunneling is a strange property with very real and significant implications in today's modern world. 


This is because when the distances between the conductive portions of circuits are reduced to 10 nm or less, difficulties arise: electrons tunnel uncontrolled, causing interference. 

Engineers must devise a variety of strategies to avoid this. They mix multiple materials, for example, to trap electrons, making them less prone to tunnel. 

Meanwhile, scientists have perfected the calculation of the tunnel effect to the point that they can build “tunnel-effect transistors” (TFETs) whose operation is solely dependent on the tunnel effect. Because the "tunnel current" may be manipulated as well. 

In current microelectronics, the tunnel effect of quantum physics plays a fundamental role—on the one hand, as a barrier to ever-increasing downsizing, and on the other, as the foundation of a new transistor technology. 

Aside from solids' electrical conductivity, common qualities like color, translucency, freezing point, magnetism, viscosity, deformability, and chemical properties, among others, can only be understood using quantum physics rules. 


Without understanding of quantum phenomena, solid state physics would be impossible to comprehend. 


Physicists continue to discover startling effects and behaviors, as well as amazing new macroscopic quantum phenomena that open the door to new applications. Superconductivity, for example, is the total elimination of electrical resistance in some metals at temperatures near absolute zero. 

The “BCS theory,” named after John Bardeen, Leon Neil Cooper, and John Robert Schrieffer, who devised it in 1957, can explain this phenomenon, which was originally seen in 1911 and can be described by a specific many-particle quantum theory termed “BCS theory.” 

(As a result, John Bardeen became the first and only person to win a second Nobel Prize in physics, in addition to the one for discovering the transistor effect.) 

However, researchers found in 1986 that the temperature at which certain materials begin conducting electric current without resistance is substantially higher than the temperature at which all previously known superconducting metals begin carrying electricity without resistance (and this was rewarded by another Nobel Prize only one year later). 

This event, like many others in quantum physics, is not fully understood (BCS theory does not explain it), yet it has enormous technological promise. 

The goal of quantum engineers is to discover superconducting compounds at room temperature. This would allow power to be delivered across whole countries and continents with no energy loss—current power networks lose roughly 5% of their energy.


You may also want to read more about Quantum Computing here.




Quantum Physics Shapes the Laser

 


The Laser—Ever More Abstract Theory, Ever More Technology.


However, atomic energy may be used for peaceful purposes, such as in nuclear power plants. Quantum physics has also shaped a number of other extremely helpful technologies, the most well-known of which is the laser. 

In their motions around the atomic nucleus, electrons can spontaneously hop from one orbit to another, according to quantum theory as expressed in Bohr's atomic model. These are the "quantum jumps or leaps" 

In reality, quantum jumps underpin all of nature's most essential systems for producing light, including chemical reactions like burning (radiation emitted by accelerated charged particles, such as bremsstrahlung which generates X-rays, are a relatively insignificant source of light). 


But, exactly, how do these leaps happen? 

When an electron jumps to a higher energy level, it absorbs the energy of an incoming light particle (photon); when the electron jumps down to a lower level, it releases a photon. So far, everything has gone well. 


But where do light particles originate and where do they end up? 

Another issue is that single quantum jumps are not causal processes that can be anticipated exactly. Instead, they are instantaneous processes that take place outside of time. 


What exactly does that imply? 

When a light switch is turned on, it turns on the light for a short period of time. In other words, the effect takes a fraction of a second to appear. When an electron leaps, however, no time passes at all, not even a fraction of a fraction of a second. 

There is no direct trigger for an electron to jump back to its ground state, and we can't pinpoint a certain moment or time period when it happens. 


These quantum problems prompted Einstein to delve more into the subject of light absorption and emission in atoms in 1917. 


The quantized emission of photons from black substances is described by Planck's radiation formula. Einstein was able to derive another “amazingly easy derivation” of the rule of spontaneous light emission from purely theoretical considerations, as he noted himself. In addition, he discovered an entirely new mechanism that he dubbed "induced light emission." 

This is the emission of photons from adequately prepared (“excited”) atoms that is prompted by another incoming photon rather than occurring spontaneously. The energy produced in this way is released into the electromagnetic field, resulting in the generation of another photon. The triggering photon is still present. 


In an atmosphere where many atoms are stimulated, i.e., many electrons are at a higher energy level, a chain reaction of electrons hopping to lower levels can occur, implying simultaneous emission of light. 


The unique aspect here is that each of the freshly produced photons has the same properties as the others: they all oscillate with the same phase, travel in the same direction, and have the same frequency and polarization (direction of oscillation). 

As a result, a very bright light with attributes equal to those of its constituent photons emerges from a few photons that start the chain reaction. 

A “coherent light wave” is another term used by physicists. Physicists only succeeded in experimentally proving and technologically realizing the stimulated emission of photons that Einstein had described in 1917 on purely theoretical grounds in the 1950s and 1960s. It served as the foundation for the laser, another important quantum technology of the twentieth century. 


A laser is made in two steps: 


  1. first, electrons in a material are encouraged to leap to higher energy levels by light radiation, an electric current, or other processes (physicists call this “pumping”). 
  2. Then, into the medium, light particles with the same energy (frequency) as the electrons' excitation energy are transmitted, causing the electrons to jump back to their ground state. 

As a result, they emit photons that are identical replicas of the entering photons. The laser's name comes from this process: Light Amplification by Stimulated Emission of Radiation. Even with the laser, scientists were unsure about the exact nature of the processes involved for a long time. 

Only the quantum theory of the electromagnetic field, or quantum electrodynamics, would be able to explain the atomic quantum jumps of electrons and the related spontaneous generation and destruction of light quanta. 

Even more complex mathematics was required for this description than for the basic quantum mechanics. 


The laser once again demonstrates a basic aspect of quantum physics: even the most abstract and non-descriptive theories may yield very practical practical applications.


You may also want to read more about Quantum Computing here.








Quantum Physicists - Journey From Magick to Engineering


While the phenomena and qualities of the micro universe appeared to physicists at first to be magical, they learnt to calculate more and more correctly over time and eventually tame this magical world, despite the fact that they did not fully comprehend it. 

Their intellectual journey led scientists to a theory called quantum theory, which described observable occurrences in the micro universe using wholly new rules and ideas. 

With this theoretical foundation, physicists were no longer magicians, but scientists—and eventually engineers—as the new theory allowed for the development of numerous remarkable and occasionally terrible technology. 

When scientists applied their quantum physical theories to the atomic nucleus, the first of these phenomena emerged. They found there was a huge amount of latent energy within it. 

Physicists had to deal with the breakdown of their own established methods of thinking during the years when the world around them was tossed into disarray by two world wars and entire cities were bombed by the warring parties. 

And from this strange new hypothesis sprang a device capable of destroying entire cities in one fell sweep. Even while scientists debated the bizarre and occasionally horrific qualities of the micro universe away from the public view, quantum physics made its first debut on the global stage, and with a very real and loud explosion. 


The atomic bomb was the first technical application of quantum physics, and it was the most terrifying weapon ever used by the military. How did such a horrific weapon come to be? 


The atomic nucleus has been recognized to be made up of fundamental particles with a positive electric charge since Rutherford's discovery in 1912. (protons). Like-charged particles repel each other, as we learned in school. 


So, how can atomic nuclei stay stable? The atomic nucleus's numerous protons should fly apart! 

Another force has to work attractively and strongly enough to balance the electric force at relatively short distances inside the atomic nucleus. Physicists, on the other hand, had no notion what that force could be.


Then there was still another quantum riddle to solve! 


Otto Hahn and Lise Meitner, two German physicists, conducted experiments with uranium nuclei in 1938 to learn more about the mysterious force in the atomic nucleus. 

Depending on the isotope, the uranium nucleus has 92 protons and 143 or 146 neutrons. 

Uranium nuclei were blasted with delayed neutrons, resulting in the formation of two extremely distinct elements: barium and krypton. Radiochemical methods were used to quickly identify barium atoms, which have an atomic number of 56 and are less than half the mass of uranium nuclei. 


How did that happen? 

Using theoretical quantum physical calculations, Meitner concluded that the neutron bombardment had split the uranium nuclei into pieces, and the fragments absorbed a large amount of energy, far more than any other previously known atomic process. 


But whence did this vitality originate? 

Another conundrum. The two nuclei that resulted from the fission (together with three neutrons) weighed somewhat less than the initial uranium nucleus plus the neutron that caused the fission, according to Meitner. 


What had happened to the bulk that had vanished? 

The famous formula E = mc2, developed more than 30 years ago, came into play at this point: the difference in total mass before and after the fission matched perfectly to the energy that the pieces had received. 

This was the first known procedure in which Einstein's equation for the equivalence of energy and mass was plainly shown. At the same time, it became evident that these atoms had tremendous energy! Given the continuing conflict, the existence of so much energy in such a short region rapidly piqued the military's interest. 

The American administration assembled a team of senior scientists and technologists in complete secrecy (not even the Vice President was briefed). The Manhattan Endeavor's purpose was to build an atomic weapon, which was the most complicated and demanding engineering project ever performed at the time. The scientists had a good time. 


The first atomic bomb was detonated on July 16, 1945, at a test facility in the New Mexico desert.


Its power outstripped even the most optimistic physicists' predictions. They were alarmed, though, as the massive nuclear mushroom cloud came on the horizon. 


As subsequently stated by Robert Oppenheimer, the Manhattan project's director, he invoked a passage from Indian mythology's "Bhagavad Gita": 

"Now I am become Death, the destroyer of worlds." 

Kenneth Bainbridge, one of his colleagues and the test director, put it even more bluntly: 

"Now we are all sons of bitches." 


Their dissatisfaction was quite deserved. Only three weeks later, the second atomic mushroom appeared, this time over Japan's sky, followed by the third only two days later. 

From the scientific discovery of nuclear fission to the atomic mushroom clouds over Hiroshima and Nagasaki, just seven years had gone. 

With the invention of the atomic weapon, quantum physics lost its purity from the beginning.

 Physicists had to comprehend that their quest for knowledge had the potential to destroy not only the dominant worldview, but the whole planet.


You may also want to read more about Quantum Computing here.









Quantum Computing Technological Revolution



    A Microcosm Theory Transformed Our World 


    It all began with three issues: 


    1. In 1900, Max Planck was stumped as to why so-called black bodies radiate energy in "energy packets" of a specific size rather than in random quantities. 
    2. Albert Einstein was obliged to admit that light is both a wave and a particle in 1905. 
    3. In a surprising experiment in 1912, Ernest Rutherford revealed that the atom is made up of a nucleus of protons with electrons circling around it; nevertheless, according to classical physics, this should not be conceivable. 


    Physicists began on one of the most thrilling intellectual adventures in human history with these three phenomena in their backpacks. 



    Like the sailors of the fourteen and sixteenth centuries, they ventured out from the secure beaches of classical physics to traverse an undiscovered ocean in the first 30 years of the twentieth century, eager to see what lay on the other side. 

    Physicists realized around the turn of the twentieth century that the principles of classical physics do not always apply. 


    Tests revealed that many fundamental aspects of the atomic universe cannot be reconciled with either our everyday senses or Western philosophical conceptual systems:


    Superposition: Quantum entities can concurrently be in a mixture of different states that would be mutually exclusive in the classical world. For example, they can move simultaneously along different paths, i.e., they can be at different places at the same time. 


    Randomness in behavior: The measurable properties of a quantum system and their temporal development can no longer be absolutely determined. With its ability to be both here and there at the same time, its observable properties can only be specified probabilistically. 


    Dependence of a quantum state on measurement: In the micro world, measurements have a direct influence on the measured object. Even stranger is the fact that only observation assigns a definite state to a quantum particle. In essence, this means that quantum particles have no independent and objective properties. Any properties they have are obtained by an external observer.


    Entanglement: Quantum particles may be non-locally interconnected. Even if they are spatially far apart, they can still belong to a common physical entity (physicists say a single “wave function”). They are thus coupled together as if by some magic force. 



    Features of the micro world violates one of four key traditional philosophical principles: 


    1. The principle of uniqueness, according to which things are in definite states (the chair stands in front of the window and not next to the door); 

    2. The principle of causality, according to which every effect must have a cause (if the chair falls over, a force must have acted on it); 

    3. The principle of objectivity (related to the principle of reality ) according to which things have an objective existence independently of our subjective perception of them (when we leave the room, the chair remains exactly where it stands and is still there even when we no longer look at it); and 

    4. The principle of independence, according to which things behave individually and independently of one another (the chair is not influenced by the fact that there is another chair in the adjoining room). 


    Humanity's Existential Questions


    For more than 2,500 years, philosophers have grappled with the existential questions of humanity. 

    Democritus wondered whether matter could be split endlessly into smaller and smaller parts and had come to the conclusion that there must be minute particles that are indivisible, the atoms. Parmenides was in search of the ultimate and changeless substance. 

    Aristotle and Plato were interested in how we as observers relate to the observed. There followed a hundred generations of philosophers who painstakingly sought clarity and coherent descriptions of the world. 

    But then, at the beginning of the 20th century, it became apparent that many philosophical principles found through this tireless and thorough reflection apply only to part of the world. 


    You may also want to read more about Quantum Computing here.



    What Is Artificial General Intelligence?

    Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...