Showing posts with label Quantum Computing Timeline. Show all posts
Showing posts with label Quantum Computing Timeline. Show all posts

Quantum Physics and Electronics




From the Transistor to the Integrated Circuit


The characteristics and states of the electrons in a solid form of matter, such as thermal conductivity, elasticity, and chemical reactivity, are substantially governed by their characteristics and states. Quantum effects play a decisive role here as well. 

Quantum physics, among other things, provides a precise explanation for the electrical conductivity of objects, including semiconductors. Their conductivity is intermediate between that of electrical conductors (like copper) and non-conductors (like porcelain), but it may be significantly affected by a variety of factors. 

Changing the temperature of some semiconductors, for example, modifies their conductivity in a way that differs from what happens in metals: it increases rather than decreases as the temperature rises. 

Doping (the technique of introducing foreign atoms into their crystal structure) can also have a substantial impact on their conductivity. 

Micro transistors are thus little more than a collection of differentially doped semiconductor components whose mode of operation is primarily dictated by the flow of electrons within them. 

All of this, once again, is based on quantum physics rules. 

Semiconductor components are the foundations of all electronics, as well as the computer and information technologies that have such a significant impact on our lives today. 

They are packaged in billions on little chips in "integrated circuits," allowing very sophisticated electronic circuits to be coupled on parts as small as a few square millimeters (e.g., in microprocessors and memory chips). 

Individual parts of these integrated circuits nowadays are made up of only a few hundred atomic layers (about 10 nm thick) and everything that happens inside them is governed by quantum physics. 


Today's chips for computers, mobile phones, and other electronic gadgets could not be made without the help of quantum physics. 


The tunnel effect is an example of a quantum phenomenon that is extremely essential in micro- scope transistors and diodes: 

Quantum particles can cross a barrier with a high probability, even though they don't have the energy to do so according to conventional physics. Simply put, the particle penetrates through the energy barrier. 

In our macro world, this means that if we fired a thousand rubber arrows against a lead wall, some would materialize on the other side, and we'd be able to calculate exactly how many arrows there would be. 


Quantum tunneling is a strange property with very real and significant implications in today's modern world. 


This is because when the distances between the conductive portions of circuits are reduced to 10 nm or less, difficulties arise: electrons tunnel uncontrolled, causing interference. 

Engineers must devise a variety of strategies to avoid this. They mix multiple materials, for example, to trap electrons, making them less prone to tunnel. 

Meanwhile, scientists have perfected the calculation of the tunnel effect to the point that they can build “tunnel-effect transistors” (TFETs) whose operation is solely dependent on the tunnel effect. Because the "tunnel current" may be manipulated as well. 

In current microelectronics, the tunnel effect of quantum physics plays a fundamental role—on the one hand, as a barrier to ever-increasing downsizing, and on the other, as the foundation of a new transistor technology. 

Aside from solids' electrical conductivity, common qualities like color, translucency, freezing point, magnetism, viscosity, deformability, and chemical properties, among others, can only be understood using quantum physics rules. 


Without understanding of quantum phenomena, solid state physics would be impossible to comprehend. 


Physicists continue to discover startling effects and behaviors, as well as amazing new macroscopic quantum phenomena that open the door to new applications. Superconductivity, for example, is the total elimination of electrical resistance in some metals at temperatures near absolute zero. 

The “BCS theory,” named after John Bardeen, Leon Neil Cooper, and John Robert Schrieffer, who devised it in 1957, can explain this phenomenon, which was originally seen in 1911 and can be described by a specific many-particle quantum theory termed “BCS theory.” 

(As a result, John Bardeen became the first and only person to win a second Nobel Prize in physics, in addition to the one for discovering the transistor effect.) 

However, researchers found in 1986 that the temperature at which certain materials begin conducting electric current without resistance is substantially higher than the temperature at which all previously known superconducting metals begin carrying electricity without resistance (and this was rewarded by another Nobel Prize only one year later). 

This event, like many others in quantum physics, is not fully understood (BCS theory does not explain it), yet it has enormous technological promise. 

The goal of quantum engineers is to discover superconducting compounds at room temperature. This would allow power to be delivered across whole countries and continents with no energy loss—current power networks lose roughly 5% of their energy.


You may also want to read more about Quantum Computing here.




Quantum Physics Shapes the Laser

 


The Laser—Ever More Abstract Theory, Ever More Technology.


However, atomic energy may be used for peaceful purposes, such as in nuclear power plants. Quantum physics has also shaped a number of other extremely helpful technologies, the most well-known of which is the laser. 

In their motions around the atomic nucleus, electrons can spontaneously hop from one orbit to another, according to quantum theory as expressed in Bohr's atomic model. These are the "quantum jumps or leaps" 

In reality, quantum jumps underpin all of nature's most essential systems for producing light, including chemical reactions like burning (radiation emitted by accelerated charged particles, such as bremsstrahlung which generates X-rays, are a relatively insignificant source of light). 


But, exactly, how do these leaps happen? 

When an electron jumps to a higher energy level, it absorbs the energy of an incoming light particle (photon); when the electron jumps down to a lower level, it releases a photon. So far, everything has gone well. 


But where do light particles originate and where do they end up? 

Another issue is that single quantum jumps are not causal processes that can be anticipated exactly. Instead, they are instantaneous processes that take place outside of time. 


What exactly does that imply? 

When a light switch is turned on, it turns on the light for a short period of time. In other words, the effect takes a fraction of a second to appear. When an electron leaps, however, no time passes at all, not even a fraction of a fraction of a second. 

There is no direct trigger for an electron to jump back to its ground state, and we can't pinpoint a certain moment or time period when it happens. 


These quantum problems prompted Einstein to delve more into the subject of light absorption and emission in atoms in 1917. 


The quantized emission of photons from black substances is described by Planck's radiation formula. Einstein was able to derive another “amazingly easy derivation” of the rule of spontaneous light emission from purely theoretical considerations, as he noted himself. In addition, he discovered an entirely new mechanism that he dubbed "induced light emission." 

This is the emission of photons from adequately prepared (“excited”) atoms that is prompted by another incoming photon rather than occurring spontaneously. The energy produced in this way is released into the electromagnetic field, resulting in the generation of another photon. The triggering photon is still present. 


In an atmosphere where many atoms are stimulated, i.e., many electrons are at a higher energy level, a chain reaction of electrons hopping to lower levels can occur, implying simultaneous emission of light. 


The unique aspect here is that each of the freshly produced photons has the same properties as the others: they all oscillate with the same phase, travel in the same direction, and have the same frequency and polarization (direction of oscillation). 

As a result, a very bright light with attributes equal to those of its constituent photons emerges from a few photons that start the chain reaction. 

A “coherent light wave” is another term used by physicists. Physicists only succeeded in experimentally proving and technologically realizing the stimulated emission of photons that Einstein had described in 1917 on purely theoretical grounds in the 1950s and 1960s. It served as the foundation for the laser, another important quantum technology of the twentieth century. 


A laser is made in two steps: 


  1. first, electrons in a material are encouraged to leap to higher energy levels by light radiation, an electric current, or other processes (physicists call this “pumping”). 
  2. Then, into the medium, light particles with the same energy (frequency) as the electrons' excitation energy are transmitted, causing the electrons to jump back to their ground state. 

As a result, they emit photons that are identical replicas of the entering photons. The laser's name comes from this process: Light Amplification by Stimulated Emission of Radiation. Even with the laser, scientists were unsure about the exact nature of the processes involved for a long time. 

Only the quantum theory of the electromagnetic field, or quantum electrodynamics, would be able to explain the atomic quantum jumps of electrons and the related spontaneous generation and destruction of light quanta. 

Even more complex mathematics was required for this description than for the basic quantum mechanics. 


The laser once again demonstrates a basic aspect of quantum physics: even the most abstract and non-descriptive theories may yield very practical practical applications.


You may also want to read more about Quantum Computing here.








Quantum Physicists - Journey From Magick to Engineering


While the phenomena and qualities of the micro universe appeared to physicists at first to be magical, they learnt to calculate more and more correctly over time and eventually tame this magical world, despite the fact that they did not fully comprehend it. 

Their intellectual journey led scientists to a theory called quantum theory, which described observable occurrences in the micro universe using wholly new rules and ideas. 

With this theoretical foundation, physicists were no longer magicians, but scientists—and eventually engineers—as the new theory allowed for the development of numerous remarkable and occasionally terrible technology. 

When scientists applied their quantum physical theories to the atomic nucleus, the first of these phenomena emerged. They found there was a huge amount of latent energy within it. 

Physicists had to deal with the breakdown of their own established methods of thinking during the years when the world around them was tossed into disarray by two world wars and entire cities were bombed by the warring parties. 

And from this strange new hypothesis sprang a device capable of destroying entire cities in one fell sweep. Even while scientists debated the bizarre and occasionally horrific qualities of the micro universe away from the public view, quantum physics made its first debut on the global stage, and with a very real and loud explosion. 


The atomic bomb was the first technical application of quantum physics, and it was the most terrifying weapon ever used by the military. How did such a horrific weapon come to be? 


The atomic nucleus has been recognized to be made up of fundamental particles with a positive electric charge since Rutherford's discovery in 1912. (protons). Like-charged particles repel each other, as we learned in school. 


So, how can atomic nuclei stay stable? The atomic nucleus's numerous protons should fly apart! 

Another force has to work attractively and strongly enough to balance the electric force at relatively short distances inside the atomic nucleus. Physicists, on the other hand, had no notion what that force could be.


Then there was still another quantum riddle to solve! 


Otto Hahn and Lise Meitner, two German physicists, conducted experiments with uranium nuclei in 1938 to learn more about the mysterious force in the atomic nucleus. 

Depending on the isotope, the uranium nucleus has 92 protons and 143 or 146 neutrons. 

Uranium nuclei were blasted with delayed neutrons, resulting in the formation of two extremely distinct elements: barium and krypton. Radiochemical methods were used to quickly identify barium atoms, which have an atomic number of 56 and are less than half the mass of uranium nuclei. 


How did that happen? 

Using theoretical quantum physical calculations, Meitner concluded that the neutron bombardment had split the uranium nuclei into pieces, and the fragments absorbed a large amount of energy, far more than any other previously known atomic process. 


But whence did this vitality originate? 

Another conundrum. The two nuclei that resulted from the fission (together with three neutrons) weighed somewhat less than the initial uranium nucleus plus the neutron that caused the fission, according to Meitner. 


What had happened to the bulk that had vanished? 

The famous formula E = mc2, developed more than 30 years ago, came into play at this point: the difference in total mass before and after the fission matched perfectly to the energy that the pieces had received. 

This was the first known procedure in which Einstein's equation for the equivalence of energy and mass was plainly shown. At the same time, it became evident that these atoms had tremendous energy! Given the continuing conflict, the existence of so much energy in such a short region rapidly piqued the military's interest. 

The American administration assembled a team of senior scientists and technologists in complete secrecy (not even the Vice President was briefed). The Manhattan Endeavor's purpose was to build an atomic weapon, which was the most complicated and demanding engineering project ever performed at the time. The scientists had a good time. 


The first atomic bomb was detonated on July 16, 1945, at a test facility in the New Mexico desert.


Its power outstripped even the most optimistic physicists' predictions. They were alarmed, though, as the massive nuclear mushroom cloud came on the horizon. 


As subsequently stated by Robert Oppenheimer, the Manhattan project's director, he invoked a passage from Indian mythology's "Bhagavad Gita": 

"Now I am become Death, the destroyer of worlds." 

Kenneth Bainbridge, one of his colleagues and the test director, put it even more bluntly: 

"Now we are all sons of bitches." 


Their dissatisfaction was quite deserved. Only three weeks later, the second atomic mushroom appeared, this time over Japan's sky, followed by the third only two days later. 

From the scientific discovery of nuclear fission to the atomic mushroom clouds over Hiroshima and Nagasaki, just seven years had gone. 

With the invention of the atomic weapon, quantum physics lost its purity from the beginning.

 Physicists had to comprehend that their quest for knowledge had the potential to destroy not only the dominant worldview, but the whole planet.


You may also want to read more about Quantum Computing here.









Quantum Computing Technological Revolution



    A Microcosm Theory Transformed Our World 


    It all began with three issues: 


    1. In 1900, Max Planck was stumped as to why so-called black bodies radiate energy in "energy packets" of a specific size rather than in random quantities. 
    2. Albert Einstein was obliged to admit that light is both a wave and a particle in 1905. 
    3. In a surprising experiment in 1912, Ernest Rutherford revealed that the atom is made up of a nucleus of protons with electrons circling around it; nevertheless, according to classical physics, this should not be conceivable. 


    Physicists began on one of the most thrilling intellectual adventures in human history with these three phenomena in their backpacks. 



    Like the sailors of the fourteen and sixteenth centuries, they ventured out from the secure beaches of classical physics to traverse an undiscovered ocean in the first 30 years of the twentieth century, eager to see what lay on the other side. 

    Physicists realized around the turn of the twentieth century that the principles of classical physics do not always apply. 


    Tests revealed that many fundamental aspects of the atomic universe cannot be reconciled with either our everyday senses or Western philosophical conceptual systems:


    Superposition: Quantum entities can concurrently be in a mixture of different states that would be mutually exclusive in the classical world. For example, they can move simultaneously along different paths, i.e., they can be at different places at the same time. 


    Randomness in behavior: The measurable properties of a quantum system and their temporal development can no longer be absolutely determined. With its ability to be both here and there at the same time, its observable properties can only be specified probabilistically. 


    Dependence of a quantum state on measurement: In the micro world, measurements have a direct influence on the measured object. Even stranger is the fact that only observation assigns a definite state to a quantum particle. In essence, this means that quantum particles have no independent and objective properties. Any properties they have are obtained by an external observer.


    Entanglement: Quantum particles may be non-locally interconnected. Even if they are spatially far apart, they can still belong to a common physical entity (physicists say a single “wave function”). They are thus coupled together as if by some magic force. 



    Features of the micro world violates one of four key traditional philosophical principles: 


    1. The principle of uniqueness, according to which things are in definite states (the chair stands in front of the window and not next to the door); 

    2. The principle of causality, according to which every effect must have a cause (if the chair falls over, a force must have acted on it); 

    3. The principle of objectivity (related to the principle of reality ) according to which things have an objective existence independently of our subjective perception of them (when we leave the room, the chair remains exactly where it stands and is still there even when we no longer look at it); and 

    4. The principle of independence, according to which things behave individually and independently of one another (the chair is not influenced by the fact that there is another chair in the adjoining room). 


    Humanity's Existential Questions


    For more than 2,500 years, philosophers have grappled with the existential questions of humanity. 

    Democritus wondered whether matter could be split endlessly into smaller and smaller parts and had come to the conclusion that there must be minute particles that are indivisible, the atoms. Parmenides was in search of the ultimate and changeless substance. 

    Aristotle and Plato were interested in how we as observers relate to the observed. There followed a hundred generations of philosophers who painstakingly sought clarity and coherent descriptions of the world. 

    But then, at the beginning of the 20th century, it became apparent that many philosophical principles found through this tireless and thorough reflection apply only to part of the world. 


    You may also want to read more about Quantum Computing here.



    What Is Artificial General Intelligence?

    Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...