How Can We Live in Peace With AI?


The limbic cortex is a region of the brain that neuroanatomists believe is the seat of emotion, addiction, mood, and a variety of other mental and emotional processes. 


The limbic system's Amygdala, which is responsible for basic survival impulses like fear and aggressiveness, is also known as "The Lizard Brain" or "The Reptilian Brain." 



  • This is because a lizard's limbic system is its only source of brain function. 
  • The lizard brain is why you're scared, why you don't make all the art you can, why you don't ship when you can. 
  • The lizard brain is the root of the resistance. 
  • The lizard brain is famished, terrified, enraged, and horny. 
  • The lizard brain is primarily concerned with eating and staying safe. 

Because status in the group is necessary for survival, the lizard brain is concerned with what others think. 


The greatest line in Barbara Tuchman's 1961 book The Guns of August sums up the inability to plan for a lengthy World War I, among other mistakes: 


"The inclination of everyone on both sides was not to prepare for the three tougher alternatives, not to act upon what they knew to be true." She's also discovered "Tuchman's rule," a historical occurrence that has been recognized as a psychological principle of "perceptual readiness" or "subjective probability": 

Disasters are seldom as widespread as they seem in written tales. It seems continuous and widespread because it is in the record, but it was more likely irregular in both time and location. Furthermore, as we know from our own experience, the persistence of the normal is typically larger than the impact of the disruption. 

After digesting today's news, one expects to be confronted with a society dominated by strikes, crimes, power outages, broken water mains, delayed trains, school closings, muggers, drug addicts, neo-Nazis, and rapists. 

On a fortunate day, one may get home in the evening without having seen more than one or two of these occurrences. As a result, I developed Tuchman's Law, which states that "the fact of being reported increases the seeming magnitude of any terrible development by five to tenfold". 

~ Barbara W. Tuchman 


In other words, people prefer to read about spectacular and overblown occurrences, thus events are portrayed as widespread and widespread. 


  • In history and the news, the negative elements of events are often highlighted, while chroniclers frequently overlook the good sides of significant occurrences.
  • Startup failures, for example, are widely publicized, while the achievements of the influential few are seldom covered or recorded in tiny type. 

Many other cognitive bias situations, such as groupthink, fear of authority, lack of creativity, and hyper-rationality, have also been examined by psychologists. 



  • William Whyte coined the word "groupthink," and Irving Janis subsequently created the idea of groupthink to explain poor decision-making that may occur in groups as a consequence of factors that bring them together. 
  • The extreme fear of authority is classified as a type of social phobia by mental health professionals. 
  • The biggest adversary of truth, according to Albert Einstein, is blind obedience to authority. 
  • When something appears apparent to those in the know, foreseeable (especially in retrospect), and yet no preparation is made for the unfavorable result, it is called a failure of imagination. 
  • There is a lack of imagination if the person lacks the capacity or refuses to pull elements from past experiences and put them together to create an imagined scenario. 
  • A lack of constructive-episodic simulation has been linked to old age and the use of other kinds of memory recall. 


Hyper-rationality is a defensive mechanism against anything that is dangerous or unsettling. It depicts circumstances in which reason has been pushed beyond its logical boundaries. 

Artificial Intelligence (AI) on the other hand, offers a distinct perspective on independence: rational, transparent, ever-changing, relentless, and dispassionate. 


  • When applied to real-world circumstances, a range of methods using properly designed self-learning and AI technologies reduce or overcome cognitive biases. 
  • Depending on human-driven ethical norms, technology may bring both benefit and damage. 
  • If controlled by ethical norms and laws, AI stands a high possibility of becoming a basis for technology that overcomes human weakness. 
  • Biases of various kinds will have varying consequences, but they will always be detrimental. 
  • Fear of judgement, fear of failure, fear of the unknown, and fear of the irrational are the results of the four biases. 
  • This leads to people leaving, hiding, delaying, and freezing, none of which are desirable results for businesses or individuals. 


This seems to be the most frequent observation of contemporary management, particularly with the focus on conflict of interest and fiduciary responsibilities. 


Without any technical knowledge on the board or in management, it is virtually a given conclusion that the "correct" thing to do is to do nothing. However, AI has a lot to offer.


~ Jai Krishna Ponnappan


You may also want to read more about Artificial Intelligence here.




Digital To Quantum Computers At A Breakneck Speed



Every year, the quantity of data created throughout the globe doubles. As data and its collection and transport transcend beyond stationary computers, as many gigabytes, terabytes, petabytes, and exabytes are created, processed, and gathered in 2018 as in all of human history previously to 2018. 

Smart Phones, Smart Homes, Smart Clothes, Smart Factories, Smart Cities... the Internet is connecting numerous "smart" objects. And they're generating a growing amount of their own data. 

  • As a result, the demand for computer chip performance is increasing at an exponential rate. 
  • In fact, during the previous 50 years, their computational capacity has about quadrupled every 18 months. 
  • The number of components per unit space on integrated circuits grows in accordance with a law proposed in 1965 by Gordon Moore, Intel's future co-founder. 
  • The reason that the overall volume of data is growing faster than individual computer performance is due to the fact that the number of data-producing devices is growing at the same rate.


Concerns that "Moore's Law" will lose its validity at some time date back 25 years. The reason for this is because component miniaturization is causing issues: 


  • As electrons move through progressively smaller and more numerous circuits, the chips get more hot. But there's a bigger issue: electronic structures have shrunk to fewer than 10 nanometers in size. This is around 40 atoms. 
  • The principles of quantum physics rule in transistors this small, rendering electron behavior completely unpredictable. Moore himself forecast the conclusion of his legislation in 2007, giving it another 10 to 15 years. 
  • Indeed, for the first time ever, the semiconductor industry's 2016 plan for chip development for the next year did not follow Moore's law. 
  • However, thanks to nano-engineers' ingenuity, it is conceivable that even smaller and quicker electronic structures will be achievable in the future, delaying the end of “classical” shrinking for a few more years. But then what? 

How long can we depend on the ability to simply increase the performance of computer chips? 

The fact that Moore's Law will no longer be true does not indicate that we have reached the end of the road in terms of improving information processing efficiency. 


However, there is a technique to make computers that are significantly quicker, even billions of times more powerful: quantum computers. 

  • These computers operate in a very different manner than traditional computers. 
  • Rather than ignoring electron quantum qualities and the challenges associated with ever-increasing component downsizing, a quantum computer overtly uses these qualities in how it processes data. 
  • We might tackle issues that are much too complicated for today's "supercomputers" in physics, biology, weather research, and other fields with the aid of such devices. 
  • The development of quantum computers might spark a technological revolution that will dominate the twenty-first century in the same way that digital circuits dominated the twentieth. 
  • Quantum computers are expected to offer computation speeds that are unimaginable today.

~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing And Digital Evolution



The Computer of Today is based on a concept from the 1940s. Although the shrinking of computer chips has prompted computer developers to study quantum mechanical rules, today's computers still operate purely on classical physics principles. 



  • Tubes and capacitors were used in the earliest computers in the 1940s, and the transistor, which was initially a "classical" component, is still a vital component in any computer today. 
  • The term "transistor" stands for "transfer resistor," which simply indicates that an electrical resistance is controlled by a voltage or current. 
  • The first transistor patent was submitted in 1925. Shortly after, in the 1930s, it was discovered that basic arithmetical operations may be performed by carefully controlling the electric current (for example, in diodes). 
  • The lack of computation speed and energy consumption are the two primary reasons why point contact transistors, triodes, and diodes based on electron tubes are only seen in technological museums today. 
  • Although the components have evolved, the architecture developed by Hungarian mathematician and scientist John von Neumann in 1945 remains the foundation for today's computers. 
  • The memory card, which carries both program instructions and (temporarily) the data to be processed, is at the heart of von Neumann's computer reference model. 
  • A control unit manages the data processing sequentially, that is, step by step, in single binary computing steps. A “SISD architecture” is a term used by computer scientists (Single Instruction, Single Data ). 

Despite the fact that transistors and electron tubes have been replaced with smaller, faster field effect transistors on semiconductor chips, the architecture of today's computers has remained same since its inception. 


How does sequential information processing in computers work? 


Alan Turing, a British mathematician, theoretically outlined the fundamental data units and their processing in 1936. 

The binary digital units, or "bits," are the most basic information units in the system. Because a bit may assume either the state "1" or the state "0," similar to a light switch that may be turned on or off, binary implies "two-valued." 

  • The word "digital" comes from the Latin digitus, which means "finger," and refers to a time when people counted with their fingers. 
  • Today, "digital" refers to information that may be represented by numbers. 
  • In today's computers, electronic data processing entails turning incoming data in the form of many consecutively organized bits into an output that is also in the form of many consecutively ordered bits. 
  • Blocks of individual bits are processed one after the other, much like chocolate bars on an assembly line; for a letter, for example, a block of eight bits, referred to as a "byte," is needed. 
  • There are just two processing options for single bits: a 0 (or 1) stays a 0 (or 1), or a 0 (or 1) transforms to a 1. (or 0). 
  • The fundamental electrical components of digital computers, known as logic gates1, are always the same fundamental fundamental electronic circuits, embodied by physical components such as transistors, through which information is transferred as electric impulses. 
  • The connection of many similar gates allows for more sophisticated processes, such as the addition of two integers. 

Every computer today is a Turing machine: it does nothing but process information encoded in zeros and ones in a sequential manner, changing it into an output encoded in zeros and ones as well. 


  • However, this ease of data processing comes at a cost: to manage the quantity of data necessary in today's complicated computer systems, a large number of zeros and ones must be handled. 
  • The amount of accessible computational blocks improves the processing capacity of a computer in a linear fashion. A chip with twice as many circuits can process data twice as quickly. 
  • The speed of today's computer chips is measured in gigahertz, or billions of operations per second. This necessitates the use of billions of transistors. 
  • The circuitry must be tiny to fit this many transistors on chips the size of a thumb nail. Only thus can such fast-switching systems' total size and energy consumption be kept under control. 
  • The move from the electron tube to semiconductor-based bipolar or field effect transistors, which were created in 1947, was critical for the shrinking of fundamental computing units on integrated circuits in microchips. 
  • Doped semiconductor layers are used to construct these nanoscale transistors. 


This is where quantum mechanics enters the picture. 

  • We need a quantum mechanical model for the migration of the electrons within these semiconductors to comprehend and regulate what's going on. 
  • This is the so-called "band model" of electronic energy levels in metallic conductors. 

Understanding quantum physics was not required for the digital revolution of the twentieth century, but it was a need for the extreme downsizing of integrated circuits.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing - A Different Approach to Calculation.



Richard Feynman posed the subject of whether the quantum world might be replicated by a normal computer in his 1981 lecture Simulating Physics with Computer, as part of a philosophical reflection on quantum theory. 

Because quantum variables do not assume fixed values, the difficulty arises from the probabilities associated with quantum states. 

They do, in fact, occupy a full mathematical space of potential states at any given instant. 


This greatly expands the scope of the computations. 

Any traditional computer, Feynman concluded, would be swamped sooner or later. 

However, he went on to wonder if this challenge might be handled with a computer that merely calculates state probabilities, or a computer whose internal states are quantum variables themselves. 


  • The weird quantum features of atomic and subatomic particles would be openly exploited by such a quantum computer. 
  • Above important, it would have a radically different structure and operation from today's computers' von Neumann architecture. 
  • It would compute in parallel on the many states adopted concurrently by the quantum variables, rather than processing bit by bit like a Turing computer. 
  • In a quantum computer, the basic information units are no longer called "bits," but "quantum bits," or "qubits" for short. 
  • Unfortunately, this term is deceptive since it still includes the term binary, which is precisely what quantum bits are not.  
  • The nature of information in qubits differs significantly from that of traditional data. Quantum bits, or qubits, are no longer binary, accepting both states at the same time, as well as any values in between. 
  • As a result, a qubit can store significantly more information than merely 0 or 1. 


The unusual capacity of qubits is due to two peculiar qualities that can only be found in quantum physics: 


  1. Superposition of classically exclusive states: Quantum states may exist in superpositions of classically exclusive states. The light switch in the tiny world may be turned on and off at the same time. This allows a qubit to assume the states 0 and 1 at the same time, as well as all states in between.
  2. Entanglement: Several qubits may be brought into entangled states, in which they are joined in a non-separable whole as though by an unseen spring. They are in some form of direct communication with each other, even though they are geographically distant, thanks to a "spooky action at a distance," a phrase used by Albert Einstein in sarcasm to emphasize his disbelief in this quantum phenomena. It's as though each quantum bit is aware of what the others are doing and is influenced by it.


Superpositions and entanglement were formerly the subject of fierce debate among quantum physicists. 

  • They've now formed the cornerstone of a whole new computer architecture. 
  • Calculations on a quantum computer are substantially different from those on a conventional computer due to the radically distinct nature of qubits. 


Unlike a traditional logic gate, a quantum gate (or quantum logical gate) represents a basic physical manipulation of one or more (entangled) qubits rather than a technological building block that transforms individual bits into one another in a well-defined manner. 


  • A particular quantum gate may be mathematically characterized by a matching (unitary) matrix that works on the qubit ensemble's states (the quantum register). 
  • The physical structure of the qubits determines how such an operation and the flow of information will seem in each situation. 

Quantum gates' tangible technological manifestation is still a work in progress.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing Power On An Exponential Scale.



A single qubit can only accomplish so much. 


  • Only the entanglement of numerous qubits in quantum registers allows for the high-level parallelization of operations that make quantum computers so powerful. 
  • It's as if a slew of chocolate manufacturers opened their doors at the same moment. 
  • You can process multiple states in parallel if you have more qubits. 
  • Unlike traditional computers, which rise in processing power linearly as the number of computational components rises, the processing power of a quantum computer grows exponentially with the number of qubits employed. 

  • When 100 extra qubits are added to 100 qubits, the performance of a quantum computer does not double. 
  • When just a single qubit is added to the 100 qubits, it is already doubled in concept. 


  • In theory, adding 10 qubits to a quantum computer increases its performance by a factor of 1000, but in practice, other factors limit the increase (see below). 
  • With 20 new qubits, the quantum computer is already a million times faster, and with 50 new qubits, the quantum computer is a million billion times faster. 
  • And, with 100 additional information carriers, the performance of a quantum computer can no longer be described in numbers, even if the performance of a normal computer has only doubled.


Quantum computers, even those with just a few hundred qubits, have considerably more computing capacity than conventional computers.  


  • At this point, it's worth noting that entangled states' huge parallelization isn't precisely equivalent to the way parallel assembly lines function in chocolate factories. 
  • The way information is stored and processed in entangled quantum systems is fundamentally different from how information is stored and processed in typical digital computers. 

Quantum computers do not function in parallel in the traditional sense; instead, they arrange information such that it is dispersed over many entangled components of the system as a whole, and then process it in a strangely parallel manner. 


This is shown in the following example.  


For a standard 100-page book, the reader gains 1% of the book's material with each page read. After reading all of the pages, the reader understands all there is to know about the book. 

Things are different in a hypothetical quantum book where the pages are entangled. 

The reader sees just random nonsense while looking at the pages one by one, and after reading all of the pages one by one, he or she still knows very nothing about the book's content. 

Anyone interested in learning more about it should look at all of its pages at the same time. 

This is because the information in a quantum book is nearly entirely contained in the correlations between the pages, rather than on the individual pages. 

 

For the time being, the notion of qubits and quantum computers is primarily theoretical. 


However, quantum engineers have made significant progress in recent years in their efforts to make quantum computers operate in reality. 

Qubits may be built in a variety of ways, and they may be entangled in a variety of ways. 


In theory, the goal is to use ingenious tactics to catch individual quantum systems, such as atoms or electrons, entangle them, and control them appropriately:

 

One approach is to use electric and magnetic fields to fixate ions (electrically charged atoms) and then let them oscillate in a controlled manner, linking them together as qubits. 

Another approach uses atomic spin coupling, which is aligned by external magnetic fields in the same way as nuclear magnetic resonance technologies.

    • The use of so-called quantum dots may also be used to manufacture qubits. 
    • These are regions of a material where electron mobility is highly restricted in all directions.
    • This implies that energy can no longer be released continuously, but only in discrete numbers, according to quantum physics principles.
    • Like a result, these points act as massive artificial atoms. 

Other researchers are attempting to build quantum computers by pumping electrons into loops in circular superconductors (known as superconducting quantum interference devices or SQUIDs), which are then disrupted by extremely thin layers of insulator.

  •  Companies like Google, Microsoft, IBM, and Intel have a specific emphasis on this area.
  • The study takes use of the Josephson phenomenon, which states that the superconductor's Cooper electron pairs may tunnel through the insulator barrier.
  • They may be in two distinct quantum states at the same time, flowing both clockwise and anticlockwise. Superpositions like this may be employed as qubits and entangled. 

Qubits might potentially be made out of certain chemical substances. A complex of a vanadium ion contained by organic sulfur compounds serves as an example. The ion's spin is so thoroughly shielded by the shell that its state (and any entanglements) are kept for a long period. 

The so-called topological quantum computer is currently a completely theoretical idea. Its origins are in mathematics, and it is still unclear whether or not it can be physically realized. It is based on what are known as anyons (not to be confused with the anions from aqueous solutions). Particle attributes may be seen in these states in two-dimensional space. As a result, they're also known as "quasi-particles." At insulator interfaces, for example, anyons may form. 


Topological qubits should build highly stable networks that are significantly more resistant to perturbations than qubits in other notions. 

A quantum computer is being developed by a number of research organizations throughout the globe. The tension is building! Which strategy will win out?


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Quantum Computing Solutions And Problems



Quantum computers have the ability to solve problems as well as create new ones. 


Issues in which today's computers, no matter how powerful, quickly hit their limitations highlight the promise of quantum computers: 


1. Cryptography: Almost every standard encryption technique is based on factoring the product of two very large prime numbers. To decode the message, one must first figure out which two primes a particular integer is made up of. This is simple for the number 39: the corresponding primes are 3 and 13. This job, however, can no longer be done by a traditional computer if the number of participants exceeds a specific threshold. In 1994, computer scientist Peter Shor created an algorithm that could factorize the products of extremely large prime numbers into their divisors in minutes using a quantum computer. 

2. Completing difficult optimization tasks: Finding the best answer from a large number of options is a difficult challenge for mathematicians. The traveling salesman's difficulty is a common one. The goal is for him to determine the best sequence in which to visit various destinations so that the overall journey is as quick as feasible. With only 15 cities, there are approximately 43 billion potential route choices; with 18 cities, the number rises to over 177 trillion. Problems similar to these may be found in industrial logistics, semiconductor design, and traffic flow optimization. Even with a modest number of points, traditional computers struggle to find the best answers in an acceptable amount of time. Quantum computers are projected to be substantially more efficient at solving such optimization issues.

 3. In the area of artificial intelligence, a substantial application might be found: In this discipline, deep neural networks are used to address combinatorial optimization problems that quantum computers can answer far better and quicker than any conventional computer. Quantum computers, in example, might recognize structures considerably quicker in very noisy data (which is very important in practical applications) and learn considerably quicker as a result. As a result, the new "mega buzzword" quantum machine learning is presently circulating, combining two buzzwords that already pique the interest of many people. 

4. Searches in huge databases: A traditional computer is required to evaluate each data point separately while searching unsorted data collections. As a result, the search time scales linearly with the quantity of data points. The number of computing steps necessary for this activity is too enormous for a traditional computer to handle big volumes of data. Lov Grover, an Indian–American computer scientist, presented a quantum computer technique in 1996 that requires just the square root of the amount of data points in terms of processing steps. With a quantum computer using the Grove algorithm, instead of taking a thousand times as long to process a billion data entries as opposed to a million data points, the work would take just over 30 times as long. 

5. Theoretical chemistry: Quantum computers have the potential to vastly enhance models of electron behavior in solids and molecules, particularly where entanglement is a prominent factor. For as we know today, the calculation and simulation of quantum systems involving interacting electrons is actually best done using computers that themselves have quantum mechanical properties, as Feynman had already observed in 1981. Theoretical physicists and chemists nowadays often deal with sophisticated optimization issues involving selecting the best conceivable, i.e., energetically most beneficial arrangement of electrons in an atom, molecule, or solid, from a large number of options. They've been attempting to solve such issues for decades, with mixed results. 

8 Because quantum computers function as quantum systems themselves, rather than applying algorithms to qubits, they may directly map and simulate the quantum behavior of the electrons involved, while conventional computers must frequently pass though a crude abstraction of such systems. 

9 Physicists refer to quantum simulators. “Right now, we have to calibrate regularly with experimental data,” says Alán Aspuru-Guzik, a pioneer in the modeling of molecules on quantum computers. If we have a quantum computer, some of it will go away.” 

10 Quantum computing's applications are, of course, of enormous interest to government agencies. For example, with a quantum computer and its code-cracking capabilities, spy services may obtain access to sensitive material held by foreign countries (or their people). 


According to Edward Snowden, the American National Security Agency (NSA) is quite interested in the technology. 


Quantum computers might also usher in a new era of industrial espionage, since company data would no longer be completely secure. 

Some scientists even anticipate that one day, quantum computers will be able to solve all of nature's issues that are impossible to solve on conventional computers due to their complicated quantum features. 



Quantum computers, in particular, might aid in the following tasks: 


  1. Calculate the ground and excited states of complicated chemical and biological compounds, as well as the reaction kinetics. This is significant, for example, in the discovery of active medicinal compounds, the construction of even more useful catalysts, and the optimization of the Haber– Bosch fertilizer manufacturing process. 
  2. Decipher the electrical structures of crystals, which will progress solid state physics and materials science greatly. Nanotechnology would benefit greatly from new discoveries in these sectors. In molecular electronics, one example is the accurate computation of the attributes of prospective novel energy storage devices or components. Another crucial use would be the discovery of new high-temperature superconductors. 
  3. Calculate the behavior of black holes, the early universe's development, and the dynamics of high-energy elementary particle collisions. With the aid of a quantum computer, scientists may better anticipate and comprehend molecules and the specifics of chemical interactions than they can now, finding new forms of treatment on a weekly basis or developing far superior battery technologies within a month. 


Quantum computers pose a danger to data security throughout the world. 

Simultaneously, they may allow scientists to tackle previously intractable issues in a variety of scientific areas, resulting in significant technological advancements.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


When Will A Quantum Computer Be Available?



IBM stated in the spring of 2016 that it will make its quantum computing technology available to the public as a cloud service. 


As part of the IBM Quantum Experience, interested parties may utilize the offered programming and user interface to log into a 5-qubit quantum computer over the Internet and build and run programs.

  • The objective of IBM was to push the development of bigger quantum computers forward. In January 2018, the company made the 20-qubit versions of its quantum computer available to a restricted group of businesses. 
  • Prototypes with 50 qubits are reportedly already available. 
  • The corporation Google then declared in the summer of 2016 that a 50 qubit quantum computer will be ready by 2020. This deadline was subsequently pushed up to 2017 or early 2018. 

  • Google announced the release of Bristlecone, a new 72-qubit quantum processor, in March 2018. 
  • According to IBM, quantum computers with up to 100 qubits will be accessible in the mid to late 2020s. 
  • A quantum computer with around 50 qubits, according to most quantum experts, might outperform the processing capabilities of any supercomputer today—at least for certain key computational tasks. 

In the context of quantum supremacy, Google walks the talk. We'll find out very soon what new possibilities actual quantum computers open up. We may be seeing the start of a new age. 


There are still several significant difficulties to tackle on the route to developing working quantum computers:


  • The most important is that under the omnipresent impact of heat and radiation, entangled quantum states decay extremely quickly—often too quickly to complete the intended operations without mistake. 
  • The “decoherence” of quantum states is a term used by physicists in this context. Chap. 26 will go through this phenomena in further depth. 
  • Working with qubits is akin to writing on the water's surface rather than a piece of paper. 
  • The latter may persist hundreds of years, while any writing on water vanishes in a fraction of a second. 
  • As a result, it's critical to be able to operate at very high rates and by the way, even the speeds at which classical computers process data are hard for us humans to imagine. 


Quantum engineers are using a two-pronged approach to solve this obstacle. 


  • On the one side, they're attempting to lengthen the lifespan of qubits, so lowering their sensitivity to mistakes, and on the other, they're designing unique algorithms to rectify any faults that do arise (this is called quantum error correction). 
  • With the use of ultra-cold freezers, physicists can restrict the consequences of decoherence.
  • Furthermore, strategies for dealing with decoherence-related mistakes in individual qubits are improving all the time. 


As a result, there is reason to believe that quantum computer dependability will improve dramatically in the future. 

However, quantum engineers' efforts have not yet delivered reliably operating quantum computers (as of fall 2021). 

Quantum computers are being developed by companies such as IBM, Google, Intel, Microsoft, and Alibaba in the next years. They claim to have achieved great strides in the last several years.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


What is the Quantum Internet?



The conveyance of qubit information is technically far more complicated than the transfer of electrons in classical computers (as it occurs in any electric cable) or electromagnetic waves on the global internet, due to the delicate nature of the qubit. 

Nonetheless, quantum information can currently be transported across hundreds of kilometers by optical fiber with negligible data loss. 

Quantum entanglement makes this feasible. In this situation, physicists use the term quantum teleportation. 


Quantum Teleportation


The name is unfortunate since quantum teleportation has nothing to do with the conveyance of matter between two places without crossing space, as depicted in popular science fiction. 


  • Quantum teleportation is the transfer of quantum characteristics of particles, often known as quantum states (qubits), from one location to another. 
  • Only quantum information is transferred in this manner, but there is no transmission line for the data to go from sender to receiver. 
  • In principle, entangled particles may be separated indefinitely without their entanglement dissipating. Since the 1990s, physicists have speculated that this characteristic enables quantum teleportation in practice. 
  • Two quantum particles (for example, photons) are entangled in a shared quantum physical state and then geographically separated without losing their shared state. 
  • The sender sends one particle to the receiver while the other stays at the sender. So much for the forethought. The real data transmission may now commence. 
  • A simultaneous measurement of the entangled qubit and the transported qubit is made at the sender (a so-called "Bell measurement"). 
  • According to quantum physics, the measurement of the sender's particle determines the state of the entangled particle at the receiver automatically and instantly, without any direct connection between them. 
  • The result of the measurement at the transmitter is subsequently sent to the receiver over a standard communication channel. 
  • The receiver qubit and the entangled qubit at the receiver are projected onto one of four potential states as a result of the measurement.
  • The receiver qubit may be changed to be in the same state as the sender qubit using knowledge about the measurement result at the sender. 
  • Without physically carrying a particle, the required (quantum) information is sent from the transmitter to the receiver in this manner. Of course, by manipulating his or her particle in the same manner, the receiver may also become the transmitter. 
  • Quantum teleportation is not about conveying information faster than light, but rather about safely moving quantum states from one location to another, since the outcome of the measurement is sent normally, i.e., not instantly. 
  • Quantum teleportation enables the transmission, storage, and processing of qubits, or quantum information. 


As a result, a quantum internet, in addition to the quantum computer, looks to be within reach.

Quantum technologies are on the verge of transforming our planet. 


In order to truly appreciate them, we must first comprehend how physicists have learnt to characterize the world of atoms. We'll need to go further into the strange realm of quantum physics for this aim.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Juno, NASA's Spacecraft, Takes A Close Look At Jupiter's Moon Ganymede

 


From the left to the right: The mosaic and geologic maps of Ganymede, Jupiter's moon, were created using the finest available photos from NASA's Voyager 1 and 2 spacecraft, as well as NASA's Galileo spacecraft. 

Credit: USGS Astrogeology Science Center/Wheaton/NASA/JPL-Caltech/USGS Astrogeology Science Center/Wheaton/NASA/JPL-Caltech 


After more than 20 years, the first of the gas-giant orbiter's back-to-back flybys will deliver a close encounter with the gigantic moon. 

NASA's Juno spacecraft will pass within 645 miles (1,038 kilometers) of Jupiter's biggest moon, Ganymede, on Monday, June 7 at 1:35 p.m. EDT (10:35 a.m. PDT). Since NASA's Galileo spacecraft made its last near approach to the solar system's largest natural satellite on May 20, 2000, the flyby will be the closest a spacecraft has gotten near the solar system's greatest natural satellite. 


The solar-powered spacecraft's flyby will provide insights about the moon's composition, ionosphere, magnetosphere, and ice shell, in addition to stunning photographs. Future missions to the Jovian system will benefit from Juno's studies of the radiation environment around the moon. 

Ganymede is the only moon in the solar system with its own magnetosphere, a bubble-shaped area of charged particles around the celestial body that is larger than Mercury. “Juno contains a suite of sensitive equipment capable of observing Ganymede in ways never previously possible,” stated Southwest Research Institute in San Antonio Principal Investigator Scott Bolton. 

“By flying so close, we will bring Ganymede exploration into the twenty-first century, complementing future missions with our unique sensors and assisting in the preparation of the next generation of missions to the Jovian system, including NASA's Europa Clipper and ESA's Jupiter ICy moons Explorer [JUICE] mission.” 


About three hours before the spacecraft's closest approach, Juno's science equipment will begin gathering data. Juno's Microwave Radiometer (MWR) will gaze through Ganymede's water-ice crust, gathering data on its composition and temperature, alongside the Ultraviolet Spectrograph (UVS) and Jovian Infrared Auroral Mapper (JIRAM) sensors. 




A spinning Ganymede globe with a geologic chart placed over a global color mosaic is animated. Credit: USGS Astrogeology Science Center/Wheaton/ASU/NASA/JPL-Caltech/USGS Astrogeology Science Center/Wheaton/ASU/NASA/JPL-Caltech 


“The ice shell of Ganymede contains some light and dark parts, implying that certain parts may be pure ice while others include filthy ice,” Bolton explained. 


“MWR will conduct the first comprehensive study of how ice composition and structure change with depth, leading to a deeper understanding of how the ice shell originates and the mechanisms that resurface the ice over time.” 

The findings will be used to supplement those from ESA's upcoming JUICE mission, which will study ice using radar at various wavelengths when it launches in 2032 to become the first spacecraft to circle a moon other than Earth's Moon. 


Juno's X-band and Ka-band radio frequencies will be utilized in a radio occultation experiment to study the moon's fragile ionosphere (the outer layer of an atmosphere where gases are excited by solar radiation to form ions, which have an electrical charge). 

“As Juno travels behind Ganymede, radio signals will travel over Ganymede's ionosphere, generating modest variations in frequency that should be picked up by two antennas at the Deep Space Network's Canberra complex in Australia,” said Dustin Buccino, a Juno mission signal analysis engineer at JPL. “We might be able to grasp the relationship between Ganymede's ionosphere, its intrinsic magnetic field, and Jupiter's magnetosphere if we can monitor this change.” 


With NASA's interactive Eyes on the Solar System, you can see where Juno is right now. 

The Juno spacecraft is a dynamic technical wonder, with three huge blades reaching out 66 feet (20 meters) from its cylindrical, six-sided body, spinning to keep itself steady as it executes oval-shaped orbits around Jupiter. 


Juno's Stellar Reference Unit (SRU) navigation camera is normally responsible for keeping the Jupiter spacecraft on track, but it will perform double duty during the flyby. 


Along with its navigational functions, the camera will collect information on the high-energy radiation environment in the region surrounding Ganymede by capturing a particular collection of photos. 

The camera is adequately insulated against radiation that may otherwise harm it. “In Jupiter's harsh radiation environment, the traces from penetrating high-energy particles appear in the photos as dots, squiggles, and streaks — like static on a television screen. 

According to Heidi Becker, Juno's radiation monitoring lead at JPL, "we extract these radiation-induced noise patterns from SRU photos to obtain diagnostic pictures of the radiation levels encountered by Juno." 


Meanwhile, the Advanced Stellar Compass camera, developed by the Technical University of Denmark, will count very intense electrons that pass through its shielding at a quarter-second interval. The JunoCam imager has also been enlisted. 


The camera was designed to transmit the thrill and beauty of Jupiter exploration to the public, but it has also given a wealth of essential research throughout the mission's almost five-year stay there. JunoCam will capture photographs at a resolution comparable to the best from Voyager and Galileo for the Ganymede flyby. 

The Juno research team will examine the photographs and compare them to those taken by earlier missions, seeking for changes in surface characteristics that may have happened over four decades or more. 

Any changes in the pattern of craters on the surface might aid astronomers in better understanding the present population of objects that collide with moons in the outer solar system. 


Due to the speed of the flyby, the frozen moon will change from a point of light to a visible disk and back to a point of light in roughly 25 minutes from JunoCam's perspective. 


There's just enough time for five photographs in that amount of time. “Things move quickly in the area of flybys, and we have two back-to-back flybys coming up next week. As a result, every second counts,” stated Juno Mission Manager Matt Johnson of the Jet Propulsion Laboratory. 

“On Monday, we'll fly through Ganymede at about 12 miles per second (19 kilometers per second). We're making our 33rd scientific flyby of Jupiter in less than 24 hours, swooping low over the cloud tops at around 36 miles per second (58 kilometers per second). It's going to be a roller coaster.” even more Concerning the Mission. 

The Juno mission is managed by JPL, a subsidiary of Caltech in Pasadena, California, for the principle investigator, Scott J. Bolton of the Southwest Research Institute in San Antonio. Juno is part of NASA's New Frontiers Program, which is administered for the agency's Science Mission Directorate in Washington by NASA's Marshall Space Flight Center in Huntsville, Alabama. 


The spacecraft was manufactured and is operated by Lockheed Martin Space in Denver. 


courtesy www.nasa.com

Posted by Jai Krishna Ponnappan


More data on Juno may be found at,


https://www.nasa.gov/juno for further details.

https://www.missionjuno.swri.edu


Follow the mission on social media at 

https://www.facebook.com/NASASolarSystem 

and on Twitter at https://twitter.com/NASASolarSystem 






Nanomachine Manufacture - A World Made of Dust—Nano Assemblers



Let us consider Feynman's ultimate vision: machines that can manufacture any substance from atomic components in the same way that children construct buildings out of Lego bricks. 

In a form of atomic 3D printer, a handful of soil includes all the essential atoms to allow such "assemblers" to construct what we want seemingly out of nowhere. 


  • The term "nano-3D" may become a new tech buzzword in the near future. These devices would not be completely new! They've been around for 1.5 billion years on our planet. 
  • Nanomachines manufacture proteins, cell walls, nerve fibers, muscle fibers, and even bone molecule by molecule in our body's two hundred distinct cell types using specific building blocks (sugar molecules, amino acids, lipids, trace elements, vitamins, and so on). 
  • Here, very specialized proteins play a key function. The enzymes are the ones you're looking for. The energy required for these activities comes from the food we consume. 
  • Biological nanomachines carry, create, and process everything we need to exist in numerous metabolic processes, like a small assembly line. 
  • Nature's innovation of cell metabolism in living systems demonstrated that assemblers are conceivable a long time ago. Enzymes are the genuine masters of efficiency as nanomachines. 

What is preventing us, as humans, from producing such technologies? 


We can even take it a step further: if nanomachines can accomplish anything, why couldn't they construct themselves? 


  • Nature has also demonstrated this on the nanoscale: DNA and RNA are nothing more than extremely efficient, self-replicating nanomachines. 
  • Man-made nanomachines may not be as far away from self-replication as they appear. 
  • Nature has long addressed the difficulty of nanomachine self-replication: DNA may be thought of as a self-replicating nanomachine. 
  • Nanotechnology opens up a world of possibilities for us to enhance our lives. Nonetheless, most people are put off by the word "nano," as are the phrases "gene" and "atomic," which similarly relate to the incomprehensibly small. 
  • Nanoparticles, genes, and atoms are all invisible to the naked eye, yet the technologies that rely on them are increasingly influencing our daily lives. 


What happens, though, when artificial nanomachines have their own momentum and are able to proliferate inexorably and exponentially? What if nanomaterials turn out to be toxic? 


The first of these issues has already arisen: nanoparticles used in a variety of items, such as cosmetics, can collect in unexpected areas, such as the human lung or in marine fish. 


What impact do they have in that area? 

Which compounds have chemical reactions with them and can attach to their extremely active surfaces? 


  • According to several research, certain nanoparticles are hazardous to microorganisms. 
  • To properly analyze not just the potential, but also the impacts of nanotechnologies, more information and education are necessary. 

This is especially true of the quantum computer.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Nano: Infinite Possibilities On The Invisible Small Scale



Norio Taniguchi was the first to define the word "nanotechnology" in 1974: Nanotechnology is primarily concerned with the separation, consolidation, and deformation of materials by a single atom or molecule. 


  • The word "nano" refers to particle and material qualities that are one nanometer to 100 nanometers in size (1 nm is one millionth of a millimeter). 
  • The DNA double helix has a diameter of 1.8 nm, while a soot particle is 100 nm in size, almost 2,000 times smaller than the full stop at the end of this sentence. 
  • The nanocosm's structures are therefore substantially smaller than visible light wavelengths (about 380–780 nm). 

The nano range is distinguished by three characteristics: 


It is the boundary between quantum physics, which applies to atoms and molecules, and classical rules, which apply to the macro scale. Scientists and engineers can harness quantum phenomena to develop materials with unique features in this intermediate realm. This includes the tunnel effect, which, as indicated in the first chapter, is significant in current transistors. 

When nanoparticles are coupled with other substances, they aggregate a huge number of additional particles around them, which is ideal for scratch-resistant car paints, for example. 

Because surface atoms are more easily pulled away from atomic complexes, nanoparticles function as catalysts for chemical processes when a fracture occurs in the material. This is demonstrated via a simple geometric consideration. A cube with a side of one nanometre (approximately four atoms) includes on average 64 atoms, 56 of which are situated on the surface (87.5 percent). In comparison to bulk atoms, the bigger the particle, the fewer surface atoms accessible for reactions. Only 7.3 percent of the atoms in a nanocube with a side of 20 nm (containing 512,000 atoms) are on the surface. Their percentage declines to 1.2 percent at 100 nm.


Nanoparticles are virtually totally made up of surface, making them extremely reactive and endowing them with surprising mechanical, electrical, optical, and magnetic capabilities. 


Physicists have known for a long time that this is true in (quantum) theory. However, the technologies required to isolate and treat materials at the nanoscale have not always been available. 

  • The invention of the Scanning Tunneling Microscope (STM) by Gert Binning and Heinrich Rohrer in 1981 was a watershed moment in nanotechnology (for which they were awarded the 1986 Nobel Prize in Physics). Single atoms can be seen with this gadget. The electric current between the tip of the grid and the electrically conductive sample reacts extremely sensitively to changes in their spacing as little as one tenth of a nanometer due to a particular quantum phenomena (the tunneling effect). 
  • In 1990, Donald Eigler and Erhard Schweizer succeeded in transferring individual atoms from point A to point B by altering the voltage provided to the STM grid tip; the device could now not only view but also move individual atoms. With 35 xenon atoms written on a nickel crystal, the two researchers “wrote” the IBM logo. Researchers were able to construct a one-bit memory cell using just 12 atoms twenty-two years later (normal one-bit memory cells still comprise hundreds of thousands of atoms). 

What Feynman envisioned as a vision of the future in 1959, namely the atom-by-atom production of exceedingly small goods, is now a reality. 

Physicists and engineers are using quantum physics to not only manipulate atoms and create microscopic components, but also to produce new materials (and better comprehend existing ones).


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.


What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...