Showing posts with label Quantum Computing. Show all posts
Showing posts with label Quantum Computing. Show all posts

Quantum Revolution 2.0 - The Mighty Trio



Overall, three key technical fields will have a significant impact on our civilization in the near future: genetic engineering, artificial intelligence (AI), and quantum technology 2.0. 



Artificial intelligence and gene technology are generally considered as dangerous, and the debate over their usage and effect is in full gear. 


In reality, these technologies have the potential to transform not just our daily lives, but also humanity itself. 

They might, for example, be used to combine people and machines in the future to enhance our capacities by merging our cognitive skills with machine computing and physical performance. 

However, machine intelligence superior to ours in general cognitive skills, not only in mathematics, chess, or Go, is possible. 

However, quantum technologies 2.0 (such as quantum computers and nanomaterials) are now just a hazy blip on the radar of people concerned about the social effect of emerging technology. 

At the same time, the three technologies described before are inextricably linked. 

They will cross-fertilize each other, resulting in a considerably greater effect when combined. 



New quantum technologies, for example, have the potential to improve AI and genetic engineering significantly: 


• The processing power of quantum computers may help AI researchers enhance neural network optimization methods once again. 

• Nanomachines might reproduce themselves using a handbook provided by humans and enhance these instructions using genetic algorithms on their own. 

• Using smart nanobots as a genetic editing engine, we might actively alter our DNA to repair and enhance it indefinitely. 


The main issue is deciding who will be responsible for determining what constitutes an optimization.




Quantum Technology 2.0's effect has been grossly overestimated. 



Its contribution to the advancement of artificial intelligence, as well as its prospective use in genetic engineering, will be critical. 

The debate of the possible health risks of nanoparticles in human bodies is still the primary focus of emerging quantum technologies today. 

This odd rejection of quantum technology's potential isn't completely innocuous. 

This blind hole is exacerbated by another cognitive bias: we've become used to the notion that technological development is accelerating, but we underestimate its absolute pace. 


Aldous Huxley's renowned 1932 book Brave New World is an example of this. 



Quantum Revolution 2.0 - Technology and Social Change



Increased scientific knowledge has always had a significant effect on technical, social, and economic advances, just as it has always entailed enormous ideological revolutions. 



The natural sciences are, in reality, the primary engine of our contemporary wealth. 


The persistent quest of information leads to scientific advancement, which, when coupled with the dynamism of free-market competition, leads to equally consistent technical advancement. 

The one gives humanity with ever-increasing insights into the structure and processes of nature, while the second provides us with almost unlimited opportunities for individual activities, economic growth, and quality-of-life improvements. 



Here are a few instances from the past: 


• During the Renaissance, new technical breakthroughs such as papermaking, printing, mechanical clocks, navigation tools/shipping, building, and so on ushered in unparalleled wealth for Europeans. 

• The fruits of Newtonian physics found a spectacular technical expression in the shape of steam engines and heat machines, based on the new theory of heat, during the Industrial Revolution of the 18th and 19th centuries. 

• Transportation and manufacturing were transformed by railway and industrial equipment. 

• In the late 1800s, Faraday and Maxwell's electromagnetic field theory led immediately to city electricity, modern telecommunications, and electrical devices for a significant portion of the rural population. 

• The technological revolution of the twentieth century roughly corresponds to the first generation of quantum technologies and has brought us lasers, computers, imaging devices, and much more (including, unfortunately, the atomic bomb), resulting in a first wave of political and economic globalization. 



Digitization, with its ever-faster information processing and transmission, industrial integration with information and communication technology, and, of course, the internet, has ushered in a new era of political and economic globalization. 


Something new will emerge from the impending second quantum revolution. 

It will radically transform communication, engagement, and manufacturing once again. 

The Quantum Revolution 2.0, like all other technological revolutions, will usher in yet another significant shift in our way of life and society. 



~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.







Quantum Revolution 2.0



When Nanobots and Quantum Computers Become Part of Our Everyday Lives.



Quantum theory is the biggest scientific revolution of the twentieth century. 


Furthermore, the notion that we live in an universe that is only ostensibly real and predictable is a total departure from our normal thinking patterns. 

We still don't know how this revelation will influence our thinking in the future. 

The philosophical implications of a breakdown of subject–object dualism in the microcosm, the laws of symmetry in theoretical physics, and the non-local effects of entangled particles have yet to pervade our daily lives and thoughts. 

Despite this, quantum physics has already profoundly impacted our contemporary worldview. 



Many individuals today have said their goodbyes to absolute certainty, whether religious, philosophical, or scientific in character. 


They can cope with the ambiguity of contradictory facts (in the sense of Bohr1). 

This isn't even the most impressive feature of quantum theory. 

What else is there to look forward to? Great shifts in our perspective in the past have always profoundly altered our life, sooner or later: • The development of rational philosophical thinking in ancient Greece is the earliest historical example. 

Traditional (religious) solutions to basic issues of mankind, such as how the universe came into existence, what happens to us after death, why this or that natural event occurs, and so on, were no longer sufficient. 

The image of Zeus, the ultimate deity, pouring bolts of fire down to Earth was no longer sufficient; global events were increasingly subjected to rigorous examination based on logical rules and empirical observation standards. 



It took many centuries for the “transition from myth to logos” to occur (from about 800 to 200 BC). 


The synthesis of a naturalistic and rational view of nature that emerged at this period continues to influence how people think today. 

Then, in the late Renaissance, came the creation of the scientific method. 

People rediscovered the philosophers of Ancient Greece after one and a half millennia of religious rigidity, and they started to evaluate nature scientifically and logically once again. 

What was new was that scientists were now attempting to explain nature using mathematical principles in a systematic and theoretical manner. 

This resulted in significant intellectual, religious, social, and political shifts. 

Humans quickly realized they were no longer at the mercy of the elements. 

Their yearning for a unique way of life, economic independence, and the exploration of new horizons outweighed the intellectual and geographic limitations of the Middle Ages. 



Scientists' efforts to comprehend the world resulted in a rising urge to change it. 


During the Enlightenment, a new, critical style of scientific thought gained popular. 

God was relegated to the position of watchmaker in Newton's mechanics. 

The religiously justified legitimacy of political, social, and economic authority started to crumble since there was no longer an everlasting "Godordained" order. 



Impenetrable walls between hierarchical social systems eventually become porous over thousands of years. 


All of this led to a considerably higher level of human intellectual potential—what we now call "human capital." Albert Einstein, who was born in the early 17th century, would have most likely followed in his father's footsteps as a modest trader. 

As a physicist in the twentieth century, he was able to alter our worldview. 

Darwin's theory of evolution shifted man's place in the universe, making him the product of a process that all animals and plants had gone through. 

As a consequence, God as Creator and other similar transcendent concepts were rendered obsolete indefinitely. 



Darwin's assertion that each human being is evolutionarily distinct fueled the contemporary world's strong individuality. 


The new picture of man had an effect on moral ideals as well: social Darwinism, which was widely accepted at the time, put self-preservation and personal achievement at the center of human ambition. 

Darwin's ideas were quickly applied to the social and political fabric of human life, rather from being limited to physical survival and biological reproduction. 

We may expect millennia-old principles of our existence and the way we perceive ourselves to be further revolutionized as a result of quantum theory's revelation that our reality in its microstructure is non-real and nondeterministic. 

The shifts in our self-perception we've made so far are most likely harbingers of much more dramatic shifts to come. 

The discovery of quantum physics was the most significant intellectual event of the twentieth century, and it is likely to alter our worldview much more than it has already.



~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.





Nanoscale - Surface-To-Volume Ratio



Surface atoms in nanoscale things act differently from their bulk atoms. 



Consider the ratio of surface area A to volume V of a nanostructure to determine whether surface or bulk effects prevail. 





The ratio A/V of three solids in the shapes of a sphere, a cube, and a right-square pyramid is compared in Table. 

It demonstrates that this ratio scales as 1/r, where r is a linear size measure. 

All regular, basic constructions are found to scale in the same way. 

Even for a complex structure, if a single size parameter can be identified (for example, by enclosing the structure within a sphere of radius r), the same scaling holds roughly. 


Physically, the 1/r scaling means that the ratio A/V rises as the size of a three-dimensional structure decreases. 



In the example illustrated in Figure, the dramatic impact on surface area can be observed. 




The cube A has 1 m sides and a 6 m2 surface area. 

Each cube has a surface area of 6 cm2 when split into smaller 1 cm cubes (part B), however there are 106 of them, resulting in a total surface area of 600 m2. 

If cube A (part C) were cut into 1 nm cubes, the total surface area would be 6000 km2. 

Despite the fact that the overall volume stays the same in all three instances, the collective surface area of the cubes is significantly enhanced. 



A significant increase in the area of surfaces (or interfaces) may result in completely new electrical and vibrational states for each surface. 


Indeed, surface effects are responsible for melting that begins at the surface (pre-melting) and for a compact object's lower melting temperature (as compared to its bulk equivalent). 

Furthermore, significant changes in the thermal conductivity of nanostructures may be ascribed in part to increased surface area. 



A nanowire's thermal conductivity, for example, may be considerably lower than that of the bulk material, whereas carbon nanotubes have a much greater thermal conductivity than diamonds. 


Even though bulk gold is chemically inert, it exhibits significant chemical reactivity in the form of a nano size cluster, which is an example of surface effects. 

This is due in part to the number of surface atoms that act like individual atoms in a gold nanocluster. 

Bulk silver, on the other hand, does not react well with hydrochloric acid. 

The electrical structure of the surface states has been ascribed to the strong reactivity of silver nanoparticles with hydrochloric acid. 



An increase in surface area affects mechanical and electrical characteristics in addition to thermal and chemical properties. 


Indium arsenide (InAs) nanowires, for example, show a monotonic reduction in mobility as their radius approaches 10 nm. 

The low-temperature transport results clearly indicate that mobility deterioration is caused by surface roughness scattering. 



Furthermore, the existence of surface charges and a decreased coordination of surface atoms may result in a very high stress that is far outside the elastic regime. 


Charges on the polar surfaces of thin zinc-oxide (ZnO) nanobelts may spontaneously form rings and coils, which is unusual. 

Unexpected nanoscale phenomena like shape-based memory and pseudo elasticity may be explained by a high surface area and the associated surface effects. 

Young's modulus of films thinner than 10 atomic layers, for example, is found to be 30% lower than the bulk value. 

All of these findings suggest that increasing the surface-to-volume ratio is essential for nanoscale things.





Making An Appearance At The Nanoscale





Materials that interact with electromagnetic and other fields show a wide variety of spatial and temporal phenomena. 


The independence of any observation with regard to the choice of time, location, and units is a fundamental principle in physics. 


Physical quantities must rescale by the same amount throughout space-time, but this does not mean that physics is scale invariant. 

It is obvious that physics requires a quantized approach at the lowest scale, and Planck's constant h defines the least observable limit. 




Our world is governed by four basic forces: gravity, the weak force, the strong force, and the electromagnetic force, according to the standard model of constituent particles. 


Each of these forces has a distinct coupling strength as well as a distinct distance dependency. 

The gravitational and electromagnetic forces have a scale of 1/r 2 (known as the inverse-square rule) and may operate over vast distances, while the weak and strong forces only work over short distances. 

The strong force is virtually unobservable at distances larger than 1014 m, while the weak force has no effect at distances higher than 10 m. 

All of this indicates that we should be aware of the scale and units used to measure various amounts. 



All forces have the property of fading away as one travels away from the source. 


Using exchange particles, which are virtual particles produced from one item (source) and absorbed by the other, quantum field theory describes any force between two things (sink) Photons, gluons, weak bosons, and gravitons are four kinds of exchange particles that give birth to four forces; they all have a spin of one in units of h/(2 ) and transfer momentum between two interacting objects. 

The force produced between the two objects equals the rate at which momentum is transferred. 



Quantum field theory indicates that when the distance between objects grows, this force decreases. 


For example, the electromagnetic force between two charge particles diminishes as 1/r 2, while for dipole–dipole interactions, this dependency becomes 1/r4.


Because most physical or chemical characteristics can be traced back to interactions between atomic or molecular components, they all tend to retain vestiges of the inverse-distance dependency and appear as size-dependent traits for nanoscale objects. 



When at least one of the dimensions falls below 100 nm, the following material characteristics become size dependent (to varying degrees): 


• Mechanical properties: elastic moduli, adhesion, friction, and capillary forces; 

• Thermal properties: melting point, thermal conductivity; 

• Chemical properties: reactivity, catalysis; 

• Electrical properties: quantized conductance, Coulomb blockade; 

• Magnetic properties: spin-dependent transport, giant magnetoresistance; 


Engineers may adjust one or more characteristics of bulk materials by resizing them to the nano regime, which is a practical and beneficial element of this size dependency (1 nm to 100 nm). 

This property is at the heart of the idea of metamaterials, which are artificially created materials that enable nanotechnology to be used in real-world applications. 


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.






Nanotechnology And Nanoscience.




In 1889, the International System of Units (SI, short for Système International) was established. 


It is based on seven basic units for measuring time, length, mass, electric current, temperature, quantity of material, and luminous intensity: second (s), meter (m), kilogram (kg), ampere (A), kelvin (K), mole (mol), and candela (cd). 


Multiples and submultiples of the original unit are created by adding prefixes denoting integer powers of ten to these basic units. 

The SI system additionally stipulates that negative powers of 10 should be expressed in Latin words (e.g., milli (m), micro (m), nano (n), and positive powers of 10 should be expressed in Greek terms (e.g., kilo (k), mega (M), giga (G) (G). 

In 1958, the term nano was used to denote 109 SI units. 

The term nano comes from the classical Latin nanus, or its ancient Greek etymon nanos (v o), which means dwarf, according to the Oxford English Dictionary. 



Norio Taniguchi used the term nanotechnology to characterize his work on ultrafine machining and its promise for building sub micrometer devices in 1974. 



This phrase now refers to a transformative technology capable of constructing, manipulating, and directing individual atoms, molecules, or their interactions on a nanoscale scale (1 to 100 nm). 

While this use reflects the spirit of modern nanotechnology, it is dependent on the size of the items involved, which has a number of flaws. 

For example, the International Organization for Standardization (ISO) has suggested expanding the scope to include materials with at least one internal or surface feature, where the start of size dependent phenomena varies from the characteristics of individual atoms and molecules. 

By using nanoscale characteristics, such structures allow new applications and lead to better materials, electronics, and systems. 






The science of tiny devices is known as nanoscience. 


Essentially, nanoscience is a size where we can use both aspects to harness collective rather than individual characteristics of atoms and molecules — it is a scale where we can utilize both aspects to harness collective rather than individual properties of atoms and molecules. 

As we'll see later, the new features of nanostructures are primarily defined by the aggregate behaviors of individual building pieces. 

Figure below depicts a range of items with length scales ranging from 0.1 nanometers to one centimeter. 

On the right side of this image, an enlarged view of a few nanoscale (1 to 100 nm) items involved in the development of nanotechnology is displayed. 



~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.




Nanotechnology - A Historical Perspective

  


In 1867, James Clerk Maxwell suggested the use of small machines to defy thermodynamics' second rule, which says that the entropy of a closed system cannot decrease. 



According to this rule, heat must travel from hot to cold, preventing the construction of a perpetual motion machine. 


Maxwell's devil is a gedanken experiment that includes a machine (or demon) protecting a small opening between two gas reservoirs at the same temperature. 

The devil can determine the speed of individual molecules and allow only the fastest to pass, resulting in a temperature differential between the two reservoirs without requiring any effort. 

Maxwell's demon is unlikely to succeed since the second rule of thermodynamics has survived the test of time, but it is interesting to discover that molecular-level sensing and manipulation concepts were imagined more than 150 years ago. 

More recently, in a 1959 lecture to the American Physical Society titled “There's Plenty of Room at the Bottom,” physicist Richard Feynman alluded to the possibility of having miniaturized devices, made of a small number of atoms and working in compact spaces, for exploiting specific effects unique to their size and shape to control synthetic chemical reactions and produce useful products. 



Humans have used the interaction of light with nanoparticles without knowing the physics underlying it, according to historical data. 


The Lycurgus Cup, illustrated in Figure, is an interesting example. 

It is believed to have been created in the fourth century by Roman artisans. 

The cup is made of glass with gold and silver nanoparticles implanted in it, and it has a color-changing feature that allows it to take on various colors depending on the light source. 

When seen in reflected light, it looks jade-green. 

From the outside, however, the cup looks translucent-red when light is shined into it. 

The ruby-red and deep-yellow hues of the second item in Figure, a stained-glass window at Lancaster Cathedral depicting Edmund and Thomas of Canterbury, are created by trapped gold and silver nanoparticles in the glass. 

Modern theories on plasmon production may explain these visual phenomena, but how ancient blacksmiths understood the exact material characteristics and compositions to achieve them in reality remains a mystery. 



Regardless of contemporary advancements that enable humans to harness the power of nanotechnology, natural processes have skillfully used nanotechnology effects for billions of years. 


Examples include collecting solar energy via photosynthesis, precise replication of the DNA structure, and DNA repair caused by endogenous or external causes. 

The primary goal of nanoscience is to discover such phenomena that are unique to the nanoscale. 

Nanotechnology, which helps society via particular applications such as longer-lasting tennis balls, more efficient solar cells, and cleaner diesel engines, is based on theoretical know-how and understanding gained through nanoscience. 

However, there are numerous examples from prehistoric times to the present day where the application of a technology preceded the underlying science; practitioners were unaware of the reasons for strange behavior they observed in materials and devices that were very different from familiar individual atoms, molecules, and bulk matter, but continued to use them in applications – a model that modern engineers and scientists appear to be following. 


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.






Quantum Computing - A 3 Qubit Entangled State Achieved

 



In a completely controlled array of spin qubits in silicon, a three-qubit entangled state has been achieved. 




The gadget in a false-colored scanning electron micrograph. The aluminum gates are represented by the purple and green structures. Six RIKEN scientists used the gadget to entangle three silicon-based spin qubits. The RIKEN Center for Emergent Matter Science is responsible for this image.




The number of silicon-based spin qubits that can be entangled has been raised from two to three by an all-RIKEN team, emphasizing the promise of spin qubits for implementing multi-qubit quantum algorithms. 


When it comes to specific kinds of computations, quantum computers have the potential to outperform conventional computers. 

They rely on quantum bits, or qubits, which are the quantum equivalents of the bits used in traditional computers. 

Small blobs of silicon known as silicon quantum dots have many characteristics that make them extremely appealing for realizing qubits, despite being less developed than certain other qubit technologies. 

Long coherence periods, high-fidelity electrical control, high-temperature functioning, and a large scaling potential are among them. 



To link multiple silicon-based spin qubits, however, scientists must be able to entangle more than two qubits, a feat that has eluded them until now. 


Seigo Tarucha and five colleagues from RIKEN's Center for Emergent Matter Science have successfully started and measured a three-qubit array on silicon (the probability that a qubit is in the expected state). 

They also used a single chip to integrate the three entangled qubits. 

This demonstration is a first step in expanding the possibilities of spin qubit-based quantum systems. 

"Two-qubit operations are sufficient for performing basic logical computations," Tarucha says. 

"However, for scaling up and incorporating error correction, a three-qubit system is the bare minimum." The team's gadget is controlled by aluminum gates and consists of a triple quantum dot on a silicon/silicon–germanium heterostructure. 



One electron may be found in each quantum dot, and its spin-up and spin-down states encode a qubit. 


An on-chip magnet creates a magnetic-field gradient that divides the three qubits' resonance frequencies, allowing them to be addressed separately. 

The researchers used a two-qubit gate, a tiny quantum circuit that is the building block of quantum computing systems, to entangle two of the qubits. 

By integrating the third qubit with the gate, they were able to achieve three-qubit entanglement. 



The resultant three-qubit state had an astonishing 88 percent state fidelity and was in an entangled state that might be utilized for error correction. 


This demonstration is only the start of an ambitious research program aimed at developing a large-scale quantum computer. 

"With the three-qubit gadget, we aim to show basic error correction and build devices with 10 or more qubits," Tarucha adds. 

"We aim to create 50 to 100 qubits and more advanced error-correction procedures in the next decade, opening the path for a large-scale quantum computer."



~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.






Quantum Cryptography - What Is Quantum Cryptography? How Does It Work?





Quantum cryptography makes use of unique quantum characteristics of nature to complete a cryptographic job. 




Most quantum cryptography algorithms are information theoretically safe (at least in theory), which is a very strong concept of security since it is derived only from information theory. 


Early attempts to utilize quantum characteristics for security reasons may be traced back to the 1970s, when Wiesner attempted to produce unfalsifiable bank notes. 

However, these concepts seemed to be impractical, since they required the storage of a single polarized photon for days without loss (at the time, photon polarization was the only conceived carrier of quantum information). 



Bennett and Brassard made the breakthrough in 1983, when they discovered that photons are better utilized to convey quantum information rather than to store it. 


  • They might, for example, be used to convey a random secret key from a sender to a recipient, who would then be able to encrypt and decode sensitive communications using the key. 
  • Bennett and Brassard released the first quantum key distribution (QKD) protocol, dubbed the BB84 protocol, shortly after. 



A QKD protocol allows two parties to create a shared secret key using an unsecured quantum channel and a public classical channel that has been authenticated. 






  • Since then, a slew of new protocols have been suggested – and implemented – propelling QKD to the forefront of quantum cryptography and one of the most important applications of quantum information science. 
  • Furthermore, driven by growing concerns about data security and the possibility of commercialization, quantum cryptography research has drawn the interest of a number of businesses, private organizations, and governments.


 


In reality, quantum cryptography solutions are being offered by an increasing number of businesses and startups across the globe. 


  • In the long run, scientists want to build large-scale quantum networks that will allow safe communication between any subset of users in the network due to quantum entanglement. 
  • In a wider sense, similar networks may be connected together to form a quantum internet, which could be used for much more than secure communication, such as safe access to distant quantum computers. 



Quantum cryptography elegantly integrates concepts and contributions from a variety of disciplines, including quantum information and quantum communication, as well as computer science and conventional encryption. 


  • The interaction of these disparate disciplines leads to theoretical breakthroughs that are of wide interest and transferable to other areas of study. 
  • However, since quantum cryptography, and in particular QKD, has a considerable economic appeal, ongoing research is also driven by more practical goals. 


For example, combined theoretical and practical efforts are continuously dedicated to: improving the key-generation rates, simplifying the experimental setups, and so on by focusing on an unique QKD protocol that has lately garnered a lot of attention from the scientific community and is widely regarded as the new standard for long-distance QKD in fiber. 




Twinfield (TF) QKD is a technique that enables two parties to create a secret key across vast distances using single-photon interferometric measurements in an intermediary relay. 


  • In this context, we use current theoretical findings and simulations to examine practical TF-QKD implementations in depth. 
  • With bipartite QKD connections becoming the norm at many research institutions and field deployments across the globe, the next major step would be to join these isolated links into quantum networks to conduct more complex multi-user activities. 
  • The extension of QKD to many users using multipartite QKD, also known as quantum conference key agreement (CKA), is undoubtedly a logical application of future quantum networks. 




When a confidential communication has to be securely broadcast among a group of users, the CKA protocol is used. 


  • The users share a shared secret key—the conference key—with which they may encrypt and decode the secret message when they utilize the CKA protocol. 




In this section, CKA plays a significant part. 


  • We provide an understandable description of CKA's evolution from current QKD protocols to expose the reader to it. 
  • We extend QKD's security architecture to incorporate CKA and concentrate on a multipartite variant of the widely used BB84 protocol. 
  • We also go through some of the most recent experimental implementations of CKA protocols, with a focus on the multipartite BB84 protocol. 
  • We describe a new CKA technique based on the TF-QKD operating principle, in which several users distil a conference key via single-photon interference events. 
  • We demonstrate that the protocol outperforms prior CKA schemes over long distances thanks to this feature, since it uses a W-class state as its entanglement resource instead of the traditional GHZ state.



~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.




What Is Post-Quantum Cryptography?




Cryptography after the Quantum Era (PQC). 




In the last decade, significant developments in quantum computing have reassured the scientific community of the need to develop quantum-resistant cryptosystems. 


  • Quantum computers represent a danger to conventional public-key encryption based on number theory, thus Post-Quantum Cryptography (PQC) has emerged as the preferable alternative (i.e., integer factorization or discrete logarithms). 



Cryptosystems that are safe against assaults launched on classical computers and possibly quantum computers may be designed using:

 

      1. lattice-based cryptography, 
      2. multivariate cryptography, 
      3. hash-based cryptography schemes, 
      4. isogeny-based cryptography, 
      5. and code-based encryption. 


  • As a result, these methods are known as PQC (Post Quantum Cryptography) algorithms. 




Cryptography methods based on lattices are easy to build and provide a solid demonstration of security. 



  • The shortest vector problem (SVP), which involves estimating the minimum Euclidean length of a lattice vector for any basis, is the foundation of lattice-based encryption. 
  • The worst-case quantum polynomial time to solve SVP is approximately exp(O(√ n)).  
  • SVP's complexity is polynomial in n even with the processing capability of a quantum computer. 
  • One of the numerous issues in the lattice family is Short Integer Solutions (SIS). 
  • If the SVP is difficult in the worst situation, SIS issues are secure in the average scenario. 



The fundamental assumptions of code-based cryptography systems are that the generator matrix and random matrix are indistinguishable and that generic decoding is difficult. 


  • Because they are based on a well-studied issue, these methods take a conservative approach to public key encryption/key encapsulation. 
  • If the key size is decreased, this class of algorithms becomes susceptible. 
  • Researchers have proposed methods for reducing key size without jeopardizing security. 
  • The complexity of solving the finite field multivariate polynomial (MVP) problem inspires multivariate cryptography. 



MVP issues are NP-hard to solve. 


  • MVPs are NP-complete problems if all equations are quadratic over GF. 
  • Despite the fact that certain MVP-based methods have been proven to be weak, the PQC signature technique provides for competitive signature sizes. 
  • The security characteristics of the underlying symmetric primitives, particularly cryptographic hash functions, are used to create hash-based digital signatures (leveraging properties of collision resistance and second pre-image resistance). 



The National Institute of Standards and Technology (NIST) stated in that it will launch a standardization project to establish quantum-resistant standards for Key Encapsulation Mechanism (KEM) and Public Key Encryption (PKE), as well as digital signatures. 




NIST specified five distinct security strengths directly linked to NIST standards in symmetric cryptography in the request for proposals: Security Level : 



  1. Algorithm is at least as difficult to crack as AES (but it is less quantum resistant—Exhaustive Key Search). 
  2. Algorithm is at least as difficult to crack as SHA (strong in terms of quantum resistance—Collision Search). 
  3. Algorithm is at least as difficult to crack as AES (and is stronger in terms of quantum resistance—Exhaustive Key Search). 
  4. Algorithm is at least as difficult to crack as SHA (very strong quantum resistance—Collision Search). 
  5. Algorithm is at least as difficult to crack as AES (the strongest in terms of quantum resistance—Exhaustive Key Search). 


The NIST PQC Competition's first round began in December and received entries, from which digital signature contenders and KEM/PKE methods were selected. 


  • The NIST PQC Competition's second round candidates were revealed in January: digital signature candidates and KEM/PQC schemes. 
  • Just as the current work is going to print, NIST has officially announced a third cycle, which will begin in June. 



The Table below summarizes the round candidates, associated scheme, and NIST security level mapping.(Click through to zoom in)





~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.






Quantum Computing Keywords




We can start to focus in on qubit modalities by composing a working quantum computing vocabulary:


Table Of Contents
What Are Qubits?
What Is A Universal Quantum Computer?
What Is Quantum Annealing?
What Is Quantum Speedup?
What Is Quantum Edge?
What Is Quantum Supremacy?
What Is A Bloch Sphere?
What Is Coherence in Quantum Computing?
What Is DiVincenzo Criteria?
What Is Quantum Entanglement?
What Is Measurement In Quantum Computing?
What Are Quantum Dots?
What Is Quantum Error Correction?
What Is Quantum Indeterminacy?
What Is Quantum Tunneling?
What Is Superposition?
What Is Teleportation In Quantum Computing?
What Is A Topological Quantum Computer?


What Are Qubits?




The quantum equivalent of conventional digital bits are qubits (quantum bits). 


  • The qubits are in a state of superposition and operate on quantum mechanics principles. 
  • To alter the state of the qubits, we must use quantum mechanics concepts. 
  • We can measure the state of the qubits at the conclusion of the computation by projecting them into conventional digital bits. 




What Is A Universal Quantum Computer?


A Quantum Turing Machine, also known as a Universal Quantum Computer, is an abstract machine that is used to simulate the effects of a quantum computer. 


  • Any quantum algorithm may be described formally as a particular quantum Turing Machine, similar to the conventional Turing Machine. 


Quantum states defined in Hilbert space are used to represent internal states. 


  • In Hilbert space, the transition function is a collection of unitary matrices. 




What Is Quantum Annealing?


Quantum Fluctuations are used to discover a heuristic method that finds a global minimum from a limited collection of candidate solutions. 


  • Quantum Annealing may be used to tackle combinatorial optimization problems having a discrete search space with multiple local minima, such as the traveling salesman problem. 
  • The system begins with the quantum parallelism superposition of all possible states and evolves using the time-dependent Schrodinger equation. 
  • The amplitudes of all states may be altered by changing the transverse field (a magnetic field perpendicular to the axis of the qubit), resulting in Quantum Tunneling between them. 



The aim is to maintain the system as near to the Hamiltonian's ground state as possible. 


  • The system achieves its ground state when the transverse field is eventually switched off, which corresponds to the solution of the optimization issue. 
  • D-Wave Systems exhibited the first Quantum Annealer in 2011. 




What Is Quantum Speedup?


This is the best-case situation, in which no classical algorithm can outperform a quantum algorithm. 


  • There are a few quantum algorithms that have a polynomial speedup in addition to factorization and discrete logarithms. 
  • Grover's algorithm is one such algorithm. 



There have been reports on simulation methods for physical processes in quantum chemistry and solid-state physics. 


  • The main ideal problem in polynomial time and an approximation method for Jones polynomial with a polynomial speedup and a solution to Pells' equation have been presented. 
  • This area is changing. 




What Is Quantum Edge?


Quantum computers have a computational advantage. 


  • The idea that quantum computers can execute certain calculations more quickly than traditional computers. 




What Is Quantum Supremacy? 


Quantum computers' prospective capacity to tackle issues that conventional computers can't. 


  • Decoherence is the process by which the quantum information in a qubit is lost over time as a result of interactions with the environment. 
  • Quantum Volume is a practical method to track and compare progress toward lower system-wide gate error rates for quantum computing and error correction operations in the near future. 
  • It's a single-number metric that a concrete protocol can measure with a quantum computer of modest size n <=50 in the near future.




What Is A Bloch Sphere?


The Bloch sphere, named after scientist Felix Bloch, is a geometrical representation of the pure state space of a two-level quantum mechanical system (qubit) in quantum mechanics. 


  • Antipodal points correspond to a pair of mutually orthogonal state vectors on the Bloch sphere, which is a unit sphere. 

The Bloch Sphere's interpretation is as follows: 


  • The poles represent classical bits, and the notation |0 and |1 is used to denote them. 
  • Unlike conventional bit representation, where these are the only conceivable states, quantum bits span the whole sphere. 
  • As a result, quantum bits contain a lot more information, as shown by the Bloch sphere. 
  • When a qubit is measured, one of the two poles collapses. 


Which of the two poles collapses depends on which direction the arrow in the Bloch representation points: 

  • if the arrow is closer to the north pole, there is a greater chance of collapsing to that pole; similarly, 
  • if the arrow is closer to the south pole, there is a greater chance of collapsing to that pole. 

This adds the concept of probability to the Bloch sphere: 

  • the angle of the arrow with the vertical axes correlates to that probability. 
  • If the arrow points to the equator, each pole has a 50/50 probability of collapsing.



What Is Coherence in Quantum Computing?


A qubit's coherence is defined as its capacity to sustain superposition across time. 


  • It is therefore the lack of "decoherence," which is defined as any process that collapses a quantum state into a classical one, such as contact with the environment.



What Is  DiVincenzo Criteria?


The DiVincenzo criteria are a set of requirements for building a quantum computer that were originally suggested by theoretical physicist David P. DiVincenzo in his article "The Physical Implementation of Quantum Computation" in 2000. 


The DiVincenzo criteria are a collection of 5+2 requirements that must be met by an experimental setup in order to effectively execute quantum algorithms like Grover's search algorithm or Shor factorization. 


To perform quantum communication, such as that utilized in quantum key distribution, the two additional requirements are required.


1 – A physically scalable system with well-defined qubits.

2 – The ability to set the qubits' states to a simple fiducial state.

3 – Long decoherence periods that are relevant.

4 – A set of quantum gates that is “universal.”

5 – A measuring capability unique to qubits.

6 — Interconversion of stationary and flying qubits.

7 – The capacity to reliably transfer flying qubits between two points.




What Is Quantum Entanglement?


Quantum entanglement is a unique relationship that exists between two qubits. 

  • Entanglement may be created in a variety of ways. 
  • One method is to entangle two qubits by bringing them close together, performing an operation on them, and then moving them apart again. 
  • You may move them arbitrarily far away from each other after they're entangled, and they'll stay intertwined. 


The results of measurements on these qubits will reflect this entanglement. 

  • When measured, these qubits will always provide a random result of zero or one, regardless of how far apart they are. 


The first characteristic of entanglement is that it cannot be shared, which allows all of the applications that are derived from it to be created. 

  • If two qubits are maximally entangled, no other person in the universe may share their entanglement. 
  • The monogamy of entanglement is the name given to this feature.


Maximum coordination is the second characteristic of entanglement that gives it its strength. 


  • When the qubits are measured, this characteristic is shown. 
  • When two entangled qubits are measured in the same basis, no matter how far apart they are, the result is always the same. 
  • This result is not predetermined; rather, it is entirely random and determined at the time of measurement.




What Is Measurement In Quantum Computing?


The act of seeing a quantum state is known as measurement. 


  • This observation will provide traditional data, such as a bit. 
  • It's essential to remember that the quantum state will change as a result of this measurement procedure. 

If the state is in superposition, for example, this measurement will cause it to ‘collapse' into a classical state: zero or one. 

  • This process of collapsing occurs at random. 
  • There is no way of knowing what the result will be until the measurement is completed. 
  • However, the chance of each result may be calculated. 

This probability is a prediction about the quantum state that we can test by preparing it many times, measuring it, and calculating the percentage of each result.



What Are Quantum Dots?


Quantum dots may be thought of as "manufactured atoms." 


  • They are semiconductor nanocrystals in which an electron-hole pair may be trapped. 
  • Because the nanoscale size is equivalent to the wavelength of light, the electron may occupy distinct energy levels, exactly as in an atom. 
  • The dots may be encased in a photonic crystal cavity and probed with laser light.




What Is Quantum Error Correction?



Quantum computers are always in touch with the outside world. This environment has the potential to disrupt the system's computational state, resulting in data loss. 


  • Quantum error correction compensates for this loss by distributing the system's computational state over multiple qubits in an entangled state. 
  • Outside classical observers may detect and correct perturbations using this entanglement without having to see the computational state directly, which would collapse it.



What Is Quantum Indeterminacy?



The basic condition of existence, backed up by all empirical evidence, in which an isolated quantum system, like as a free electron, does not have fixed characteristics until those attributes are seen in experiments intended to quantify them. 


  • That is, unless those characteristics are measured, a particle does not have a particular mass, location, velocity, or spin. 
  • Indeed, the particle does not exist until it is seen in a strict sense.




What Is Quantum Tunneling?


Due to the wave-like nature of particles, quantum tunneling is a quantum mechanical phenomenon in which particles have a limited chance of overcoming an energy barrier or transiting through an energy state usually prohibited by classical physics. 


  • A particle's probability wave reflects the likelihood of locating the particle in a certain place, and there is a limited chance that the particle is on the opposite side of the barrier.




What Is Superposition?


Quantum physics' basic premise is superposition. 


  • It asserts that quantum states, like waves in classical physics, may be joined together – superposed – to produce a new valid quantum state, and that every quantum state can be seen as a linear combination, a sum of other unique quantum states.



What Is Teleportation In Quantum Computing?


Quantum teleportation is a technique that uses entanglement to transmit qubits. 


  • The following is how teleportation works: 

    • Initially, Alice and Bob must create an entangled pair of qubits between them. 
    • Alice next conducts a measurement on the qubit she wishes to transmit as well as the qubit that is entangled with Bob's qubit. 
    • This measurement compresses the qubits and breaks the entanglement, but it also provides her with two classical outcomes in the form of two classical bits. 
    • Alice transmits these two traditional bits to Bob over the traditional Internet. 
    • Bob next applies to his qubit a rectification operation that is based on these two classical bits. 
    • As a result, he is able to reclaim the qubit that was previously in Alice's control. 


It's worth noting that we've now sent a qubit without really utilizing a physical carrier capable of doing so. 

To accomplish this, you'll need entanglement, of course. 


It's also worth noting that quantum teleportation doesn't allow for communication faster than the speed of light. 


  • This is because Bob will not be able to make sense of the qubit she has in her hands until he receives the classical measurement results from Alice. 
  • The transmission of these traditional measurement results must take a certain length of time. 
  • This time is also constrained by the speed of light.




What Is A Topological Quantum Computer?


A topological quantum computer is a theoretical quantum computer that uses anyons, which are two-dimensional quasiparticles whose world lines intersect to create braided in a three-dimensional spacetime (i.e., one temporal plus two spatial dimensions). 


  • The logic gates that make up the computer are formed by these strands. 
  • The benefit of utilizing quantum braiding over trapped quantum particles in a quantum computer is that the former is considerably more stable. 
  • Small, cumulative perturbations may cause quantum states to decohere and create mistakes in computations, but they have no effect on the topological characteristics of the braiding. 
  • This is comparable to the work needed to cut a string and reconnect the ends to create a new braid, rather than a ball (representing an ordinary quantum particle in four-dimensional spacetime) colliding with a wall. 

In 1997, Alexei Kitaev suggested topological quantum computing.




~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.






What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...