Showing posts with label Quantum Cryptography. Show all posts
Showing posts with label Quantum Cryptography. Show all posts

Quantum Cryptography - What Is Quantum Cryptography? How Does It Work?





Quantum cryptography makes use of unique quantum characteristics of nature to complete a cryptographic job. 




Most quantum cryptography algorithms are information theoretically safe (at least in theory), which is a very strong concept of security since it is derived only from information theory. 


Early attempts to utilize quantum characteristics for security reasons may be traced back to the 1970s, when Wiesner attempted to produce unfalsifiable bank notes. 

However, these concepts seemed to be impractical, since they required the storage of a single polarized photon for days without loss (at the time, photon polarization was the only conceived carrier of quantum information). 



Bennett and Brassard made the breakthrough in 1983, when they discovered that photons are better utilized to convey quantum information rather than to store it. 


  • They might, for example, be used to convey a random secret key from a sender to a recipient, who would then be able to encrypt and decode sensitive communications using the key. 
  • Bennett and Brassard released the first quantum key distribution (QKD) protocol, dubbed the BB84 protocol, shortly after. 



A QKD protocol allows two parties to create a shared secret key using an unsecured quantum channel and a public classical channel that has been authenticated. 






  • Since then, a slew of new protocols have been suggested – and implemented – propelling QKD to the forefront of quantum cryptography and one of the most important applications of quantum information science. 
  • Furthermore, driven by growing concerns about data security and the possibility of commercialization, quantum cryptography research has drawn the interest of a number of businesses, private organizations, and governments.


 


In reality, quantum cryptography solutions are being offered by an increasing number of businesses and startups across the globe. 


  • In the long run, scientists want to build large-scale quantum networks that will allow safe communication between any subset of users in the network due to quantum entanglement. 
  • In a wider sense, similar networks may be connected together to form a quantum internet, which could be used for much more than secure communication, such as safe access to distant quantum computers. 



Quantum cryptography elegantly integrates concepts and contributions from a variety of disciplines, including quantum information and quantum communication, as well as computer science and conventional encryption. 


  • The interaction of these disparate disciplines leads to theoretical breakthroughs that are of wide interest and transferable to other areas of study. 
  • However, since quantum cryptography, and in particular QKD, has a considerable economic appeal, ongoing research is also driven by more practical goals. 


For example, combined theoretical and practical efforts are continuously dedicated to: improving the key-generation rates, simplifying the experimental setups, and so on by focusing on an unique QKD protocol that has lately garnered a lot of attention from the scientific community and is widely regarded as the new standard for long-distance QKD in fiber. 




Twinfield (TF) QKD is a technique that enables two parties to create a secret key across vast distances using single-photon interferometric measurements in an intermediary relay. 


  • In this context, we use current theoretical findings and simulations to examine practical TF-QKD implementations in depth. 
  • With bipartite QKD connections becoming the norm at many research institutions and field deployments across the globe, the next major step would be to join these isolated links into quantum networks to conduct more complex multi-user activities. 
  • The extension of QKD to many users using multipartite QKD, also known as quantum conference key agreement (CKA), is undoubtedly a logical application of future quantum networks. 




When a confidential communication has to be securely broadcast among a group of users, the CKA protocol is used. 


  • The users share a shared secret key—the conference key—with which they may encrypt and decode the secret message when they utilize the CKA protocol. 




In this section, CKA plays a significant part. 


  • We provide an understandable description of CKA's evolution from current QKD protocols to expose the reader to it. 
  • We extend QKD's security architecture to incorporate CKA and concentrate on a multipartite variant of the widely used BB84 protocol. 
  • We also go through some of the most recent experimental implementations of CKA protocols, with a focus on the multipartite BB84 protocol. 
  • We describe a new CKA technique based on the TF-QKD operating principle, in which several users distil a conference key via single-photon interference events. 
  • We demonstrate that the protocol outperforms prior CKA schemes over long distances thanks to this feature, since it uses a W-class state as its entanglement resource instead of the traditional GHZ state.



~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.




What Is Post-Quantum Cryptography?




Cryptography after the Quantum Era (PQC). 




In the last decade, significant developments in quantum computing have reassured the scientific community of the need to develop quantum-resistant cryptosystems. 


  • Quantum computers represent a danger to conventional public-key encryption based on number theory, thus Post-Quantum Cryptography (PQC) has emerged as the preferable alternative (i.e., integer factorization or discrete logarithms). 



Cryptosystems that are safe against assaults launched on classical computers and possibly quantum computers may be designed using:

 

      1. lattice-based cryptography, 
      2. multivariate cryptography, 
      3. hash-based cryptography schemes, 
      4. isogeny-based cryptography, 
      5. and code-based encryption. 


  • As a result, these methods are known as PQC (Post Quantum Cryptography) algorithms. 




Cryptography methods based on lattices are easy to build and provide a solid demonstration of security. 



  • The shortest vector problem (SVP), which involves estimating the minimum Euclidean length of a lattice vector for any basis, is the foundation of lattice-based encryption. 
  • The worst-case quantum polynomial time to solve SVP is approximately exp(O(√ n)).  
  • SVP's complexity is polynomial in n even with the processing capability of a quantum computer. 
  • One of the numerous issues in the lattice family is Short Integer Solutions (SIS). 
  • If the SVP is difficult in the worst situation, SIS issues are secure in the average scenario. 



The fundamental assumptions of code-based cryptography systems are that the generator matrix and random matrix are indistinguishable and that generic decoding is difficult. 


  • Because they are based on a well-studied issue, these methods take a conservative approach to public key encryption/key encapsulation. 
  • If the key size is decreased, this class of algorithms becomes susceptible. 
  • Researchers have proposed methods for reducing key size without jeopardizing security. 
  • The complexity of solving the finite field multivariate polynomial (MVP) problem inspires multivariate cryptography. 



MVP issues are NP-hard to solve. 


  • MVPs are NP-complete problems if all equations are quadratic over GF. 
  • Despite the fact that certain MVP-based methods have been proven to be weak, the PQC signature technique provides for competitive signature sizes. 
  • The security characteristics of the underlying symmetric primitives, particularly cryptographic hash functions, are used to create hash-based digital signatures (leveraging properties of collision resistance and second pre-image resistance). 



The National Institute of Standards and Technology (NIST) stated in that it will launch a standardization project to establish quantum-resistant standards for Key Encapsulation Mechanism (KEM) and Public Key Encryption (PKE), as well as digital signatures. 




NIST specified five distinct security strengths directly linked to NIST standards in symmetric cryptography in the request for proposals: Security Level : 



  1. Algorithm is at least as difficult to crack as AES (but it is less quantum resistant—Exhaustive Key Search). 
  2. Algorithm is at least as difficult to crack as SHA (strong in terms of quantum resistance—Collision Search). 
  3. Algorithm is at least as difficult to crack as AES (and is stronger in terms of quantum resistance—Exhaustive Key Search). 
  4. Algorithm is at least as difficult to crack as SHA (very strong quantum resistance—Collision Search). 
  5. Algorithm is at least as difficult to crack as AES (the strongest in terms of quantum resistance—Exhaustive Key Search). 


The NIST PQC Competition's first round began in December and received entries, from which digital signature contenders and KEM/PKE methods were selected. 


  • The NIST PQC Competition's second round candidates were revealed in January: digital signature candidates and KEM/PQC schemes. 
  • Just as the current work is going to print, NIST has officially announced a third cycle, which will begin in June. 



The Table below summarizes the round candidates, associated scheme, and NIST security level mapping.(Click through to zoom in)





~ Jai Krishna Ponnappan


You may also want to read more about Quantum Computing here.






Potential of Quantum Computing Applications



Despite the threat that the existence of a large-scale quantum computer (an FTQC) poses to information security, the ability of intermediate-scale (NISQ) processors to provide unprecedented computing power in the near future opens up a wide opportunity space, especially for critical Defense Department applications and the Defense technology edge. 

The current availability of NISQ processors has drastically changed the development route for quantum applications. 

As a result, a heuristics-driven strategy has been developed, allowing for significantly greater engagement and industry involvement. 

Previously, quantum algorithm research was mostly focused on a far-off FTQC future, and determining the value of a quantum application needed extremely specialized mathematical abilities. 

We believe that in the not-too-distant future, this will no longer be essential for quantum advantage to be practicable. 

As a result, it will be critical, particularly the Defense Department and other agencies, to have access to NISQ devices, which we anticipate will enable for the development of early mission-oriented applications. 

While NISQ processors do not pose a danger to communications security in and of itself, this recently obtained intermediate regime permits quantum hardware and software development to be merged under the ‘quantum advantage' regime for the first time, potentially speeding up progress. 


This emphasizes the security apparatus's requirement for a self-contained NISQ capability.




Quantum Computing Threat to Information Security



Current RSA public-key (asymmetric) encryption systems and other versions rely on trapdoor mathematical functions, which make it simple to compute a public key from a private key but computationally impossible to compute the converse, a private key from a public key.

The difficulties of integer factorization and elliptic curve variations of the discrete logarithm issue, both of which have no known solution for computing an inverse in polynomial time, are exploited to create frequently used trapdoor functions (that is, on a finite timescale). 


In a nutshell, this so-called "computational hardness" provides safety. 


In 1994, however, Peter Shor proposed a quantum method that may be employed on a sufficiently large-scale quantum computer to perform integer factorization in polynomial time. 

The now-famous quantum technique has now been proved to solve the discrete logarithm and elliptic-curve logarithm problems in polynomial time as well. 


As a result of the creation of an FTQC in conjunction with this quantum algorithm, the security of present asymmetric public-key cryptography is jeopardized. 

Furthermore, Shor's method exemplifies how advances in the mathematics and physical sciences have the potential to jeopardize secure communications in general. 


In addition to Defense Department and critical cyber infrastructure systems, the world's digital revolution, which includes 4 billion internet users, 2 billion websites, and over $3 trillion in retail transactions, is backed at multiple tiers by existing public-key cryptography. 


While the creation of an FTQC is estimated to be at least a decade or two away, there is still a pressing need to solve this issue because of the ‘record now, exploit later' danger, in which encrypted data is collected and kept for subsequent decryption by an FTQC when one becomes available. 

As a result, the US National Institute of Standards and Technology's Post Quantum Cryptography Project, which includes worldwide partners—a security "patch" for the internet—is prioritizing the development of new "quantum hard" public-key algorithms.




Post Quantum Computing Encryption - Future-Proofing Encryption



Encryption in the post-quantum era. 


Many popular media depictions of quantum computing claim that the creation of dependable large-scale quantum computers will bring cryptography to an end and that quantum computers are just around the corner. 

The latter point of view may turn out to be overly optimistic or pessimistic, if you happen to rely on quantum-computing-proof security. 

While quantum computers have made significant progress in recent years, there's no certainty that they'll ever advance beyond laboratory proof-of-concept devices to become a realistic daily technology. (For a more thorough explanation, see a recent ASPI study.) 


Nonetheless, if quantum computing becomes a viable technology, several of the most extensively used encryption systems would be vulnerable to quantum computer cryptography assaults because quantum algorithms may drastically shorten the time it takes to crack them. 


For example, the RSA encryption scheme for the secure exchange of encryption keys, which underlies most web-based commerce, is based on the practical difficulty of finding prime factors of very big integers using classical (non-quantum) computers.

However, there is an extremely efficient quantum technique for prime factorization (known as ‘Shor's algorithm') that would make RSA encryption vulnerable to attack, jeopardizing the security of the vast quantity of economic activity that relies on the ability to safeguard moving data. 

Other commonly used encryption protocols, such as the Digital Signature Algorithm (DSA) and Elliptic Curve DSA, rely on mathematical procedures that are difficult to reverse conventionally but may be vulnerable to quantum computing assaults. 


Moving to secure quantum communication channels is one technique to secure communications. 


However, while point-to-point quantum channels are conceivable (and immune to quantum computer assaults), they have large administration overheads, and constructing a quantum ‘web' configuration is challenging. 

A traditional approach is likely to be favored for some time to come for applications such as networking military force units, creating secure communications between intelligence agencies, and putting up a secure wide-area network. 


Non-quantum (classical) techniques to data security, fortunately, are expected to remain safe even in the face of quantum computer threats. 


Quantum assaults have been found to be resistant to the 256-bit Advanced Encryption Standard (AES-256), which is routinely employed to safeguard sensitive information at rest. 

Protecting data at rest addresses only half of the problem; a secure mechanism for transferring encryption keys between the start and end locations for data in motion is still required. 


As a result, there's a lot of work being done to construct so-called "post-quantum" encryption systems that rely on mathematical processes for which no quantum algorithms exist. 


IBM has already detailed a quantum-resistant technology for safely transporting data across networks.  If the necessity arises, such a system might possibly replace RSA and other quantum-vulnerable encryption systems.



If everything else fails, there's always encryption technologies for the twenty-first century. 


One technique to improve communication security is to be able to ‘narrowcast' in such a way that eavesdropping is physically difficult, if not impossible. 

However, this is not always practicable, and there will always be messages that must pass over channels that are sensitive to eavesdropping. 


Even so-called "secure" channels can be breached at any time. 


The actual tapping of a subsea cable run to a Soviet naval facility on the Kamchatka Peninsula by the US Navy in the 1970s is a good example. The cable was deemed safe since it ran wholly within Russian territorial seas and was covered by underwater listening posts. 

As a result, it transmitted unencrypted messages. The gathered signals, though not of high intelligence value in and of themselves, gave cleartext ‘cribs' of Soviet naval communications that could be matched with encrypted data obtained elsewhere, substantially simplifying the cryptanalytic work. 

Even some of the LPI/LPD technology systems discussed in earlier sections may be subject to new techniques. 

For example, the Pentagon has funded research on devices that gather single photons reflected off air particles to identify laser signals from outside the beam, with the goal of extracting meaningful information about the beam direction, data speeds, and modulation type. The ultimate objective is to be able to intercept laser signals in the future.  


A prudent communications security approach is to expect that an opponent will find a method to access communications, notwithstanding best attempts to make it as difficult as possible. 


Highly sensitive information must be safeguarded from interception, and certain data must be kept safe for years, if not decades. Cryptographic procedures that render an intercepted transmission unintelligible are required. 

As we saw in the section on the PRC's capabilities, a significant amount of processing power is currently available to target Australian and ally military communications, and the situation is only going to become worse. 

On the horizon are technical dangers, the most well-known of which is the potential for effective quantum computing. Encryption needs to be ‘future proofed.'


As secure intermediates, space-based interconnections are used. 


If the connection can be made un-interceptable, space-based communications might provide a secure communication route for terrestrial organizations. Information and control signals between spacecraft and the Earth have been sent by radio waves to and from ground stations until now. 

Interception is achievable when collection systems are close enough to the uplink transmitter to collect energy from either the unavoidable side lobes of the main beam or when the collection system is able to be positioned inside the same downlink footprint as the receiver. 

The use of laser signals of various wavelengths to replace such RF lines has the potential to boost data speeds while also securing the communications against eavesdropping. 


Using laser communication connection between spacecraft has a number of advantages as well. 

Transmission losses over long distances restrict the efficiency with which spacecraft with low power budgets can exchange vast amounts of data, and RF connections inevitably restrict bandwidth. 


The imposts on space, weight, and power on spacecraft would be reduced if such linkages were replaced by laser communications. 

The benefits might include being able to carry larger sensor and processing payloads, spending more time on mission (owing to reduced downtime to recharge batteries), or a combination of the two. 

In the United States, the Trump administration's Space Force and anticipated NASA operations (including a presence on the moon and deep space missions) have sparked a slew of new space-based communications research initiatives. 


NASA has a ten-year project road map (dubbed the "decade of light") aiming at creating infrared and optical frequency laser communication systems, combining them with RF systems, and connecting many facilities and spacecraft into a reliable, damage-resistant network. 

As part of that effort, it is developing various technology demonstrations. 

Its Laser Communications Relay Demonstration, which is set to be live in June, will utilize lasers to encode and send data at speeds 10 to 100 times faster than radio systems.  

NASA uses the example of transmitting a map of Mars' surface back to Earth, which may take nine years with present radio technology but just nine weeks using laser communications. T

he practicality of laser communications has been demonstrated in laboratory prototype systems, and NASA plans to launch space-based versions later this year. The Pentagon's Space Development Agency (SDA) and the Defense Advanced Research Projects Agency (DARPA) are both working on comparable technologies, but with military and intelligence purposes in mind. 


The SDA envisions hundreds of satellites linked by infrared and optical laser communication connections. 

Sensor data will be sent between spacecraft until it reaches a satellite in touch with a ground station, according to the plan. Information from an orbiting sensor grid may therefore be sent to Earth in subsecond time frames, rather than the tens of minutes it can take for a low-Earth-orbiting satellite to pass within line of sight of a ground station. 

Furthermore, because to the narrow beams created by lasers, an eavesdropper has very limited chance of intercepting the message. Because of the increased communication efficiency, ‘traffic jams' in the considerably more extensively utilized radio spectrum are significantly less likely to occur. 

This year, the SDA plans to conduct a test with a small number of "cubesats." Moving to even higher frequencies, X-ray beams may theoretically transport very high data-rate messages. In terrestrial applications, ionization of air gases would soon attenuate signals, but this isn't an issue in space, and NASA is presently working on gigabit-per-second X-ray communication lines between spacecraft.  

Although NASA is primarily interested in applications for deep space missions (current methods can take many hours to transmit a single high-resolution photograph of a distant object such as an asteroid after a flyby), the technology has the potential to link future constellations of intelligence-gathering and communications satellites with extremely high data-rate channels. On board the International Space Station, NASA has placed a technology demonstration.



Communications with a low chance of being detected. 


One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. 

The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves).

 The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fibre-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 


A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 


As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency enables for faster data transfer

• very low-powered signals can still be received over short distances

• signal strength rapidly drops off over a critical distance 


Communications with a low chance of being detected. One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. 


For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves). 

The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fiber-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. 

This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 

A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 

As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. 

Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency allows for faster data transfer 

• very low-powered signals can still be heard over short distances 

• there is a quick drop-off in signal strength at a critical distance







Quantum Cryptography


The Holy Grail of Data Security 


Let's take a closer look at the second item on the list: quantum cryptography. In today's society, data security is a problem that has grown more crucial. 


How can we be sure that no one else has access to our personal digital information? 

Or that third parties don't listen in on our discussions without our knowledge? 


Traditional encryption encrypts a communication with a key code in such a way that decrypting it without knowing the key would demand unreasonably large processing power. But it's like a never-ending competition to build ever-more sophisticated encryption methods that can't be cracked by ever-more powerful computers. 

At least for the dilemma of the unidentified eavesdropper, quantum cryptography offers a solution.

  Quantum key distribution is a critical component of quantum-secure communication: by conveying the key using entangled quantum states of light, any interference in the transmission, such as an eavesdropper in the communication channel, is immediately observable by the user. 

  • Assume A makes a “secure” phone call to B. (in quantum cryptography, A and B are always taken to stand for Alice and Bob). 
  • Both Alice's and Bob's equipment are capable of measuring entangled particles. 
  • When the line is intercepted, Alice and Bob quickly recognize that an undesirable third party (commonly referred to as Eve) is present, because Eve would irreversibly disrupt the entanglement of the particles while listening in, i.e., measuring it for that reason. 
  • She also can't just copy them and transfer the information, the qubit, to the intended recipient without being caught, because it's impossible to duplicate any (yet-to-be-measured) quantum state exactly. 
  • As soon as Alice and Bob observe any changes to their key, or that the entanglement of their particles has been broken, they alter the method of communication and, at least temporarily, prevent the eavesdropper. 


Cryptography relies on a fundamental fact of quantum mechanics: quantum states may never be replicated without affecting the matching state or original information. 


Engineers are currently striving to utilize the odd qualities of the micro universe, which caused so much consternation among physicists in the early part of the twentieth century. 

Physicists went back to the theoretical drawing board during the creation of the first generation of quantum technologies to achieve a proper understanding of the principles that govern the micro universe. Meanwhile, they have made great progress in their efforts. 

Quantum physics and all of its main aspects may now be applied in a technology environment. The fascinating aspect of this approach is that scientists and engineers are working on a whole new universe of possibilities that have never been conceived before, rather than just attempting to make current and familiar things quicker or more exact. 


“The nineteenth century was known as the machine era, the twentieth century will go down in history as the information era,” wrote physicist Paul Davies in 1997. The quantum age, I believe, will begin in the twenty-first century.”



You may also want to read more about Quantum Computing here.





Analog Space Missions: Earth-Bound Training for Cosmic Exploration

What are Analog Space Missions? Analog space missions are a unique approach to space exploration, involving the simulation of extraterrestri...