Showing posts sorted by relevance for query small satellites. Sort by date Show all posts
Showing posts sorted by relevance for query small satellites. Sort by date Show all posts

Post Quantum Computing Encryption - Future-Proofing Encryption



Encryption in the post-quantum era. 


Many popular media depictions of quantum computing claim that the creation of dependable large-scale quantum computers will bring cryptography to an end and that quantum computers are just around the corner. 

The latter point of view may turn out to be overly optimistic or pessimistic, if you happen to rely on quantum-computing-proof security. 

While quantum computers have made significant progress in recent years, there's no certainty that they'll ever advance beyond laboratory proof-of-concept devices to become a realistic daily technology. (For a more thorough explanation, see a recent ASPI study.) 


Nonetheless, if quantum computing becomes a viable technology, several of the most extensively used encryption systems would be vulnerable to quantum computer cryptography assaults because quantum algorithms may drastically shorten the time it takes to crack them. 


For example, the RSA encryption scheme for the secure exchange of encryption keys, which underlies most web-based commerce, is based on the practical difficulty of finding prime factors of very big integers using classical (non-quantum) computers.

However, there is an extremely efficient quantum technique for prime factorization (known as ‘Shor's algorithm') that would make RSA encryption vulnerable to attack, jeopardizing the security of the vast quantity of economic activity that relies on the ability to safeguard moving data. 

Other commonly used encryption protocols, such as the Digital Signature Algorithm (DSA) and Elliptic Curve DSA, rely on mathematical procedures that are difficult to reverse conventionally but may be vulnerable to quantum computing assaults. 


Moving to secure quantum communication channels is one technique to secure communications. 


However, while point-to-point quantum channels are conceivable (and immune to quantum computer assaults), they have large administration overheads, and constructing a quantum ‘web' configuration is challenging. 

A traditional approach is likely to be favored for some time to come for applications such as networking military force units, creating secure communications between intelligence agencies, and putting up a secure wide-area network. 


Non-quantum (classical) techniques to data security, fortunately, are expected to remain safe even in the face of quantum computer threats. 


Quantum assaults have been found to be resistant to the 256-bit Advanced Encryption Standard (AES-256), which is routinely employed to safeguard sensitive information at rest. 

Protecting data at rest addresses only half of the problem; a secure mechanism for transferring encryption keys between the start and end locations for data in motion is still required. 


As a result, there's a lot of work being done to construct so-called "post-quantum" encryption systems that rely on mathematical processes for which no quantum algorithms exist. 


IBM has already detailed a quantum-resistant technology for safely transporting data across networks.  If the necessity arises, such a system might possibly replace RSA and other quantum-vulnerable encryption systems.



If everything else fails, there's always encryption technologies for the twenty-first century. 


One technique to improve communication security is to be able to ‘narrowcast' in such a way that eavesdropping is physically difficult, if not impossible. 

However, this is not always practicable, and there will always be messages that must pass over channels that are sensitive to eavesdropping. 


Even so-called "secure" channels can be breached at any time. 


The actual tapping of a subsea cable run to a Soviet naval facility on the Kamchatka Peninsula by the US Navy in the 1970s is a good example. The cable was deemed safe since it ran wholly within Russian territorial seas and was covered by underwater listening posts. 

As a result, it transmitted unencrypted messages. The gathered signals, though not of high intelligence value in and of themselves, gave cleartext ‘cribs' of Soviet naval communications that could be matched with encrypted data obtained elsewhere, substantially simplifying the cryptanalytic work. 

Even some of the LPI/LPD technology systems discussed in earlier sections may be subject to new techniques. 

For example, the Pentagon has funded research on devices that gather single photons reflected off air particles to identify laser signals from outside the beam, with the goal of extracting meaningful information about the beam direction, data speeds, and modulation type. The ultimate objective is to be able to intercept laser signals in the future.  


A prudent communications security approach is to expect that an opponent will find a method to access communications, notwithstanding best attempts to make it as difficult as possible. 


Highly sensitive information must be safeguarded from interception, and certain data must be kept safe for years, if not decades. Cryptographic procedures that render an intercepted transmission unintelligible are required. 

As we saw in the section on the PRC's capabilities, a significant amount of processing power is currently available to target Australian and ally military communications, and the situation is only going to become worse. 

On the horizon are technical dangers, the most well-known of which is the potential for effective quantum computing. Encryption needs to be ‘future proofed.'


As secure intermediates, space-based interconnections are used. 


If the connection can be made un-interceptable, space-based communications might provide a secure communication route for terrestrial organizations. Information and control signals between spacecraft and the Earth have been sent by radio waves to and from ground stations until now. 

Interception is achievable when collection systems are close enough to the uplink transmitter to collect energy from either the unavoidable side lobes of the main beam or when the collection system is able to be positioned inside the same downlink footprint as the receiver. 

The use of laser signals of various wavelengths to replace such RF lines has the potential to boost data speeds while also securing the communications against eavesdropping. 


Using laser communication connection between spacecraft has a number of advantages as well. 

Transmission losses over long distances restrict the efficiency with which spacecraft with low power budgets can exchange vast amounts of data, and RF connections inevitably restrict bandwidth. 


The imposts on space, weight, and power on spacecraft would be reduced if such linkages were replaced by laser communications. 

The benefits might include being able to carry larger sensor and processing payloads, spending more time on mission (owing to reduced downtime to recharge batteries), or a combination of the two. 

In the United States, the Trump administration's Space Force and anticipated NASA operations (including a presence on the moon and deep space missions) have sparked a slew of new space-based communications research initiatives. 


NASA has a ten-year project road map (dubbed the "decade of light") aiming at creating infrared and optical frequency laser communication systems, combining them with RF systems, and connecting many facilities and spacecraft into a reliable, damage-resistant network. 

As part of that effort, it is developing various technology demonstrations. 

Its Laser Communications Relay Demonstration, which is set to be live in June, will utilize lasers to encode and send data at speeds 10 to 100 times faster than radio systems.  

NASA uses the example of transmitting a map of Mars' surface back to Earth, which may take nine years with present radio technology but just nine weeks using laser communications. T

he practicality of laser communications has been demonstrated in laboratory prototype systems, and NASA plans to launch space-based versions later this year. The Pentagon's Space Development Agency (SDA) and the Defense Advanced Research Projects Agency (DARPA) are both working on comparable technologies, but with military and intelligence purposes in mind. 


The SDA envisions hundreds of satellites linked by infrared and optical laser communication connections. 

Sensor data will be sent between spacecraft until it reaches a satellite in touch with a ground station, according to the plan. Information from an orbiting sensor grid may therefore be sent to Earth in subsecond time frames, rather than the tens of minutes it can take for a low-Earth-orbiting satellite to pass within line of sight of a ground station. 

Furthermore, because to the narrow beams created by lasers, an eavesdropper has very limited chance of intercepting the message. Because of the increased communication efficiency, ‘traffic jams' in the considerably more extensively utilized radio spectrum are significantly less likely to occur. 

This year, the SDA plans to conduct a test with a small number of "cubesats." Moving to even higher frequencies, X-ray beams may theoretically transport very high data-rate messages. In terrestrial applications, ionization of air gases would soon attenuate signals, but this isn't an issue in space, and NASA is presently working on gigabit-per-second X-ray communication lines between spacecraft.  

Although NASA is primarily interested in applications for deep space missions (current methods can take many hours to transmit a single high-resolution photograph of a distant object such as an asteroid after a flyby), the technology has the potential to link future constellations of intelligence-gathering and communications satellites with extremely high data-rate channels. On board the International Space Station, NASA has placed a technology demonstration.



Communications with a low chance of being detected. 


One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. 

The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves).

 The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fibre-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 


A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 


As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency enables for faster data transfer

• very low-powered signals can still be received over short distances

• signal strength rapidly drops off over a critical distance 


Communications with a low chance of being detected. One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. 


For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves). 

The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fiber-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. 

This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 

A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 

As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. 

Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency allows for faster data transfer 

• very low-powered signals can still be heard over short distances 

• there is a quick drop-off in signal strength at a critical distance







Precision Measurements with Quantum Technology

 


Measurements that are more precise than ever before are now possible thanks to new quantum technologies. 

The precise measurement of physical quantities such as the distance between New York and Boston or the number of electrons flowing through a wire at a particular period may appear to be tedious. 

However, this is not the case. Because, regardless of what is being measured, whether it is meters, seconds, volts, or anything else, the highest level of accuracy may be critical. In this regard, the sensitivity of quantum mechanically entangled states to external shocks can be very beneficial for many measuring applications. 


The measuring of time by atomic clocks is a well-known example of the metrological application of quantum physical processes. 


Optical atomic clocks have been in use for more than 70 years. The characteristic frequency of electron transitions in atoms subjected to electromagnetic radiation determines their temporal interval. 

Incoming electromagnetic waves with a frequency of 9,192,631,770 oscillations per second (in the microwave range) have a maximum resonance for caesium atoms, i.e., a maximum of photons are released at that frequency. 

Humans have a considerably more precise definition of the second than the assertion that one day comprises 86,400 s, thanks to the commonly recognized definition that one second equals 9,192,631,770 of these vibrations. Because atomic clocks are based on the stimulation of numerous caesium atoms and a mean value of the number of released photons is taken, they are extremely precise. 


Now that there are roughly 260 standardized atomic clocks across the world that can be compared to each other, the measurement becomes even more precise, resulting in yet another averaging effect. 


Thanks to a global network of atomic clocks, time measurement is unbelievably precise. Every million years, they are accurate to within 1 second. However, that is insufficiently correct. 

How is that possible? After all, we just need our clock to be precise to the second to ensure that we don't miss the start of our favorite television show. 

However, most of us are unaware that the global navigation system GPS would not function without atomic clocks, as it determines locations by measuring the time it takes for a signal to travel between the device and the GPS satellites. 

The time measurement must be accurate to a few billionths of a second in order to identify our position to within a meter. Similarly, digital communication, in which a huge number of phone calls are sent over a single line at the same time, relies on ultraprecise time measurement. 


Atomic clocks manage the switches that route individual digital signals across the network so that they arrive at the correct receiver in the correct order. 


External disturbances, such as electric fields, can impact the accuracy of atomic clocks. 

These extend the frequency spectrum of the photons being measured, resulting in tiny changes in the resonance frequency and, as a result, in the time being recorded. 

Fluctuations in the terrestrial magnetic field are another factor. Today's GPS and digital communications technologies, as well as high-precision measurements in physics experiments, are limited by this. Even with atomic clocks, time measurement is still too imprecise for some GPS applications or many data transmission channels. 

This weakness would be addressed by a new generation of atomic clocks that take use of quantum entanglement's impact. In each clock in the global network, a few atoms would be quantum mechanically entangled. 

Because a measurement on a single atom of one clock is also a measurement on all others, the clocks will stabilize each other in this way; because to the nature of entanglement, even the tiniest errors within the network of clocks will be instantaneously rectified. 


Quantum physical processes offer another another technique to improve the accuracy of atomic clocks. 


We could account for the unsettling magnetic field variations if we knew how long they lasted in fractions of a second using an adequate error correction approach. Nature demonstrates how the magnetic field may be measured ultra-precisely at the atomic level utilizing the impact of quantum entanglement. 

Many migrating bird species have a magnetic sense that they utilize to navigate hundreds of kilometers to their wintering sites. For a long time, ornithologists were astounded by the precision with which they measured the intensity and direction of the Earth's magnetic field. 


They just discovered a few years ago that birds employ a quantum compass for this reason. Electron pairs are entangled across two molecules by their spins in the robin's eye. 


External magnetic fields are quite sensitive to these entanglements. The electrons revolve in different directions depending on the magnetic field's orientation, which translates to different orientations of their "spin." 

The shift in the orientation of the electron spins of these molecules in the bird's eye is enough to turn them into isomers (molecules with the same chemical formula but different spatial structure). 

The varied characteristics of the isomers are very sensitive to the strength and direction of the magnetic field, generating various chemical processes in the bird's retina that eventually lead to perception—the bird's eye therefore becomes a perfect measuring device for magnetic fields. 


Many species of birds have evolved a form of quantum pair of glasses for magnetic fields. 


They may therefore make their way to their winter lodgings via quantum phenomena. Local gravity fields may be detected extremely precisely utilizing quantum mechanically entangled states, in addition to temporal and magnetic fields, which has sparked major economic interest. 

Today, detailed measurements of the intensity of local gravitational fields are used to find metal and oil resources in the earth. 

Large subterranean gas or water fields can also be detected by local density differences, which result in a slightly greater or weaker gravitational force—but this is a little impact that can only be detected with ultra-sensitive gravity sensors. 

Such measurements might be made much more precise by utilizing the phenomena of quantum mechanical entanglement. Even a single individual might be tracked down using an entanglement-based ultra-sensitive gravity sensor based on the gravitational field formed by their body mass. 


Gas pipelines in the earth, water pipe breaks, sinkholes beneath roadways, and anomalies under a proposed house plot might all be found. 


Furthermore, if archaeologists were able to use gravity sensors to simply "lit up" ancient and prehistoric sites, their work would be substantially simplified. Entanglement-based measuring devices might also detect the small magnetic currents linked to brain function or cell-to-cell communication in our bodies. 

They would allow for real-time monitoring of individual neurons and their behavior. This would allow us to assess the processes in our brain (and body) considerably more precisely than we can now with EEG recordings. 

Quantum magnetic field sensors are already in use for magnetoencephalography (MEG), which uses Superconducting Quantum Interference Devices (SQUIDs) to assess the magnetic activity of the brain (superconducting quantum interference units). Perhaps, in the future, we may be able to capture our thoughts from the outside and feed them straight into a computer. 


Future quantum technologies may, in fact, provide the ideal brain–computer interaction. Much of what has previously been unseen will become visible thanks to measurement instruments based on quantum entanglement.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.








What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...