Nano: Infinite Possibilities On The Invisible Small Scale



Norio Taniguchi was the first to define the word "nanotechnology" in 1974: Nanotechnology is primarily concerned with the separation, consolidation, and deformation of materials by a single atom or molecule. 


  • The word "nano" refers to particle and material qualities that are one nanometer to 100 nanometers in size (1 nm is one millionth of a millimeter). 
  • The DNA double helix has a diameter of 1.8 nm, while a soot particle is 100 nm in size, almost 2,000 times smaller than the full stop at the end of this sentence. 
  • The nanocosm's structures are therefore substantially smaller than visible light wavelengths (about 380–780 nm). 

The nano range is distinguished by three characteristics: 


It is the boundary between quantum physics, which applies to atoms and molecules, and classical rules, which apply to the macro scale. Scientists and engineers can harness quantum phenomena to develop materials with unique features in this intermediate realm. This includes the tunnel effect, which, as indicated in the first chapter, is significant in current transistors. 

When nanoparticles are coupled with other substances, they aggregate a huge number of additional particles around them, which is ideal for scratch-resistant car paints, for example. 

Because surface atoms are more easily pulled away from atomic complexes, nanoparticles function as catalysts for chemical processes when a fracture occurs in the material. This is demonstrated via a simple geometric consideration. A cube with a side of one nanometre (approximately four atoms) includes on average 64 atoms, 56 of which are situated on the surface (87.5 percent). In comparison to bulk atoms, the bigger the particle, the fewer surface atoms accessible for reactions. Only 7.3 percent of the atoms in a nanocube with a side of 20 nm (containing 512,000 atoms) are on the surface. Their percentage declines to 1.2 percent at 100 nm.


Nanoparticles are virtually totally made up of surface, making them extremely reactive and endowing them with surprising mechanical, electrical, optical, and magnetic capabilities. 


Physicists have known for a long time that this is true in (quantum) theory. However, the technologies required to isolate and treat materials at the nanoscale have not always been available. 

  • The invention of the Scanning Tunneling Microscope (STM) by Gert Binning and Heinrich Rohrer in 1981 was a watershed moment in nanotechnology (for which they were awarded the 1986 Nobel Prize in Physics). Single atoms can be seen with this gadget. The electric current between the tip of the grid and the electrically conductive sample reacts extremely sensitively to changes in their spacing as little as one tenth of a nanometer due to a particular quantum phenomena (the tunneling effect). 
  • In 1990, Donald Eigler and Erhard Schweizer succeeded in transferring individual atoms from point A to point B by altering the voltage provided to the STM grid tip; the device could now not only view but also move individual atoms. With 35 xenon atoms written on a nickel crystal, the two researchers “wrote” the IBM logo. Researchers were able to construct a one-bit memory cell using just 12 atoms twenty-two years later (normal one-bit memory cells still comprise hundreds of thousands of atoms). 

What Feynman envisioned as a vision of the future in 1959, namely the atom-by-atom production of exceedingly small goods, is now a reality. 

Physicists and engineers are using quantum physics to not only manipulate atoms and create microscopic components, but also to produce new materials (and better comprehend existing ones).


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.


Nanomaterials - Materials Of Wonder.



Skilled blacksmiths have been producing the renowned Damascus steel in a complex manufacturing process for for 2,000 years. Layers of various steels are piled, forged together, continuously folded over and flattened until a substance consisting of up to several hundred of these layers is eventually created, similar to how a baker kneads dough. 

Damascus steel is highly hard while also being incredibly flexible when compared to regular steel. It is now recognized that the incorporation of carbon nanotubes with lengths of up to 50 nm and diameters of 10 to 20 nm is responsible for these exceptional material characteristics. 


Of course, ancient and medieval blacksmiths had no knowledge of nanotubes because their procedures were totally dependent on trial and error. 


As further examples, humans were already producing gleaming metallic nanoparticle surfaces on ceramics 3,400 years ago in Mesopotamia and Egypt, while the Romans used nanoparticles to seal their everyday ceramics, and red stained glass windows were made with glass containing gold nanoparticles in the Middle Ages. 


  • Nanoparticle-based materials have been made and utilized since the beginning of time. We can now comprehend and even enhance materials like Damascus steel thanks to quantum physics' insight.
  • Millennia-old forging methods can be further enhanced by carefully specifying the inclusion of particular materials. 
  • Nanometer-sized nickel, titanium, molybdenum, or manganese particles can be introduced into the iron crystal lattice of steel for this purpose. Nickel and manganese, in particular, encourage the development of nanocrystals, which maintain their structure even when the metal is bent, ensuring the material's resilience. 
  • Due to the precise dispersion of these nanocrystals, the steel becomes very flexible and bendable. Despite accounting for a relatively tiny portion of the overall mass, the extra particles provide far better characteristics than the pure iron crystal lattice. This strategy is employed, for example, in the automobile and aerospace industries, where more deformable and robust steels enable lightweight materials and energy-saving building processes.
  • The notion of introducing super-fine distributions of nanoparticles into materials (known as "doping" in semiconductors) underpins a variety of nanomaterial manufacturing processes. 


The “seasoning” of materials with single atoms or nano-atomic compounds can give them completely new properties, allowing us to make: 


• foils that conduct electricity, 

• semiconductors with precisely controlled characteristics (which have been the foundation of computer technology for decades), and 

• creams that filter out UV components from sunlight. Nanotechnology can also be used to replicate goods that have evolved naturally. 


 

Spider silk is a fine thread that is just a few thousandths of a millimetre thick yet is very ductile, heat-resistant up to 200 degrees, and five times stronger than steel. For decades, scientists have wished to create such a chemical in the lab. This dream has now become a reality. 


  • A mixture of chain shaped proteins and small fragments of carbohydrate with lengths in the nanometer range is the secret of natural spider's thread. 
  • Artificial spider silk may be utilized to make super-textiles that help troops wear blast-resistant gear, athletes wear super-elastic clothes, and breast implant encasements avoid unpleasant scarring. 

Nanomaterials were created and exploited by evolution long before humanity did. We can reconstruct and even enhance these now thanks to quantum physics discoveries.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Nanomaterials - Diamonds Aren't The Only Thing That's Valuable.



Pure nanomaterials have become available today.


Graphite is a fascinating example. 


  • Graphite is a kind of elementary carbon that is commonly used to produce pencil leads. It's just a stack of carbon layers, each one the thickness of a single carbon atom. 
  • Each layer is made up of graphene, a two-dimensional carbon molecule lattice regulated by quantum physics. 
  • For many years, scientists have been researching these ultra-thin carbon layers theoretically. 
  • Their quantum-physical calculations and simulations revealed that graphene must have incredible properties: 200 times the strength of steel, outstanding electrical and thermal conductivity, and transparency to visible light. 
  • They merely needed verification that their theoretical calculations were true in practice. 
  • Andre Geim and Konstantin Novoselov then succeeded in isolating pure graphene in 2004. Their plan was to use a graphite-based adhesive tape to remove it. 
  • In 2010, Geim and Novoselov were awarded the Nobel Prize in Physics for their work. Has a Nobel Prize in Physics ever been granted for anything so simple? 


Graphene is the world's thinnest substance, with thicknesses on the order of one nanometer. 


  • At the same time, its atoms are held together by densely packed “covalent” chemical bonds, which bind them all. 
  • There are no flaws in this material, no areas where it may break, in a sense. 
  • Because each carbon atom in this composite may participate in chemical processes on both sides, it exhibits exceptional chemical, electrical, magnetic, optical, and biological capabilities. 


Graphene might be used in the following ways: 


• Clean drinking water production: graphene membranes may be utilized to construct extremely efficient desalination facilities. 

• Energy storage: Graphene may be utilized to store electrical energy more effectively and long-term than other materials, allowing for the creation of long-lasting and lightweight batteries. 

• Medicine: graphene-based prosthetic retinas are being studied by experts (see below). 

• Electronics: graphene is the world's tiniest transistor. 

• Special materials: graphene might potentially be utilized as a coating to create flexible touchscreens, allowing mobile phones to be worn like bracelets. 


The EU believes graphene-based technologies have such promising futures that it designated research in this subject as one of two initiatives in the Future and Emerging Technologies Flagship Initiative, each with a one-billion-euro budget. 

The Human Brain Project is the other sponsored project, but a third has emerged in the meantime: the flagship project on quantum technologies. 

Graphene, a nanomaterial, is thought to be a future wonder material.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Microelectronics To Nanoelectronics

 


Doped silicon crystals are the basis of modern microelectronics. We've been pursuing the path from micro to nanoelectronics for quite some time now. 

And some of Feynman's vision has already come to fruition. In 1959, he claimed that a particle of dust could contain the information of 25 million books. 

  • One bit must be held in 100 atoms to do this. It is now feasible to create elementary storage units with 12 atoms. So there's capacity for over 250 million books on a particle of dust. 
  • Carbon nanotubes, commonly known as nanotubes, are an example of future nanomaterials in electronics. 
  • Graphene layers have been rolled into tubes to create small carbon cylinders with a diameter of roughly 100 nanometers. 
  • Only the rules of quantum physics can explain their unique electrical characteristics. 
  • Because the electrons pass through the Nano tube almost without interference, i.e. without being deflected by blocking atoms as they would be in a metallic conductor, they carry electronic currents better than any copper conductor, depending on the diameter of the tube. 


Stanford University researchers have built a functioning computer with 178 nanotube transistors.  It possesses the processing capacity of a 1955 computer, which could occupy an entire gymnasium. Even farther, the nanomaterial "silicene" goes. 


  • Atoms are stacked in two-dimensional layers with honeycomb patterns, similar to graphene. But, unlike graphene, which is formed of carbon, silicene is a foil formed of elementary silicon, a semiconductor, which makes it particularly attractive for computer chip fabrication. 
  • The first transistor constructed of silicene was constructed in 2014 by researchers at the University of Texas. 

Despite the fact that silicene's manufacturing and processing are still technically challenging (it decays when exposed to oxygen, for example), there is high expectation that this substance can dramatically improve the performance of computer chips. 


  • Transistors made of nanotubes or silicon might be switched significantly quicker, resulting in significantly more powerful computer processors. 
  • The creation of nanotubes for use in computers, on the other hand, is not the end of the narrative.


Physicists and computer designers want to employ single molecules as transistors in the future. In reality, by flipping a switch, some organic molecules may be transformed from electrically conductive to insulating.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



When Biotechnology And Nanotechnology Collide



Richard Feynman foresaw sixty years ago that nanoparticles and nanomachines may be extremely useful in medicine. 


This aspect of his vision is also coming to fruition right now. Here are three examples of things that are already being done: 


Nano-Retina, an Israeli startup, has invented an artificial nano-retina that allows the blind to sight again. 4: it is made up of a small, flat implant with a high-resolution network of nano-electrodes. The nano-retina activates the optic nerve, causing incoming light particles to be collected by the electrodes and relayed to the brain as visual sensations. 

Nano biosensors detect antibodies and particular enzymes in human bodily fluids in a lab on a chip. On a credit card-sized chip, just one-thousandth of a millilitre of blood, urine, or saliva (or even less) is put. When it comes into touch with the desired substance, the nanoparticles embedded in it detect certain chemical, optical, or mechanical changes. As a result, the chip can identify a variety of medical signs in only a few minutes. 

Nanoparticles deliver medications directly to locations of inflammation or mutant cells, allowing for a more effective pharmacological assault. Because blood is as sticky as honey for such small particles, the topic of how to transport nanostructures in the blood has remained unanswered for a long time. Magnetic fields, for example, may now be used to direct them. Bioengineers want to utilize them in precision chemotherapies against cancer cells, among other things. 


Nano-robots, often known as "nanobots," are extremely small nano-robots that hold great promise in medicine. Every two years, we'd go to the doctor for a health checkup, which would be replaced with a continuous nano-check. 


  • Nanobots would roam our bodies indefinitely, detecting viruses, gene changes, and harmful deposits in the circulation before they became a problem. 
  • They would then start treatment right away by administering medications straight to the illness location. 
  • They'd combat infections, reduce inflammation, remove cysts and cellular adhesions, unblock clogged arteries to avoid strokes, and even do surgery. 
  • They would submit the results immediately to the family doctor if required, who would then contact the patient to schedule an appointment. 
  • Many small nano-robots—biomarkers, labs-on-a-chip, and other telemedical devices—permanently circulate inside our bodies for health care and healing, according to doctors. Nanoparticles, also known as nanobots, might be employed in our food. 
  • They would assist us in digesting food in such a way that nutrients are absorbed as efficiently as possible by our bodies. This would be beneficial in the treatment of disorders that now necessitate a tight diet. 


Researchers are also working on developing meals with nanoparticles on the surface that would mimic the flavor of chips, chocolates, or gummy bears while being nutritious and even healthful.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Ultra-Small Nano Machines - Masters Of The Nano-World



Our growing technical mastery of the nanoworld will open up a plethora of new technical possibilities, including Feynman's vision of ultra-small machines operating at the level of single atoms. 

  • Nanowheels, nanomotors, and even a nano-elevator have previously been constructed.
  • There is a nano-car with four distinct motors installed on a central support, powered by the tip of a scanning tunneling microscope. 

Nanotechnologists can make things even smaller. 


  • A single bent thioether molecule lying on a copper surface makes up the world's tiniest electric motor, which is only a nanometre in size. 
  • Two differing length hydrocarbon chains (a butyl and a methyl group) hang like small arms on a central sulphur atom in this molecule. 
  • The whole molecule is connected to the copper surface in a way that allows it to freely spin. It is powered by a scanning tunneling microscope, whose electrons use the tunnel effect to excite the molecule's rotating degrees of freedom. 
  • The electrical current and the outside temperature can affect the motor's operating speed.  Nanomachines are currently being developed. 
  • The molecular motor is on par with the electric motor in the 1830s in terms of progress. Nobody could have predicted that the electric motor would one day be used to power trains, dishwashers, and vacuum cleaners in 1830. 


When voting on the 2016 Nobel Prize for Chemistry, the Nobel Prize Committee in Stockholm foresaw a comparable promise for molecular nanomachines. 

Molecular motors are anticipated to be employed in sensors, energy storage systems, and the production of novel materials in the near future. 


Nanotechnology has progressed in a number of ways that have mostly gone unnoticed by the general public: 


• The first generation of nanotechnology products, such as Damascus steel, were still passive materials with well-defined properties that did not change when used. 

• The second generation of nanotechnology products, on the other hand, produced tiny machines that “do work”—in other words, they drive an active process, such as a transport vehicle for targeted drug delivery in the body (see below). Nanostructures now interact and react directly with other substances, causing them to change and/or their surroundings. 

• A third generation of nanotechnologies, known as "integrated nano-systems," is already on the horizon. Various active nano-components, such as copiers, sensors, motors, transistors, and so on, are employed as components and built into a working whole, similar to how an engine, clutch, electronics, tires, and so on, when combined, become a car. This paves the door for more complicated nanomachines to emerge.

 

Couple nanostructures with varied characteristics and capacities into sophisticated nanomachines is the next stage in nanotechnology.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Nanotechnology's Possibilities: Technology On The Smallest Scales



We currently employ nanotechnology in a variety of ways, but only a small percentage of the population is aware of it. Nanotechnology, in addition to the quantum computer, is the most interesting prospective technological application of quantum theory. 


Many of its uses are now part of our daily routines. Some examples include: 


• Sun cream lotions that use nanotechnology to give UV protection. 

• Nanotechnologically treated surfaces for self-cleaning window panes, scratch-resistant automobile paint, and ketchup that pours evenly from the bottle. 

• Textiles coated with nanoparticles to reduce perspiration odor. Antibacterial silver particles, for example, keep bacteria from turning our odorless perspiration into a foul-smelling body odor. 


The upcoming nanotechnologies are even more amazing. 

Nano-robots that automatically and permanently detect diseases in human bodies, as well as autonomous nanomachines that can generate almost anything from a mound of soil. 

Nanotechnology has long been ingrained in our daily lives, but this technological outgrowth of quantum physics has a brighter future. 

One can get the notion that “nano” is the key to everything fascinating and futuristic. 


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Potential of Quantum Computing Applications



Despite the threat that the existence of a large-scale quantum computer (an FTQC) poses to information security, the ability of intermediate-scale (NISQ) processors to provide unprecedented computing power in the near future opens up a wide opportunity space, especially for critical Defense Department applications and the Defense technology edge. 

The current availability of NISQ processors has drastically changed the development route for quantum applications. 

As a result, a heuristics-driven strategy has been developed, allowing for significantly greater engagement and industry involvement. 

Previously, quantum algorithm research was mostly focused on a far-off FTQC future, and determining the value of a quantum application needed extremely specialized mathematical abilities. 

We believe that in the not-too-distant future, this will no longer be essential for quantum advantage to be practicable. 

As a result, it will be critical, particularly the Defense Department and other agencies, to have access to NISQ devices, which we anticipate will enable for the development of early mission-oriented applications. 

While NISQ processors do not pose a danger to communications security in and of itself, this recently obtained intermediate regime permits quantum hardware and software development to be merged under the ‘quantum advantage' regime for the first time, potentially speeding up progress. 


This emphasizes the security apparatus's requirement for a self-contained NISQ capability.




Quantum Computing Threat to Information Security



Current RSA public-key (asymmetric) encryption systems and other versions rely on trapdoor mathematical functions, which make it simple to compute a public key from a private key but computationally impossible to compute the converse, a private key from a public key.

The difficulties of integer factorization and elliptic curve variations of the discrete logarithm issue, both of which have no known solution for computing an inverse in polynomial time, are exploited to create frequently used trapdoor functions (that is, on a finite timescale). 


In a nutshell, this so-called "computational hardness" provides safety. 


In 1994, however, Peter Shor proposed a quantum method that may be employed on a sufficiently large-scale quantum computer to perform integer factorization in polynomial time. 

The now-famous quantum technique has now been proved to solve the discrete logarithm and elliptic-curve logarithm problems in polynomial time as well. 


As a result of the creation of an FTQC in conjunction with this quantum algorithm, the security of present asymmetric public-key cryptography is jeopardized. 

Furthermore, Shor's method exemplifies how advances in the mathematics and physical sciences have the potential to jeopardize secure communications in general. 


In addition to Defense Department and critical cyber infrastructure systems, the world's digital revolution, which includes 4 billion internet users, 2 billion websites, and over $3 trillion in retail transactions, is backed at multiple tiers by existing public-key cryptography. 


While the creation of an FTQC is estimated to be at least a decade or two away, there is still a pressing need to solve this issue because of the ‘record now, exploit later' danger, in which encrypted data is collected and kept for subsequent decryption by an FTQC when one becomes available. 

As a result, the US National Institute of Standards and Technology's Post Quantum Cryptography Project, which includes worldwide partners—a security "patch" for the internet—is prioritizing the development of new "quantum hard" public-key algorithms.




Quantum Computing - A New Way to Compute

 


Google formally debuted their newly created Sycamore quantum processor in 2019 and claimed to have completed the first computation that was simple for a quantum computer but extremely challenging for even the most powerful supercomputers. 

Previously, continuous breakthroughs in transistor fabrication technology had propelled the world's ever-increasing computer capability. Computing power has increased dramatically during the last 50 years. 


Despite these significant technical advancements, the underlying mathematical laws that govern computers have remained basically constant. 

Google's demonstration of so-called "quantum supremacy," also known as "quantum advantage," was based on 30 years of advancements in mathematics, computer science, physics, and engineering, and it heralded the start of a new era that might cause considerable upheaval in the technology landscape. 

Traditional (‘classical') computers work with data encoded in bits, which are often represented by the presence (or absence) of a little electrical current. 


According to computational complexity theory, this option leads to issues that will always be too expensive for traditional computers to solve. Simply put, the traditional cost of modelling complicated physical or chemical systems doubles with each extra particle added. 

In the early 1980s, American Nobel Laureate Richard Feynman proposed quantum computers as a solution to avoid this exponential expense. 


Information is encoded in quantum mechanical components called qubits, and quantum computers manage this information. 

Qubits are encoded by superconducting electrical currents that may be modified by precisely engineered electrical componentry in Google's Sycamore processor, for example. 

The ‘factoring problem,' in which a computer is entrusted with identifying the prime factors of a big number, remained an academic curiosity until quantum computers were shown to be capable of solving it effectively. 


The RSA public-key cryptosystem, which is a cornerstone of internet security, is based on this key issue. 

With that finding, a flurry of research activity erupted throughout the world to see if quantum computers could be developed and if so, how powerful they could be.




Post Quantum Computing Encryption - Future-Proofing Encryption



Encryption in the post-quantum era. 


Many popular media depictions of quantum computing claim that the creation of dependable large-scale quantum computers will bring cryptography to an end and that quantum computers are just around the corner. 

The latter point of view may turn out to be overly optimistic or pessimistic, if you happen to rely on quantum-computing-proof security. 

While quantum computers have made significant progress in recent years, there's no certainty that they'll ever advance beyond laboratory proof-of-concept devices to become a realistic daily technology. (For a more thorough explanation, see a recent ASPI study.) 


Nonetheless, if quantum computing becomes a viable technology, several of the most extensively used encryption systems would be vulnerable to quantum computer cryptography assaults because quantum algorithms may drastically shorten the time it takes to crack them. 


For example, the RSA encryption scheme for the secure exchange of encryption keys, which underlies most web-based commerce, is based on the practical difficulty of finding prime factors of very big integers using classical (non-quantum) computers.

However, there is an extremely efficient quantum technique for prime factorization (known as ‘Shor's algorithm') that would make RSA encryption vulnerable to attack, jeopardizing the security of the vast quantity of economic activity that relies on the ability to safeguard moving data. 

Other commonly used encryption protocols, such as the Digital Signature Algorithm (DSA) and Elliptic Curve DSA, rely on mathematical procedures that are difficult to reverse conventionally but may be vulnerable to quantum computing assaults. 


Moving to secure quantum communication channels is one technique to secure communications. 


However, while point-to-point quantum channels are conceivable (and immune to quantum computer assaults), they have large administration overheads, and constructing a quantum ‘web' configuration is challenging. 

A traditional approach is likely to be favored for some time to come for applications such as networking military force units, creating secure communications between intelligence agencies, and putting up a secure wide-area network. 


Non-quantum (classical) techniques to data security, fortunately, are expected to remain safe even in the face of quantum computer threats. 


Quantum assaults have been found to be resistant to the 256-bit Advanced Encryption Standard (AES-256), which is routinely employed to safeguard sensitive information at rest. 

Protecting data at rest addresses only half of the problem; a secure mechanism for transferring encryption keys between the start and end locations for data in motion is still required. 


As a result, there's a lot of work being done to construct so-called "post-quantum" encryption systems that rely on mathematical processes for which no quantum algorithms exist. 


IBM has already detailed a quantum-resistant technology for safely transporting data across networks.  If the necessity arises, such a system might possibly replace RSA and other quantum-vulnerable encryption systems.



If everything else fails, there's always encryption technologies for the twenty-first century. 


One technique to improve communication security is to be able to ‘narrowcast' in such a way that eavesdropping is physically difficult, if not impossible. 

However, this is not always practicable, and there will always be messages that must pass over channels that are sensitive to eavesdropping. 


Even so-called "secure" channels can be breached at any time. 


The actual tapping of a subsea cable run to a Soviet naval facility on the Kamchatka Peninsula by the US Navy in the 1970s is a good example. The cable was deemed safe since it ran wholly within Russian territorial seas and was covered by underwater listening posts. 

As a result, it transmitted unencrypted messages. The gathered signals, though not of high intelligence value in and of themselves, gave cleartext ‘cribs' of Soviet naval communications that could be matched with encrypted data obtained elsewhere, substantially simplifying the cryptanalytic work. 

Even some of the LPI/LPD technology systems discussed in earlier sections may be subject to new techniques. 

For example, the Pentagon has funded research on devices that gather single photons reflected off air particles to identify laser signals from outside the beam, with the goal of extracting meaningful information about the beam direction, data speeds, and modulation type. The ultimate objective is to be able to intercept laser signals in the future.  


A prudent communications security approach is to expect that an opponent will find a method to access communications, notwithstanding best attempts to make it as difficult as possible. 


Highly sensitive information must be safeguarded from interception, and certain data must be kept safe for years, if not decades. Cryptographic procedures that render an intercepted transmission unintelligible are required. 

As we saw in the section on the PRC's capabilities, a significant amount of processing power is currently available to target Australian and ally military communications, and the situation is only going to become worse. 

On the horizon are technical dangers, the most well-known of which is the potential for effective quantum computing. Encryption needs to be ‘future proofed.'


As secure intermediates, space-based interconnections are used. 


If the connection can be made un-interceptable, space-based communications might provide a secure communication route for terrestrial organizations. Information and control signals between spacecraft and the Earth have been sent by radio waves to and from ground stations until now. 

Interception is achievable when collection systems are close enough to the uplink transmitter to collect energy from either the unavoidable side lobes of the main beam or when the collection system is able to be positioned inside the same downlink footprint as the receiver. 

The use of laser signals of various wavelengths to replace such RF lines has the potential to boost data speeds while also securing the communications against eavesdropping. 


Using laser communication connection between spacecraft has a number of advantages as well. 

Transmission losses over long distances restrict the efficiency with which spacecraft with low power budgets can exchange vast amounts of data, and RF connections inevitably restrict bandwidth. 


The imposts on space, weight, and power on spacecraft would be reduced if such linkages were replaced by laser communications. 

The benefits might include being able to carry larger sensor and processing payloads, spending more time on mission (owing to reduced downtime to recharge batteries), or a combination of the two. 

In the United States, the Trump administration's Space Force and anticipated NASA operations (including a presence on the moon and deep space missions) have sparked a slew of new space-based communications research initiatives. 


NASA has a ten-year project road map (dubbed the "decade of light") aiming at creating infrared and optical frequency laser communication systems, combining them with RF systems, and connecting many facilities and spacecraft into a reliable, damage-resistant network. 

As part of that effort, it is developing various technology demonstrations. 

Its Laser Communications Relay Demonstration, which is set to be live in June, will utilize lasers to encode and send data at speeds 10 to 100 times faster than radio systems.  

NASA uses the example of transmitting a map of Mars' surface back to Earth, which may take nine years with present radio technology but just nine weeks using laser communications. T

he practicality of laser communications has been demonstrated in laboratory prototype systems, and NASA plans to launch space-based versions later this year. The Pentagon's Space Development Agency (SDA) and the Defense Advanced Research Projects Agency (DARPA) are both working on comparable technologies, but with military and intelligence purposes in mind. 


The SDA envisions hundreds of satellites linked by infrared and optical laser communication connections. 

Sensor data will be sent between spacecraft until it reaches a satellite in touch with a ground station, according to the plan. Information from an orbiting sensor grid may therefore be sent to Earth in subsecond time frames, rather than the tens of minutes it can take for a low-Earth-orbiting satellite to pass within line of sight of a ground station. 

Furthermore, because to the narrow beams created by lasers, an eavesdropper has very limited chance of intercepting the message. Because of the increased communication efficiency, ‘traffic jams' in the considerably more extensively utilized radio spectrum are significantly less likely to occur. 

This year, the SDA plans to conduct a test with a small number of "cubesats." Moving to even higher frequencies, X-ray beams may theoretically transport very high data-rate messages. In terrestrial applications, ionization of air gases would soon attenuate signals, but this isn't an issue in space, and NASA is presently working on gigabit-per-second X-ray communication lines between spacecraft.  

Although NASA is primarily interested in applications for deep space missions (current methods can take many hours to transmit a single high-resolution photograph of a distant object such as an asteroid after a flyby), the technology has the potential to link future constellations of intelligence-gathering and communications satellites with extremely high data-rate channels. On board the International Space Station, NASA has placed a technology demonstration.



Communications with a low chance of being detected. 


One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. 

The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves).

 The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fibre-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 


A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 


As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency enables for faster data transfer

• very low-powered signals can still be received over short distances

• signal strength rapidly drops off over a critical distance 


Communications with a low chance of being detected. One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. 


For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves). 

The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fiber-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. 

This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 

A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 

As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. 

Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency allows for faster data transfer 

• very low-powered signals can still be heard over short distances 

• there is a quick drop-off in signal strength at a critical distance







Quantum Cryptography


The Holy Grail of Data Security 


Let's take a closer look at the second item on the list: quantum cryptography. In today's society, data security is a problem that has grown more crucial. 


How can we be sure that no one else has access to our personal digital information? 

Or that third parties don't listen in on our discussions without our knowledge? 


Traditional encryption encrypts a communication with a key code in such a way that decrypting it without knowing the key would demand unreasonably large processing power. But it's like a never-ending competition to build ever-more sophisticated encryption methods that can't be cracked by ever-more powerful computers. 

At least for the dilemma of the unidentified eavesdropper, quantum cryptography offers a solution.

  Quantum key distribution is a critical component of quantum-secure communication: by conveying the key using entangled quantum states of light, any interference in the transmission, such as an eavesdropper in the communication channel, is immediately observable by the user. 

  • Assume A makes a “secure” phone call to B. (in quantum cryptography, A and B are always taken to stand for Alice and Bob). 
  • Both Alice's and Bob's equipment are capable of measuring entangled particles. 
  • When the line is intercepted, Alice and Bob quickly recognize that an undesirable third party (commonly referred to as Eve) is present, because Eve would irreversibly disrupt the entanglement of the particles while listening in, i.e., measuring it for that reason. 
  • She also can't just copy them and transfer the information, the qubit, to the intended recipient without being caught, because it's impossible to duplicate any (yet-to-be-measured) quantum state exactly. 
  • As soon as Alice and Bob observe any changes to their key, or that the entanglement of their particles has been broken, they alter the method of communication and, at least temporarily, prevent the eavesdropper. 


Cryptography relies on a fundamental fact of quantum mechanics: quantum states may never be replicated without affecting the matching state or original information. 


Engineers are currently striving to utilize the odd qualities of the micro universe, which caused so much consternation among physicists in the early part of the twentieth century. 

Physicists went back to the theoretical drawing board during the creation of the first generation of quantum technologies to achieve a proper understanding of the principles that govern the micro universe. Meanwhile, they have made great progress in their efforts. 

Quantum physics and all of its main aspects may now be applied in a technology environment. The fascinating aspect of this approach is that scientists and engineers are working on a whole new universe of possibilities that have never been conceived before, rather than just attempting to make current and familiar things quicker or more exact. 


“The nineteenth century was known as the machine era, the twentieth century will go down in history as the information era,” wrote physicist Paul Davies in 1997. The quantum age, I believe, will begin in the twenty-first century.”



You may also want to read more about Quantum Computing here.





Precision Measurements with Quantum Technology

 


Measurements that are more precise than ever before are now possible thanks to new quantum technologies. 

The precise measurement of physical quantities such as the distance between New York and Boston or the number of electrons flowing through a wire at a particular period may appear to be tedious. 

However, this is not the case. Because, regardless of what is being measured, whether it is meters, seconds, volts, or anything else, the highest level of accuracy may be critical. In this regard, the sensitivity of quantum mechanically entangled states to external shocks can be very beneficial for many measuring applications. 


The measuring of time by atomic clocks is a well-known example of the metrological application of quantum physical processes. 


Optical atomic clocks have been in use for more than 70 years. The characteristic frequency of electron transitions in atoms subjected to electromagnetic radiation determines their temporal interval. 

Incoming electromagnetic waves with a frequency of 9,192,631,770 oscillations per second (in the microwave range) have a maximum resonance for caesium atoms, i.e., a maximum of photons are released at that frequency. 

Humans have a considerably more precise definition of the second than the assertion that one day comprises 86,400 s, thanks to the commonly recognized definition that one second equals 9,192,631,770 of these vibrations. Because atomic clocks are based on the stimulation of numerous caesium atoms and a mean value of the number of released photons is taken, they are extremely precise. 


Now that there are roughly 260 standardized atomic clocks across the world that can be compared to each other, the measurement becomes even more precise, resulting in yet another averaging effect. 


Thanks to a global network of atomic clocks, time measurement is unbelievably precise. Every million years, they are accurate to within 1 second. However, that is insufficiently correct. 

How is that possible? After all, we just need our clock to be precise to the second to ensure that we don't miss the start of our favorite television show. 

However, most of us are unaware that the global navigation system GPS would not function without atomic clocks, as it determines locations by measuring the time it takes for a signal to travel between the device and the GPS satellites. 

The time measurement must be accurate to a few billionths of a second in order to identify our position to within a meter. Similarly, digital communication, in which a huge number of phone calls are sent over a single line at the same time, relies on ultraprecise time measurement. 


Atomic clocks manage the switches that route individual digital signals across the network so that they arrive at the correct receiver in the correct order. 


External disturbances, such as electric fields, can impact the accuracy of atomic clocks. 

These extend the frequency spectrum of the photons being measured, resulting in tiny changes in the resonance frequency and, as a result, in the time being recorded. 

Fluctuations in the terrestrial magnetic field are another factor. Today's GPS and digital communications technologies, as well as high-precision measurements in physics experiments, are limited by this. Even with atomic clocks, time measurement is still too imprecise for some GPS applications or many data transmission channels. 

This weakness would be addressed by a new generation of atomic clocks that take use of quantum entanglement's impact. In each clock in the global network, a few atoms would be quantum mechanically entangled. 

Because a measurement on a single atom of one clock is also a measurement on all others, the clocks will stabilize each other in this way; because to the nature of entanglement, even the tiniest errors within the network of clocks will be instantaneously rectified. 


Quantum physical processes offer another another technique to improve the accuracy of atomic clocks. 


We could account for the unsettling magnetic field variations if we knew how long they lasted in fractions of a second using an adequate error correction approach. Nature demonstrates how the magnetic field may be measured ultra-precisely at the atomic level utilizing the impact of quantum entanglement. 

Many migrating bird species have a magnetic sense that they utilize to navigate hundreds of kilometers to their wintering sites. For a long time, ornithologists were astounded by the precision with which they measured the intensity and direction of the Earth's magnetic field. 


They just discovered a few years ago that birds employ a quantum compass for this reason. Electron pairs are entangled across two molecules by their spins in the robin's eye. 


External magnetic fields are quite sensitive to these entanglements. The electrons revolve in different directions depending on the magnetic field's orientation, which translates to different orientations of their "spin." 

The shift in the orientation of the electron spins of these molecules in the bird's eye is enough to turn them into isomers (molecules with the same chemical formula but different spatial structure). 

The varied characteristics of the isomers are very sensitive to the strength and direction of the magnetic field, generating various chemical processes in the bird's retina that eventually lead to perception—the bird's eye therefore becomes a perfect measuring device for magnetic fields. 


Many species of birds have evolved a form of quantum pair of glasses for magnetic fields. 


They may therefore make their way to their winter lodgings via quantum phenomena. Local gravity fields may be detected extremely precisely utilizing quantum mechanically entangled states, in addition to temporal and magnetic fields, which has sparked major economic interest. 

Today, detailed measurements of the intensity of local gravitational fields are used to find metal and oil resources in the earth. 

Large subterranean gas or water fields can also be detected by local density differences, which result in a slightly greater or weaker gravitational force—but this is a little impact that can only be detected with ultra-sensitive gravity sensors. 

Such measurements might be made much more precise by utilizing the phenomena of quantum mechanical entanglement. Even a single individual might be tracked down using an entanglement-based ultra-sensitive gravity sensor based on the gravitational field formed by their body mass. 


Gas pipelines in the earth, water pipe breaks, sinkholes beneath roadways, and anomalies under a proposed house plot might all be found. 


Furthermore, if archaeologists were able to use gravity sensors to simply "lit up" ancient and prehistoric sites, their work would be substantially simplified. Entanglement-based measuring devices might also detect the small magnetic currents linked to brain function or cell-to-cell communication in our bodies. 

They would allow for real-time monitoring of individual neurons and their behavior. This would allow us to assess the processes in our brain (and body) considerably more precisely than we can now with EEG recordings. 

Quantum magnetic field sensors are already in use for magnetoencephalography (MEG), which uses Superconducting Quantum Interference Devices (SQUIDs) to assess the magnetic activity of the brain (superconducting quantum interference units). Perhaps, in the future, we may be able to capture our thoughts from the outside and feed them straight into a computer. 


Future quantum technologies may, in fact, provide the ideal brain–computer interaction. Much of what has previously been unseen will become visible thanks to measurement instruments based on quantum entanglement.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.








What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...