Quantum Computing - A New Way to Compute


Google formally debuted their newly created Sycamore quantum processor in 2019 and claimed to have completed the first computation that was simple for a quantum computer but extremely challenging for even the most powerful supercomputers. 

Previously, continuous breakthroughs in transistor fabrication technology had propelled the world's ever-increasing computer capability. Computing power has increased dramatically during the last 50 years. 

Despite these significant technical advancements, the underlying mathematical laws that govern computers have remained basically constant. 

Google's demonstration of so-called "quantum supremacy," also known as "quantum advantage," was based on 30 years of advancements in mathematics, computer science, physics, and engineering, and it heralded the start of a new era that might cause considerable upheaval in the technology landscape. 

Traditional (‘classical') computers work with data encoded in bits, which are often represented by the presence (or absence) of a little electrical current. 

According to computational complexity theory, this option leads to issues that will always be too expensive for traditional computers to solve. Simply put, the traditional cost of modelling complicated physical or chemical systems doubles with each extra particle added. 

In the early 1980s, American Nobel Laureate Richard Feynman proposed quantum computers as a solution to avoid this exponential expense. 

Information is encoded in quantum mechanical components called qubits, and quantum computers manage this information. 

Qubits are encoded by superconducting electrical currents that may be modified by precisely engineered electrical componentry in Google's Sycamore processor, for example. 

The ‘factoring problem,' in which a computer is entrusted with identifying the prime factors of a big number, remained an academic curiosity until quantum computers were shown to be capable of solving it effectively. 

The RSA public-key cryptosystem, which is a cornerstone of internet security, is based on this key issue. 

With that finding, a flurry of research activity erupted throughout the world to see if quantum computers could be developed and if so, how powerful they could be.

What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...