# Quantum Computing

## Quantum Computing

How does a computer work? The smallest part of a computer is a transistor which is essentially like a switch that permits or prohibits the flow of information. One can predict with absolute certainty whether information is passing through (signal is ‘1’) or not passing through (signal is ‘0’), through billions of transistors that form the circuitry of a modern-day computer. This is why they say that computers are nothing but a set of zeros and ones, which, simply put, are two possible states -true and false- of a ‘bit’ : the most fundamental storage unit for a computer.

Now, did you know, today a transistor is just 5 nanometers, which is about 1500 times smaller than a red blood cell! There is a problem that arises, however, over the decades we have been making our computers smaller, more efficient and more powerful by packing smaller and smaller transistors, but we might reach the limit soon. One who is familiar with the Moore’s law is aware that the number of transistors on a chip will double every 2 years as the cost of the computer is halved. But till how far is this true? The fundamental laws of physics, which once helped us progress in the field of making transistors, will become a hurdle for our progress in making transistors much smaller.

Interestingly, as they get smaller and smaller, laws of quantum physics start to apply, which are much different than the laws of classical physics. The main problem quantum physics laws pose in today’s computers, though, is that even if the information in a ‘bit’ is not supposed to flow through the transistor (signal is ‘0’), a small enough particle could just go on the other side of the transistor by something called quantum tunnelling, changing the actual flow of information. This will make modern computers impossible to work with.

However, as they say, there is always a silver lining! An interesting opportunity exists to

advance computing exponentially, by building quantum computers, which can work under the laws of quantum physics. Instead of encoding data into bits which are either 0 or 1, quantum computing deals with quantum bits (or, q-bits), which can be both 0 and 1 at the same time. Quantum computers can exploit this ability to process multiple instructions simultaneously. For example, if a password is stored in 128 bits, a normal computer will need to sequentially process 2^128 iterations (3.4 X 10^38). A quantum computer can process all possible outcomes at the same time, and provide one ‘potential’ answer in fraction of seconds. However, practical difficulty today is that these potential answers are prone to errors; fortunately, work is ongoing to make this promising concept accurate!

A transition in today’s world itself can be seen where government agencies and the top data companies of the world are using quantum computers or supercomputers to do analysis and calculations. It is hence apt to say that the future of technology may lie in quantum computing…

Sources