Our lives, at present are highly dependent on technology. It has revolutionised our lifestyles entirely. From ways we communicate to the means by which we travel, from the food we eat to the apparels we wear, from healthcare to education systems. It’s quite mid-boggling to even imagine how we live our lives now as opposed to a couple of decades ago. In the words of Edward Teller, “The science of today is the technology tomorrow”. The rate at which technology is advancing is simply astounding. That being said, it’s intuitive for one to question, “What could be the next big thing in technology?”.

## The Modern Processor Architecture is still the Old Processor Architecture

A quick browsing on the web will throw at you a few concepts like, Mixed Reality, Artificial Intelligence, 3-D printing, Block Chain and even Internet of Things (IoT). But I’d like to approach this question from a rather different standpoint. Well, true enough that all the above have quite the hype among business and individuals these days and would undoubtedly disrupt the way we do things, but with a little bit more thought you’d realize, all these technologies do one thing in common; which is essentially yet unsurprisingly, processing data to provide some meaningful result.

The difference actually lies in the kind of output provided in each case. But if you look at how processing takes place (this is the place where the magic happens), they all still depend on the same, fundamental, “ancient” processor architecture. Yes, I did quote ancient because, Surprise! Surprise! these modern magic making devices are still based on more or less the same processor architecture of the EDVAC (Electronic Discrete Variable Automatic Computer for short) designed around 1949 *(told you it’s ancient!)*.

Diving a bit further into processor fundamentals, from a software processing point of view (myself being a software engineer) and not digging into the nitty grities of the electronics of the modern processor, today’s processor is essentially based off an architecture called the Von Neuman Architecture, designed by an American mathematician and physicist John Von Neuman back in 1945. In essence, it specifies that program instructions must be processed sequentially (one at a time) and that the program instructions and data may reside in the same memory. At the rate at which we see processing take place at present, it’s mind baffling to even comprehend the fact that, for a single processor, at any given instant, it is handling not more than just a single program instruction! What’s even more alarming is to realise that these “intelligent” processing beasts are as intelligent to comprehend nothing beyond a “1” and “0” (binary).

## How Quantum Computing Works

I often used to imagine, that if we were able to do so much using devices that could hardly differentiate between a “1” and a “0”, what we could accomplish if we made these devices to comprehend something at least a bit more than just binary. This is when I developed an enthusiasm towards Quantum Computing.

**Qubits, Superposition, Quantum Entanglement and all that Jazz**

Derived from quantum mechanics, this computation system uses the help of quantum particles called quantum bits or qubits for the representation of data as opposed to bits in the form of electric voltages (high and low being 1 and 0 respectively) in traditional computational systems.

Quantum computing harnesses the quantum nature of particles such as adding two or more quantum states to result in another valid quantum (called quantum superposition) and the fact that particles perform as a whole “system” rather that behaving individually (called quantum entanglement).

Quite a bit of jargon but what does this really mean?

## How Quantum Computing can Revolutionize Technology

For instance, two bits in your computer can be in four possible states (00, 01, 10, or 11), but only one of them at any time. This limits the computer to processing one input at a time.

In a quantum computer however, two qubits can also represent the exact same four states (00, 01, 10, or 11). The difference is, because of quantum superposition, the qubits can represent all four states at the same time. That’s a bit like having four regular computers running side-by-side. Well the gist is that, compared to classical computing, quantum computing is a much better candidate for processing larger volumes of data simultaneously.

Quantum computing is extremely good at tasks like factorising large prime numbers and performing large permutations at blazing fast speeds due to the approach it takes in processing. For example, RSA encryption algorithm, which is one of the most widely used encryption algorithm at present that would require a classical computer light years to crack is rendered useless against a quantum computer (how so? check this out! Why RSA is useless against a quantum computer).

## Quantum Computing for AI, Machine Learning and Big Data. Very Briefly.

Artificial intelligence, machine learning and cryptography are among few of the technologies that may be complemented by the power of quantum computing.

The adoption of this new computation methodology could mean that we could gain access to more effective and higher quality drugs, more accurate predictions using machine learning algorithms such as better traffic prediction and personalized shopping recommendations, safer airplanes governed by much more complex and sophisticated software than the ones at present and an opening of a portal with a realm of smarter, more powerful and even more passionate super organisms.

## When can we Expect Quantum Computing to Become Mainstream?

Having said that, the practical approach to applying the theory of quantum computing is not as straight forward and poses much challenge in materializing the idea. D-Wave, Google and NASA are among few of the giants that have already begun exploration on the possibility of commercializing the concept. D-Wave is already attempting to make available quantum computation as a service. The environmental conditions required however, such as the 80mK (~-273 Celsius) temperature needed to bring out the quantum nature of particles owe to some of the practical challenges in squeezing a quantum into your average laptop or mobile phone.

Moreover, although a quantum computer does possess the potential to crack a (current) state-of-the-art algorithm in a fraction of time, it may not be able to prove itself more supreme against a classical computer in the concatenation of two string literals due to the difference in the nature of processing required in the two scenarios. Algorithms also would have to be redesigned to be able to run optimally on a quantum computer.

Nevertheless, my hopes remain positive in the future of a world driven by quantum computers. It may not appear as fast as in 2020 but surely quite soon enough. This is my take on what the next big thing of technology may hold, what do you think? I’d like to hear your thoughts too. Feel free to comment and discuss below on what you feel about this article and perhaps even shed some light on what you may feel if you think otherwise.

In the meantime, you might find following references useful:

*IEEE Spectrum:* Quantum Computer Comes Closer to Cracking RSA Encryption*Phys.org:* How quantum effects could improve artificial intelligence*Time Magazine:* 9 Ways Quantum Computing Will Change Everything*Wired:* Quantum Computing Is Real, and D-Wave Just Open-Sourced It*DWave:* Introduction to the D-Wave Quantum Hardware*Cosmos Magazine:* Quantum computing for the qubit curious

**Image Courtesy:**

*Header image from unsplash.com/@nasa*

*D-Wave Two system from dwavesys.com*