When we talk about quantum computers, we usually mean fault-tolerant devices. These will be able to run Shor's algorithm for factoring, as well as all the other algorithms that have been developed over the years. But the power comes at a cost: to solve a factoring problem that is not feasible for a classical computer, we will require millions of qubits. This overhead is required for error correction, since most algorithms we know are extremely sensitive to noise.

Even so, programs run on devices beyond 50 qubits in size quickly become extremely difficult to simulate on classical computers. This opens the possibility that devices of this sort of size might be used to perform the first demonstration of a quantum computer doing something that is infeasible for a classical one. It will likely be a highly abstract task, and not useful for any practical purpose, but it will nevertheless be a proof-of-principle.

Once this is done, we'll be in a strange era. We'll know that devices can do things that classical computers can't, but they won't be big enough to provide fault-tolerant implementations of the algorithms we know about. Preskill coined the term 'Noisy Intermediate-Scale Quantum' to describe this era. Noisy because we don't have enough qubits to spare for error correction, and so we'll need to directly use the imperfect qubits at the physical layer. And 'Intermediate-Scale' because of their small (but not too small) qubit number.

So what applications might devices in NISQ era have? And how will we design the quantum software to implement them? These are questions that are far from being fully answered, and will likely require quite different techniques than those for fault-tolerant quantum computing.

1+1 A very clear and concise answer. Makes me wonder: how does software help account for the lack of qubits for error correction? Maybe I will ask this as a formal question here later. – Ntwali B. – 2018-11-18T15:26:57.363