## What is quantum computing vs. what is not quantum computing

8

2

That is to say,

• what are some common or popular misconceptions about what constitutes quantum computing?

and

• how are those things misconceptions?

It could help in explaining to frame this while imagining as if you were explaining for a layperson, such as Sophie (your meek-and-kindly-but-fox-news-informed great-aunt twice-removed), who has asked you out of genuine curiosity after hearing it referenced in passing multiple times on TV, radio, etc.

Sophie isn't looking for a career in computing, never-mind quantum computing (She's a darned good seamstress herself), but does grok basic maths and logic despite her technologically simpler lifestyle.

Sophie would like to know some mildly political things; such as how and why we fund studies in quantum-computing, why quantum-computing is studied, what we're actually using it for, as well as some mildly technical things; such as why her regular computer isn't "quantum" since it computes "quantities", how quantum-computers are any faster than her Athlon XP with Windows 2000, why doing things the way we've done them in traditional computing isn't satisfactory by itself, and when she can get that quantum-pocket-calculator that answers her questions before she asks them.

Of note: I am not Sophie nor do I have any aunt Sophie (to the best of my knowledge anyways; quantum-aunts notwithstanding!).
I asked this question because I've read and heard a lot of random snippets of information on the topic, from which I have my own basic comprehension, but not an understanding which is strongly communicable to other people.
Being slightly more computer-informed than other people around am also asked to try and explain the topic of quantum-computing for laypeople.
Obviously I'm hardly an ideal teacher on the subject, but truncating conversations on the topic to the likes of "I know its not what you just described but I can't tell you how it's not that" never sits well with me, hence the rather arbitrary framing I offered.

1Fox news reported S.2998? – AHusain – 2018-09-02T16:33:16.997

7

The difficulty with explaining quantum computing is that quantum objects and processes have no direct classical analogue; they're an entirely new ontological category. For example, you might have learned in high school physics that light "is both a particle and a wave" in an attempt to relate it to two classical objects you can intuitively understand. In truth, light is neither a particle nor a wave, but rather a quantum phenomenon - something else entirely which requires learning a whole different language to understand. That language is mathematics.

This is far beyond what your average curious layperson (or those in the political process holding the science grant money purse strings) is willing to expend learning about quantum computing, so it behooves us to come up with simpler explanations. We can group these explanations into two categories:

1. What do quantum computers enable us to accomplish?
2. How do quantum computers work?

The first question has been covered at length in zillions of pop science articles. Breaking RSA, simulating molecular interactions, all that stuff. Your question is specifically aimed at the second: how does a quantum computer work?

Usually when someone asks me this, I start out by asking them how they think classical computers work. If they don't have a somewhat-correct understanding of that, I just say quantum computers make use of weird quantum phenomena like superposition and entanglement to solve certain problems faster.

If they sort of understand how classical computers work, I might explain that superposition enables us to calculate, in some restricted sense, with the values of both 0 and 1 at the same time. However, I make sure to emphasize that this does not mean quantum computers "try every possible value simultaneously" and in the end if you have $n$ qbits you can only get $n$ bits of classical information from your quantum computation. I explain that we use a phenomenon called quantum interference to make getting the wrong answers less likely, and the right answers more likely.

Depending on how curious the person is, I might walk them through an explanation of the 2-norm similar to Scott Aaronson's approach in Quantum Computing Since Democritus. It helps if you make this entertaining and engaging for the person, using something like "imagine you're a god, and you're gonna make your own version of the universe; now, just for fun, you want to mess with the way probability works..." etc. Then you can use the concept of "negative probability" to expand on the interference explanation.

Past this point, we're getting into where you'd want to start using linear algebra to show them how things actually work, so I usually just direct them to my talk or run through it live depending on how much I like them.

I think that terms such as "understanding" are variable in their meaning and very subjective. I can give understanding English as an example. Take an average native speaker, they can communicate effectively in most situations. There are say one million words in English. The average person knows maybe 50,000 words. So they do not know a lot of words. Do they still understand English? The answer is yes. The reason is that understanding is a variable concept. That is why measurement is always effective. Understanding is measured by an outcome of some kind: people understand them. – Trevor Lee Oakley – 2018-09-02T17:28:39.340

3

There are lots of separate questions in there: politics, physics, etc. and I won't pretend to answer all of it, but let me try to get towards what I think is the core of the matter.

How do I explain to the interested non-specialist what I do (the general field)?

My explanation actually varies a lot depending on who I'm talking to, and depends a lot on having a two-way conversation about it. However, one approach that I quite like, because it doesn't tell any lies or half-truths, it helps avoid some of the common misconceptions, and doesn't require any mind-blowing quantum stuff that I don't really have the right language to describe (apart from with maths), is the following:

I assume people are familiar with the concept of logic gates. Now, a computation is just a sequence of logic gates all wired up in a specific order for that specific computation. And you can think of a processor as something that can dynamically rewire different circuits. (Of course, the connectivity isn't dynamically changed, it's done with extra gates.) But the point is that every processor, from your basic pocket calculator up to the most powerful supercomputer, is constructed out of the same fundamental set of logic gates. (In fact, just one type of gate is sufficient. The NAND gate, for example, is 'universal' for classical computation).

Now, imagine I suddenly gave you a new type of logic gate that cannot be straightforwardly made out of your existing set of gates. That instantly gives you a low of potential to rewrite your existing software to take advantage of this new gate. That does not mean that every algorithm will be faster. Some won't benefit at all, while some specific ones might show ridiculous speed improvements. Do we know what proportion fall into each camp? Not really, until you go and work on the algorithms extensively.

What we do know in the case of quantum computation is some specific examples that are radically faster than the existing classical algorithms. Things like Shor's algorithm for factoring numbers, and Grover's search.

If it's that simple, do we have quantum computers already?

Not really. That extra logic gate is hard to build, and you can't just interface it directly with the existing logic gates; you have to build those from scratch as well in a new technology (and, really, we're still scratching around trying to figure out what a good technology to use actually is). It's a work in progress that's receiving a lot of attention at the moment. There are small working prototypes which are potentially on the verge of performing computations that we can't do classically, but they're still a long way from useful implementations of algorithms such as Grover of Shor.

Why should you care?

That depends on what your interests are. For Sophie, unless she's ultra-paranoid about her tap-and-go seamstress payment system being hacked, she probably doesn't care so much about the practical side. It's not yet clear that it's going to have a significant bearing on things. She can probably trust her payment system provider to take care of things as much as they can, and forget about it.

Why do I care? Quantum Mechanics is mind-blowing. I'm really excited not to just be a part of trying to categorise its effect, and explain certain things that are going on, but to harness that quantum weirdness and bend it to my will (cue maniacal laughter!)

Why does it get funding? Well, there are a few reasons (and I'm going to make some very sweeping statements here that are too broad). For one, the few specific killer applications that we have are really useful to specific groups. Shor's algorithm will allow government agencies to spy on communications that use existing public key cryptography systems (practically everybody). Grover's search will help with all sorts of computer sciencey problems which people like Google would really like help with).

Another reason is just to drive science on. As science progresses, whether or not the final outcome is ever realised as originally envisaged, there's all sorts of stuff comes out along the way (the classic example is the space program). Everybody wants to invest to make sure they're at the forefront of those new developments, and it's a game of chicken. Once there's major investment from one party, nobody wants to be left behind. Compared to many options, quantum computing appears reasonably accessible (while still being a huge challenge, it's not string theory) and it's amazing how much quantum mechanics has already permeated existing technology. The processor in your computer might not enable quantum computing (because it doesn't have the extra gate), but it still relies on quantum effects for its manufacture and functioning. Lasers are ubiquitous now (CD, DVD players), and their fundamental operating principle is quantum mechanical.

2

When explaining quantum computers, we must deal with the fact that most people don't know the fundamentals of classical computers. Since the difference lies at this fundamental level, that can present a problem.

Nevertheless, many people know that information technology comes in analogue form as well as digital. That already gives them an intuition that there can be different devices that do the same job, but do it differently and with different strengths and weaknesses. This sets up the place that quantum will take, as a third type of information technology distinct from the other two.

For those that are more well-versed in analogue and digital computing, you could invite them to think about a device that combines the strengths of both. That, I think, is a good way to think about what a quantum computer is: Wave-particle duality becomes analogue-digital duality when we think in terms of computing.

So I guess I would invite your wise aunt to think about issues like these, tailored to her current knowledge.