## Quantum circuits explain algorithms, why didn't classical circuits?

13

4

When explaining a quantum algorithm, many revert to 'circuit-speak' by drawing a diagram of how qubits split off into transformations and measurements, however, rarely if not never would someone explaining a classical math algorithm revert to its representation in binary circuits. I would understand that this is because transforms and such don't exist in the binary world, but:

Doesn't this unnecessary focus on the computational details relating to computing-computing, rather than the mathematical/statistical/optimization problem that the circuitry merely only underlies, detract from the main problem/application at hand? Is the classical mindset just that intuitive and aligned to general human thought, that quantum circuits, on the other hand, will remain to be a standard explanation strategy?

1"..rarely if not never would someone explaining a classical math algorithm revert to its representation in binary circuits" Actually, this was quite common among those involved in the development of computers in the 50's and 60's. In fact, it's how I learned to program (c. 1970). I've always assumed that the reason was that they were more familiar with electronic circuit logic, and because it represented a lower level logic that could explain the behavior of the higher level logic of CPUs and programs. – RBarryYoung – 2019-11-19T20:26:31.090

17

You might find this analogy helpful: the development of quantum algorithms is still in the Booth's multiplication algorithm stage; we haven't quite reached dynamic programming or backtracking. You'll find that most textbooks explain the Booth's algorithm using the following circuit. That is in fact, the method in which the multiplication logic is implemented in most modern processors (with some minor modifications depending on the version). However, this kind of representation quickly becomes tedious when you move to on algorithmic techniques like looping and recursion which may involve multiple multiplication and division steps, among others. It would be crazy for textbooks to explain more advanced algorithms using hardware-level implementations like this. Not to mention that the basic hardware circuitries vary with the processor. If you've ever done assembly language programming this should resonate.

Classical algorithm textbooks like the CLRS evade this problem by framing the algorithms without any particular processor in mind. The basic algorithmic procedures like addition, multiplication, looping, etc. are all considered as black boxes. If you're interested to see the processor-specific implementation of a CLRS algorithm you could certainly write it up in some high-level language like C and then convert it to assembly. Fortunately, compilers do that tedious conversion on our behalf!

Now the interesting part is that the basic building blocks of quantum algorithms are not addition or multiplication as such, but rather operations like Fourier transform and amplitude amplification. Quantum algorithms are largely framed in terms of these basic transformations that are very easy to visualize using quantum circuits (at least, if we're using the gate model). It's really much more about convenience and much less about intuition.

Rest assured that if a textbook ever states a generalized quantum equivalent of the Dijkstra's algorithm it wouldn't be showing you all the gates required to implement it, but rather in terms of the elementary quantum operations whose hardware implementations would largely vary depending on the quantum processor you're using. The bottom line is that we're still in the assembly language stage of quantum computing.

looking forward to when quantum algorithms move beyond the Booth's multiplication algorithm/assembly language stage and compilers start working on our behalf. but thinking operations like Fourier and amplification would still require circuit diagrams. thanks – develarist – 2019-11-19T15:37:32.707

Great answer. As far as I can tell, it matches my experiences described in my comment to the OP. – RBarryYoung – 2019-11-19T20:28:24.843

4

Great answer, the only thing I would disagree with is that quantum algorithms are still in the Booth's multiplication algorithm stage. If you look at the papers referenced at Quantum Algorithms Zoo you will find many that use pseudo-code and have no circuits.

1@KliuchnikovVadym That is true, yes. However, those algorithms are generally not found in introductory textbooks and aren't beginner-level as such. I wanted to give the OP a broad overview; your answer on the other hand emphasizes on the aspect you mentioned. – Sanchayan Dutta – 2019-11-20T18:24:55.420

@KliuchnikovVadym pseudocodes aren't directly shown on the link you provided from what i could see, only descriptions of the algorithms. are the pseudocodes you mention found in the various arxiv links to the articles themselves? if so, what are good pseudocode-only articles on that page for quantum machine/supervised learning? – develarist – 2019-11-20T20:56:16.530

@SanchayanDutta Another example is Quantum Computing: Lecture Notes. It uses a fair amount of pseudo-code and is only slightly beyond the introductory level. For example, see Page 61, Page 54. I do agree that a lot of introductory quantum computing material is very circuit heavy.

1

@develarist, here are a couple of examples of pseudo-code use in quantum machine learning: Quantum Perceptron Models, Generative training of quantum Boltzmann machines with hidden units

7

The state of quantum computing technology is still in its infancy, so implementation details are generally important when considering quantum algorithms. Number of gates, number of operations, types of gates (e.g. Clifford vs. non-Clifford) are often necessary information to evaluate the feasibility and value of a quantum algorithm.

In many cases quantum algorithms are still being optimized, and there are often competing approaches with different trade-offs being considered and iterated. As a result, even publications describing very complex algorithms often include circuit diagrams implementing novel functions to improve efficiency (e.g. Fig. 1: controlled SELECT).

The quantum circuit model is also one of the more intuitive ways to depict quantum computations. Quantum circuits are a restricted form of tensor networks (see e.g. here), which are often used more broadly in both physics and classical computing (particularly in machine learning).

Microsoft seems to be one of the leaders in terms of developing the level of abstraction of quantum computation that you seem to be referring to, embodied in Q#. However, effective abstraction is not always straightforward or necessarily more intuitive (see e.g. here).

7

In classical computing, both circuit diagrams and pseudo-code are used to explain algorithms. The choice between circuits and pseudo-code depends on the context. If the goal is to explain a highly optimized implementation of an algorithm on FPGA, a circuit diagram is probably more suitable. For example, see this paper on AES implementation on FPGA. Pedagogical explanation of AES uses pseudo-code.

Similarly in quantum computing, if one wants to explain a highly optimized implementation of a modular adder, they resort to showing circuit diagrams. Papers focused on more high-level quantum algorithms frequently contain no quantum circuit diagrams and use pseudo-code instead. A good example of such a paper is Quantum algorithm for linear systems of equations. If you look through papers referenced at Quantum Algorithm Zoo, you will find many that have no circuit diagrams in them.

It seems that many people get the impression that 'circuit-speak' is so common because quantum computing is taught from the ground up. Quantum circuits are one of the first concepts many get exposed to when learning quantum computing.

2

There are no classical registers in quantum computing

In classical computers you can have a well defined "current state at a given time" (stored notably in CPU registers and DRAM memory in modern systems), and this state changes with time (each CPU clock) in a controlled way.

Therefore, it is easier to map sequential description of an algorithm back to classical real hardware. For example, a classical algorithm might be described sequentially as:

a = b + c
d = 2 * a


and in a classical computer this might actually be implemented in two separate steps:

• a CPU clock happens
• one ADD instruction that stores the intermediate result to a register that represent a
• a CPU clock happens
• one MUL instruction which stores the final result to a register that represnts d
• a CPU clock happens
• ...

In quantum computing however, you cannot save the "intermediate state of a computation" and operate on it in a later step: you set up the inputs and the circuit, and information flows in a single indivisible step to the sensor device at the end of the circuit which makes a probabilistic reading.

Therefore, unless we are treating quantum circuits as black boxes between classical registers, sequential algorithm descriptions don't make much sense.

It is this fact that makes quantum computers much harder to program.

So a more likely useful description of quantum computing looks more like the combinatorial logic blocks (i.e. blocks with no registers and therefore no state) in hardware description languages such as Verilog and VHDL, which are just textual descriptions of a graph of circuits.

For example, in a Verilog combinatorial block, when you say:

a = b + c


it doesn't mean "on the next clock cycle of the algorithm, the register a will be worth b + c" like in say C or Python.

It rather means:

• a is a wire,
• b is a wire
• c is a wire
• + is an adding circuit with b and c as inputs and a as output

Therefore, as soon as b or c change, a also "immediately" changes. With "immediately" in quotes because in practice electrons do take some time to move, and so we can't take the clock smaller than this propagation time.

A "propagation time" analogue is also present in quantum computers, where each experiment takes some time to finish, and the faster that time, the faster you can rerun the experiment to reduce uncertitude of the result.

Of course, for any maximum input size, you could make one huge combinatorial circuit that implements that algorithm. But in classical computing we don't do that because silicon is expensive to design and produce, so it is much more economical to design a circuit that solves a wider range of problems than a huge specialized circuit, even if each problem is solved a bit less fast.

In quantum computers, you don't have a choice. Unless you can use a divide and conquer style algorithm to generate smaller subproblems (which generally implies a P problem which might not be so interesting to a quantum computer), you just need a minimum number of qubits and gates for each given algorithm.

you're saying classical uses sequential algorithm descriptions while quantum uses sequential logic blocks. both have the word sequential in them so what is the difference? And are these types of circuit graphs (the topic here), versus algorithm pseudocodes? As for classical registers, I have seen bits placed within circuit diagrams next to qubits in many algorithms though, and besides that I wasn't asking why classical circuits aren't used to describe quantum algorithms, but why classical circuits aren't used to describe classical algorithms as abundantly as quantum does quantum – develarist – 2019-11-20T14:02:59.037

@develarist oops, I messed up sequencial and combinatorial, was meant to be combinatorial logic, fixed that. But basically: sequencial description is often more intuitive for classic and more implementable, but it is impossible for quantum: in quantum any algorithm description must somehow translate to a circuit. – Ciro Santilli TRUMP BAN IS BAD – 2019-11-20T14:07:22.360

with sequential description being impossible, this is why they use circuit diagrams instead for quantum algos? – develarist – 2019-11-20T14:10:05.030

@develarist yes, this is what I understand – Ciro Santilli TRUMP BAN IS BAD – 2019-11-20T14:10:57.480

about what you added where variables are treated as wires, where or how then does quantum interference come into play when arriving at the final solution of a quantum algorithm? – develarist – 2019-11-20T14:19:30.780

@develarist I'm not that familiar with quantum, but I believe that in current models interference are introduced in the quantum gates, but it is impossible to observe intermediate states to know when something started taking effect without disturbing the outcome, you can only observe one single operation – Ciro Santilli TRUMP BAN IS BAD – 2019-11-20T14:22:47.460

2

This is a good explanation of why existing abstractions can't be applied to quantum algorithms, but that doesn't mean there won't be new abstractions which are just as powerful as the field develops. There are already programming languages which don't express things as sequential instructions and explicit states - for instance, declarative logic languages such as Prolog or theorem assistants like Coq.

– IMSoP – 2019-11-20T16:09:47.777