## What precisely is quantum annealing?

24

8

Many people are interested in the subject of quantum annealing, as an application of quantum technologies, not least because of D-WAVE's work on the subject. The Wikipedia article on quantum annealing implies that if one performs the 'annealing' slowly enough, one realises (a specific form of) adiabatic quantum computation. Quantum annealing seems to differ mostly in that it does not seem to presuppose doing evolution in the adiabatic regime — it allows for the possibility of diabatic transitions.

Still, there seems to be more intuition at play with quantum annealing than just "adiabatic computation done hastily". It seems that one specifically chooses an initial Hamiltonian consisting of a transverse field, and that this is specifically meant to allow for tunnelling effects in the energy landscape (as described in the standard basis, one presumes). This is said to be analogous to (possibly even to formally generalise?) the temperature in classical simulated annealing. This raises the question of whether quantum annealing pre-supposes features such as specifically an initial transverse field, linear interpolation between Hamiltonians, and so forth; and whether these conditions may be fixed in order to be able to make precise comparisons with classical annealing.

• Is there a more-or-less formal notion of what quantum annealing consists of, which would allow one to point to something and say "this is quantum annealing" or "this is not precisely quantum annealing because [it involves some additional feature or lacks some essential feature]"?
• Alternatively: can quantum annealing be described in reference to some canonical framework — possibly in reference to one of the originating papers, such as Phys. Rev. E 58 (5355), 1998 [freely available PDF here] — together with some typical variations which are accepted as also being examples of quantum annealing?

• Is there at least a description which is precise enough that we can say that quantum annealing properly generalises classical simulated annealing, not by "working better in practise", or "working better under conditions X, Y, and Z", but in the specific sense in that any classical simulated annealing procedure can be efficiently simulated or provably surpassed by a noiseless quantum annealing procedure (just as unitary circuits can simulate randomised algorithms)?

9

My previous answer to an earlier question about the difference between quantum annealing and adiabatic quantum computation can be found here. I'm in agreement with Lidar that quantum annealing can't be defined without considerations of algorithms and hardware.

That being said, the canonical framework for quantum annealing and the inspiration for the D-Wave is the work by Farhi et al. (quant-ph/0001106).

Finally, I'm not sure one can generalize classical simulated annealing using quantum annealing, again without discussing hardware. Here's a thorough comparison: 1304.4595.

(1) I saw your previous answer, but don't get the point you make here. It's fine for QA not to be universal, and not to have a provable performance to solve a problem, and for these to be motivated by hardware constraints; but surely quantum annealing is something independent of specific hardware or instances, or else it doesn't make sense to give it a name.

(2) You're linking the AQC paper, together with the excerpt by Vinci and Lidar, strongly suggests that QA is just adiabatic-ish evolution in the not-necessarily-adiabatic regime. Is that essentially correct? Is this true regardless of what the initial and final Hamiltonians are, or what path you trace through Hamiltonian-space, or the parameterisation with respect to time? If there are any extra constraints beyond "possibly somewhat rushed adiabatic-ish computation", what are those constraints, and why are they considered important to the model?

(1+2) Similar to AQC, QA reduces the transverse magnetic field of a Hamiltonian, however, the process is no longer adiabatic and dependent on the qubits and noise levels of the machine. The initial Hamiltonians are called gauges in D-Wave's vernacular and can be simple or complicated as long as you know the ground state. As for the 'parameterization with respect to time,' I think you mean the annealing schedule and as stated above this is restricted hardware constraints.

(3) I also don't see why hardware is necessary to describe the comparison with classical simulated annealing. Feel free to assume that you have perfect hardware with arbitrary connectivity: define quantum annealing as you imagine a mathematician might define annealing, free of niggling details; and consider particular realisations of quantum annealing as attempts to approximate the conditions of that pure model, but involving the compromises an engineer is forced to make on account of having to deal with the real world. Is it not possible to make a comparison?

The only relation classical simulated annealing has with quantum annealing is they both have annealing in the name. The Hamiltonians and process are fundamentally different.

$$H_{\rm{classical}} = \sum_{i,j} J_{ij} s_i s_j$$

$$H_{\rm{quantum}} = A(t) \sum_{i,j} J_{ij} \sigma_i^z \sigma_j^z + B(t) \sum_i \sigma_i^x$$

However, if you would like to compare simulated quantum annealing with quantum annealing, Troyer's group at ETH are the pros when it comes to simulated quantum annealing. I highly recommend these slides largely based on the Boxio et al. paper I linked above.

Performance of simulated annealing, simulated quantum annealing and D-Wave on hard spin glass instances — Troyer (PDF)

(4) Your remark about the initial Hamiltonian is useful and suggests something very general lurking in the background. Perhaps arbitrary (but efficiently computable, monotone, and first differentiable) schedules are also acceptable in principle, with limitations only arising from architectural constraints, and of course also the aim to obtain a useful outcome?

I'm not sure what you're asking. Are arbitrary schedules useful? I'm not familiar with work on arbitrary annealing schedules. In principle, the field should go from high to low, slow enough to avoid a Landau-Zener transition and fast enough to maintain the quantum effects of qubits.

Related; The latest iteration of the D-Wave can anneal individual qubits at different rates but I'm not aware of any D-Wave unaffiliated studies where this has been implemented.

DWave — Boosting integer factoring perfomance via quantum annealing offsets (PDF)

(5) Perhaps there is less of a difference between the Hamiltonians in QA and CSA than you suggest. $$H_{cl}$$ is clearly obtained from $$H_{qm}$$ for $$A(t)=1,B(t)=0$$ if you impose a restriction to standard basis states (which may be benign if $$H_{qm}$$ is non-degenerate and diagonal). There's clearly a difference in 'transitions', where QA seems to rely on suggestive intuitions of tunnelling/quasiadiabaticity, but perhaps this can be (or already has been?) made precise by a theoretical comparison of QA to a quantum walk. Is there no work in this direction?

$$A(t)=1,B(t)=0$$ With this schedule you're no longer annealing anything. The machine is just sitting there at a finite temperature so the only transitions you'll get are thermal ones. This can be slightly useful as shown by Nishimura et al. The following publication talks about the uses of a non-vanishing transverse field.

arXiv:1605.03303

arXiv:1708.00236

Regarding the relation of quantum annealing with quantum walks. It's possible to treat quantum annealing in this way as shown by Chancellor.

arXiv:1606.06800

(6) One respect in which I suppose the hardware may play an important role --- but which you have not explicitly mentioned yet --- is the role of dissipation to a bath, which I now vaguely remember being relevant to DWAVE. Quoting from Boixo et al.: "Unlike adiabatic quantum computing [...] quantum annealing is a positive temperature method involving an open quantum system coupled to a thermal bath." Clearly, what bath coupling one expects in a given system is hardware dependent; but is there no notion of what bath couplings are reasonable to consider for hypothetical annealers?

I don't know enough about the hardware aspects to answer this, but if I had to guess, the lower temperature the better to avoid all the noise-related problems.

You say "In principle the field should go from high to low, slow enough to avoid a Landau-Zener transition and fast enough to maintain the quantum effects of qubits." This is the helpful thing to do, but you usually don't know just how slow that can or should be, do you?

This would be the coherence time of the qubits. The D-Wave annealing schedules are on the order of microseconds with T2 for superconducting qubits being around 100 microseconds. If I had to give a definitive definition of annealing schedule it would be 'an evolution of the transverse field within a length of time less than the decoherence time of the qubit implementation.' This allows for different starting strengths, pauses, and readouts of field strengths. It need not be monotonic.

I thought maybe dissipation to a bath was sometimes considered helpful to how quantum annealers work, when operating in the non-adiabatic regime (as it often will be when working on NP-hard problems, because we're interested in obtaining answers to problems despite the eigenvalue gap possibly being very small). Is dissipation not potentially helpful then?

I consulted with S. Mandra and while he pointed me to a few papers by P. Love and M. Amin, which show that certain baths can speedup quantum annealing and thermalization can help find the ground state faster.

arXiv:cond-mat/0609332

I think that maybe if we can get the confusion about the annealing schedules, and whether or not it the transition has to be along a liner interpolation between two Hamiltonians (as opposed to a more complicated trajectory), ...

$$A(t)$$ and $$B(t)$$ don't necessarily have to be linear or even monotonic. In a recent presentation D-Wave showed the advantages of pausing the annealing schedule and backwards anneals.

DWave — Future Hardware Directions of Quantum Annealing (PDF)

Feel free to condense these responses however you'd like. Thanks.

Thanks --- I hope that we might be able to pin down some additional details though. (1) I saw your previous answer, but don't get the point you make here. It's fine for QA not to be universal, and not to have provable performance to solve a problem, and for these to be motivated by hardware constraints; but surely quantum annealing is something independent of specific hardware or instances, or else it doesn't make sense to give it a name. (cont'd) – Niel de Beaudrap – 2018-04-04T21:17:50.317

(2) Your linking the AQC paper, together with the excerpt by Vinci and Lidar, strongly suggests that QA is just adiabatic-ish evolution in the not-necessarily-adiabatic regime. Is that essentially correct? Is this true regardless of what the initial and final Hamiltonians are, or what path you trace through Hamiltonian-space, or the parameterisation with respect to time? If there are any extra constraints beyond "possibly somewhat rushed adiabatic-ish computation", what are those constraints, and why are they considered important to the model? (cont'd) – Niel de Beaudrap – 2018-04-04T21:20:53.173

(3) I also don't see why hardware is necessary to describe the comparison with classical simulated annealing. Feel free to assume that you have perfect hardware with arbitrary connectivity: define quantum annealing as you imagine a mathematician might define annealing, free of niggling details; and consider particular realisations of quantum annealing as attempts to approximate the conditions of that pure model, but involving the compromises an engineer is forced to make on account of having to deal with the real world. Is it not possible to make a comparison? – Niel de Beaudrap – 2018-04-04T21:25:30.247

It is apparent that we think about this in different ways, but I'm getting a sense of the implicit theoretical model. I hope you don't mind if I keep trying to tease from you some sort of idea of what QA is in principle. (4) Your remark about the initial Hamiltonian is useful, and suggests something very general lurking in the background. Perhaps arbitrary (but efficiently computable, monotone, and first differentiable) schedules are also acceptable in principle, with limitations only arising from architectural constraints, and of course also the aim to obtain a useful outcome? – Niel de Beaudrap – 2018-04-05T08:55:15.097

(5) Perhaps there is less of a difference between the Hamiltonians in QA and CSA than you suggest. $H_{\mathrm{cl}}$ is clearly obtained from $H_{\mathrm{qm}}$ for $A(t) = 1, B(t) = 0$ if you impose a restriction to standard basis states (which may be benign if $H_{\mathrm{qm}}$ is non-degenerate and diagonal). There's clearly a difference in 'transitions', where QA seems to rely on suggestive intuitions of tunnelling/quasiadiabaticity, but perhaps this can be (or already has been?) made precise by a theoretical comparison of QA to a quantum walk. Is there no work in this direction? – Niel de Beaudrap – 2018-04-05T09:13:29.507

(6) One respect in which I suppose the hardware may play an important role --- but which you have not explicitly mentioned yet --- is the role of dissipation to a bath, which I now vaguely remember being relevant to DWAVE. Quoting from Boixo et al.: "Unlike adiabatic quantum computing [...] quantum annealing is a positive temperature method involving an open quantum system coupled to a thermal bath." Clearly, what bath coupling one expects in a given system is hardware dependent; but is there no notion of what bath couplings are reasonable to consider for hypothetical annealers? – Niel de Beaudrap – 2018-04-05T10:12:06.330

1Haven’t forgotten about this just been super busy. Will try to update tonight. – Andrew O – 2018-04-06T14:36:36.573

I appreciate the time you're taking to address my questions. We are still misunderstanding each other a bit. You ask "Are arbitrary schedules useful?". I'm confident that the answer is 'no', just as arbitrary unitary circuits are not helpful. But while it is not helpful to perform unitary gates willy-nilly, doesn't mean we should define the model to disallow that. While we have some idea of circuits which are not helpful, we don't know precisely which circuits are helpful. So, we define "things which are allowed" generously, and leave 'helpfulness' as a question of design within the model. – Niel de Beaudrap – 2018-04-09T10:51:05.800

You say "In principle the field should go from high to low, slow enough to avoid a Landau-Zener transition and fast enough to maintain the quantum effects of qubits." This is the helpful thing to do, but you usually don't know just how slow that can or should be, do you? If you don't know precisely what is helpful, and have to accept that finding out just how slow that will be might be a difficult question of design --- where you might be able to do some parts quickly, some more slowly --- just how generous would you be in defining what "an annealing schedule" is? – Niel de Beaudrap – 2018-04-09T10:56:25.447

You may have misunderstood what I was getting at (in a shorthand way) regarding the similarity between the classical and quantum Hamiltonian --- more importantly I'm certain we've talked past each other about the effect of a bath. I thought maybe dissipation to a bath was sometimes considered helpful to how quantum annealers work, when operating in the non-adiabatic regime (as it often will be when working on NP-hard problems, because we're interested in obtaining answers to problems despite the eigenvalue gap possibly being very small). Is dissipation not potentially helpful then? – Niel de Beaudrap – 2018-04-09T11:26:30.223

I think that maybe if we can get the confusion about the annealing schedules, and whether or not it the transition has to be along a liner interpolation between two Hamiltonians (as opposed to a more complicated trajectory), it might be helpful to condense the answer to something a bit more concise. I'd be willing to do that as a single edit on your post, and perhaps you could determine whether or not I've done a reasonable job at a faithful summary. How would that be? – Niel de Beaudrap – 2018-04-09T11:29:14.530

I don't think I agree with your bolded statement "The only relation classical simulated annealing has with quantum annealing is they both have annealing in the name." I think the analogy is tighter than that. By Wick rotating the time parameter in the quantum annealing problem to imaginary time, can't you map it to a classical annealing problem, where the total runtime $T$ of the quantum annealing problem maps to the inverse temperature $\beta$ of the classical annealing problem? – tparker – 2020-06-20T21:16:36.913

So "slow time evolution" in quantum annealing is indeed formally equivalent to "low temperature" in classical annealing, albeit referring to the temperature of a hypothetical different system, rather than to the physical temperature of the actual quantum annealer (which in principle is always held at zero). – tparker – 2020-06-20T21:18:15.243

I'm thinking of condensing your answer to a more concise summary in the near future. Did you want to pick up before I do that, or are you happy to see what I make of our discussion (possibly to roll it back if you're unhappy with it)? – Niel de Beaudrap – 2018-05-15T14:26:03.443

I’ve been slammed with work lately. I’ll have some time later this week once I finish this thing I’m working on today. Let me have a few more days to address your past comment. Thanks. – Andrew O – 2018-05-15T14:28:59.137