24

9

Recently, a series of research papers have been released (this, this and this, also this) that provide classical algorithms with the same runtime as quantum machine learning algorithms for the same purpose. From my understanding, the key to all the algorithms that have been *dequantized* is the same: the state preparation routine has been substituted by $\ell^2$-norm sampling.

A few questions arise: after these results, what are the expectations from the field? Since any quantum algorithm will need of state preparation at some point (in every algorithm we will have, at some point, to encode some classical data in a quantum state), can any quantum machine learning algorithm be dequantized using Tang's method? If not, which algorithms are expected to / do resist this dequantization?

@CarloBeenakker thanks for the great link! can you elaborate on the FT application? this paper postulates the equivalence between deep NNs and large-dimensional linear operations. Would that also apply here?

– fr_andres – 2019-08-07T19:06:56.127Are you asking about all quantum algorithms or just quantum machine learning algorithms? – Condo – 2020-07-07T18:49:50.780

@Connor my interest is in particular about quantum machine learning algorithms, since it is not expected that any quantum algorithm (for instance, take Grover search, which is one of the paradigmatic ones) can be dequantized – Alex – 2020-07-08T11:49:46.240

3

One algorithm that may be immune from dequantization is the quantum algorithms for topological and geometric analysis of big data (arxiv, Nature) of Lloyd, Garneroni, and Zanardi.

– Mark S – 2020-08-23T22:19:22.287@MarkS perhaps this is true, however the algorithm you mention has substantial caveats outlined at the end of https://www.nature.com/articles/nphys3272 so I don't think it should be chosen as a flagship of hope...

– Condo – 2020-09-04T16:11:58.1401Note that there is at least one possible application of QML where the assumption that we begin by encoding classical data in a quantum state does not hold. Namely, using QML to process quantum rather than classical data. Note that many types of sensors exploit quantum effects. Rather than measuring such a sensor one could coherently couple it to a quantum computer and perform tasks such as state classification. – Adam Zalcman – 2020-12-17T20:19:03.123

7

in the case of HHL, the classical solution by E. Tang works for low-rank matrices while the original algorithm was for sparse ones, so at least in this case not quite the same problem is being solved. That said, this blog post by E. Tang explaining her technique might be of interest

– glS – 2019-02-19T19:35:43.8504

the expectation, voiced for example in arXiv:1811.04909 is that exponential quantum speed-ups are tightly related to problems where high-rank matrices play a crucial role, like in Hamiltonian simulation (quantum chemistry) or the Fourier transform (factorization).

– Carlo Beenakker – 2019-03-31T07:15:02.353