Quantum machine learning after Ewin Tang

24

9

Recently, a series of research papers have been released (this, this and this, also this) that provide classical algorithms with the same runtime as quantum machine learning algorithms for the same purpose. From my understanding, the key to all the algorithms that have been dequantized is the same: the state preparation routine has been substituted by $\ell^2$-norm sampling.

A few questions arise: after these results, what are the expectations from the field? Since any quantum algorithm will need of state preparation at some point (in every algorithm we will have, at some point, to encode some classical data in a quantum state), can any quantum machine learning algorithm be dequantized using Tang's method? If not, which algorithms are expected to / do resist this dequantization?

Alex

Posted 2019-02-19T14:21:14.933

Reputation: 351

@CarloBeenakker thanks for the great link! can you elaborate on the FT application? this paper postulates the equivalence between deep NNs and large-dimensional linear operations. Would that also apply here?

– fr_andres – 2019-08-07T19:06:56.127

Are you asking about all quantum algorithms or just quantum machine learning algorithms? – Condo – 2020-07-07T18:49:50.780

@Connor my interest is in particular about quantum machine learning algorithms, since it is not expected that any quantum algorithm (for instance, take Grover search, which is one of the paradigmatic ones) can be dequantized – Alex – 2020-07-08T11:49:46.240

3

One algorithm that may be immune from dequantization is the quantum algorithms for topological and geometric analysis of big data (arxiv, Nature) of Lloyd, Garneroni, and Zanardi.

– Mark S – 2020-08-23T22:19:22.287

@MarkS perhaps this is true, however the algorithm you mention has substantial caveats outlined at the end of https://www.nature.com/articles/nphys3272 so I don't think it should be chosen as a flagship of hope...

– Condo – 2020-09-04T16:11:58.140

1Note that there is at least one possible application of QML where the assumption that we begin by encoding classical data in a quantum state does not hold. Namely, using QML to process quantum rather than classical data. Note that many types of sensors exploit quantum effects. Rather than measuring such a sensor one could coherently couple it to a quantum computer and perform tasks such as state classification. – Adam Zalcman – 2020-12-17T20:19:03.123

7

in the case of HHL, the classical solution by E. Tang works for low-rank matrices while the original algorithm was for sparse ones, so at least in this case not quite the same problem is being solved. That said, this blog post by E. Tang explaining her technique might be of interest

– glS – 2019-02-19T19:35:43.850

4

the expectation, voiced for example in arXiv:1811.04909 is that exponential quantum speed-ups are tightly related to problems where high-rank matrices play a crucial role, like in Hamiltonian simulation (quantum chemistry) or the Fourier transform (factorization).

– Carlo Beenakker – 2019-03-31T07:15:02.353

Answers

3

I am not an expert in the field but there are a few points that I am aware of:

  1. There are proofs that certain quantum machine learning algorithms cannot be efficiently simulated on a classical computer even if the classical computer has analagous sampling access to the data as the quantum algorithm does (i.e. they cannot be dequantized) [1-3]. However there is no proof that these algorithms are better at learning to classify certain datasets than the best classical algorithms out there.
  2. There is one paper out there proving that a certain quantum machine learning algorithm cannot be simulated efficiently using a classical computer and can learn a specific classification task exponentially faster than any classical algorithm [4]. However it is important to state that the algorithm and specific classification task mentioned in the paper are very contrived and hence it is not a useful task. To summarize, they use an SVM with a very particular quantum kernel on a dataset generated by the discrete log problem. Nonetheless it's a really neat proof-of-concept result and gives us some hope that quantum computing can bring a meaningful advantage to machine learning.
  3. As of today there is no example where we know that a quantum machine learning algorithm could outperform a classical machine learning algorithm in any meaningful task, but a recent paper has suggested a methodology for assessing the potential for quantum advantage in prediction on learning tasks [5]. However the proofs in classical machine learning are also quite few and far in between, so given the above two points, people are hopeful that when quantum computers become powerful enough to try some QML algorithms out, we would've advanced in the field enough that we are likely to find some use cases where we find advantage.

[1] https://www.nature.com/articles/s41586-019-0980-2
[2] https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.122.040504
[3] https://advances.sciencemag.org/content/4/12/eaat9004
[4] https://arxiv.org/abs/2010.02174
[5] https://arxiv.org/abs/2011.01938

Rajiv Krishnakumar

Posted 2019-02-19T14:21:14.933

Reputation: 381