Are there quantum algorithms demonstrating speedup computing classical neural networks (in 2021)?



It seems like there are a number of different speed-ups for different machine learning algorithms:

enter image description here

But has anyone created an algorithm showing speed-up for neural networks? A similar question was asked a few years ago, but the answer is not so clear. It seems like it is an open problem as of 2014, and was interested in knowing if there have been any recent developments that demonstrate speedup.

Steven Sagona

Posted 2021-01-02T22:21:50.193

Reputation: 723



It seems clear that any speedup for neural networks will be inherently polynomial in the traditional sense – they don't traditionally use Fourier transforms or any other subroutines that we often associate with exponential speedups. But if we look to things related to training time (e.g. sample complexity) and learnability, we may find useful advantages in practice. I shared a link at the end of my answer to a related question that may be of interest.

Of course in terms of implementation and hardware, classical deep learning is getting pretty advanced. It may take a long while for any quantum advantages to be realized in practice.


Posted 2021-01-02T22:21:50.193

Reputation: 663

Thanks for your answer. One thing I want to point out is that I specifically asked about speedup "classical neural networks." Am I incorrect in thinking that quantum neural networks are pretty fundamentally different? I thought that QNNs are really useful for a different set of problems (such as machine learning within a quantum algorithm), and are not just "speed up classical neural nets". – Steven Sagona – 2021-02-14T20:53:34.347

Yes, I would say in general QNNs are different. They're also poorly defined, though. For example, in much of the QML literature, variational approaches are viewed more generally as QNNs, with the notion that the 'neural network' comes from using gradient descent to optimize the parameters of a function (i.e. the variational circuit). But in those cases, there is no 'neuron' or activation function in the traditional sense. That said, there have been attempts to make 'true' QNNs, ones that are the quantum versions of a classical neural net. – Greenstick – 2021-02-14T21:04:38.713

The paper I reference refers to the former (i.e. the variational circuits). That said, the notion of a speedup for variational circuits is hard to define under the quantum query model. Fundamentally this has to do variational algorithms being heuristic; if an exponential speedup is realized by a variational circuit for, say, simulating the ground state of a molecule, it's a safe bet to assume that the provenance of that exponential speedup will be due to the amenability of the problem to quantum information, not through the development of a clever heuristic based on gradient descent. – Greenstick – 2021-02-14T21:09:19.177