In general, the efficiency of Quantum Machine Learning Techniques will be calibrated and measured more in terms of the energy efficiency, ability to handle complex computational problems, NP-hard problems and the ability to ensemble different domain algorithms than the speed and learning rate. However, there could be exceptionally faster quantum algorithms for a specific set of computational problems.

Quantum Backpropagation can be more energy efficient and faster if the right combination of quantum algorithms is used. It depends on how efficiently the output neuron states converge the quantum correlations employed in the feedforward network and what kind of control system algorithms are used for backpropagation. An efficient control NOT gate with lesser decoherence and optimized topology will be useful for this purpose.

One interesting approach to quantum backpropagation is by implementing
a form of quantum adaptive error correction, in the sense that, for a feedforward network, the input layer is conditionally transformed so that it exhibits the firing patterns that solve a given computational problem.

In this approach quantum backpropagation dynamics is integrated in a two-stage neural cognition scheme: there is a feedforward learning stage such that the output neurons’
states, initially separable from the input neurons’ states, that converge during a
neural processing time to correlated states with the input layer, and then
there is a backpropagation stage, where the output neurons act as a control
system that triggers different quantum circuits that are implemented on the
input neurons, conditionally transforming their state in such a way that a
given computational problem is solved.

The following research paper has a deep analysis on the Quantum Back Propagation dynamics through the application of a Hamiltonian framework. It introduces a Hamiltonian framework for quantum neural machine learning with basic feedforward neural networks integrating quantum measurement theory and dividing the quantum neural dynamics in the learning stage and the backpropagation stage and then apply the framework to two example problems:

The firing pattern selection problem, where the neural network places the input layer in a specific well-defined firing configuration, from an initially arbitrary superposition of neural firing patterns

The n-to-m Boolean functions’ representation problem, where the goal for the network is to correct the input layer so that it represents an arbitrary n-to-m Boolean function.

There is another experimental implementation of Quantum Back Propagation algorithm for a Multi Layer Perceptron based Artificial Neural Network as outlined here.

Are you asking in general or are you asking in the context of PennyLane itself? – cnada – 2018-11-14T12:39:38.483

I'm more interested about quantum backpropagation in general. – asmaier – 2018-11-14T15:40:45.107

Well at the moment it is a bit investigated but people are mostly looking first at a quantum version of gradient descent, where we try to find a complexity that is better than the classical version. – cnada – 2018-11-14T15:46:14.630

Maybe an interesting follow-up on cnada's comment: quantum improvements of gradient descent algorithms are for example covered in this paper: https://arxiv.org/abs/1711.00465

– arriopolis – 2018-11-14T19:57:10.280