How is back-propagation done through the classical weights feeding into the quantum unitaries?

In this particular case, the gradient of the quantum variational circuit is computed using the parameter-shift rule. The parameter-shift rule allows us to compute the gradient by simply evaluating linear combinations of the variational circuit under study, so works with both quantum simulators and hardware.

For example, given a variational quantum circuit $U(\boldsymbol \theta)$
and some measured observable $\hat{B}$, the derivative of the expectation value

$$\langle \hat{B} \rangle (\boldsymbol\theta) =
\langle 0 \mid U(\boldsymbol\theta)^\dagger \hat{B} U(\boldsymbol\theta) \mid 0\rangle$$

with respect to the input circuit parameters $\boldsymbol{\theta}$ is given by

$$\nabla_{\theta_i}\langle \hat{B} \rangle(\boldsymbol\theta)
= \frac{1}{2}
\left[
\langle \hat{B} \rangle\left(\boldsymbol\theta + \frac{\pi}{2}\hat{\mathbf{e}}_i\right)
- \langle \hat{B} \rangle\left(\boldsymbol\theta - \frac{\pi}{2}\hat{\mathbf{e}}_i\right)
\right].$$

Thus, the gradient of the expectation value can be calculated by evaluating the same variational
quantum circuit, but with shifted parameter values.

To 'integrate' the quantum circuit into the larger classical ML model, we use PennyLane; which allows you to create an arbitrary hybrid classical-quantum ML model that supports backpropagation using either TensorFlow, PyTorch, or Autograd.

In this particular case, two good resources are:

Quantum transfer learning tutorial. This is a self contained tutorial that follows the general structure of the PyTorch transfer learning tutorial, with difference of using a quantum circuit to perform the final classification task.

The XanaduAI/quantum-transfer-learning GitHub repo, which contains the code for the numerical experiments in the paper.

How do they even calculate cross-entropy loss

In this case, since PennyLane lets us use PyTorch to drive the overall model, we simply use `nn.CrossEntropyLoss()`

class provided by PyTorch.

### References