Can quantum computing speed up Bayesian learning?

8

One of the biggest drawbacks of Bayesian learning against deep learning is runtime: applying Bayes' theorem requires knowledge on how the data is distributed, and this usually requires either expensive integrals or some sampling mechanism (with the corresponding drawbacks).

Since at the end of the day is all about distribution propagations, and this is (as far as I understand) the nature of quantum computing, is there a way to perform these efficiently? If yes, what limitations do apply?

Edit (directly related links):

fr_andres

Posted 2018-03-30T04:12:01.383

Reputation: 694

There hasn't been a lot of work on this (that I know of). For Bayesian networks, there is 1404.0055, in which the author uses a variation of Grover search to obtain a quadratic speed-up. On the related topic of Markov Models there are also few things, see references on the wiki and 1611.08104. I am not qualified enough to build an answer out of these though.

– glS – 2018-03-31T08:02:59.150

@glS just wanted to tell you about HC's answer, looks really interesting (in case you didn't know about that paper). Thanks a lot for your references and brief explanations too, if you want to ellaborate some answer I will be glad to upvote it – fr_andres – 2018-04-01T21:46:26.500

Answers

4

Gaussian Processes are a key component of the model-building procedure at the core of Bayesian Optimization. Therefore speeding up the training of Gaussian processes directly enhances Bayesian Optimization. The recent paper by Zhao et. al on Quantum algorithms for training Gaussian Processes does exactly this.

hopefully coherent

Posted 2018-03-30T04:12:01.383

Reputation: 607

Just to complement your answer, the same author has recently published a new paper where they make use of quantum training of Gaussian Processes to train deep learning architectures, providing (theoretical) speedups with respect to classical training.

– Alex – 2018-07-26T12:58:14.567