4

2

A quantum computer can be modeled as a single unitary transition of a (large) effective quantum state to another. In order to get errors under control, quantum error correction is assumed. A logical qbit lives in a "logical heat bath" which must have an effective entropy and temperature far below that of the ambient world in order for the error rate of the computation to be below unity.

Thus it should be possible to compute the energy cost of any quantum computation as the entropy difference between the required effective "logical heat bath" and the ambient environment. As entropy scales as $$dS = \frac{dQ}{T_A} - \frac{dQ}{T_B},$$ this would imply that an arbitrarily long/large calculation requires zero temperature and an infinite amount of energy, in apparent contradiction with the quantum threshold theorem. Such an energy cost would apply to classical calculations as well.

Is there an error in my reasoning? If not, has anyone published actual calculations of the required energy to perform a computation of a given size or have a formula for the entropy of the needed heat bath that satisfies the threshold theorem? This would seem to imply that cryptographically relevant computations like Shor's algorithm will never be practical on energy grounds. The required precision on the logical state is exponential in the number of bits, and the corresponding requirement on the heat bath should be also.

Can you elaborate on "logical heat bath which must have an effective entropy and temperature far below that of the ambient world"? It seems like your reasoning would also apply to classical computers, where the "state" of the bits in a computer is also exponential in the number of bits, but the energy costs of error correction are manageable. As I understand most quantum error correction tries to make energy barriers between logical states that scale exponentially with the size of the code -- does that fit with what you're saying and/or answer your question? – Sam Jaques – 2020-07-08T11:18:54.230

Yes the logic applies to classical computers as I mentioned. The basic idea is to transform the physical qbit + heat bath into the logical qbit + heat bath. As a heat bath is uncorrelated, to first order any transformation on it leaves the heat bath invariant. But an error correcting transformation must lower the entropy of the heat bath to maintain the coherence of the logical qbit. As entropy is directly an energy quantity, this represents a lower bound on the amount of energy required for the quantum computation. You can do this directly on the physical qbit in e.g. atom traps at nK temps. – Bob McElrath – 2020-07-08T15:03:18.803

So no @SamJaques this doesn't answer my question, as "energy barriers that scale exponentially with the size of the code" imply an exponential decrease in entropy, and an exponential amount of energy to create the state. – Bob McElrath – 2020-07-08T15:04:48.470

My mistake, energy barriers scale linearly (https://arxiv.org/abs/1411.6643, III(c)). That paper show that the transition probabilities decrease exponentially with the size of the energy barrier, though. Since the entropy of destroying $b$ bits is $kTb$, then error correction should be a linear cost. My guess is that the logarithms in entropy are playing a role; e.g., maybe the entropy difference is proportional to the log of the required precision. But I don't know enough thermodynamics to give the details.

– Sam Jaques – 2020-07-09T15:09:01.827