26

5

Plain and simple. Does Moore's law apply to quantum computing, or is it similar but with the numbers adjusted (ex. triples every 2 years). Also, if Moore's law doesn't apply, why do qubits change it?

26

5

Plain and simple. Does Moore's law apply to quantum computing, or is it similar but with the numbers adjusted (ex. triples every 2 years). Also, if Moore's law doesn't apply, why do qubits change it?

23

If you take as definition "*the number of transistors in a dense integrated circuit doubles about every two years*", it definitely does not apply: as answered here in Do the 'fundamental circuit elements' have a correspondence in quantum technologies? there exist no transistors-as-fundamental-components (nor do exist fundamental-parallel-to-transistors) in a quantum computer.

If you take a more general definition "*chip performance doubles aproximately every 18 months*", the question makes more sense, and the answer is still that **it does not apply**, mainly because Moore's law is not one of fundamental Physics. Rather, in the early stages, it was an observation of a stablished industry. Later, as pointed out in a comment,[1] it has been described as functioning as an "*evolving target*" and as a "*self-fulfilling prophecy*" for that same industry.

The key is that **we do not have a stablished industry producing quantum computers.** We are not in the quantum equivalent from 1965. Arguably we will move faster, but in many aspects we are rather in the XVII-XVIII centuries. For a perspective, check this timeline of computing hardware before 1950.

For a more productive answer, there are a few fundamental differences and a few possible parallels between classical and quantum hardware, in the context of Moore's law:

- For many architectures, in a certain sense we already work with the smallest possible component. While we might develop ion traps (of a fixed size) fitting more ions, but we cannot develop smaller ions: they are of atomic size.
- Even when we are able to come up with tricks, such as Three addressable spin qubits in a molecular single-ion magnet, they are still fundamentally limited by quantum mechanics. We need control over 8 energy levels to control 3 qubits ($2^n$), which is doable, but not scalable.
- Precisely because the scalability issue is one of the hardest problem we have with quantum computers -not just having a larger number of qubits, buy also being able to entangle them- it's dangerous to extrapolate from current progress. See for illustration the history of NMR quantum computers, which stalled after a very early string of successes. In theory, increasing the number of qubits in the device was trivial. In practice, every time you want to be able to control 1 more qubit you need to double the resolution of your machine, which becomes very unfeasible very quickly.
- If and when there exists an industry that relies on an evolving technology which is able to produce some kind of integrated quantum chips, then yes, at that point we will be able to draw a real parallel to Moore's law. For a taste of how far we are from that point, see Are there any estimates on how complexity of quantum engineering scales with size?

[1] *Thanks to Sebastian Mach for that insight and wikipedia link. For more details on that see Getting New Technologies Together: Studies in Making Sociotechnical Order
edited by Cornelis Disco, Barend van der Meulen, p. 206 and Gordon Moore says aloha to Moore's Law.*

3"*Moore's law is not one of fundamental physics but one of observation of a stablished industry. We do not have a stablished industry producing quantum computers.*" Exactly so, and I am glad to see more people on this site saying so, as you have done very clearly. Quantum computing is not really here yet --- though it is coming. – Niel de Beaudrap – 2018-04-17T06:46:05.397

2I am not sure if Moore's Law is just observational. I believe more that it's a dogma or agenda; kind of the industry's TODO- and *Good Enough*-list. – Sebastian Mach – 2018-04-17T08:20:39.257

How about the number of qubits over time? https://goo.gl/images/3Y4v51

– JollyJoker – 2018-04-17T08:36:36.433@JollyJoker: "*Moore's law is not one of fundamental physics but one of observation of a stablished industry. We do not have a stablished industry producing quantum computers.*" As an observation about the very early development of quantum technologies, it is possible that there happens to be a recent trend, just as the horoscope in the paper may happen to give me useful advice today. It doesn't mean that it indicates a particularly reliable basis for prediction. There are better ways to investigate progress in quantum technology. – Niel de Beaudrap – 2018-04-17T09:56:57.727

@NieldeBeaudrap Can't claim I'd consider a doubling every six years starting from one qubits in 1990 to be a very reliable way of predicting the future, but it seems like a completely reasonable guess. – JollyJoker – 2018-04-17T11:07:08.883

1@JollyJoker: By that estimate, we should have about 25-26 qubits, as opposed to 19, 49, 72, or 2000. Perhaps you're considering one particular platform? Also, how reliable are these qubits and what can you do with them (and is this standard being held consistent with time over many doubling periods)? It seems to me that we don't learn very much of any importance from any simple projected figure, and that to understand how quantum technology is advancing, we may need to draw back the curtain to investigate what there is behind the hype. – Niel de Beaudrap – 2018-04-17T11:21:12.583

@NieldeBeaudrap I'm just considering the pic I found by googling for Moore's law quantum computing and linked above. I haven't read the blog post it's from. I don't disagree with your points on reliability and general usefulness but I do think Moore's law for transistor count, clock speed, hard drive space or whatever suffers from the same problems as a useful predictor. It's just an empirical observation of exponential growth. – JollyJoker – 2018-04-17T12:37:19.490

@JollyJoker if you're looking for even more scaling "laws", some Yale researchers claim a Moore's Law-like trend for qubit coherence times http://science.sciencemag.org/content/339/6124/1169/F3

– forky40 – 2018-04-17T16:13:26.573@SebastianMach: I head people make this claim about Moore's law just be sociological or dogma all the time, but it makes no sense to me. Can you elaborate more? There are obviously fundamental physics, engineering, and economics reasons that other industries (e.g., cars, education, catering, oil) don't have a Moore's law; if they could increase performance exponentially, they would. Can you point to any industries that physically *could* have Moore's-like scaling but don't for sociological/agenda reasons? If not, in what sense is this about agenda? – Jess Riedel – 2018-04-25T15:13:25.550

@JessRiedel I believe this should not work as a thread for discussion, but rather as comments suggesting changes to the answer. In any case, exponentials (upwards and downwards) are used to approximate fossil fuel extractions of any given field https://en.wikipedia.org/wiki/Hubbert_peak_theory#Hubbert_curve

– agaitaarino – 2018-04-25T15:29:18.423Agreed, this would work better as a separate thread, but unfortunately there's no way to force a thread to be created except to wait for the website to automatically suggest it (1, 2, 3).

– Jess Riedel – 2018-04-25T15:44:52.760I'm happy to simply hear an explanation/link from Sebastian Mach without needing to engage in a back-and-forth. Until then, I suggest agaitaarino modify his answerer to reflect the very speculative nature of this claim. – Jess Riedel – 2018-04-25T15:44:56.487

@JessRiedel: It's basically my baseline skepticism. Nature is not based on powers of two or more generally, doublings and halvings. And empirically, it seems unprobable that such a big industry just doubles everything every n years, without fundamental breakthroughs, or stuff like simply adding multiple CPU sockets which actually existed long before multicore CPUs (which, more than a decade since their commodization, are still under-utilized or even just more-than-enough). Science always fluctuates, industries fluctuate, people fluctuate. It's basically applied experience and skepticism. – Sebastian Mach – 2018-04-25T16:08:59.417

@JessRiedel: ... on a sidenote: https://en.wikipedia.org/wiki/Moore%27s_law#As_an_evolving_target_for_industry (just found it after building my opinion)

– Sebastian Mach – 2018-04-25T16:09:48.6008

** tl;dr-** Moore's law won't necessarily apply to the quantum computing industry. A deciding factor may be if the manufacturing processes can be iteratively improved to exponentially increase something analogous to transistor count or roughly proportional to performance.

It's important to note that Moore's law was about the numbers of transistors in high-density integrated circuits, *not* the performance or speed of electronics despite common approximate restatements of Moore's law.

Moore's lawis the observation that the number of transistors in a dense integrated circuit doubles about every two years.–"Moore's law", Wikipedia

Underlying Moore's law was the simple fact that, for a given integrated circuit size, the number of transistors we could cram into it was roughly proportional to the volume of an individual transistor,$$ n_{\text{transistors}}~{\approx}~\frac{V_{\text{integrated circuit}}}{V_{\text{transistor}}}. $$So, Moore's law was sorta like:

The volume of a transistor halved about every two years.

Then the question becomes, why were transistors able to shrink so rapidly?

This was largely because transistors are basically made of microscopically fabricated wires in an integrated circuit, and as manufacturing technology progressed, we were able to make smaller-and-smaller wires:

The process of making crazy-small wires in an integrated circuit took a lot of research know-how, so folks in industry basically set out to iteratively improve their fabrication processes at such a rate to maintain Moore's law.

However, Moore's law is now basically over. Our fabrication processes are nearing the atomic scale such that the physics of the situation are changing, so we can't just keep shrinking further.

As noted above, Moore's law is basically ending now. Computers will likely pick up speed due to other advances, but we aren't really planning to make sub-atomic transistors at this time. So despite industry's strong desire to maintain it, it seems unlikely.

If we assume similar behavior in a future quantum-computing industry, then we might assume that something like Moore's law might arise if industry finds itself in a similar position, where it can iteratively improve the components' manufacturing process to exponentially increase their count (or some similar metric).

At this time, it's unclear what basic industrial metric quantum computer manufacturers might iteratively improve over the course of decades to recreate a trend like Moore's law, largely because it's unclear what sort of quantum computing architectural technologies might find widespread deployment like modern integrated circuits have.

6

The first thing to understand about Moore’s law is that it is not a law in the absolute sense, mathematically provable, or even postulated (like a law of physics). Really, it was just a rule of thumb that said the number of transistors in a processor would double every x years. This can be seen in the way that the value x has changed over time. Originally, it was x=1, then it became x=2, then what it was applied to (processor speed) changed. It has proved to be a useful rule of thumb, partly because it was the rule of thumb that was used to set targets for new generations of processor.

So, there is absolutely no reason why Moore’s law should apply to quantum computers, but it would not be unreasonable to guess that, past some basic threshold, qubit numbers will double every y years. For most implementations of quantum computation, we don’t yet have enough data points to start extrapolating an estimate for the value y. Some might argue that it’s not even clear yet whether we’re in the “vacuum tube” or “transistor” era of quantum computing (Moore’s law didn’t start until the transistor era).

We might start to try and extrapolate for some systems. For example, D-wave has a history of doubling its processor sizes. This started as y=1, and currently has about y=2. Of course, this is not a universal quantum computing device. The next best thing we might look at is the IBM quantum processor. In a year, the computer available on the IBM quantum experience went from 5 qubits to 16, although I don’t think it’s reasonable to extrapolate based on this.

4

Plain and simple. Does Moore's law apply to quantum computing, or is it similar but with the numbers adjusted (ex. triples every 2 years). Also, if Moore's law doesn't apply, why do qubits change it?

A great question, with great answers; still, I will try my hand at it.

No, most quantum computers do not have qubits created in silicon; even the few that do are not created by utilizing computational lithography. Quantum computing is in it's earliest days, it can't be compared directly to a mature technology of an entirely different kind.

Information to support that short answer:

This question was asked at physics.SE: "Reasonable to expect Moore's law for quantum computing?", receiving one answer; not particularly well received (400 views in 144 days, and 1 UpVote).

It is termed Rose's Law, by some; after the CTO of D-Wave Systems. See this article: "Quantum computing Rose’s Law is Moore’s Law on steroids" or the Flickr page of the Managing Director of the investment firm Draper Fisher Jurvetson, Steve Jurvetson: "Rose’s Law for Quantum Computers".

The chart runs a bit ahead of itself, and it applies to quantum annealing computers, it's not exactly comparable to universal quantum computing.

The reason Moore's Law is not exactly comparable is because it refers to transistors and an entirely different manufacturing process, you're comparing a manufacturing process that was established at the time with one where the computer is in it's earliest days and is essentially hand built.

Wikipedia's webpage describes Moore's Law this way:

"Moore's law is the observation that the number of transistors in a dense integrated circuit doubles about every two years. The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and Intel, whose 1965 paper described a doubling every year in the number of components per integrated circuit, and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years. The period is often quoted as 18 months because of Intel executive David House, who predicted that chip performance would double every 18 months (being a combination of the effect of more transistors and the transistors being faster).".

Gordon E. Moore's graphic from 1965 looked like this:

The article by Max Roser and Hannah Ritchie (2018) - "Technological Progress", published online at OurWorldInData.org, explains how exponential equations have been used to describe everything from Moore's Law, computational power (both operations per second and clock speed * cores * threads), the progress of human flight or even human genome DNA sequencing.

Moore's law is an observation and projection of an historical trend and not a physical or natural law. Although the rate held steady from 1975 until around 2012, the rate was faster during the first decade. A nostalgic look at the early days of personal computing is given in this Ars Technica feature: "The creation of the modern laptop: An in-depth look at lithium-ion batteries, industrial design, Moore's law, and more".

In this Communications of the ACM, Vol. 60 No. 1 article: "Exponential Laws of Computing Growth" the authors, Denning and Lewis, explain:

"The three kinds of exponential growth, as noted—doubling of components, speeds, and technology adoptions—have all been lumped under the heading of Moore's Law. Because the original Moore's Law applies only to components on chips, not to systems or families of technologies, other phenomena must be at work. We will use the term "Moore's Law" for the component-doubling rule Moore proposed and "exponential growth" for all the other performance measures that plot as straight lines on log paper. What drives the exponential growth effect? Can we continue to expect exponential growth in the computational power of our technologies?

Exponential growth depends on three levels of adoption in the computing ecosystem (see the table here). The chip level is the domain of Moore's Law, as noted. However, the faster chips cannot realize their potential unless the host computer system supports the faster speeds and unless application workloads provide sufficient parallel computational work to keep the chips busy. And the faster systems cannot reach their potential without rapid adoption by the user community. The improvement process at all three levels must be exponential; otherwise, the system or community level would be a bottleneck, and we would not observe the effects often described as Moore's Law.

With supporting mathematical models, we will show what enables exponential doubling at each level. Information technology may be unique in being able to sustain exponential growth at all three levels. We will conclude that Moore's Law and exponential doubling have scientific bases. Moreover, the exponential doubling process is likely to continue across multiple technologies for decades to come.

**Self-Fulfillment**

The continuing achievement signified by Moore's Law is critically important to the digital economy. Economist Richard G. Anderson said, "Numerous studies have traced the cause of the productivity acceleration to technological innovations in the production of semiconductors that sharply reduced the prices of such components and of the products that contain them (as well as expanding the capabilities of such products)."1 Robert Colwell, Director of DARPA's Microsystems Technology Office, echoes the same conclusion, which is why DARPA has invested in overcoming technology bottlenecks in post-Moore's-Law technologies.5 If and when Moore's Law ends, that end's impact on the economy will be profound.

It is no wonder then that the standard explanation of the law is economic; it became a self-fulfilling prophesy of all chip companies to push the technology to meet the expected exponential growth and sustain their markets. A self-fulfilling prophecy is a prediction that causes itself to become true. For most of the past 50-plus years of computing, designers have emphasized performance. Faster is better. To achieve greater speed, chip architects increased component density by adding more registers, higher-level functions, cache memory, and multiple cores to the same chip area and the same power dissipation. Moore's Law became a design objective.".

Moore's Law had a lot of help, shaping the future and maintaining the growth was an objective of those whom stood to profit; not entirely constrained by technological limitations. If consumers wanted something sometimes it was provided and other times a *better* idea was offered; what was popular (clock speed) sold at a premium and what, at one time, was not well understood (more cores and threads) was promoted as the way forward.

Moore's Law was well received, evolving into many things, like Kurzweil's "The Law of Accelerating Returns". Here's an updated version of Moore’s Law (based on Kurzweil’s graph):

Another fact-based chart is provided by Top500.Org's chart of the exponential growth of SuperComputer power:

The Missouri University of Science and Technology's article: "Forecasting Consumer Adoption of Technological Innovation: Choosing the Appropriate Diffusion Models for New Products and Services before Launch" explains that the Bass Model (a modification of the logistic curve) is a sound method to predict future growth (based upon past statistics).

The logistic curve features a slow start, great mid-term progress, followed by an eventual slowdown; often replaced by something new.

On forecasting models the authors had this to say:

"**MODELS**

The Box and Cox and Generalized Bass models were the best models when it came to curve-fitting while the Simple Logistic model did the poorest. **However, the results of the research showed that a curve-fitting advantage did not translate into a forecasting advantage when creating a forecast for an innovation without a market history**. The popularity of the Bass model derives from two unique factors. As this research has reinforced, the Bass model is very robust. In addition, the Bass model’s two coefficients have a theoretical foundation. The Bass model variants created for this research deliberately violated the assumption of a constant $m$. This resulted in a model (Bv) that outperformed any of the others in the radical low-priced innovation context. Unfortunately, there was just one innovation in this context – additional research is recommended to test the viability of this variation with more datasets in various contexts.

The Simple Logistic model is one of the oldest diffusion models known. It is a very basic model, but it clearly outperformed the other models in the context of really new low-priced innovations. The Gompertz model it is not recommended for forecasting the diffusion of really new or radical innovations before the launch of an innovation. However, the Gompertz model may be very well suited for forecasts generated well after the launch of an innovation. While not the focus of this research, it was observed that the diffusion of the Projection Television innovation follows a perfect Gompertz curve.

The Flexible Logistic Box and Cox model has a problem where the c variable tends to run to infinity in some scenarios. This was addressed by capping the upper limit of $c$ to 100,000. Despite (or because of) this fix, the authors must admit to being skeptical as to how well the Box and Cox model would do in comparison to the other models. As it turned out, the Box and Cox was second only to the Bass model in terms of robustness. The Box and Cox was also the best model in the context of radical high-priced innovations.".

**Moore's position as a co-founder of Intel helped ensure that he could help his prediction to come true and keep it on track.** Quantum computing is too near it's genesis to be pushed forward by simply pouring money on it, with so many paths to creating a successful quantum computing device money needs to be apportioned wisely to make the most gains from the many branches that research has taken.

"The European Quantum Technologies Roadmap" (11 Dec 2017) lists some of the challenges, following the introduction:

"**Introduction**

A quantum computer based on the unitary evolution of a modest number of robust logical qubits (N>100) operating on a computational state space with 2$^N$ basis states would outperform conventional computers for a number of well identified tasks. A viable implementation of a quantum computer has to meet a set of requirements known as the DiVincenzo criteria: That is, a quantum computer operates on

(1) an easily extendable set of well characterized qubits

(2) whose coherence times are long enough for allowing coherent operation

(3) and whose initial state can be set

(4). The qubits of the system can be operated on logically with a universal set of gates

(5) and the final state can be measured

(6). To allow for communication, stationary qubits can be converted into mobile ones

(7) and transmitted faithfully.

It is also understood that it is essential for the operation of any quantum computer to correct for errors that are inevitable and much more likely than in classical computers.

Today quantum processors are implemented using a range of physical systems. Quantum processors operating on registers of such qubits have so far been able to demonstrate many elementary instances of quantum algorithms and protocols. The development into a fully featured large quantum computer faces a scalability challenge which comprises of integrating a large number of qubits and correcting quantum errors. Different fault-tolerant architectures are proposed to address these challenges. The steadily growing efforts of academic labs, startups and large companies are a clear sign that large scale quantum computation is considered a challenging but potentially rewarding goal.".

...

There are too many paths to choose, and determine the best way forward, to either plot a model for growth (like Moore's Law) nor should so straight a line be expected.

With D-Wave's computer each doubling of qubits represents a doubling of computation power, for the subset of problems it is suited for, for universal quantum computers each single additional qubit represents a doubling of power; unfortunately each single qubit needs to be represented by multiple qubits, to permit error correction and maintain coherence. Some technologies used to implement qubits allow fewer or single qubits to be used as they are not error prone and have longer coherence and greater fidelity. Speed of control is also an important consideration when choosing which technology to implement and while it will affect the plot of the curve it's out of the scope of the answer offered here.

Further reading: "Coherent control of single electrons: a review of current progress" (1 Feb 2018), "Hyperfine-assisted fast electric control of dopant nuclear spins in semiconductors" (30 Mar 2018), "A >99.9%-fidelity quantum-dot spin qubit with coherence limited by charge noise" (4 Aug 2017).

1Unfortunately it is already 2018 and no "Faster than Universe" QC has been developed :( – Trect – 2018-06-17T19:50:42.167

3

This article seems to adequately explain what you are asking. It shows the growth of usable qubits in quantum computers.

So the question comes up whether Moore’s Law can also be applied to quantum qubits. And early evidence suggests that indeed it may [...]

The adiabatic line would be a prediction for quantum annealing machines like the D-Wave computers. These have followed the Moore’s Law prediction pretty closely so far with the D-Wave 1 at 128 qubits in 2011, the D-Wave 2 at 512 qubits in 2013, the D-Wave 2X at 1097 qubits in 2015, and a 2048 qubit machine in 2017. [...]

The Physical curve predicts the number of physical qubits that will be available. There is less historical data on these, but there are indications that these will progress rapidly too. As examples, IBM has a 5 qubit machine that is available in the cloud through the IBM Quantum Experience and Google has demonstrated a 9 qubit machine. Both of these companies, and others have indicated that these densities will increase rapidly so the Physical curve maintains the improvement rate of a doubling every year for the next 10 years and a doubling every two years thereafter.

1

If the formulation of this question seems too vague, I asked a more refined version previously on Physics.SE: Once scalable fault-tolerance is achieved, how should we expect the number of qubits in a single device to scale in time? I am very happy to see discussion on this site. If Alex Jone and the community find it appropriate, I suggest editing the question here by simply copying my version in whole or part.

– Jess Riedel – 2018-04-25T15:06:09.687@JessRiedel I'd say, while respecting the original (compact-and-straightforward) question, and trying not to change the scope too much in order to avoid invalidating current answers, feel free to edit this question to include a longer version. – agaitaarino – 2018-05-07T05:54:37.257

Recent popular article: https://www.quantamagazine.org/does-nevens-law-describe-quantum-computings-rise-20190618/

– Jess Riedel – 2019-06-18T15:37:38.797