**Digital and Analog**

The question about analog computing is important.

Digital circuitry gained popularity as a replacement for analog circuitry during the four decades between 1975 to 2015 due to three compelling qualities.

- Greater noise immunity
- Greater drift immunity (accuracy)
- No leakage of stored values

This quickly led to digital signaling standards, architecture of a general purpose computing, and central processing units on a chip. The later, combined with an array of registers to perform elementary operations is the meaning of the word microprocessor.

**Quanta and Computing**

Regarding quantum computing, there have been some interesting proposals to pack digital gates into much smaller volumes, but the notion that a computer can be made of transistors the size of electrons might be a bit fantastic. That's what the term *quantum computing* implies. That degree of miniaturization would have to defy principles of particle physics that are very strongly supported by amassed empirical evidence. Among them is Heisenberg's uncertainty principle.

All computing involves quanta, but statistically. For a transistor in a digital circuit to be statistically stable, there must be a sufficient number of Si atoms with at least 0.1% molar concentration of the atoms used to dope the Si to create a PN junction. Otherwise the transistor will not switch reliably.

The lithographic limit of most mass produced VLSI chips is around 7 nm as of this writing. Crystalline Si, nucleus to nucleus, is about .2 nm, so the miniaturization of a stable transistor is near its quantum limit already. Exceeding that limit by a considerable amount destabilizes the digital circuitry. That's a quantum physics limitation, not a lithographic limitation.

**Projections, Models, and Techniques to Push Limits**

Moore's law was simply an approximate model for the chip industry during the period between the invention of the integrated circuit to the physical limitation of the atomic composition of transistors, which we are now approaching.

Field effect transistors (FETs) can take the miniaturization only slightly further than the mechanics of PN junctions. 3-D circuitry has theoretical promise, but no repeatable nanotechnology mechanisms have yet been developed to form complex circuitry in the third dimension.

**Returning to the Primary Question**

Placing aside the magical idea that quantum computing will somehow completely revolutionize circuitry, we have a question that is both feasible and predictable.

Can an analog computer implement real-valued neural networks and hence do artificial network computation better?

If we define *better* in this context as cheaper and faster, while maintaining reliability and accuracy, the answer is straightforward.

It definitely takes fewer transistors to create the feed forward part of an artificial network using an analog approximation of the closed forms resulting from the calculus of artificial networks than a digital one. Both are approximations. Analog circuitry has noise, and drift and digital circuitry has rounding error. Beyond rounding, digital multiplication is much more complex in terms of circuitry than analog, and multiplication is used quite a bit in artificial network implementations.

**Limitation Interplay of Gödel and Turing**

The idea from the tail end of the title of the book this question referenced, "Beyond the Turing Limit," is also a little fantastic. The thought experiment of Alan Turing leading to the Turing machine and the associated computability theory (including Turing completeness) was not developed to be a limit. Quite the opposite. It was an answer to Gödel's incompleteness theory. People in Turing's time saw the work Gödel's genius as the annoying but indismissable limit threatening the centuries-old vision of using machines to automatically expand human knowledge. To summarize this work, we can state with assurance this.

The theory limiting what computers can do is not related to how the numbers are represented in electronic circuit implementations. It is a limitation of information mechanics.

These principles are important but have little to do with the limitation.

- Analog computing
- Miniaturization
- Parallel computing
- Ways that stochastic injection can help
- How algorithms can be defined in programming languages

The above has to do with the feasibility of a project for which some person or corporation must pay and the intellectual capacities required to completing it, not the hard limit on what is possible.

Defining what a super-Turing capability might be would be a dismissal or a dismantling of what mathematicians consider to be well constructed theory. Dismantling or shifting the contextual frame of some computability theory is plausible. Dismissing the work that has been done would be naive and counterproductive.

**Real Numbers are Now Less Real Than Integers**

The compelling idea contained in the question is the reference to continuity, physicality, and the real valued nature of parameters that acquire a learned state during the training of artificial networks.

To multiply a vector of digital signals by a digital parameter matrix requires many transistors and can require a significant number of clock cycles, even when dedicated hardware is used. Only a few transistors per signal path are required to perform the analog equivalent, and the throughput potential is very high.

To say that real values cannot be represented in digital circuits is inaccurate. The IEEE standards for floating point numbers, when processed in a time series, represent real valued signals well. Analog circuits suffer from noise and drift as stated above. Both analog and digital signals only appear to be comprised of real number values. Real numbers are not real except in the world of mathematical models. What we call quantities in the laboratory are essentially measurements of means of distributions. Solidifying and integrating the probabilistic nature of reality into science and technology may be the primary triumph of the twentieth century,

For instance, when dealing in milli-Amps (mA), electric current seems to be a continuous phenomenon, but when dealing with nano-Amps (nA), the quantum nature of electric current begins to appear. This is much like what happens with the miniaturization of the transistor. Real numbers can only be represented in analog circuits through the flow of discrete electrons. The key to the advantage of an analog forward feed in artificial networks is solely that the density of network cells can be considerably higher, reducing the cost of the network in its VLSI space.

In summary, real numbers received the name for their type prior to the emergence of quantum physics. The idea that quantities formerly considered real and continuous were actually statistical averages of discrete activities at a quantum level revolutionized the field of thermodynamics and microelectronics. This is something that disturbed Einstein in his later years. In essence, mathematics using real numbers is effective in engineering because it simplifies what physicist now believe are distributions of a large numbers of quantum phenomena occurring in concert.

**Summarizing the Probable Future of Analog Computing**

This phrase from the question is not precisely scientific, even though it points to a strong likelihood.

It is quite possible that quantum computers will be analogue.

This modified version is more consistent with scientific fact in the way it is phrased, and is also factual.

It is possible that computers dealing with signals at a near quantum level of miniaturization will have a higher proportion of analog circuitry.

This question and its answers have many links to papers and research work regarding analog computing: If digital values are mere estimates, why not return to analog for AI?.

1Many thanks!. Excellent answer and excellent pointer to the other thread. Fantastically! – TomR – 2019-01-24T08:53:36.937