Analog Sound Board

Photo by: Dmitry Demidov showing an analog sound board.

The very first computers developed were based on analog circuits; computation was done by the modification of electronic signals. For example, addition was performed by combining two signals with a simple amplifier circuit. These analog computers were some of the fastest computers ever designed, but they had two major issues. The first was that they were very prone to errors and the second was that there was a limit to the size of the computation.

The analog computers were error prone due to manufacturing variations in the components. A great example is the resistors used in the analog computing circuits. The standard resistor has a tolerance of 10 percent, which basically means that if you expect to use a 100 ohm resistor, it can be anywhere from 90-110 ohms. If you try to use two of them in an additional circuit, the error can get worse. Let’s say you are adding 100+100 and get 180 as the final answer, or 220. This happened quite frequently in the early days of computers. So much so that NASA chose to use hand calculation with paper and pencil over the computers during the lunar missions, because the computer results could not be trusted.

The second problem has to do with the fact that as the size of the problem you were trying to solve grew, the number of components in the computation circuit also grew, but this growth was not always linear; in most cases it was exponential in growth. This led to computers being the size of small buildings before they became useful for scientific research. Just in case you don’t understand the difference between linear growth and exponential growth, linear growth is like counting (2, 4, 6, 8…) and exponential growth is like multiplying (2, 4, 8, 16, 32…).

There were several techniques designed to correct the errors in analog computers, but most of these error correction techniques involved adding even more components to the computation circuits, making the size problem even worse. That was until someone came up with the idea of digital computing, which eliminated the error nearly completely. This was because digital computing worked off of a concept of switches; a signal was always either on (1) or off (0) but never considered to be a value in between. The digital computer significantly reduced the complexity of computers and brought about a new revolution in technology.

Over the course of nearly 80 years, we have begun to reach the point that digital computers can no longer achieve the computations necessary for continued scientific research, and new methodologies for computation are being developed. You might say we have gone full circle in computers, as nearly all of the new computing technologies on the horizon are analog in nature. Quantum computing uses the analog measurements of the rotation of atoms to simulate quantum chemistry systems; optical neural networks utilize analog devices to process light signals in machine learning use cases; and both have the same problems the century old analog computers faced.

The new research at MIT has resulted in the discovery of new error correction techniques that give hope for the optical neural networks. Prior to Ryan Hamerly’s research, it was not thought possible to build an optical neural network large enough to simulate any real world problems because the error rate would be far too high. Hamerly and his group of graduate student researchers developed a single hardware component to add to the optical switches to help eliminate the errors and prevent them from accumulating. The new technique actually runs counter to the expectations; larger circuits should result in larger error accumulation, but adding their component causes the error to decrease as the circuits grow larger. As a result his research has effectively solved both the error rate and scalability issues on optical neural networks. It might not be long before we see optical computers taking machine learning research to the next level and bringing a century old computing technique back into the mainstream.

Until next week, stay safe and learn something new.

Scott Hamilton is an Expert in Emerging Technologies at ATOS and can be reached with questions and comments via email to or through his website at

Share via
Copy link
Powered by Social Snap