Analog Computing and AI
Scott Hamilton
Senior Expert in Emerging Technology
Have you ever seen an old technology come back to solve a new problem? This is exactly what has begun to happen in the realm of Artificial Intelligence (AI). You, like most people, might think AI is new technology, however AI was actually invented before the digital computer. The first AI was called the Perceptron and was built in 1958 by Frank Rosenblat at Cornell University. The Perceptron was touted to be a much more advanced system than Rosenblat had actually designed, which hurt the field of AI and caused what modern computer scientists call the first dark era of AI.
Rosenblat based his system off of his knowledge of psychology and a study of how neurons work in the human brain. Of course even today, most of the knowledge on neurons and neurology is still theoretical; the theory is good enough to allow for some fairly solid simulations of neural activity. Rosenblat theorized that a neural network, a cluster of neurons, learn by utilizing the strength of the connection between neurons to trigger the firing of the next neuron. The human brain has trillions of neurons, so it is definitely not possible yet to build a neural network capable of human thought processes, but we can do simple things like recognize objects and drive cars with AI.
Rosenblat’s theoretical neural network is at the heart of machine learning and AI. Ironically much of his research was done on analog computers, which, as it turns out, are much faster at the matrix multiplications required for machine learning than our modern digital computers, but much less accurate at the mathematics themselves. AI is one area of computer science where precision of the math is much less important. Analog computers are a good fit in this field of study, and have already begun to bring back technology from the 1940s and 1950s to solve modern computer problems.
To give you a rough idea, in the growth of neural networks, Rosenblat’s Perceptron, which could do rudimentary character recognition, consisted of a single layer of 64 neurons and was trained by manually tuning the connection values of the neurons by adjusting the resistance of an analog multiplication circuit. Modern AI systems contain 700-million neurons. These systems require high-power computing; graphical processing units which draw upwards of 700 watts of power are used to train these modern neural networks.
There have been recent new developments in AI utilizing analog circuits to compute the matrix-matrix multiplication necessary for these large neural networks, and the analog circuits currently in development by Mythic AI, based in Texas, can do nearly the same computation on analog circuits utilizing just three watts of power. Digital computers are reaching the end of usefulness in AI as the neural networks are growing faster than the computational power needed to run them. AI is the perfect use case for analog computers because AI utilizes the same algorithm (matrix-matrix multiply), and does not require high precision. The biggest challenge of neural networks can be overcome by applying analog computing, and the biggest challenge of analog computing is a lack of precision in the results, making analog computing and AI a perfect match.
For a great video about analog computing and AI check out Derek Muller’s video on his channel Veritasium at https://youtu.be/GvsUOuSjvcg.
Until next week, stay safe and learn something new.
Scott Hamilton is an Expert in Emerging Technologies at ATOS and can be reached with questions and comments via email to shamilton@techshepherd.org or through his website at https://www.techshepherd.org.