You are currently viewing The Rise, The Fall – and The Resurrection of Analogue Computing
When old technology dies, it usually stays dead.
No one expects rotary phones or cathode-ray television sets to come crawling back from tech cemetery. Floppy disks, VHS tapes, pagers and faxing machines are some of the old guard of technology that shall forever rest in peace. Similarly, we will not see analogue computers popping up in data centres anytime soon. They were monstrous beasts that were the size of houses, limited in accuracy, and a pain in the behind to maintain.
Or so I thought, till I came across a puzzling statement.
The preface of the reissued book “Analog Computing” by German mathematician Bernd Ulmann says –
“Bringing back analog computers in much more advanced forms than their historic ancestors will change the world of computing drastically and forever.”
I found this statement to be absolutely asinine. If we follow the evolution tree of technology, we have clearly seen that digital was the next logical step in the metamorphosis. Then, why would Ulmann say something so controversial yet so brave? For example, look at photography – in the pre-digital era, variations in light created chemical reactions on a piece of film, where an image emerged as a representation, or an analogue, of reality. On the other hand, modern cameras convert these light variations as digital values, which are processed by the camera’s CPU before being stored as a long string of 1s and 0s.
Analogue?
In the later part of the 19th century, engineers coined the term “analogue” to describe computers that simulated real-world conditions. It used to be the frontier of computing until the 1970s, after which digital computing was introduced. In 2002, the world was storing more information digitally than in analogue format, and by the 2010s, the transition from analogue to digital was complete.
According to the researcher Martin Hilbert from the University of Southern California, within the next century, “computational power and ability to store as much information as that which can be stored in the molecules of all humankind’s DNA.”
Reasoning by analogy is a fundamental trait we possess. Whenever humans observe patterns in the physical world, we tend to scale them down into smaller models and use them to predict future designs. Take the Antikythera mechanism, for example – in 1901, sea divers discovered an ancient shipwreck off the coast of Greece. They recovered a treasure which came to be known as the oldest known example of an analogue computer in existence. This mechanism was a hand-powered model of the Solar System, which was used to compute and predict the positions and motions of celestial bodies, thereby predicting eclipses decades in advance.
Okay. But seriously, why analogue?
Lyle Bickley, a founding member of the Computer History Museum in Mountain View, California, served years as an expert witness in patent suits. He maintains an encyclopedic knowledge of everything that has been done and is still being done in the realm of data processing.
He said, “A lot of Silicon Valley companies have secret projects doing analogue chips.”
Really? But why?
“Because they take so little power.”
He explained that when brute-force natural-language AI systems distil millions of words from the internet, the process is insanely power-hungry. On the other hand, the human brain can store petabytes of information and runs on about 20 Watts of power, the same as a light bulb. He said, “Yet if we try to do the same thing with digital computers, it takes megawatts.” For that kind of application, digital is “not going to work. It’s not a smart way to do it.”
Mike Henry, cofounder of Mythic, which claimed to be marketing the industry’s first AI analogue matrix processor, expanded on Bickley’s point. He said the neural network that powers GPT-3 has 175 billion synapses, comparing the processing elements to brain neurons. “Every time you run that model to do one thing, you have to load 175 billion values. Very large data-center systems can barely keep up.”
This is because, Henry said, they are digital. Modern AI systems use static RAM, or SRAM, to store data. The caveat is that they require constant power to store data, even when not performing any tasks. Engineers have come up with various solutions, like lowering the supply voltage, to increase the efficiency of SRAM, but there is only so much they can do before hitting the threshold.
Diving deep into neural networks
In the ’90s and early 2000s, analogue was replaced by digital because of the lower cost and size and their accuracy and precision. But now, the science of AI builds on what is known as deep neural networks (DNNs). They do not require the same precision and are all about matrix multiplication.
Analogue computers are really, really good at matrix multiplication.
According to Mythic, “Today’s digital AI processors are tremendously expensive to develop and rely on traditional computer architectures, limiting innovation to only the largest technology companies. Inference at the edge requires low latency low power, and must be cost-effective and compact. Digital processors are just not able to meet these challenging needs of edge AI.”
Plus, the flash memory of analogue chips can hold different voltages to simulate how neurons work in the brain. Now, isn’t the whole point of AI that it works like a human brain?
Now, you may ask, “If digital computing is more general-purpose and precise than analogue computers, why are they not the future?”
The short and not-so-sweet answer to this question is that we are approaching the limits of digital computing. Claude Shanon’s groundbreaking thesis on how Boolean algebra can construct any logical, numerical relationship has taken us far. But that ride is coming to a slow halt. Over the years, as we kept developing smaller and smaller transistors, it led to faster digital computers. This phenomenon was captured by Moore’s Law, which states that the number of transistors in a dense integrated circuit doubles about every two years. But the weird thing about Moore’s Law is that it is no law – it was merely an observation extrapolated into the future. The latest transistor architectures are so dense that we are reaching the size of atoms. Soon enough, we will reach the physical threshold of circuit architecture – how do we pack stuff closer than atoms?
The Return of the Old Guard
The most challenging real-world phenomenon we are trying to predict and solve is differential calculations. For instance, fluid flow is described by a set of equations known as the Navier-Stokes equation. What’s weird is that they do not have analytical solutions and can only be solved using computers. Typically, these equations can be used to predict global tides, weather changes, etc. But, when we combine this with the sheer amount of data machine learning provides, we need faster and better computing.
Analogue computers are unmatched in speed by digital computers in solving specific differential equations.
The question arises: could we develop a general-purpose analogue computer where the architecture can instantly be rebuilt to solve different problems when required?
According to scientific researchers and tech startups involved in this field, they do believe that analogue computing has more potential.
When Ning Guo, a Research Hardware Engineer at Facebook Reality Labs and the developer of the 4th-order hybrid computer, was asked about the possible applications of analogue computing, this is what he had to say –
“The law of diminishing returns applies to the digital realm, yet it still dominates the industry. If we applied as many people and as much money to the analogue domain, I think we could have some kind of analogue coprocessing happening to accelerate the existing algorithms. Digital computers are very good at scalability. Analog is very good at complex interactions between variables. In the future, we may combine these advantages.”
Long Live Analogue
Whether your project is greenlit in business always comes down to one thing – financial. Much money is being thrown at AI, smarter drug molecules, agile robots and a dozen other applications that try to make sense of the complex physical world. But if power consumption and heat dissipation become expensive problems, cutting off digital load by incorporating miniaturised analogue coprocessors is significantly cheaper. No one would care if analogue computation was done using a giant steel box filled with vacuum tubes.
Reality is imprecise, no matter how much we prefer otherwise, and when you want to model it with truly elegant faithfulness, digitising it might not be the sensible way to do it. Therefore, in the words of Omar Ahmad, I must conclude:
“We live in a digital world, but we’re fairly analogue creatures.”
Author: Amar Chowdhury