Skip to content

The incredible zombie comeback of analog computers

    My personal term for the imprecise nature of the cluttered, fuzzy world was muzzy. But then, in 1980, I bought an Ohio Scientific desktop computer and found instant, lasting relief. All his operations were based on binary arithmetic, in which a 1 was always exactly a 1 and a 0 was a true 0, without fractions. The 1 of existence, and the 0 of nothingness! I fell in love with the purity of digital and learned to write code, which became a lifelong refuge from fuzzy math.

    Of course, digital values ​​still had to be stored in fallible physical components, but margins of error took care of that. In a modern 5-volt digital chip, 1.5 volts or less would represent the number 0, while 3.5 volts or more would represent the number 1. Components on a decently designed motherboard would stay within those limits, so there shouldn’t be any misunderstandings. .

    So when Bernd Ulmann predicted that analog computers were due for a zombie comeback, I wasn’t just skeptical. I found the idea a little… disturbing.

    Hoping for one reality check, I consulted Lyle Bickley, one of the founders of the Computer History Museum in Mountain View, California. Bickley has served as an expert witness in patent cases for many years and has an encyclopedic knowledge of everything that has been done and continues to be done in the field of data processing.

    “Many companies in Silicon Valley have secret projects involving analog chips,” he told me.

    Real? But why?

    “Because they use so little power.”

    Bickley explained that when, for example, brute force natural language AI systems distill millions of words from the internet, the process is insanely power-hungry. The human brain runs on a small amount of electricity, he said, about 20 watts. (That’s the same as a light bulb.) “But if we try to do the same thing with digital computers, it will take megawatts.” Digital “is not going to work for those kinds of applications. It’s not a smart way to do it.”

    Bickley said he would be breaking confidentiality to tell me details, so I started looking for startups. Soon I found a company in the San Francisco Bay Area called Mythic, which claimed to be marketing the “industry-first AI analog matrix processor.”

    Mike Henry co-founded Mythic at the University of Michigan in 2013. He is an energetic man with a neat haircut and a well-ironed shirt, like an old IBM salesman. He expanded on Bickley’s point, citing the brain-like neural network that powers GPT-3. “It has 175 billion synapses,” Henry said, comparing processing elements to connections between neurons in the brain. “So every time you run that model to do one thing, you have to load 175 billion values. Very large data center systems can barely keep up.”

    That’s because, Henry said, they’re digital. Modern AI systems use a type of memory called static RAM or SRAM that requires constant power to store data. The circuit must remain powered on even when it is not performing a task. Engineers have done a lot to improve SRAM’s efficiency, but there is a limit. “Tricks like lowering the supply voltage are running out,” said Henry.