Forget the digital. The future of AI is… analog? At least that’s the claim of Mythic, an AI chip company that it says is “making a performance leap forward” by going back in time. Type of.
Before ENIAC, the world’s first room-sized programmable, general-purpose electronic digital computer, came to life in 1945, all computers were arguably analog—and have been for as long as computers have existed.
Analog computers are a bit like stereo amplifiers, using a variable range to represent desired values. In an analog computer, numbers are represented by currents or voltages instead of the zeros and ones used in a digital computer. While ENIAC marked the beginning of the end for analog computers, analog machines in some form endured until the 1950s or 1960s when digital transistors took hold.
“Digital art has replaced analog computing,” Tim Vehling, Mythic’s senior vice president of product and business development, told Digital Trends. “It was cheaper, faster, more powerful and so on. [As a result]Their analogue went away for a while.”
To paraphrase a famous quote often attributed to Mark Twain, reports of the death of analog computers may have been grossly exaggerated. If the triumph of the digital transistor for analog computers was the beginning of the end, maybe it was only the beginning of the end of the beginning.
Build the next great AI processor
Mythical
However, Mythic does not intentionally build retro technology. This isn’t just any steampunk startup operating out of an old clock tower HQ filled with Tesla coils; It is a well-funded technology company based in Redwood City, California and Austin, Texas that builds Mythic Analog Matrix Processors (Mythic AMP) that promise advances in power, performance and cost by using a unique analog computing architecture which differs significantly from regular digital architectures.
Devices like the announced M1076 single-chip analog computing device are expected to usher in an era of computationally intensive processing at impressively low power consumption.
“There’s definitely a lot of interest in making the next great AI processor,” said Vehling. “There is certainly a lot of investment and venture capital money flowing into this space. That is out of the question.”
The analog approach is not just a marketing gimmick either. Mythic sees trouble in the future for Moore’s Law, Intel co-founder Gordon Moore’s famous observation in 1965 that claimed that about every 18 months, the number of transistors that can be squeezed onto an integrated circuit is doubling. This observation has helped usher in a period of sustained exponential improvements for computers over the past 60 years and has supported the amazing advances AI research has made over the same period.
But Moore’s Law faces challenges of a physical nature. Progress has slowed due to the physical limitations of constantly trying to miniaturize components. Approaches such as optical and quantum computing offer a possible way out. Mythic’s analog approach, meanwhile, attempts to create compute-in-memory elements that work like tunable resistors, supplying inputs as voltages and collecting the outputs as currents. The idea is that the company’s chips can handle the matrix multiplication needed to allow artificial neural networks to work in innovative new ways.
As the company explains, “We use analog data processing for our core neural network matrix operations, where we multiply an input vector by a weight matrix. Analog computing offers several key advantages. First, it’s amazingly efficient; it eliminates memory moves for the neural network weights since they are used in place as resistors. Second, it’s high performance; When we do one of these vector operations, hundreds of thousands of multiply-accumulate operations are happening in parallel.”
“There are many ways to approach the problem of AI computation,” Vehling said, citing the different approaches being explored by different hardware companies. “There is no wrong way. But we fundamentally believe that the approach of throwing more and more transistors at it, making the process nodes smaller and smaller – essentially the Moore’s Law approach – is no longer practical. It’s already beginning to prove itself. Whether you use analog computers or not, companies need to find a different approach to build next-generation products that have high computing power, low power consumption, [et cetera].”
The future of AI
Chris DeGraw/Digital Trends, Getty Images
If left unfixed, this issue will have a major impact on further AI development, especially if done locally on devices. Right now, some of the AI we rely on every day combines on-device processing and the cloud. Think of it as if you have an employee who can make decisions to a certain extent, but then has to call their boss and ask for advice.
Smart speakers, for example, work according to this model, which perform tasks such as keyword spotting (“OK, Google”) locally, but then outsource the actual voice query to the cloud and thus allow household appliances to use the power of supercomputers stored in huge data centers, thousands of kilometers away.
That’s all well and good, although some tasks require immediate answers. And as AI gets smarter, we expect more from it. “We’re seeing a lot of what we call Edge AI that doesn’t rely on the cloud when it comes to industrial applications, machine vision applications, drones or video surveillance,” Vehling said. “[For example], you might want a camera that tries to identify someone and take immediate action. There are many applications that require immediate application to a result.”
AI chips need to keep up with other hardware breakthroughs. Cameras, for example, are getting better and better. Image resolution has increased dramatically over the past few decades, which means that deep AI models for image recognition must be able to parse ever-increasing amounts of resolution data in order to perform analytics.
Add to this the growing expectations of what people believe should be extractable from an image – be it mapping objects in real-time, identifying multiple objects at once, determining the three-dimensional context of a scene – and you realize the immense challenge AI systems are in front.
Whether it’s offering more computing power while keeping devices small, or data protection requirements that require local processing rather than offloading, Mythic believes its compact chips have a lot to offer.
The roll out
Mythical
“was [currently] in the early commercialization phase,” said Vehling. “We announced a few products. So far we have a number of customers who are evaluating [our technology] to use in their own products… Hopefully by the end of this year, early next year we will see companies using our technology in their products.”
Initially, this is likely to be the case in enterprise and industrial applications such as video surveillance, high-end drone manufacturers, automation companies and more. However, don’t expect consumer applications to lag too far behind.
“Beyond 2022 — [2023] We’re going into 24 – we’re going to start seeing consumer tech companies [adopt our technology] too,” he said.
If analog computing turns out to be the innovation driving the augmented and virtual reality needed for the metaverse to work… well, isn’t this the most perfect meeting point of steampunk and cyberpunk you could wish for?
Hopefully, Mythic’s chips will prove less imaginary and unreal than the company’s chosen name would have us believe.
Today’s tech news, curated and condensed for your inbox
Check your inbox!
Please provide a valid email address to continue.
This email address is currently on file. If you don’t receive any newsletters, please check your spam folder.
Sorry, there was an error subscribing. Please try again later.
Editor’s Recommendations
Comments are closed.