A startup, Mythic, has created analog chips that perform linear algebra for AI by solely leveraging the properties of voltage and conductance! This is exciting news because when Moore's law peaks in a couple of years, analog chips may be the scalable and efficient drop-in replacement for the digital chips we use today.
⚡ The attached clip is a small portion of a recent Veritasium video that explains this new technology. It shows how these chips can produce 25 trillion math operations per second (teraflops) with only 3 watts! Much more efficient when compared to the NVIDIA Titan V GPU, which can produce 110 teraflops at 250 watts.
Featured image by: The Heathkit EC-1 Educational Analog Computer (1960) retrieved from Wikimedia Commons
🤖 Efficiency matters since a massive carbon footprint is one of the most significant vulnerabilities and negative externalities for both artificial intelligence and blockchain. And it's not just SOTA technologies! Compare the wattage of any computer to what an adult human brain consumes: 12-20 watts.
🧠 Indeed, the brain is efficient because biological neural networks orchestrate both analog and digital signaling seamlessly. Moreover, I'm convinced that biological systems and processes inspire the best human engineering. Biomimicry makes for the most robust materials, efficient design, structural integrity, and much more. So why second guess 3.7 billion years of evolution?!
🔮 As promising as analog computing seems, it's hard to tell what will happen in the next decade. Neuromorphic computing will go beyond just changing the signaling of the hardware but also attempt to mimic biology through different use of materials and architectures. We also have quantum computing which might just blow everything out of the water but may never be consumer-facing.