Computers are going to increase in size as they consume more and more power, requiring increasingly elaborate cooling systems.
Like how the transistor made the big and hot vacuum tubes obsolete, maybe we’ll see some analog breakthrough do the same thing to transistors, at least for AI.
I doubt there is a world where we use analog for general purpose computing, but it seems perfect for messy, probabilistic processes like thinking.
I think the problem here would be figuring out how much of the brain's power draw to attribute to the multiplication. A brain is more akin to a motherboard than a single CPU, with all kinds of I/O, internal regulation, and other ancillary stuff going on all the time.
We can surely build more efficient and capable hardware than our current evolved wetware, since all of the details of how to build it are generally externalized. If the chips had to fab themselves, it would be a different story.
The software is a different story. Sure, the brain does all sorts of things that aren't necessary for $TASK, but we aren't necessarily going to be able to correctly identify which are which. Is your inner experience of your arm motion needed to fully parse the meaning in "raise a glass to toast the bride and groom", or respond meaningfully to someone who says that? Or perhaps it doesn't really matter - language is already a decent tool for bridging disjoint creature realities, maybe it'll stretch to synthetic consciousness too.
All of computation is realised by very few arithmetic operations. Then test energy efficiency of wetware and hardware on those operations. Then any difference can be attributed to algorithms.
Like how the transistor made the big and hot vacuum tubes obsolete, maybe we’ll see some analog breakthrough do the same thing to transistors, at least for AI.
I doubt there is a world where we use analog for general purpose computing, but it seems perfect for messy, probabilistic processes like thinking.