Hacker News new | past | comments | ask | show | jobs | submit login

Is there a comparison of power efficiency of human brain doing 50 digits multiplication vs a multiplier circuit doing it?



I think the problem here would be figuring out how much of the brain's power draw to attribute to the multiplication. A brain is more akin to a motherboard than a single CPU, with all kinds of I/O, internal regulation, and other ancillary stuff going on all the time.


Is the issue then we haven't discovered the magical algorithm run by our brain? If we discover it, then digital circuits will handsomely beat brain.


We can surely build more efficient and capable hardware than our current evolved wetware, since all of the details of how to build it are generally externalized. If the chips had to fab themselves, it would be a different story.

The software is a different story. Sure, the brain does all sorts of things that aren't necessary for $TASK, but we aren't necessarily going to be able to correctly identify which are which. Is your inner experience of your arm motion needed to fully parse the meaning in "raise a glass to toast the bride and groom", or respond meaningfully to someone who says that? Or perhaps it doesn't really matter - language is already a decent tool for bridging disjoint creature realities, maybe it'll stretch to synthetic consciousness too.


All of computation is realised by very few arithmetic operations. Then test energy efficiency of wetware and hardware on those operations. Then any difference can be attributed to algorithms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: