Hacker News new | past | comments | ask | show | jobs | submit login

Thank you for this - the name neural networks has made a whole generation of people forget that they have an endocrine system.

We know things like sleep, hunger, fear, and stress all impact how we think, yet people want to still build this mental model that synapses are just dot products that either reach an activation threshold or don't.




Fortunately for academics looking for a new start in industry, this widespread misunderstanding has made it only far too easy to transition from a slow-paced career in computational neuroscience to an overwhelmingly lucrative one in machine learning!


There have been people on HN arguing that the human brain is a biological LLM, because they can't think of any other way it could work, as if we evolved to generate the next token, instead of fitness as organisms in the real world. Where things like eating, sleeping, shelter, avoiding danger, social bonds, reproduction and child rearing are important. Things that require a body.


It's also frustrating because LLMs aren't even the only kind of AI/ML out there, they're just the kind currently getting investment and headlines.


I'm one of those people. To me those things only sounded like a different prompt. Priorities set for the llm


Isn’t that taken the analogy too literally? You’re saying nature is promoting humans to generate the next token to be outputted? What about all the other organisms that don’t have language? How do you distinguish nature prompts from nature training datasets? What makes you think nature is tokenized? What makes you think language generation is fundamental to biology?


Here's the hubris of thinking that way:

I would imagine the baseline assumption of your thinking is that things like sleep and emotions are a 'bug' in terms of cognition (or at the very least, 'prompts' that are optional).

Said differently, the assumption is that with the right engineer, you could reach human-parity cognition with a model that doesn't sleep or feel emotions (after all what's the point of an LLM if it gets tired and doesn't want to answer your questions sometimes? Or even worse knowingly deceives you because it is mad at you or prejudiced against you).

The problem with that assumption is that as far as we can tell, every being with even the slightest amount of cognition sleeps in some form and has something akin to emotional states. As far as we can prove, sleep and emotions are necessary preconditions to cognition.

A worldview where the 'good' parts of the brain (reasoning and logic) are replicated in LLM but the 'bad' parts (sleep, hunger, emotions, etc.) are not is likely an incomplete model.


Do airplanes need sleep because they fly like birds who also require sleep?


Ah a very fun 'snippy' question that just proves my point further. Thank you.

No airplanes do not sleep. That's part of why their flying is fundamentally different than birds'.

You'll likely also notice that birds flap their wings while planes use jet engines and fixed wings.

My entire point is that it is foolish to imagine airplanes as mechanical birds, since they are in fact completely different and require their own mental models to understand.

This is analogous to LLMs. They do something completely different than what our brains do and require their own mental models in order to understand them completely.


I'm reluctant to ask, but how do ornithopters fit into a sleep paradigm?


Great follow up!

Ornithopters are designed by humans who sleep - the complex computers needed to make them work replicate things humans told them to do, right?

It is a very incomplete model of an ornithopter to not include the human.


Here, it's actually fun to respond to your comment in another way, so let's try this out:

Yes, sleep is in fact a prerequisite to planes flying. We have very strict laws about it actually. Most planes are only able to fly because a human (who does sleep) is piloting it.

The drones and other vehicles that can fly without pilots were still programmed by a person (who also needed sleep) FWIW.


They do need scheduled maintenance.


Birds flap their wings and maneuver differently. They don't fly the same way.


People will spout off about how machine learning is based on the brain while having no idea how the brain works.


It is based on the brain, but only in the loosest possible terms; ML is a cargo cult of biology. It's kind of surprising that it works at all.


It works because well, its actually pretty primitive at its core. Whole learning process is actually pretty brutal. Doing millions of interations w/ random (and semi-random) adjustments.


I think I've fallen into the "it's just a very fancy kind of lossy compression" camp.


Honestly once you understand maximum-likelihood estimation, empirical risk minimization, automatic differentiation, and stochastic gradient descent, it's not that much of a surprise it works.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: