Hacker News new | past | comments | ask | show | jobs | submit login

How did researchers before that explain what the neurons do if they believed they did not have action potentials? Did they believe communication was done solely through chemical messaging?



Classical action potentials are just one mechanism of INTRAcellular communication - You could think of it as a special case of signaling via chemical concentration, where the chemical is cations and the propagation is faster+more directed than diffusion. INTERcellular signaling is only rarely mediated directly by voltage. Also, action potentials are most "useful" for propagating a signal rapidly over a long distance - It kind of accelerates and error-corrects (= reverses diffusive broadening) voltage signals down a linear path. Action potentials are so well known mostly because they show up in stuff that's easy to observe (long motor neurons) and they're easy to quantify

Somewhat related, there is a roughly inverse correlation between neuron count and "computational power per neuron", "older and simpler" critters' neurons are more likely to be "less specialized" and more likely to use hundreds of different chemicals for transmitting intercellular signals, while "newer and more advanced" critters' neurons are more likely to be "specialized" and use just one chemical for transmitting intercellular signals


Neural computing without action potents is commonplace. Computational interactions among cells and neurons in retina are almost all graded potentials that modulate transmitter release or conductances through gap junctions. Retinal ganglion cells of course do generate conventional spikes—to pass a data summary to midbrain. hypothalamus, and dorsal thalamus.

Action potential are almost strictly INTRAcellular events (minor exception being ephaptic effects) that are converted in a surprisingly noisy way into presynaptic transmitter release and variable postsynaptic changes in conductances.

Action potential are a clever kludge necessitated by being big and having long axons and needing to act quickly.


You don't need spikes to have computation. Deep networks don't have spikes.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: