Hacker News new | past | comments | ask | show | jobs | submit login

# synapses should be analogous to # model parameters no? And # model parameters should be linear in # transistors.



We can not even get close to saying our current networks can be even close to synapses in performance or functional because architecturally we still use feedforward networks no recursion, no timing elements, very static connections. Transitors will definitely have some advantages in terms of being able to synchronize information and steps to an infinitely better degree than biological neurons, but as long as we stick with transformers it's the equivalent of trying to get to space by stacking sand, could you get there eventually? Yes, but there's better ways.


>> # synapses should be analogous to # model parameters no?

I think they're equivalent to a parameter AND the multiplier. Or in analog terms they'd just be a resistor whose value can be changed. Digital stuff is not a good fit for this.


> Or in analog terms they'd just be a resistor whose value can be changed

For what it's worth, that's actually a thing (ReRAM/memristors), but I think it got put on the back burner because it requires novel materials and nobody figured out how to cost-effectively scale up the fabrication versus scaling up flash memory. I saw some mention recently that advances in perovskite materials (a big deal lately due to potential solar applications) might revive the concept.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: