Hacker News new | past | comments | ask | show | jobs | submit login

> NNs are currently at a similar point as where physics was before Newton and before calculus.

I'm more inclined to compare with the era after Newton and Leibniz, but prior to the development of rigorous analysis. If you look at this time period, the analogy fits a bit better IMO -- you have a proliferation of people using calculus techniques to great advantage for solving practical problems, but no real foundations propping the whole thing up (e.g., no definition of a limit, continuity, notions of how to deal with infinite series, etc.).




Maybe. On the other hand, maybe a rigorous mathematical analysis of NNs is as useful as a rigorous mathematical analysis of computer architectures - not very useful. Maybe all you need is just to keep scaling it up, adding some clever optimizations in the process (none of the great CPU ideas like caches, pipelining, out of order execution, branch prediction, etc came from rigorous mathematical analysis).

Or maybe it's as useful as a rigorous mathematical analysis of a brain - again, not very useful, because for us (people who develop AI systems), it would be far more valuable to understand a brain on a circuit level, or an architecture level, rather than on a mathematical theory level. The latter would be interesting, but probably too complex to be useful, while the former would most likely lead to dramatic breakthroughs in terms of performance and capabilities of the AI systems.

So maybe we just need to keep doing what we have been doing in DL field in the last 10 years - trying/revisiting various ideas, scaling them up, and evolving the architectures the same way we've been evolving our computers for the last 100 years, with the hope there will be more clues from neuroscience. I think we just need more ideas like transformers, capsules, or neural Turing machines, and computers that are getting ~20% faster every year.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: