Hacker News new | past | comments | ask | show | jobs | submit login

I was talking about the ability to use thin layers over top of broad learning. If neural networks are a realistic analogy at all, I think it fits. The roots of all this ML stuff is not straight from linear algebra even though much of the math is.



You're playing it extremely fast and loose with concepts like "low-level prerequisite knowledge" and how exactly does something "rel[y] on that knowledge", though. These aren't physical quantities like temperature where we -- as a species -- have the massive amount of low-level prerequisite knowledge that allows us to make rapid high level judgments that rely on that knowledge. The previous sentence is an example of how easy this reasoning is to abuse.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: