Hacker News new | past | comments | ask | show | jobs | submit login

With kNN, if you have enough data and a good definition of "distance", then yes, it is in a way similar to "deep learning", just deep in the sense that's data and human-labor intensive.

I'm oversimplifying here, but what linear regression and logistic regression did to kNN is that you can automate the "distance" function, but you still have to manually construct the features. What DL did is one step further -- don't even bother feature-engineering, the network can construct the features themselves.

You see, there isn't a function that kNN can't approximate. If you have an impressive feature list and a training datum that has every feature exactly the same as your input, there is no reason not to directly use the output of said datum. It's the feasibility that matters.

Of course, DL is a huge step forward. It has made "impossible things possible, and hard things easy". The author also acknowledged DL's importance. However, that doesn't mean we should stop at DL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: