Hacker News new | past | comments | ask | show | jobs | submit login

I highly recommend Andrew Ng's lectures also. I was going through them myself, and he does a good job of explaining the intuition behind many things (of which the math, although important, is only a formalization of).

One piece of advice I would give from my own experience is that you have to play with data to get practical experience applying the methods. Machine learning is not a set of plug and play blackboxes that you can feed random input into and get clean output. You have to spend a lot of effort understanding your data and how they relate to the specific method you're using. For example, if you use linear regression as your learning model, you have to understand what kind of relationship is assumed between the inputs and outputs (in this case, that the output is a linear combination of the inputs).

I know this because when I started, I would just toss unclean, unfiltered, and untransformed data into a method and hope for good results. Of course I fed garbage in, so I got garbage out.

Another word of advice is to watch out for overfitting. Often, you'll find that your training gives you good in-sample statistics (for example, with linear regression you'll get great R^2 with high p-values). However, when you test out of sample, you'll realize quickly that most of the models you've fit are overfit to the data that you trained on. Just something to be aware of.

I guess both of these may be very abstract and useless for you right now, but hopefully one day you'll look back and able to find use for it.




Yeap they are indeed abstract concepts to me right now, but it's great to have them mentioned, because they'll definitely come in handy later. thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: