Hacker News new | past | comments | ask | show | jobs | submit login

Introduction to Statistical Learning

https://faculty.marshall.usc.edu/gareth-james/ISL/

Elements of Statistical Learning

https://web.stanford.edu/~hastie/ElemStatLearn/

Machine Learning: A Probabilistic Perspective

https://mitpress.mit.edu/books/machine-learning-1




"Machine Learning: a Probabilistic Perspective" is more an encyclopedia of algorithms I would say, and it has lots of typos. I personally would not recommend it (except for the amount of algorithms that it covers, many of which are usually not found in other books).


Thanks for early warning. Will have to keep that in mind.


Are those really the best starts for "Bayesian statistics"?

Especially the first 2 are rather the standard "intro to ML textbooks", with a frequentist focus (ISL may even have zero Bayesian stuff - Naive Bayes is not "Bayesian" – while ESL still has maybe 10% bayesian content if that).

Instead, I would suggest the following for learning Bayesian methods, especially given the HN crowd: https://github.com/CamDavidsonPilon/Probabilistic-Programmin...


You make a good point. It's been a while since I flipped through them, they just come up in lots of discussions on this topic. I agree that the series you link to is really great for PPL and Bayesian methods. You may find that the library upon which it's based (PyMC3) is built on top of Theano, which has been abandoned and deprecated. PyMC4 is around the corner and uses TensorFlow Probability. Early, informal reports say it's 10x faster.


Thanks a ton for these. Added this to things I know that I don't know list. ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: