Hacker News new | past | comments | ask | show | jobs | submit login

Well, I'm pretty mathematically inclined, but ignorant of Bayesian statistics, and I still didn't really get it. Frinstance the first full-length paragraph:

The full Bayesian probability model includes the unobserved parameters. The marginal distribution over parameters is known as the “prior” parameter distribution, as it may be computed without reference to observable data. The conditional distribution over parameters given observed data is known as the “posterior” parameter distribution.

uses too much jargon; I'm sure I'd understand it if he'd defined "marginal distribution" and "conditional distribution" and clarified exactly what the difference between observable and unobservable data and/or parameters is. The hypothetical audience for this seems to be people who are intimately familiar with statistical terminology but know absolutely nothing about Bayesian statistics.




I think those concepts are best understood by example.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: