Hacker News new | past | comments | ask | show | jobs | submit login

Most of those things are stuff you'll encounter if you read up on Hamiltonian MC and information field theory, but it might take some additional reading to get all the required background knowledge.

The first two are just tricks really. For the weighted estimates you just go from:

    log(P) = \sum log(P_i)
to

    log(P) = \sum w_i log(P_i)
and for the sufficient statistics you get derivations like:

    log(P) = \sum_i (1/2) (x_i - m)^2
           = \sum_i (1/2) x_i^2 - x_i m - (1/2)m^2
           = (\sum_i (1/2) x_i^2) - (\sum_i x_i) m - (1/2)m^2
to show that (\sum_i (1/2) x_i^2) and (\sum_i x_i) fully determine the posterior distribution.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: