Hacker News new | past | comments | ask | show | jobs | submit login

Would you mind saying some more about these topics? Or provide some references? Iā€™d like to learn a bit more on all of these!



Most of those things are stuff you'll encounter if you read up on Hamiltonian MC and information field theory, but it might take some additional reading to get all the required background knowledge.

The first two are just tricks really. For the weighted estimates you just go from:

    log(P) = \sum log(P_i)
to

    log(P) = \sum w_i log(P_i)
and for the sufficient statistics you get derivations like:

    log(P) = \sum_i (1/2) (x_i - m)^2
           = \sum_i (1/2) x_i^2 - x_i m - (1/2)m^2
           = (\sum_i (1/2) x_i^2) - (\sum_i x_i) m - (1/2)m^2
to show that (\sum_i (1/2) x_i^2) and (\sum_i x_i) fully determine the posterior distribution.


I wonder how much he knows, interpreting it as a thermodynamic ensemble is just going back to thinking about probabilities really. Information field theory is something very different. The cheesy explanation for it is that it is just quantum field theory in imaginary time.


>it is just quantum field theory in imaginary time.

Quantum field theory is just thermodynamics in imaginary time, not sure why you object to calling it a thermodynamic ensemble. Besides, I could hardly introduce a subject as 'quantum field theory in imaginary time' now could I?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: