Hacker News new | past | comments | ask | show | jobs | submit login

I don't really understand this "Entropy Is All About Arrangements" takeaway. A fair coin has higher entropy than a biased one. What are the "arrangements" in this case?



The coins don't have entropy, the sequences they produce do (in information theory sense, this is not about physics).

For a long sequence (say 1000), the former will produce close to 500 heads and 500 tails. The latter, assuming one heads are three times as probable as tails, will produce around 750 heads and 250 tails. There are many more different sequences of the first kind.


I'm referring to the entropy of the Bernoulli distribution. If the coin is fair, the entropy is 1 bit... if the coin isn't fair, then the entropy of the distribution is less than 1 bit. I'm having trouble reconciling the information theory way of thinking about entropy as a function of a distribution, with how physicists tend to think of entropy of arrangements.


There is a relationship between the distribution of x_i and the sequences generated from that distribution x_1, x_2, ..., x_n. If the coin isn't fair there are less arrangements possible. If the coin has two heads there is only one possible sequence.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: