Hacker News new | past | comments | ask | show | jobs | submit login
How Life and Death Spring from Disorder (quantamagazine.org)
59 points by spyhi on July 19, 2017 | hide | past | favorite | 12 comments



This article looks extremely interesting, but there is something that I don't understand and which prevents me to follow the whole reasoning: why does the "demon" has to wipe his memory and why does it cost energy? (instead of, for example, just "write" new information slices on top of old unused ones, without flushing the whole memory)


By overwriting old information it is still being erased and lost - so the Landaur's principle holds, some explanation is here: https://en.wikipedia.org/wiki/Landauer%27s_principle


A short summary as I understand:

Imagine you write a '1' bit to a memory location - now you know for sure that the location contains '1' now, as you can check it anytime. But now you have absolutely no information about the past, either it was '1' before or '0'. So the number of legal states (and entropy) seemingly have decreased if you are considering this bit level macrostate. But according to the 2nd law the global entropy never decreases, so at least an equal entropy increase eventually should be there somewhere in the system (in the microstates - the states you did not considered as logically separate) - for example as a small heat increase.


I see. So, the point made was not so much that the "demon" has to go through complete flush of memory than its arbitration causes small energy waste on its own which compensates for the perceived diminution of entropy in the manipulated system. This allows me to proceed, thanks!


This is also why reversible computing is interesting, because (at least theoretically) you can do computation with minimal energy expenditure & wasted heat if your operations are all physically reversible.


This stuff is fascinating. Does anybody have any recommendations for a book-length treatment of the same topics? (Something suitable for your average EE)


"What is Life", by E. Schroedinger is referenced in the article and covers these topics.

"Order out of Chaos" by Prigogine and Stengers, and "Into the Cool" by Sagan and Schneider would also be decent recommendations.


I really wanted to read this article. But the text doesn't scale on my phone and I couldn't read across the whole line.


There's a handy PDF link in the top left corner; I don't know if it shows on mobile.


" This randomness is equated with the thermodynamic quantity called entropy — a measurement of disorder "

So sad that even such a nice article, with so many smart people from the SFI to proof read it, still described entropy as disorder.

Sigh.

PS: before I get downvoted: [1] https://en.m.wikipedia.org/wiki/Entropic_force#Hydrophobic_f... [2] http://glotzerlab.engin.umich.edu/home/publications-pdfs/201...


Are you trying to imply that entropy sometimes decreases (things become more ordered) or are you trying to say that the term disorder is inaccurate?

Those links appear to show examples of "spontaneous" decreases in entropy. The problem is that entropy can easily decrease in local systems, without violating thermodynamics or the fact that entropy as a whole always increases.

For example plant life can form, decreasing entropy, as long as on a global scale entropy increases.

On the other hand, if you're trying to say that disorder is a confusing term, then that's true. The every day meaning of disorder is often the opposite of what is meant when talking about entropy and thermodynamics. But if you take the word "disorder" in the thermodynamic sense, as a full microscopic description of the system, then an increase in entropy does imply an increase in disorder. So although it's a confusing term, it's still a valid description.

https://en.m.wikipedia.org/wiki/Entropy_(order_and_disorder)


"if you're trying to say that disorder is a confusing term, then that's true. "

Yes, that is what I mean. The number of microstates can increase and yet make it look, to the eye, like the system is more ordered than before although entropy is increasing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: