Thermodynamics II: Disorder and Decay

This article is cross-posted to My Other Weblog™, the Patrick Operator: Introduction to Physics: Thermodynamics II: Disorder and Decay.

This is part 5 of my Introduction to Physics series of posts.

More thermodynamics! Well, there’s plenty left: we haven’t even looked at the laws of thermodynamics and, actually, we’re not going to look at all of them. Instead, we’ll focus on the second law, which I think is the most interesting one.

The second law can be expressed in many different ways. The simplest of these is “over time, the entropy of a closed system increases”. But what’s entropy?

The most popular synonym for “entropy” is “disorder”: so the second law becomes “over time, things become more disordered”. But what precisely do we mean by disorder? The plain English definition actually hits surprisingly close to the mark: ordered things decay into disordered things, machines break down, components wear out and the cables under my desk get more and more tangled.

That said, we are physicists, and we should probably be a little more precise about what disorder is. Suppose that one day you decide to sort all the books on your bookshelf in alphabetical order, by author’s name. We’ll think of this as the low entropy (i.e., highly ordered) state. Over time, you take books out, read them and then put them back. But since you’re like me, you’re too lazy to put the books back in their correct, alphabetical position. After a couple of weeks, the books are more or less in alphabetical order, but not perfectly so. After a year, the books might as well have been ordered randomly. This is a high-entropy (i.e., highly disordered) state.

The crucial insight here is that there’s only one ordered state (ABCDE…) and there are lots of disordered states (for a tiny library of just 12 books, there are just over 479 million non-alphabetical ways of arranging the books; for a modest library of 60 books there are 8.3×1081 disorderings… that’s a freaking huge number that’s roughly equal to the estimated number of particles in the entire universe). As a consequence, over time, you’re overwhelmingly more likely to end up in a disordered (high entropy) state than in an ordered (low entropy) state.

The same thing happens with matter. Instead of “books” think “particles”, instead of “pulling books off the shelf and putting them back” think “particles jiggling around due to temperature”. States that are useful to us humans (such as houses and food and planets) tend to be lower entropy than states that are not useful to humans (such as the heat death of the universe, where all of existence is a uniform blob of fairly cold gas).

An Arrow in Time

If you think back to classical mechanics, one interesting observation you can make is that Newton’s laws don’t talk too much about time: in fact, they work just as well with time going backwards as they do with time going forwards! (We say that classical mechanics is time-reversible.) So there’s nothing in those laws that says which way time should go, which raises the question: why does time go forward?

An answer provided by entropy is the thermodynamic arrow of time. Time flows in the same direction as entropy. As we go forward in time, entropy always increases; if we could go backwards in time, entropy would decrease (in violation of the second law!).

This is an example of what physicists refer to as symmetry breaking. The laws of classical mechanics are symmetric in time: d = ts (distance travelled is speed multiplied by time) works just as well if time flows backwards as it does if time flows forwards.

By contrast, the second law of thermodynamics is not time symmetric: it tells you that entropy increases as you go fowards in time. The second law of thermodynamics prohibits entropy decreasing, and so prevents time flowing backwards. Thus the apparent symmetry between “fowards in time” and “backwards in time” is broken. A lot of modern physics research goes into looking at other ways the universe is not time-symmetric (or not symmetric in other ways).

Think about this some more and you might wander across a philosophical problem with all this: if entropy always increases, how did we manage to start out in a state with sufficiently low entropy that we ended up with our planet and life and all the good things? If you figure this out, please tell me, so I can leave your body in a gutter somewhere and claim my Nobel prize. 🙂