An ice cube on a summer day

Pratyush Tiwary

In this short article, we will try to understand a driving force of nature that is arguably one of the most fundamental and omnipresent – yet generally a bit poorly understood by the average high school student. This force of nature is called entropy, and in connivance with its better-known accomplice, energy, it drives all creation and destruction. This essay is primarily aimed at high school teachers who want a way to explain entropy better to their students, helping set them up for a deeper understanding of various principles in physics and chemistry.

entropy We will start with an intuitive definition of entropy. We will then introduce without proof a law that governs how entropy should behave, commonly known as the second law of thermodynamics. Throughout the article we will look at a few real life events. Using entropy as a tool we will attempt to understand these phenomena from a deeper perspective and get a feel for why they really happen. Finally, we will very cursorily try to connect entropy with information theory, thereby giving us a sense of why entropy and the second law do what they do. That will be the end of this little jaunt. Some key statements are emphasized in bold throughout this article for easy accessibility.

Recall those hot summer days when you take an ice cube out of the freezer, and it almost spontaneously melts, even before you can put it in your soft drink? And as a second example, when you, the teacher, leave the classroom for just a few seconds and come back, you never find the students the way you left them! Most likely, you find them in complete “chaos” almost biting each others’ heads off! And, the longer you leave them “isolated” from your supervision, the higher is the likely state of chaos or disorder you find them in.

Entropy is a way to quantify the overall disorder or chaos of a system.

And the second law of thermodynamics says that the entropy of an isolated system can never decrease.

By an isolated system, we mean that neither energy nor mass crosses the system’s boundary in or out. An example is when you put hot soup in a perfectly insulated thermos. Neither heat nor mass can be lost to the external surroundings.

Let’s now be slightly more quantitative. If there are N ways in which a system can exist or arrange itself, then the entropy S of that system will increase as N increases. This is easy to understand through one of our examples. If there was one student in your class that you left unguarded, how much chaos could he/she cause? Probably less than if you had five students, which in turn would lead to lesser chaos than if you had hundred students (which unfortunately does happen in many classrooms). Thus we say that: Entropy S increases with the N number of ways in which a system can exist.

More rigorously, S is proportional to the logarithm of N. That leads us to a more rigorous definition of entropy: S = kB log N … (1)

Boltzmann’s tomb in Vienna, Austria, with W used instead of N as in our notation. Photograph by Thomas Schneider, 2002. http://alum.mit.edu/www/toms/boltzmann.html
Boltzmann’s tomb in Vienna, Austria, with W used instead of N as in our notation. Photograph by Thomas Schneider, 2002.
http://alum.mit.edu/www/toms/boltzmann.html
This notion was introduced originally by the physicist Boltzmann, who liked it so much that it is engraved on his tombstone (Fig.). kB is a proportionality constant, named “Boltzmann’s constant” in honor of the legendary Austrian physicist. This constant forms the link between microscopic and macroscopic physics, and provides a measure of the amount of energy corresponding to the random thermal motions of the molecules in any substance. For a one-dimensional classical system at equilibrium at temperature T, this average energy is kBT/2.

Why logarithm of N in Eq. (1)? This is because entropy, unlike say temperature, is required to be an extensive property. When one combines two systems with N1 and N2 ways independently of existing, the combined system has N1*N2 ways of existing, not N1+N2. But the overall entropy should be a sum of the individual entropies, i.e., S1+S2 and not S1*S2 (to understand this, think what would happen if exactly one of the systems had N=1 and thus 0 entropy!) The log in equation (1) serves precisely this role, because log (N1*N2) = log N1+log N2. No other function but the logarithm satisfies this unique requirement!

Now that we know how to calculate entropy, recall the second law by which the entropy of an isolated system always increases. Thus, left on its own, an isolated system will always tend to drift towards more chaos, or higher number of ways N in Eq. (1) that it can exist in.

Let’s go to the ‘ice cube on a summer day’ example and look at it in the light of entropy. The ice cube has molecules well arranged on a crystalline lattice, executing thermal motions around their average positions, but on an average, the molecules are confined to their positions on the crystal. But if the ice cube melts, the molecules as a liquid are free to explore much more space, thus increasing their entropy. What has actually happened here is that heat flew from the warmer surroundings into the colder ice cube, causing a phase transformation from solid to liquid. This leads us to an alternative statement of the second law of thermodynamics: Unless driven by an agent to do otherwise, heat will always flow from warmer to cooler body.

A point of confusion that might happen (it happened to me!) is in seeing the equivalence of the two statements of the second law. So let’s take a moment for this purpose by using another analogy.

In our analogy, heat can be considered like money. The warmer body is akin to a 9-to-6 hard working parent with no free time. The cooler body is like a 14-year-old teenager on vacation from school. A 100 rupees does not mean much to the busy parent, especially with barely any free time during which to spend it. But for the teenager on vacation, those 100 rupees increase the number of ways to spend his or her time in an enormous number of ways. Thus movement of 100 rupees from parent to child, or heat from a warmer to cooler body, slightly decreases the entropy of the warmer body, but the rise in the entropy of the colder body is so dominant, that overall it increases the entropy of the combined system. Note that to make this useful but hand waving analogy more rigorous, we have to ensure that the combined system is also isolated.

This above intuitive (probably more so to parents than to the teenager) discussion leads to yet another very interesting example: evolution. Some critics of science use the second law to state that evolution is non-scientific. According to them, the double helix of the DNA and all the machinery of the human body is too ordered and low in entropy to have arisen without an external designer – it cannot be simple evolution.

What these critics fail to take into account is that the human body is not an isolated system – the second law as we saw today talks about the entropy of an isolated system. Further, entropy has several associated components. While it is true that the machinery inside us is more ordered than a simple bunch of carbon atoms, think for instance of all the chaos this machinery has caused on beautiful planet earth! Thus, we cannot conclude that evolution contradicts the second law of thermodynamics. The true relation between entropy and evolution calls for a much more detailed discussion – something you could encourage your students to think about.

Before we end, let us look at entropy again from yet another perspective: information, or the lack thereof. As the number of ways in which a system can exist increases, to an observer it becomes less likely which of these states will the system actually take. That is, if there is only one possible state, you know the system has to arrange conforming to this state. But if there were 100 possible equally likely states, then the observer will be 100 times poorly informed about which state the system actually takes. Thus, entropy is also inversely related to the amount of information available about a system, and the certainty with which this is known. The second law can then be re-interpreted as that this certainty should decrease in the course of time. The explanation for this is eventually rooted in the quantum nature of reality and Heisenberg’s uncertainty principle, and that is something we will not get into here.

One can write a whole book about entropy and what it means and what it does not mean – and indeed there are countless excellent books. Some are mentioned in the references below. But this is a good time to stop. The author hopes that this short explanatory article will be useful to introduce young curious minds to the concept of entropy and the second law of thermodynamics, possibly inspiring them to look at ideas in physics and chemistry from a real life perspective. And also maybe the teachers will be kinder to the naughty students creating chaos when left alone – like all of us, they are just following the second law of thermodynamics.

Acknowledgments
The author would like to thank Jagannath Mondal, Megan Newcombe and Nisheeth Srivastava for discussions regarding this article.

References

  1. “Thermal Physics” by Charles Kittel and Herbert Kroemer. Publisher: Macmillan.
  2. “Thermodynamics” by Enrico Fermi. Publisher: Dover Books. Available for free online at: http://gutenberg.net.au/ebooks13/1305021p.pdf

The author is a post-doctoral scientist at Columbia University where he uses entropy and related concepts from the field of statistical mechanics to design engineering materials and pharmaceutical drugs. He can be reached at pt2399@columbia.edu.

Leave a Reply