User login

Navigation

You are here

How to teach thermodynamics

Libb Thims's picture

This blog will give some wisdom guidance on how to teach thermodynamics, particularly in the opening weeks, to undergraduate engineering students. This blog started as a response post to nanomechanics engineer Zhigang Suo's 19 Dec 2010 request for suggestive advice on which textbook to use and how to teach thermodynamics, as he is apprehensive about teaching his first thermodynamics class (Engineering Science 181 Engineering Thermodynamics, Harvard). In any event, to give some quick advice on how to teach thermodynamics:

(a) Show students the visual timeline of how thermodynamics originated, namely how Parmenides’ 485 BC denial of the void, led to the development of the barometer (1643), the Guericke vacuum engine (1652), the gas laws (1658), the Papin  digester (1679), then to the Papin engine (1690); which is the prototype engine model for the Carnot cycle (as described in steps by Papin), and hence the original model for the thermodynamic system, i.e. the volume of whatever substance is inside the piston and cylinder. It is important to understand the relation between the creation of the vacuum and work; and to get a visual of what exactly is the "working substance" (thermodynamic system), as defined in the Papin engine model. 

(b) Introduce students to Boerhaave’s law (1720), i.e. that all bodies of the universe can be made to expand or contract in volume; this is the opening citation to Lavoisier’s caloric theory, based on experiments using Papin's digester. This is very important to the understanding of entropy.

(c) Then introduce student's to Roger Boscovich's 1758 stationary point atom model of gases (one in a long line of atomic theories), in which the atoms of the gas were thought to oscillate about points of equilibrium, rather than to move about in trajectories; this view seems to be the model that scientists, in particular Lavoisier and Carnot, had in mind, prior to August Kronig's 1856 paper on "A General Theory of Gases", and Clausius' 1857 followup paper "On the Nature of the Motion which we Call Heat", which launched the kinetic theory of gases (and hence statistical mechanics). As Kronig put it: "the molecules of a gas do not oscillate about definite positions of equilibrium, but instead move about with velocity." Lavoiser speaks of caloric particles as something that is accumulated in the intersticies of the regions between the atoms of gas; thus both he and Carnot (who adopted Lavoisier's theory) seem to have had a Boscovich-type model in mind when they were of the view that all physical bodies expand and contract to their original atomic configuration, based on the number of caloric particles in them, the caloric amount remaining unchanged (as described by Carnot as the re-establishment in the equilibrium in the caloric), per each engine cycle. This is very important to the inderstanding of the difference between a "reversible" cycle and "irreversible" cycle or process; and hence to the underlying understanding of the second law and entropy increase.

(d) Then introduce students to Gustave Coriolis’ principle of the transmission of work, as derived in his 1829 Calculation of the Effect of Machines; there's no English translation (you have to do your own French to English translation to read the derivation), but this is where most of the geometry behind Clausius' derivation of internal energy stems, and is the origin of the mathematical definition of work.

(e) Then introduce students to the mechanical equivalent of heat (1842); this is also a difficult concept to understand, but it was through this model that "caloric" became converted into "entropy"; and beyond this the entire unit system of energy (joule) is based on this measurement. The true name of entropy is called "transformation content", as explained in great detail by Clausius, and it is based on this model that heat and work are equivalent or transformable into each other, and hence caloric is not indestructible as Lavoisier and Carnot viewed things.

(f) Then I would strongly suggest the required reading assignment of the first 38-pages (mathematical introduction + first law derivation) of Rudolf Clausius’ The Mechanical Theory of Heat, the 1879 2nd edition translation by Walter Browne (~$20 new at Amazon). All engineering thermodynamics textbooks are simply a rehashing of this 1879 textbook, which is the core of all of thermodynamics. This textbook should be a required purchase for all engineers. As Einstein put it, of all the books in science, the theory contained in Clausius' textbook is the least likely, of all universal theories, to ever be overthrown.

All of this should be introduced in the first week or so of class, then you can go on to fill in the rest of the class with whatever engineering textbook you choose. Students looking for a fuller or deeper understanding of entropy can then go back later and read the key chapters of Clausius' textbook (3, 4, 5, 9, 10), on his or her own time. The math of thermodynamics is certainly difficult, but more often than not it is the intuitive basis that is the more difficult aspect of why one is learning the math and doing the derivations. Introduction to this foundation may help with this.

 

Comments

Libb Thims's picture

Here's a useful page of thermodynamic lectures; a collection of 54 lecture parts on different topics in thermodynamics, mostly from MIT and Yale, which might be useful to students.

Libb Thims's picture

Just a few quick points about how to progress to step two once you have given the framework of step one (above). In particular Suo states that he is going to do the following for step one:  

This is all was waste of time. The quantum mechanical/statistical description of entropy can come in later parts of the course (and this is only one of many interpretations of entropy, not the core model). What you need to do next, following the historical foundation (above) is:

(a) Explain how Clausius used (1/T) as the integrating factor of heat (Q), or what was previously considered as caloric, to conclude that the new resulting function (Q/T) is an extensive state function, that can be used to measure heat, indirectly; such that product TdS is a conjugate variable pair measure of the energy of heat.

(b) Then move on to Gibbs (1876) and how he added on the other versions of the conjugate variable pairs for other types of work, elongation, chemical, electrical, etc.

(c) Then move on to the Boltzmann (1878)-Planck (1900)-Nernst (1907) models of quantum states, radiation thermodynamics, entropy at absolute zero, logarithmic models of entropy.

Once you have this framework instilled than you can move on to newer applications, e.g. thermodynamics of fuel cells, thermodynamics of ecological or biological systems, solar cell thermodynamics, etc.

Zhigang Suo's picture

Dear Libb:  Thank you very much for the wealth of information.  This historical approach is intriguing.  I have been reading and thinking how to teach this undergarduate class, and will post when things gel together for me.

Libb Thims's picture

Yes, good luck to you and your class. I just thought I would put my two cents in. There are so many branches of thermodynamics, e.g. a new one I discovered today is "interfacial thermodynamics", that I believe it is important to teach the core basics of thermodynamics first (i.e. Clausius) before jumping off into other areas, e.g. statistical mechanics, quantum states interpretations of thermodynamics, or biological applications. Myself, I originally learned thermodynamics via Stanley Sandler and his method of teaching thermodynamics, which he describes in his own words as:

“Most of the previous books presented thermodynamics in a way that
required students to memorize a specific way to do every problem. I
thought it would be more effective to have a small set of very general
equations and then be able to treat every new problem as a special case
of those equations. That's how I taught myself, so I decided to write a
textbook to teach others in this way.”

His method is pretty good, but is geared towards chemical engineers. A down side of his method, is that you will never know what entropy is, as he simply introduces it as a new variable and then jumps into pages of derivations. It took me years to figure out what entropy was. I used to think it had something to do with heat lost to friction on the walls of the piston; then I was convinced it had something to do with Lazare Carnot and his 1803 Fundamental Principles of Equilibrium and Movement, as passed on to his son Sadi Carnot; it wasn't until I actual got my hands on a copy of Rudolf Clausius' 1865 Mechanical Theory of Heat, which prior to 2005 were only available in original copies (I paid $600 dollars for mine), that I actually began to understand what entropy was.

 

Zhigang Suo's picture

Thank you for relating your experience with learning entropy.  Perhaps many of us should recall our own experience, our own aha moments.  My undergraduate degree was in engineering mechanics.  We had perhaps half a course on thermodynamics.  I recall being told of Clausius's way of introducing entropy.  I didn't get much out it.  In graduate school, I took an undergraduate course, which used the textbook by Adkins.  In many ways, the book amplifies the book by Pippard.  Once again entropy was introduced in Clausius's way, which did not register with me.  Meanwhile I had to take courses in materials science, which used thermodynamics like using calculus: thermodynamics was assumed as a prerequisite, and was used in complicated situations.  By this time, it had become clear that I should go back to the basics of thermodynamics.

In 1994 I was on Sabbatical in Stuttgart, and read the two short papers by Gibbs.  For the first time, phase transition made sense to me.  His graphical interpretation helped me to see what entropy does.  This approach was later solidified for me by the textbook of Callen.  In Gibbs, as well as in Callen, the approach is "stop soul searching and maximize a function".  The behavior of entropy is carefully described, in words and in equations.  And we then maximize entropy.  No more soul-searching questions.  How was entropy discovered. What is entropy?  Why does it tend to maximum?  I started to use thermodynamics heavily in my own research, and published a review article on the motion of microscopic surfaces in materials in 1997.

What is entropy, anyway?   But, are we being unfair to entropy?  How many of us bother to ask similar questions about any other quantities? What is energy? what is electric charge?  What is volume? 

My aha moment with entropy came when I read the introductory pages of Kittel and Kroemer.  I was deeply moved by the experience that I entered a review on Amazon.  In hindsight, my enthusiasm for the book might be a little ridiculous and unfair to other authors.  After all, the approach taken by Kittel and Kroemer had been around for some time, in quite a few fine textbooks.

I have since used the approach in my graduate courses, and written up some class notes.  The students seemed to like the approach.  The basic approach was described here.  I am now deciding if this approach is suitable for an undergraduate course on engineering thermodynamics.   

Let's hope that other iMechanicians will join us to relate their own experience in learning thermodynamics. 

Libb Thims's picture

One of the inherent problems with microstate logarithmic versions of entropy and thus thermodynamics versions based on this mode of argument, is that the proof of the derivation doesn't go all the way down to the roots. Boltzmann made the approximation in 1872 that the velocities of the particles were representative of entropy; Planck built on this in 1901 to argue, in radiation thermodynamic terms, that Boltzmann's model would also apply to quantum states of black bodies; and ever since the logarithm model/multiplicity is assumed to be universally true, especially in the thermal physics community. There certainly is much to be learned from this model, but being that it is an approximation of an approximation of heat, there is a certain lack of mental cohesiveness and universality in this approach.

What I have found is that when you speculate on say the entropy associated with interactions of say a microscopic region of a surface or a protein-protein interaction, etc., no one really raises much object, but when you speculate on how entropy applies (or governs) an aspect of human existence, great emotions arise and heated debate results, as is exemplified by the 1902 entropy debate (on choice, God, and irreversiblity in nature), the 2006 Rossini debate (on politics and thermodynamics) or the 2009 Moriarty debate (on entropy and microstates of student arrangements in a field), whereby one is forced into deeper understanding of thermodynamics, where the proofs and derivations go all the way down to core principles and where one understands the assumptions behind each variable intimately.  

On an aside, I notice that as of 2001 you had 40 thermodynamics books in your collection; it might be useful to others (such as myself) if you posted your thermodynamics book collection list online as I have done.

Libb Thims's picture

One sideline topic that you should also address (that is a growing issue of confusion in recent years), at least at one point in your class, is the question: "what does Shannon entropy have to do with Clausius entropy?" American engineer Myron Tribus was asked this question during his 1948 doctorial examination at UCLA, but had no response, and would spend the next 10 years fretting about this question. The answer is nothing.

One has to do with mathematically quantifying the variations of 1s and 0s (high and low voltages in electrical current) coming through a telegraph wire, the other with physically quantifying heat (atomic movement) in a heat engine. What originally originated as in inside joke between chemical engineer John Neumann and electrical engineer Claude Shannon as turned into a growing mess of confusion in the assumed connection between information theory and thermodynamics.

The relation between the two is what is called a mathematical isomorphism, i.e. equations with a similar structure (e.g. a quantity proportional to the logarithm of another quantity), but on completely different topics. One can find papers published almost monthly (many by engineers) arguing for a thermodynamic foundation to information theory or vice versa. The three recent books on entropy by Israeli physical chemist Arieh Ben-Naim are a typical example of this type of mislogic, who argues that we should throw out Planck's constant, redefine the absolute temperature scale, and make entropy unitless, all in the name of re-defining Clausius entropy (and hence the entire field of thermodynamics) in terms of Shannon's mathematical theory of communication. 

Libb Thims's picture

Just a quick note to your last post, i.e. about going back years later after digging through Gibbs and Callen to figure out why entropy tends to maximize and why we maximize the function to find the condition for equilibrium, this is why I strongly suggest you spend a few minutes on step one (above), points (c) and (e), and then reference students to Clausius, chapter 10 "On Non-Reversible Processes", to understand why entropy tends to maximize, which is simply a mathematical way of quantifying the effect that heat gets converted non-reversibly into internal work inside the system, and that in each step of the cycle or process, the sum of the numerical magnitudes of this non-recoverable conversion add or sum to a maximum number, ceasing at equilibrium (when the process stops). If you don't attempt to address this, your students may end up searching years later for this answer (as you did; and as Gibbs did himself, with his groping for a mixed-up-ness explanation).

Subscribe to Comments for "How to teach thermodynamics "

Recent comments

More comments

Syndicate

Subscribe to Syndicate