A layperson's introduction to Thermodynamics, part 1: Energy, work, heat
@wanda-seldon has been giving us an introduction to quantum physics. For now, she will be given a short break to prepare new stuff. In the meantime I will be covering some classical mechanics, more specifically thermodynamics. In part 1, we need to work our way through some of the more dry concepts, so we can understand and appreciate the horrifying implications of the fun parts. So I promise, this will be the most verbose one.
Some of you may have briefly seen a version of this posted, that was due to me misunderstanding the schedule with @wanda-seldon. If you saw that one, I will mention I rewrote nearly all of it to be more readable.
Now, on today's agenda: The basics of heat, work and energy and how it's all related.
Previous posts can be found here: https://tildes.net/~science/8al/meta_post_for_a_laypersons_introduction_to_series
If @wanda-seldon in her posts mention "energy", it's most likely in the context of energy operators, which is a concept in quantum physics. I'm not going to pretend I understand them, so I will not be explaining the difference. We will cover what energy is in classical mechanics. So keep that in mind if you read something from either of us.
What is heat? Using a lot of fancy words we can describe it as follows. Heat is an energy that is transferred between systems by thermal interaction. And what is work? Work is an energy that is applied in a way that performs... work. The combined energy in a system is called internal energy. This type of energy can be transformed or applied to other systems.
These are a lot of new words, so lets break that down a bit.
A system is just a catch-all term for something that can be defined with a boundary of sorts. Be it mass, volume, shape, container, position, etc. A canister, your tea mug, the steam inside a boiler, your body, a cloud, a room, earth, etc. They are all systems because you can in some way define what is within the boundary, and what is beyond the boundary.
In theory, you could define every single nucleid in the universe as an unique system. But that would be counter-intuitive. In thermodynamics we tend to lump things into a system, and treat it as one thing. As opposed to Quantum stuff that looks at the smallest quantity. Calculating every single water molecule in my coffee would be pure insanity. So we just treat my mug as the boundary, and the tea inside the mug as the system. And just so it's mentioned, systems can contain systems, for instance a tea mug inside a room.
Energy is some quantifiable property that comes in either the form of heat, work. It can be transferred to other systems, or change between the different energy types. An example of transfer is my coffee cooling down because it's in a cold room. That means heat has been transferred from one system (my mug) to another system (the room). Alternatively you could say my hot coffee mug is warming up the room, or that the room is cooling down my coffee. Thermodynamics is a LOT about perspective. An example of transforming energy types is when we rub our hands together. That way we convert work (rubbing) into heat. It's really not more complicated than that. An interaction in this case is just a system having an effect on a different system. So a thermal interaction means it's an interaction due to heat (like in the mug example).
This brings us to an extremely important point. So important, it's considered "law". The first law of thermodynamics even. Energy cannot be destroyed, it can only change forms.
Your battery charge is never really lost. Neither is the heat of your mug of coffee. It just changed form or went somewhere else. The combined energy of all types that is residing inside a system is called internal energy.
Heat and work
Let's say we have a system, like a room. And all windows and doors are closed, so no energy can leave. In this system, you have a running table fan connected to a power line, getting energy from outside the system. The table fan is making you feel cool. Is the fan cooling down the room, heating up the room, or doing nothing? Think about it for a moment.
The first thought of many would be to think that this fan would cool the room down, it sure makes you feel cooler! But it's actually heating up the room. As we remember, internal energy is the energy inside a system (room, in this case). The fan is getting energy from outside, and uses this energy to perform work. The fan accelerates the air inside the room, and this accelerated air will evaporate some of your sweat, so you feel cool. But as we remember, energy cannot be destroyed. So we are importing energy into the system, increasing the internal energy. Some of the work from the fan is also directly converted to heat, since the motor of the fan will get hot.
So if we are not getting rid of any of this excess energy, we are increasing the internal energy. And therefore actively increasing the temperature of the room.
To use a more tangible example: Simplified, this phenomena is why green house gases are bad. Lets define earth as a system. Earth gets a lot of energy from the sun. And a lot of this energy will be reflected and sent back to space. Green house gases will reflect back some of this energy trying to leave earth. So instead of having a roughly equal amount of energy enter the system (from the sun, from us doing stuff, etc) that leaves out in space, we have an increasing amount of energy on earth. This, as a consequence, increases temperature.
Now, what are the maybe not so obvious implications of this?
Waste heat, from supplied energy or inefficient work is a constant headache in engineering. If we cannot remove enough heat, we will actively heat up objects until they are destroyed. Thats why good cooling systems are important in cars, computers, etc.
Now this was not so bad. In the future we will cover phase changes, equilibriums, entropy, the heat death of the universe and briefly touch upon engines. So thats most likely two more parts after this. After that @wanda-seldon will take over again.
I plan on doing one main part per week, but if something is asked that warrants a small topic I might do smaller ones inbetween.
Something unclear? Got questions? Got feedback? Or requests of topics to cover? Leave a comment.
Can you expand further on the difference, on a fundamental level, between heat and work? The way I understand it is that thermal energy is like a collective average measure of the kinetic energy of individual molecules in the system. Meanwhile work, for example in a moving piston, is about the kinetic energy of the movement of the piston, which if we were to be pedantic, is also a collection of molecules. So both seems to be about energy in the motions of a collection of molecules, only in one case the motions are random and in the other case the molecules collectively move together in one direction. If so, what would be the boundary for the "level of randomness" between heat and work?
Let's say I have a big room filled with a bunch of randomly bouncing normal-sized balls. I then transfer energy into the room by continuously hitting the wall with high velocity rocks from the outside, causing the wall to vibrate and bounce the balls inside more violently. If the balls movements are random, is it correct to say this collection of balls receives an increase in thermal energy?
If I place a pollen in a bowl of water and the pollen starts moving around, do the water molecules transfer heat or work to the pollen?
You actually hit one of the core definitions straight on the head. Heat is considered a "low quality" energy because it cannot be directed. It is not orderly, just is randomly moving. Work is a "high quality" energy. It can be ordered in a direction, so it is orderly.
We can convert (about) 100% of high quality energy into low quality energy. For instance an electric heater will waste (near) nothing of its energy on movement or deformation. It all goes into heating a room. So we take a high quality energy and use it to create a low quality energy.
But we cannot do it in reverse. It's not possible to take all low quality energy and turn 100% of it into high quality energy. If you burn a log, it's a lot harder to turn the heat into a log again. This is the basis of entropy (and so important its the second law of thermodynamics!). We can only increase the chaos in a closed system, not decrease it.
You would most likely lose that energy again since vibration is work.
Think of it rather this way, you are supplying energy to a closed system. In this example, the balls are moving around more. So you supplied energy which is used to perform more work. If we assume that there is no air resistance or friction inside the room then no changes to heat should occur. The balls will just move faster, but thats it.
But in reality the balls will collide with molecules in the air, and collide with the wall. So you will generate heat due to friction. And the temperature increases.
EDIT: The types of energy and entropy we will actually touch in part 3.
Hm, I suppose being "low" or "high quality" energy is not the defining feature for a particular energy exchange to be called "heat" or "work" (for example, thermal electric generator converting heat to electricity seems like a very ordered use out of a heat flow; energy stored in chemical bonds seem to be another type of high quality energy beside work), is it? If it is then what is the misconception in my examples and if it isn't, how can we define heat and work in a way that make the distinction clear?
No, you're sort of on to it. You can convert energy from one type to another. But you cannot convert 100% of a heat into work, whereas you could convert 100% of work into heat.
The big factor that makes this a bit easier to think with, is a property we call "entropy". Which I will get into more detail at a later post.
In your example, the only thing I think you had "wrong" is that you treated the balls as atoms/molecules. But the balls are an arrangement of rubber molecules, that have an inherent type of structure. A ball can experience an impact in the form of deformation, as opposed to the type of interactions we see in "heat". If we were to say that the room is rigid in all ways and the balls are one molecule big, you'd be right.
Do you have any recommendation on how to go about "defining" a system (non-physical)? For example in network dynamics, could we define something analogous to Energy/work/heat?
I remember that a way to think about black holes radii, Hawking radiation, etc. (and do back-of-the-envelope calculations) was to define a temperature/energy/entropy for the black hole system. It was pretty cool but it was also not related to normal definitions (for example energy was related to the surface area of the event horizon). So I wonder how do people go about defining these concepts and if it is possible to do so for every system or there are requisites for what a system must have in order to have thermodynamical properties.
Sorta. This is the kind of stuff researchers or physicists are much better at than me. Generally speaking, just like with Thermodynamics, it's more about defining things properly and setting up good models.
I mostly work with reactive systems. But any pattern is a factor you can simplify or expect is good. If it behaves similarily as a natural system, you can start making rules.
That is completely beyond me, sorry.
Within the context of physics, a system is defined so that you can either keep a property constant, or so that you can track the influx (positive or negative) of a quantity. So that's what the starting point should be when you want to define a system in a useful manner; the quantity you want to track.
This is true, black holes can be described thermodynamically (the details are way out of my field though). I take issue with saying it's not related to normal definitions. Energy is still energy, it's just expressed in terms of surface area.
We must be able to define the system in terms of state variables (pressure, volume, etc). In physics, we look for the right state function (e.g. F(P,V)) to describe a system. Crucially, these state variables must only depend on the equilibrium states of the system. The way the system got the current state must not matter, only its current state. If you cannot describe your system within these constraints it's not a thermodynamic system.
EDIT: many microscopic systems are non-equilibrium systems and therefore cannot be described by classical thermodynamics. There do exist theories for non-equilibrium many-body systems but I haven't dealt with them much. One book I have studied is field theory of non-equilibrium systems by Alex Kamenev which gets around the restrictions I described above by using time contours.
Very interesting (and thanks @ducks too for your answer!). Sorry if I oversimplified the black hole thing.
So say I have a simulation of an interacting particle system and it reaches equilibrium. How do I find state variables? Could I relate them to eigenfunctions of the laplacian?
I guess my other question is how do you decide a system has reached equilibrium, for example, a steady state would be an equilibrium but in ideal gases you still have the particles moving around.
Is there any good book on the mathematical aspects of statistical mechanics and thermodynamics?
I am not sure what you mean. You want to find state variables that can track a system's 'movement' through its phase space. So for each (macroscopic) degree of freedom you want to have a state variable. I am not entirely sure how the eigenfunction will help you here.
An equilibrium state is not the same as a state in which nothing is happening. Sure, an ideal gas has particles moving around but the power of thermodynamics is that we don't need to care. We look at the entire system and see no net changes in energy. There is no energy coming in, no energy going out. The gas' net energy does not change. So an equilibrium state is a state in which there are no net flows of energy (or matter, or whatever else is relevant).
I don't know any good books that work from a mathematical point of view. Most physicists prefer to learn by examining physical systems and so all the books I know of on statistical mechanics explain things by reformulating known concepts (quantum mechanics usually) in new terms. If you are familiar with quantum mechanics I can suggest a few books.
Thanks for taking this on. The idea that we are free to define the systems goes a long way in making my old physics classes make sense. One of the most frustrating was Statics, which I suspect is because of its way of defining what the system is.
I remember the pain of statics very well. Problem with it too is that engineering statics takes so many formulas and predefined rules from empirical data. If the source material isn't too concerned with figuring out why things are the way they are, it's very hard to make students understand why things are that way. Though I might be biased since my lecturer might have been the most incompetent man to walk the earth.
Thermodynamics is a lot more elegant in that way. It's not perfect. But everything is related, and ideally you want to define as much as possible.
I think they cloned those statics teachers. LOL.
I thankfully never had to take statics as a chemical engineer. One of my thermo professors said something like, "Chemical engineering almost never deals with static systems, if you want statics, set the derivative to zero and solve it, next topic."
What is the relationship between thermodynamic entropy and information-theoretic entropy?
I don't know information-theory, so I don't feel qualified to answer this. Hopefully someone else will come to the rescue.