Information

4.14: The Second Law of Thermodynamics - Biology

4.14: The Second Law of Thermodynamics - Biology


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

A living cell’s primary tasks of obtaining, transforming, and using energy to do work may seem simple. Strictly speaking, no energy transfer is completely efficient, because some energy is lost in an unusable form.

An important concept in physical systems is that of order and disorder (also known as randomness). The more energy that is lost by a system to its surroundings, the less ordered and more random the system is. Scientists refer to the measure of randomness or disorder within a system as entropy. High entropy means high disorder and low energy (Figure 1). To better understand entropy, think of a student’s bedroom. If no energy or work were put into it, the room would quickly become messy. It would exist in a very disordered state, one of high entropy. Energy must be put into the system, in the form of the student doing work and putting everything away, in order to bring the room back to a state of cleanliness and order. This state is one of low entropy. Similarly, a car or house must be constantly maintained with work in order to keep it in an ordered state. Left alone, the entropy of the house or car gradually increases through rust and degradation. Molecules and chemical reactions have varying amounts of entropy as well. For example, as chemical reactions reach a state of equilibrium, entropy increases, and as molecules at a high concentration in one place diffuse and spread out, entropy also increases.

Try It Yourself

Set up a simple experiment to understand how energy is transferred and how a change in entropy results.

  1. Take a block of ice. This is water in solid form, so it has a high structural order. This means that the molecules cannot move very much and are in a fixed position. The temperature of the ice is 0°C. As a result, the entropy of the system is low.
  2. Allow the ice to melt at room temperature. What is the state of molecules in the liquid water now? How did the energy transfer take place? Is the entropy of the system higher or lower? Why?
  3. Heat the water to its boiling point. What happens to the entropy of the system when the water is heated?

All physical systems can be thought of in this way: Living things are highly ordered, requiring constant energy input to be maintained in a state of low entropy. As living systems take in energy-storing molecules and transform them through chemical reactions, they lose some amount of usable energy in the process, because no reaction is completely efficient. They also produce waste and by-products that aren’t useful energy sources. This process increases the entropy of the system’s surroundings. Since all energy transfers result in the loss of some usable energy, the second law of thermodynamics states that every energy transfer or transformation increases the entropy of the universe. Even though living things are highly ordered and maintain a state of low entropy, the entropy of the universe in total is constantly increasing due to the loss of usable energy with each energy transfer that occurs. Essentially, living things are in a continuous uphill battle against this constant increase in universal entropy.


4.14: The Second Law of Thermodynamics - Biology

There are four laws of thermodynamics however, for this course only the first two are relevant:

  1. Energy may be transferred from place to place or transformed into different forms, but it cannot be created or destroyed.
  2. In any given system, the entropy (the amount of energy) will either increase or stay the same.

In this outcome, we will learn exactly how these laws are important to understanding biology.

Learning Objectives

  • Distinguish between an open and a closed system
  • Understand how the first law of thermodynamics applies to biological systems
  • Understand how the second law of thermodynamics applies to biological systems

Unscrambling the Second Law of Thermodynamics

The second law of thermodynamics surely qualifies as one of the most talked-about principles in all of physics. Depending on who you ask, it is either incredibly mysterious or fairly mundane. Some physicists think the second law is connected to fundamental ideas such as time and the origin of the universe, 1 yet it is also an aspect of everyday experiences, such as how a morning cup of coffee cools down, or the fact that you cannot unscramble an egg. The second law has even been invoked by rock band Muse to explain why, in their view, economic growth cannot continue for much longer. 2 However, trying to find a clear explanation of what the second law actually is (and why it is true) can be a frustrating experience.

I first encountered the second law as a teenager, while reading an issue of the fundamentalist Christian magazine Creation, given to me by my grandmother. Since the article’s author wanted to argue against biological evolution, it claimed that the second law of thermodynamics implies evolution is impossible. Its definition of the second law was that disorder always increases with time. At first glance, this does seem incompatible with evolution by natural selection, which can lead to more complex, “better designed” organisms over time. 3 At the time, I thought it was unlikely that mainstream biology would flagrantly contradict mainstream physics, so I remained sceptical of this argument, even though I couldn’t understand the counterarguments I found on the Internet at the time.

During my first university physics course, I was excited when I learned we’d be studying thermodynamics. Finally, I thought, I would be able to understand the second law properly (along with the other, less popular laws). Alas, my expectations were not met, despite having a good lecturer. Instead of discussing big-picture issues like evolution, economics, or cosmology, we worked out the maximum possible efficiency of refrigerators and steam engines. I’m sure these are interesting in their own ways, but I was disappointed.

The version of the second law we studied was related to concepts of heat and temperature, and little else. A familiar consequence of this version of the second law is that heat always flows from a hot object to a cooler one, and not the other way around. Your morning cup of coffee cools down, and heats up the air around it it doesn’t heat up further while cooling the room, even though that possibility is compatible with other laws of physics such as the conservation of energy.

The second law is formalised by defining a quantity called entropy. When heat flows out of one object and into another, the first object’s entropy goes down, by an amount that depends on its temperature. The change in the entropy is the amount of heat energy transferred, Q (usually measured in joules), divided by the temperature of the object, T1, measured in Kelvin. When that same heat energy flows into another object, that object’s entropy goes up by Q divided by the temperature of the second object, T2. The second law of thermodynamics can then be stated as, if you add up all of the changes in entropy of all the objects you are studying, the result must be a positive number or zero. It can’t be negative. In other words, the total entropy must either increase or stay the same. When a cup of coffee cools down its entropy decreases, but the entropy of its surroundings increases by an even greater amount, since the coffee is hotter than the surroundings.

The version of the second law I just described, usually attributed to the 19th century German physicist Rudolf Clausius, certainly has its uses. However, it is a far cry from the lofty fundamental principle I had expected to learn. What did it have to do with evolution? The illusion that organisms are well-designed doesn’t have anything to do with heat being transferred. It doesn’t have much to do with the economy either, except very indirectly, because machines are useful and the second law prohibits certain kinds of machines.

Confusingly, as we learned the Clausius version of the second law, we also discussed phenomena such as gas diffusion, shown in this animated gif:

In the animation, purple gas atoms start above the horizontal barrier in the middle of the box, and green atoms start below the barrier. As time progresses, the purple and green atoms end up spread throughout the entire box. This “spreading out”, in a physical sense, was claimed to be an example of the second law, yet I could never find what heat was being transferred in this example. If no heat is being transferred, the Clausius second law doesn’t apply we are left with hand waving about disorder.

So is there a version of the second law that relates to concepts more general than heat and temperature? It turns out the answer is yes, but I had to wait many years before I learned it. Surprisingly, it turns out this more general second law isn’t really a principle of physics, but rather a principle of reasoning. This more general version of the second law not only explains why the Clausius version is true, but gives us a tool for much more general questions, like the evolution question or Muse’s economic musings. It also appears in everyday life, and not just in situations involving heat and temperature. For example, why is darts difficult? Why can’t most men sing operatic high Cs? And why are political polls (somewhat) accurate?

Uncertainty and volume

During my PhD studies, I became intensely interested in Bayesian statistics 4 and how to use it in astronomical data analysis. During this process, I discovered the work of heterodox physicist E. T. Jaynes (1922–1998), 5 whose clear-headed thinking and combative writing style changed the direction of my PhD project and my life.

One day I came across a particular Jaynes paper, which is now one of my favourite journal articles of all time. 6 I didn’t understand it immediately, but had a strong sense that I should persist because it seemed important. Every so often, I’d return to re-read it, understanding just a little more each time. The breakthrough came after dinner one night.

I had taken out a tub of ice cream from the freezer for dessert. After dishing the ice cream into a bowl, I tried putting the tub back in the freezer, but it wouldn’t fit (due solely to my packing abilities). This got me thinking about volumes, and suddenly it clicked Jaynes was explaining that the second law of thermodynamics, both the Clausius heat/temperature version and (importantly) a more general version, reduce to the principle that big things cannot fit into small spaces unless they are compressed. This is common sense when applied to physical objects, but to get the second law, you have to apply it to an abstract object: a volume of possibilities. This idea did not necessarily originate with Jaynes, as it can be found (albeit less explicitly stated) in the work of J. Willard Gibbs, among others. Another source that helped me understand was the posts of irreverant blogger “Pierre Laplace”. 7

To demonstrate the idea of a volume of possibilities, consider the 52 books on my office bookshelf, 50 of which are non-fiction. [It’s a coincidence that there are 52 cards in a standard deck of cards. I try to avoid explanations involving cards, coin flipping, and dice, because people tend to (incorrectly) think they understand those already.]

Now I tell you that one of the books on the shelf was signed by the author. Which one? I’m not telling. Based solely on this information, it makes sense for you assign a probability of 1/52 to each of the books, describing how plausible you think it is for each book to be the signed one.

From this state of near-ignorance, it seems like you might not be able to draw any conclusions with a high confidence. But that’s an illusion it depends on what questions you ask. Imagine if I were to ask you whether the signed book is a non-fiction book. If your probability is 1/52 for each book being the signed one, the probability the signed one is non-fiction must be 50/52 (adding the probabilities of the 50 non-fiction books together), or approximately 96%. That’s an impressive level of confidence — much more than you should have in the conclusion of a single peer-reviewed science paper! Of course, a high probability doesn’t mean the expected result is guaranteed. It just means it’s very plausible based on the information that you explicitly put into the calculation.

Here is a (very simple) diagram of my bookshelf, with the blue regions being non-fiction books, and the red being fiction: Clearly the safest bet is that the signed book is non-fiction, simply because there are more of them—they occupy a large majority of the volume of possibilities (which, in this case, corresponds to a physical volume on my bookshelf).

The general principle here, which Jaynes pointed out, is this: If you consider all possibilities that are consistent with the information you have, and the vast majority of those possibilities imply a certain outcome, then that outcome is very plausible. This is also a consequence of probability theory. The only way around this conclusion is to have some reason to assign highly non-uniform probabilities to the possibilities. For example, if I had some reason to think fiction books were more likely to be signed, I wouldn’t have assigned an equal probability of 1/52 to each book.

Clausius from Jaynes

The Jaynes version of the second law (about uncertainty) can be applied to all sorts of questions, and it can also explain why the Clausius version (about heat and temperature) is true.

When I say that I have a hot cup of coffee and that the air around it is cooler, it seems like quite a specific statement. But in a certain sense it is actually a very vague statement, in that it leaves out a large number of details about what’s actually happening. I didn’t tell you the colour of the mug, whether my window was open, whether the coffee was instant or not (usually instant with soy milk and two pills of sucralose if you’re wondering…I can handle the hate mail). More importantly, I also left out key details about the position and velocity (speed and direction of movement) of every molecule in the mug and the surrounding air. The high temperature of the coffee means its molecules are moving fairly rapidly, but there’s precious little information apart from that.

Based on this very vague information about the cup of coffee, can we predict what will happen in the future? Our common sense and experience say yes, very loudly. Hot cups of coffee cool down. Duh! But to a physicist the standard way of predicting the future is to use the laws of motion, which predict how particles (such as molecules of coffee and air) will move around. The catch is that we need to give the initial conditions: what are all the positions and velocities that we’re making our prediction from?

Since we don’t know the initial conditions (we only have the ‘vague’ information, specifically, the temperature), we can’t actually apply the standard laws of motion to see what will happen in the future. Any prediction we make will be a guess, or, more formally, a probability. We’re in a situation analogous to the bookshelf one. If we think of all possibilities for what the positions and velocities of the coffee and air molecules might be, the vague information we have can only restrict the possibilities down to a subset, shown as the red area below:

Now, what the laws of motion will do for you is tell you how a point in this diagram will move around over time (equivalently, how the positions and velocities of all the molecules will change over time). At a particular instant, the physical reality is represented by a single point somewhere in the red zone. As time goes on, that point will move around (as depicted by the curved arrow in the diagram). Where will it end up? That depends on precisely where in the red zone the point was initially. However, a special property of the laws of motion is that if your uncertainty about the initial point is represented by the red zone, then your uncertainty about where that point ends up will be represented by a region of the same volume. In physics, zones of uncertainty are like containers of ice cream they can’t be compressed.

Notice that the red zone is much smaller than the green zone, representing the set of possibilities compatible with the vague statement, “The coffee is cool and the surroundings are slightly warmer (than they would be in the red zone)”. Therefore, if the red zone is moved around and changes in shape (but retains its volume), it’s possible for it to fit entirely within the green zone, simply because it’s smaller. On the diagram it’s about 20 times the area or so, but in physics it can be bigger by a factor of something like 10^(10^20). Therefore, the following rule of thumb is at least possible: if the starting position is in the red zone, the final position will be in the green zone.

The opposite situation doesn’t work. Instead, we have to admit that if the starting position is in the green zone, the final position will almost certainly not be in the red zone. Translated back into statements about coffee, ‘hot cups of coffee tend to cool down’ is a reasonable rule of thumb, but ‘cold cups of coffee in cool rooms tend to heat up’ is not, and this is all due to the volumes of the red and green regions in the set of possibilities.

There is a freaky subset of states within the green zone which would lead back to a hot coffee. But its volume is absurdly tiny that we could never hope to engineer a coffee/air system that’s actually in that subset. Thus, the idea of making a plausible guess based on your incomplete information explains why the thermodynamic ‘heat and temperature’ version of the second law is true. In fact, the connection is so close that Clausius’s definition of entropy corresponds exactly to the size of the regions of uncertainty.

Interestingly, this explanation doesn’t really explain anything about time, since notions of time were assumed in the explanation itself. Presumably, a more fundamental explanation for time would explain it in terms of some other concepts.

Entropy: it’s not what’s there, it’s what you know

Many physicists will happily talk about “the entropy of the universe” being low in the past, or how Stephen Hawking derived the formula for “the entropy of a black hole”.

These are probably correct and interesting results (I am not very familiar with all the details of them), but I have a nit to pick. Entropy is too subtle a concept to be taken for granted. Since the exact same physical system can have more than one entropy (depending on how we want to think about it), we should be totally explicit about how we are defining and using it.

In terms of the previous diagram, entropy is not a property of the arrow (which describes what actually happened) but a property of the red and green regions (describing uncertainty). Entropy describes what is known about a system, or how much information a vague statement provides (or would provide) about the system. There is no ‘true’ entropy that we could calculate even if we knew everything there was to be known!

What rules of thumb are we looking for? Different ones may exist, and we’ll find them by using different entropies. It all depends on what specific vague statements we are interested in. For example, we might want to see whether hot coffee tends to cool down, and in this case entropy will involve heat and temperature, or we might be interested in whether gas in one half of a box will tend to mix with the gas in the other half of the box (like in the animated gif). In that case, the entropy will have nothing to do with heat transfer, but rather how much of each type of gas is on either side of the barrier.

When people first hear that a “law of physics” can be derived as a prediction based on near-ignorance, their natural inclination is to worry that a prediction based on near-ignorance might be wrong. Of course it’s possible. However, when this occurs, it’s a good opportunity to learn. If 99.99999999999% of the possibilities would imply one outcome, and that outcome does not occur, you need to figure out why, and in doing so, you might discover something new.

Evolution and economics

It’s worth thinking about whether the second law really does forbid evolution by natural selection. We don’t need to get particularly technical with the concept of entropy in order to have a go at that. All we need to ask is whether it’s plausible that a population of self-replicating organisms will tend to improve their survival and reproductive fitness over time.

The answer is yes, provided the mutation rate is sufficiently low. And if the organisms reproduce sexually, the population’s average fitness will increase even faster. 8 This isn’t the Clausius version of the second law, but an example of the Jaynes one: of all the possible deaths, reproduction events, and mutations that could plausibly occur, most would lead to an increase in the average fitness of the population. The probability that a populations’ fitness would decrease is low because, for that to happen, the organisms with worse genomes would have to be reproducing more than the ones with better genomes.

I’m not an economist, so I don’t know whether Muse are right about the economic consequences of the second law. The economic mainstream seems to think continued economic growth is plausible and desirable. Some dissenters exist, and occasionally they seem (to my uneducated mind) to have some good points, although some others just seem to be Malthusians with a desire to take up gardening.

The second law in everyday life

The probabilistic reasoning underlying the second law is also applicable to other fields. I like to apply it to singing, which is one of my favourite hobbies. About ten years ago, I decided to learn how to sing, since it seemed fun, and not having to carry an instrument around has its advantages. Back then I couldn’t sing very well and in particular I couldn’t sing above E4, the E just above middle C. This was frustrating but very common. Most men can’t sing along with high songs unless they transpose the melody down by an octave.

It turns out that singing high notes requires very specific conditions to be achieved in the larynx, regarding air pressure and so on. To get this right, you need to apply muscular effort in your torso in a particular way that singers traditionally call support. 9 You also can’t be quiet unless you want a very gentle “falsetto” sound. The volume required is more than most people would intuitively feel is necessary. For example, I can’t (attempt to) sing high-pitched rock songs while my wife is in the same room because it hurts her ears (because it’s loud regardless whether it’s good or bad…). Your choice of vowels is also more restricted than at low pitches.

All these specific conditions (loud volume, correct support, a vowel that works) must be met in order to sing a high note. And the reason it’s hard is the same reason it’s hard to hit a hole-in-one, or carry out any precise physical feat of all the things we could do, most of them don’t lead to a successful good-sounding note (or an accurate golf shot). If (almost) all roads lead to Rome, it’s a good bet you’ll end up in Rome. The second law of thermodynamics is as simple as that.

Understanding its logic, and how it arises from uncertain reasoning (essentially little more than probability theory), is the key to extending its use outside of physics in a sensible way.

Brendon Brewer is a senior lecturer in the Department of Statistics in Auckland. Follow him on Twitter @brendonbrewer


Biology 171

By the end of this section, you will be able to do the following:

Thermodynamics refers to the study of energy and energy transfer involving physical matter. The matter and its environment relevant to a particular case of energy transfer are classified as a system, and everything outside that system is the surroundings. For instance, when heating a pot of water on the stove, the system includes the stove, the pot, and the water. Energy transfers within the system (between the stove, pot, and water). There are two types of systems: open and closed. An open system is one in which energy can transfer between the system and its surroundings. The stovetop system is open because it can lose heat into the air. A closed system is one that cannot transfer energy to its surroundings.

Biological organisms are open systems. Energy exchanges between them and their surroundings, as they consume energy-storing molecules and release energy to the environment by doing work. Like all things in the physical world, energy is subject to the laws of physics. The laws of thermodynamics govern the transfer of energy in and among all systems in the universe.

The First Law of Thermodynamics

The first law of thermodynamics deals with the total amount of energy in the universe. It states that this total amount of energy is constant. In other words, there has always been, and always will be, exactly the same amount of energy in the universe. Energy exists in many different forms. According to the first law of thermodynamics, energy may transfer from place to place or transform into different forms, but it cannot be created or destroyed. The transfers and transformations of energy take place around us all the time. Light bulbs transform electrical energy into light energy. Gas stoves transform chemical energy from natural gas into heat energy. Plants perform one of the most biologically useful energy transformations on earth: that of converting sunlight energy into the chemical energy stored within organic molecules (Review). (Figure) shows examples of energy transformations.

The challenge for all living organisms is to obtain energy from their surroundings in forms that they can transfer or transform into usable energy to do work. Living cells have evolved to meet this challenge very well. Chemical energy stored within organic molecules such as sugars and fats transforms through a series of cellular chemical reactions into energy within ATP molecules. Energy in ATP molecules is easily accessible to do work. Examples of the types of work that cells need to do include building complex molecules, transporting materials, powering the beating motion of cilia or flagella, contracting muscle fibers to create movement, and reproduction.


The Second Law of Thermodynamics

A living cell’s primary tasks of obtaining, transforming, and using energy to do work may seem simple. However, the second law of thermodynamics explains why these tasks are harder than they appear. None of the energy transfers that we have discussed, along with all energy transfers and transformations in the universe, is completely efficient. In every energy transfer, some amount of energy is lost in a form that is unusable. In most cases, this form is heat energy. Thermodynamically, scientists define heat energy as energy that transfers from one system to another that is not doing work. For example, when an airplane flies through the air, it loses some of its energy as heat energy due to friction with the surrounding air. This friction actually heats the air by temporarily increasing air molecule speed. Likewise, some energy is lost as heat energy during cellular metabolic reactions. This is good for warm-blooded creatures like us, because heat energy helps to maintain our body temperature. Strictly speaking, no energy transfer is completely efficient, because some energy is lost in an unusable form.

An important concept in physical systems is that of order and disorder (or randomness). The more energy that a system loses to its surroundings, the less ordered and more random the system. Scientists refer to the measure of randomness or disorder within a system as entropy . High entropy means high disorder and low energy ((Figure)). To better understand entropy, think of a student’s bedroom. If no energy or work were put into it, the room would quickly become messy. It would exist in a very disordered state, one of high entropy. Energy must be put into the system, in the form of the student doing work and putting everything away, in order to bring the room back to a state of cleanliness and order. This state is one of low entropy. Similarly, a car or house must be constantly maintained with work in order to keep it in an ordered state. Left alone, a house’s or car’s entropy gradually increases through rust and degradation. Molecules and chemical reactions have varying amounts of entropy as well. For example, as chemical reactions reach a state of equilibrium, entropy increases, and as molecules at a high concentration in one place diffuse and spread out, entropy also increases.

Transfer of Energy and the Resulting Entropy Set up a simple experiment to understand how energy transfers and how a change in entropy results.

  1. Take a block of ice. This is water in solid form, so it has a high structural order. This means that the molecules cannot move very much and are in a fixed position. The ice’s temperature is 0°C. As a result, the system’s entropy is low.
  2. Allow the ice to melt at room temperature. What is the state of molecules in the liquid water now? How did the energy transfer take place? Is the system’s entropy higher or lower? Why?
  3. Heat the water to its boiling point. What happens to the system’s entropy when the water is heated?

Think of all physical systems of in this way: Living things are highly ordered, requiring constant energy input to maintain themselves in a state of low entropy. As living systems take in energy-storing molecules and transform them through chemical reactions, they lose some amount of usable energy in the process, because no reaction is completely efficient. They also produce waste and by-products that are not useful energy sources. This process increases the entropy of the system’s surroundings. Since all energy transfers result in losing some usable energy, the second law of thermodynamics states that every energy transfer or transformation increases the universe’s entropy. Even though living things are highly ordered and maintain a state of low entropy, the universe’s entropy in total is constantly increasing due to losing usable energy with each energy transfer that occurs. Essentially, living things are in a continuous uphill battle against this constant increase in universal entropy.


Section Summary

In studying energy, scientists use the term “system” to refer to the matter and its environment involved in energy transfers. Everything outside of the system is the surroundings. Single cells are biological systems. We can think of systems as having a certain amount of order. It takes energy to make a system more ordered. The more ordered a system, the lower its entropy. Entropy is a measure of a system’s disorder. As a system becomes more disordered, the lower its energy and the higher its entropy.

The laws of thermodynamics are a series of laws that describe the properties and processes of energy transfer. The first law states that the total amount of energy in the universe is constant. This means that energy cannot be created or destroyed, only transferred or transformed. The second law of thermodynamics states that every energy transfer involves some loss of energy in an unusable form, such as heat energy, resulting in a more disordered system. In other words, no energy transfer is completely efficient, and all transfers trend toward disorder.

Free Response

Imagine an elaborate ant farm with tunnels and passageways through the sand where ants live in a large community. Now imagine that an earthquake shook the ground and demolished the ant farm. In which of these two scenarios, before or after the earthquake, was the ant farm system in a state of higher or lower entropy?

The ant farm had lower entropy before the earthquake because it was a highly ordered system. After the earthquake, the system became much more disordered and had higher entropy.

Energy transfers take place constantly in everyday activities. Think of two scenarios: cooking on a stove and driving. Explain how the second law of thermodynamics applies to these two scenarios.

While cooking, food is heating up on the stove, but not all of the heat goes to cooking the food, some of it is lost as heat energy to the surrounding air, increasing entropy. While driving, cars burn gasoline to run the engine and move the car. This reaction is not completely efficient, as some energy during this process is lost as heat energy, which is why the hood and the components underneath it heat up while the engine is turned on. The tires also heat up because of friction with the pavement, which is additional energy loss. This energy transfer, like all others, also increases entropy.

Glossary


Second law of thermodynamics examples

Examples based on Clausius’s statement

1) Air leaks from the balloon on its own

You might have noticed this on your birthday.

The air leaks from the balloon on its own after some time.

The air never goes inside the balloon on its own. This example is based on the Entropy statement of second law of thermodynamics. It is an example of spontaneous process.

2) Two gases will mix automatically on its own

If we remove this white separator as shown in the image, then what happens?

Both the gases will get mixed with each other. And this process also occurs on its own. Thus this is an example of second law of thermodynamics which shows that the entropy of the universe increases due to this spontaneous process.

3) Hot coffee cools down automatically

This example is also based on the principle of increase in entropy.

In a shivering winter, if your mom prepares a hot coffee for you and you do not drink it within few minutes, then what happens to this coffee?

This coffee will cool down after some time. Right?

Yes. Obviously it will cool down. And this process occurs on its own (spontaneously).

As this process occurs spontaneously, the entropy of the universe will increase.

4) Object falls on the ground on its own

You have already experienced this.

A stone or any object always falls down on the ground on its own. These objects never goes up automatically.

This process of falling indicates that it is a spontaneous process and for such spontaneous processes, the entropy of the universe increases.

5) Ice melts automatically

You have already seen this at least once in your life.

What happens if you keep ice on a table for some time?

Now this thermodynamic process occurs spontaneously (on its own).

Because of this spontaneous process, the entropy of the universe increases.

6) Water always flows from higher level to the lower level

Have you seen water going upward automatically?

The answer is NO.

Because the water always flows spontaneously from higher level to the lower level.

It never goes up automatically.

Thus this spontaneous process of water flowing from higher level to lower level indicates the increase of entropy of the universe.

7) A gas takes the entire volume of the container

What will happen if you insert some gas in the closed container. Obviously it will spread throughout the container(on its own) and it will occupy the entire space of the container.

This is exactly the same as applying perfume on your shirt and the fragrance spreads in the entire room.

This process occurs spontaneously (i.e on its own) and because of this spontaneous process, the entropy of the universe increases.

Example based on Kelvin Planck’s statement

1) Cars and bikes engine

Hope you know this Kelvin Planck’s statement, now let me explain to you how this example of car engine or bike engine is based on the 2nd law.

See, according to Kelvin Planck’s statement, there should be at least two thermal reservoirs (one at higher temperature and other at lower temperature) then only the engine will work.

In a car engine and bike engine, there is a higher temperature reservoir where heat is produced and a lower temperature reservoir where the heat is released.

Thus these engines are the example of second law of thermodynamics.

Example based on Clausius’s statement

1) Refrigerator using electricity to change the direction of heat flow

You know what happens in the refrigerator?

The heat is traveling from the lower temperature body (i.e inside space of refrigerator) to the higher temperature body (i.e outside the refrigerator).

But this process is not possible on its own. To make this heat flow possible, there is a supply of external energy to this refrigerator. This external energy is nothing but electrical energy which is further used in the compressor of the refrigerator to produce mechanical work.

Thus this example satisfies the Clausius’s statement of second law of thermodynamics.

Hope you have found this article helpful.

Let me know what you think about these examples of second law of thermodynamics. Feel free to comment if you have any queries.


4.14: The Second Law of Thermodynamics - Biology

The Second Law of Thermodynamics,
Evolution, and Probability
Copyright © 1995-1997 by Frank Steiger
This document may be reproduced without royalty for non-profit, non-commercial use.

reationists believe that the second law of thermodynamics does not permit order to arise from disorder, and therefore the macro evolution of complex living things from single-celled ancestors could not have occurred. The creationist argument is based on their interpretation of the relationship between probability and a thermodynamic property called "entropy."

By way of background, and in order to clarify the creationist position, let me quote from the creationist literature:

The Remarkable Birth of Planet Earth , by Henry Morris:

(p. 14) All processes manifest a tendency toward decay and disintegration, with a net increase in what is called the entropy, or state of randomness or disorder, of the system. This is called the Second Law of Thermodynamics.

(p. 19) There is a universal tendency for all systems to go from order to disorder, as stated in the Second Law, and this tendency can only be arrested and reversed under very special circumstances. We have already seen, in Chapter I, that disorder can never produce order through any kind of random process. There must be present some form of code or program, to direct the ordering process, and this code must contain at least as much "information" as is needed to provide this direction.
Furthermore, there must be present some kind of mechanism for converting the environmental energy into the energy required to produce the higher organization of the system involved. .
Thus, any system that experiences even a temporary growth in order and complexity must not only be "open" to the sun's energy but must also contain a "program" to direct the growth and a "mechanism" to energize the growth.

Scientific Creationism , edited by Henry Morris:

(p.25) The Second Law (Law of Energy Decay) states that every system left to its own devices always tends to move from order to disorder, its energy tending to be transformed into lower levels of availability, finally reaching the state of complete randomness and unavailability for further work.

Of course, the creationist application of the second law of thermodynamics to the development of living things is inconsistent with any model of origins. Creationists get around this problem by invoking the supernatural:

The Genesis Flood , by Whitcomb and Morris:

(p. 223) But during the period of Creation, God was introducing order and organization into the universe in a very high degree, even to life itself! It is thus quite plain that the processes used by God in creation were utterly different from the processes which now operate in the universe!

As will be shown later on, it is only the over-all entropy of a complete, or closed system that must increase when spontaneous change occurs. In the case of spontaneously interacting sub-systems of a closed system, some may gain entropy, while others may lose entropy. For example, it is a fundamental axiom of thermodynamics that when heat flows from subsystem A to subsystem B, the entropy of A decreases and the entropy of B increases. The statement that an increase in order can only occur as the result of a directional mechanism, program, or code is misleading. Any process that can be demonstrated to take place with an increase in order/decrease in entropy is arbitrarily deemed to be the consequence of an undefined "directional mechanism."

Probability, as used in thermodynamics, means the probability that some specific change will occur. Probability is related to the thermodynamic concept of irreversibility. An irreversible physical or chemical change is a change that will not spontaneously reverse itself without some change in the surrounding conditions. Irreversible changes have a high degree of probability. The probability of an irreversible change spontaneously reversing itself without outside interference is zero.

When we say that a change is irreversible (in the thermodynamics sense) it means only that the change will not spontaneously reverse itself without some change in the surrounding conditions. It does not mean that it cannot be reversed by any means at all!

It is important to remember that a change that has a high degree of probability under one set of circumstances may have a very low degree of probability under a different set of circumstances. To illustrate: If the temperature drops below freezing, the probability of water becoming ice is very high. The change from water to ice is thermodynamically irreversible. If the surrounding temperature should happen to rise above the freezing point, the probability of water becoming ice, or remaining as ice, is zero. Under these conditions the reverse change of ice to liquid water is also thermodynamically irreversible.

Failure to understand that in thermodynamics probabilities are not fixed entities has led to a misinterpretation that is responsible for the wide- spread and totally false belief that the second law of thermodynamics does not permit order to spontaneously arise from disorder. In fact, there are many examples in nature where order does arise spontaneously from disorder: Snowflakes with their six-sided crystalline symmetry are formed spontaneously from randomly moving water vapor molecules. Salts with precise planes of crystalline symmetry form spontaneously when water evaporates from a solution. Seeds sprout into flowering plants and eggs develop into chicks.

Thermodynamics is an exact science that is based on a limited number of specific mathematical concepts. It is not explainable in terms of qualitative metaphors. In order to understand the relationship between probability and the second law, the reader must be familiar with the relationship between probability and entropy. Entropy is a mathematically defined entity which is the fundamental basis of the second law of thermodynamics and all of its engineering and physical chemistry ramifications.

In the following sections we will try to explain the true relation between entropy and probability and show why this relationship does not preclude the possibility of order spontaneously arising from disorder.

In describing the laws of thermodynamics we often refer to "systems." A system is a specific entity or object or region in space to be evaluated in terms of its thermodynamic properties and possible changes. It could be an ice cube, a toy balloon, a steam turbine, or even the entire earth itself.

The concept of entropy is fundamental to understanding the second law of thermodynamics. Entropy (or more specifically, increase in entropy) is defined as heat (in calories or Btu's) absorbed by a system, divided by the absolute temperature of the system at the time the heat is absorbed. Absolute temperature is the number of degrees above "absolute zero", the coldest temperature that can exist.

The total entropy in a system is represented by the symbol S. The symbol S is used to represent a given change in the entropy content of a system. If the symbol q is used to represent the amount of heat absorbed by a system, the equation for the resulting entropy increase is:
Where T is the absolute temperature. When heat is absorbed, the entropy of a system increases when heat flows out of a system, its entropy decreases.

The "surroundings" of a system is everything outside of the system that can interact with it surroundings can usually be defined as the space that surrounds a system. When heat is evolved by a system, that same heat is absorbed by its surroundings. When heat is absorbed by a system, that same heat must necessarily come from its surroundings. Therefore any entropy increase in a system due to heat flow must be accompanied by an entropy decrease in the surroundings, and vice versa. When heat flows spontaneously from a hotter region to a cooler region, the entropy decrease in the hotter region will always be less than the entropy increase in the cooler region, because the greater the absolute temperature, the smaller the entropy change for any particular heat flow. (See equation 1, above)

As an example, consider the entropy change when a large rock at 500 degrees absolute is dropped into water at 650 degrees absolute. (We are using an absolute temperature scale based on Fahrenheit degrees on this scale, water freezes at 492 degrees.) For each Btu of heat that flows into the rock at these temperatures the entropy increase in the rock is 1/500 = 0.0020 and the entropy decrease of the water is 1/650 = 0.0015. The difference between these values is 0.0020 - 0.0015 = 0.0005. This represents the over all entropy increase of the system (rock) and its surroundings (water).

Of course the rock will warm up to, and the water cool to, a temperature intermediate between their original temperatures, thus considerably complicating the calculation of total entropy change after equilibrium is achieved. Nevertheless, for every Btu of heat transferred from water to rock there will always be an increase of over-all net entropy.

As was mentioned before, a spontaneous change is an irreversible change. Therefore an increase in the overall net entropy can be used as a measure of the irreversibility of spontaneous heat flow.

Irreversible changes in a system can, and often do, take place even though there may be no interaction, and negligible heat flow, between system and surroundings. In cases like these the entropy "content" of the system is greater after the change than before. Even when heat flow does not occur between system and surroundings, spontaneous changes inside an isolated system are always accompanied by an increase in the system's entropy, and this calculated entropy increase can be used as a measure of irreversibility. The following paragraphs will explain how this entropy increase can, at least in some cases, be calculated.

It is an axiom of thermodynamics that entropy, like temperature, pressure, density, etc., is a property of a system and depends only on the existing condition of the system. Regardless of the procedures followed in achieving a given condition, the entropy content for that condition is always the same. In other words, for any given set of values for pressure, temperature, density, composition, etc., there can be only one value for the entropy content. It is essential to remember this: When a system that has undergone an irreversible change is restored to its original condition (same temperature, pressure, volume, etc.) its entropy content will likewise be the same as it was before.

In cases where an isolated system undergoes an entropy increase as the result of a spontaneous change inside the system, we can calculate that entropy increase by postulating a procedure whereby the system's entropy increase is transferred to the surroundings in a manner such that there is no further increase in net entropy and the system is restored to its original condition. The entropy increase of the surroundings can then be readily calculated by equation (1): S = q/T, where q = heat absorbed by the surroundings, and T = absolute temperature of the surroundings.

It bears repeating that when the system is restored to its original condition, its entropy content will be the same as it was before its irreversible change. Therefore the amount of entropy absorbed by the surroundings during restoration must necessarily be the same as the entropy increase accompanying the system's original irreversible change, providing that there is no further increase in net entropy during restoration.

This postulated restoration procedure and the postulated properties of the surroundings are for the purpose of calculation only. Since we are not dealing with the surroundings as such, they can be postulated in whatever form necessary to simplify the calculations it is neither necessary nor desirable that the surroundings correspond to any condition that could actually exist. Therefore, we will postulate a theoretical restoration procedure that takes place with no further increase in net entropy, even though such a procedure can not actually be obtained experimentally.

The restoration process, if it were to take place in actuality, would have to be accompanied by at least a small amount of irreversibility, and hence an additional increase in the entropy of the surroundings beyond the entropy increase from the system's original irreversible change. This is because heat will not flow without a temperature differential, friction cannot be entirely eliminated, etc. Therefore the restoring process, if it is to take place with no further increase in over-all net entropy, must be postulated to take place with no irreversibility. If such a process could be actually realized, it would be characterized by a continuous state of equilibrium (i.e. no pressure or temperature differentials) and would occur at a rate so slow as to require infinite time. Processes like these are called "reversible" processes. Remember, reversible processes are postulated to simplify the calculation of the entropy change in a system it is not necessary that they be capable of being achieved experimentally.

It should not be assumed that equation (1) requires that q, the heat absorbed, must necessarily be absorbed reversibly. The concept of reversibility is merely a means to an end: the calculation of entropy change accompanying an irreversible process.

The following example will illustrate the calculation of a reversible restoring process and at the same time develop the equation which is the basis for the thermodynamical relationship between probability and the second law. We will postulate a system consisting of an "ideal" gas contained in a tank connected to a second tank that has been completely evacuated, with the valve between the two tanks closed. The temperature of the system and its surroundings is postulated to be the same. An ideal gas is one whose molecules are infinitely small and have no attractive or repulsive forces on each other. (Under ordinary conditions hydrogen and helium closely approximate the properties of an ideal gas.) An ideal gas is chosen in order to develop the basic relationship without introducing complicating correction factors to account for the size of the molecules and the forces they exert on each other.

When the valve is opened the gas expands irreversibly from V1 (its original volume) to V2 (the volume of both tanks). There is no work of compression by or upon the surroundings. Because the gas is ideal there is no temperature change, and hence no heat flow takes place.

After expanding irreversibly from V1 to V2, the gas is restored to its original condition by reversibly compressing it back to V1. This compression requires work (force applied through a distance) which in turn generates heat in the gas, heat that is absorbed by the surroundings so that there is no increase in the gas temperature. In our mathematical model of this reversible restoring process, the surroundings are postulated to be so large that they also do not undergo any temperature increase. The temperature T remains unchanged during the entire irreversible expansion and subsequent reversible restoration process.

The work of compressing the gas during restoration is equal to the pressure of the gas times the volume change due to compression. Because the pressure increases during compression, the work of compression must be determined by the calculus integral:
The integral sign indicates the summation of all the individual values of PdV.

The equation relating temperature, pressure, and volume of an ideal gas is:

In the case of a reversible, isothermal compression of an ideal gas we may substitute P from equation (2) into the equation for compression work. When this is done, we have:

Although it is not necessary that our postulated reversible restoration process be capable of being carried out in a practical sense, it is nevertheless sometimes helpful to be able to visualize the process. To this end, the reader may consider the restoring compression process being brought about by a piston fitted into the end of the second tank. On compression from V2 to V1, the piston moves down the length of the second tank, and with no mechanical friction forces all the gas contained therein back into the first tank V1.

Since the work of compression is equal to q, the heat absorbed by the surroundings, q may be substituted in equation (3) to give: From equation (1) the entropy gained by the surroundings during restoration from V2 to V1 is: Substituting from equation (4):

Upon integrating (a calculus procedure for summing up the individual values of dV/V) we have: Where ln(V2/V1) is the natural logarithm of the ratio of expanded volume to the initial volume, and S is equal to the entropy increase in the surroundings upon restoration compression from V2 to V1. As we have seen, S is also equal to the entropy increase of the gas caused by its original expansion from V1 to V2. This is because V1 is the same volume both before expansion and after restoration compression, and therefore has the same entropy content. Therefore the entropy transferred to the surroundings during restoration is equal to that gained by the system in expanding from V1 to V2.

The ratio of the probability that all the gas molecules are evenly distributed between the two tanks to the probability that all the molecules, of their own accord and by random motion, would be in tank V1 is equal to (V2/V1) N , where N is the number of molecules.

If V2/V1 were equal to 2.0, for example, and N were equal to 10, the probability ratio would be 2 to the tenth power, or 1024. For N = 100, the ratio would be approximately 1.27 times ten to the 30th power. It is clear that the random motion of trillions of gas molecules heavily favors a uniform distribution. From the probability equation, we have: Taking the natural logarithm of both sides, and then multiplying both sides by R, the gas constant: Substituting in equation (5):

Equation (6) represents the fundamental relationship between probability and the second law of thermodynamics. It states that the entropy of a gaseous system increases when its molecular distribution changes from a lower probability to a higher probability (X2 greater than X1).

Based on the belief that the laws of thermodynamics are universal, this equation has been assumed to apply to all systems, not just gaseous. In other words, any entropy change is proportional to the logarithm of the ratio of probabilities. Therefore, for the general case equation (6) can be written: Where K is a constant depending on the particular change involved. However, individual values of K, X1, or X2 are seldom, if ever, known for non-gaseous systems.

As we have seen before, S can be either positive or negative. When S is negative equation (7) can be written: Therefore a system can go from a more probable state (X2) to a less probable state (X1), providing S for the system is negative. In cases where the system interacts with its surroundings, S can be negative providing the over-all entropy of the system and its interacting surroundings is positive the over-all change can be positive if the entropy increase of the surroundings is numerically greater than the entropy decrease of the system.

In the case of the formation of the complex molecules characteristic of living organisms, creationists raise the point that when living things decay after death, the process of decay takes place with an increase in entropy. They also point out, correctly, that a spontaneous change in a system takes place with a high degree of probability. They fail to realize, however, that probability is relative, and a spontaneous change in a system can be reversed, providing the system interacts with its surroundings in such a manner that the entropy increase in the surroundings is more than enough to reverse the system's original entropy increase.

The application of energy can reverse a spontaneous, thermodynamically "irreversible" reaction. Leaves will spontaneously burn (combine with oxygen) to form water and carbon dioxide. The sun's energy, through the process of photosynthesis, will produce leaves from water vapor and carbon dioxide, and form oxygen.

If we unplug a refrigerator, heat will flow to the interior from the surroundings the entropy increase inside the refrigerator will be greater than the entropy decrease in the surroundings, and the net entropy change is positive. If we plug it in, this spontaneous "irreversible" change is reversed. Due to the input of electrical energy to the compressor, the heat transferred to the surroundings from the condenser coils is greater than the heat extracted from the refrigerator, and the entropy increase of the surroundings is greater than the entropy decrease of the interior, in spite of the fact that the surroundings are at a higher temperature. Here again, the net entropy change is positive, as would be expected for any spontaneous process.

In a similar manner, the application of electrical energy can reverse the spontaneous reaction of hydrogen and oxygen to form water: when a current is passed through a water solution, hydrogen is liberated at one electrode, oxygen at the other.

As can easily be confirmed experimentally, agitating water raises its temperature. When water falls freely from a higher elevation to a lower elevation, its energy is changed from potential to kinetic, and finally to heat as it splashes at the end of its fall. The second law of thermodynamics states that the water will not spontaneously raise itself to its original elevation using the heat produced on splashing as the sole source of energy. To do so would require a heat engine that would convert all of the heat of splashing to mechanical energy.

The efficiency of a heat engine is thermodynamically limited by the Carnot cycle, which limits the efficiency of any heat engine to T/T, where T is the temperature increase due to splashing, and T is the absolute temperature. Since T is only a small fraction of T, there is no device that could be constructed which would allow all of the water to spontaneously jump back to its former elevation.

We can, at least in theory, calculate the entropy increase of the water resulting from its irreversible change in falling. In a manner analogous to that used in the previous example, the entropy increase would be equal to the heat generated by splashing agitation, divided by the absolute temperature. If some of the energy of the falling water is extracted by a water wheel, there will be less heat of splashing and hence less entropy increase.

A properly designed turbine could extract most of the water's kinetic energy. This is not the same thing as trying to utilize the heat of splashing as an energy source for a heat engine to raise the water. In other words, using the energy before it becomes heat is much more efficient than trying to use it after it becomes heat.

If a water wheel is connected by shafts, belts, pulleys, etc. to a pump, the pump can raise water from the downstream side of the water wheel to an elevation even higher than that of the upstream reservoir. Some of the water would spontaneously raise itself to an elevation even higher than original, but the rest of it would end up below the water wheel on the downstream side.

While it is not possible for all of the water to raise itself to an elevation higher than its initial elevation, it is possible for some of the water to spontaneously raise itself to an elevation higher than initial.

As with any other irreversible change, there will be an increase in over-all entropy. This means that the entropy increase of the water going over the wheel is greater than the entropy decrease of water pumped up to the higher elevation.

This will be confirmed mathematically in the following paragraphs. will stand for the Greek letter gamma, representing unit weight in pounds per cubic foot. An increase in the value of a parameter will be represented by . From the flow equation, energy in = energy out: The total energy available, h, is divided into pump work, f (h + h), and energy lost, T S: Rearranging: When no pump work is done, then: Combining equations (8) and (9), we get: In the case where the water falls freely without turning the water wheel or operating the pump: Equation (10) shows that S' is larger than S, and that the entropy increase due to pump friction and downstream agitation is "backed up" by the even larger entropy increase that takes place when water falls freely. Equation (10) also shows that the lower the value of S, the more efficient the pump, and the greater the value of f, the fraction of water pumped.

Creationists assume that a change characterized by a decrease in entropy can not occur under any circumstances. In fact, spontaneous entropy decreases can, and do, occur all the time, providing sufficient energy is available. The fact that the water wheel and pump are man-built contraptions has no bearing on the case: thermodynamics does not concern itself with the detailed description of a system it deals only with the relationship between initial and final states of a given system (in this case, the water wheel and pump).

A favorite argument of creationists is that the probability of evolution occurring is about the same as the probability that a tornado blowing through a junkyard could form an airplane. They base this argument on their belief that changes in living things have a very low probability and could not occur without "intelligent design" which overcomes the laws of thermodynamics. This represents a fundamental contradiction in which (they say) evolution is inconsistent with thermodynamics because thermodynamics doesn't permit order to spontaneously arise from disorder, but creationism (in the guise of intelligent design) doesn't have to be consistent with the laws of thermodynamics.

A simpler analogy to the airplane/junkyard scenario would be the stacking of three blocks neatly on top of each other. To do this, intelligent design is required, but stacking does not violate the laws of thermodynamics. The same relations hold for this activity as for any other activity involving thermodynamical energy changes. It is true that the blocks will not stack themselves, but as far as thermodynamics is concerned, all that is required is the energy to pick them up and place them one on top of the other. Thermodynamics merely correlates the energy relationships in going from state A to state B. If the energy relationships permit, the change may occur. If they don't permit it, the change can not occur. A ball will not spontaneously leap up from the floor, but if it is dropped, it will spontaneously bounce up from the floor. Whether the ball is lifted by intelligent design or just happens to fall makes no difference.

On the other hand, thermodynamics does not rule out the possibility of intelligent design it is just simply not a factor with respect to the calculation of thermodynamic probability.

Considering the earth as a system, any change that is accompanied by an entropy decrease (and hence going back from higher probability to lower probability) is possible as long as sufficient energy is available. The ultimate source of most of that energy, is of course, the sun.

The numerical calculation of entropy changes accompanying physical and chemical changes are very well understood and are the basis of the mathematical determination of free energy, emf characteristics of voltaic cells, equilibrium constants, refrigeration cycles, steam turbine operating parameters, and a host of other parameters. The creationist position would necessarily discard the entire mathematical framework of thermodynamics and would provide no basis for the engineering design of turbines, refrigeration units, industrial pumps, etc. It would do away with the well-developed mathematical relationships of physical chemistry, including the effect of temperature and pressure on equilibrium constants and phase changes.


[The Greek and mathematical symbols are courtesy of Karen Strom of the University of Massachussetts. They may be downloaded here and used with this attribution and without fee for non-profit purposes only.]


Forward and Backward

You might hear the term reversibility. Scientists use the term reversibility to describe systems that are in equilibrium with themselves and the environment around them. When a system is in equilibrium, any change that occurs in one direction is balanced by an equal change in the opposite direction. Reversibility means that effects can be reversed. This implies that the system is isolated (nothing is interfering, nothing entering or leaving). Overall, their effect and change on the system are zero.


Why is it that when you leave an ice cube at room temperature, it begins to melt? Why do we get older and never younger? And, why is it whenever rooms are cleaned, they become messy again in the future? Certain things happen in one direction and not the other, this is called the "arrow of time" and it encompasses every area of science. The thermodynamic arrow of time (entropy) is the measurement of disorder within a system. Denoted as (Delta S), the change of entropy suggests that time itself is asymmetric with respect to order of an isolated system, meaning: a system will become more disordered, as time increases.

Major players in developing the Second Law

  • Nicolas Léonard Sadi Carnot was a French physicist, who is considered to be the "father of thermodynamics," for he is responsible for the origins of the Second Law of Thermodynamics, as well as various other concepts. The current form of the second law uses entropy rather than caloric, which is what Sadi Carnot used to describe the law. Caloric relates to heat and Sadi Carnot came to realize that some caloric is always lost in the motion cycle. Thus, the thermodynamic reversibility concept was proven wrong, proving that irreversibility is the result of every system involving work.
  • Rudolf Clausius was a German physicist, and he developed the Clausius statement, which says "Heat generally cannot flow spontaneously from a material at a lower temperature to a material at a higher temperature."
  • William Thompson, also known as Lord Kelvin, formulated the Kelvin statement, which states "It is impossible to convert heat completely in a cyclic process." This means that there is no way for one to convert all the energy of a system into work, without losing energy.
  • Constantin Carathéodory, a Greek mathematician, created his own statement of the second low arguing that "In the neighborhood of any initial state, there are states which cannot be approached arbitrarily close through adiabatic changes of state."

To understand why entropy increases and decreases, it is important to recognize that two changes in entropy have to considered at all times. The entropy change of the surroundings and the entropy change of the system itself. Given the entropy change of the universe is equivalent to the sums of the changes in entropy of the system and surroundings:

In an isothermal reversible expansion, the heat q absorbed by the system from the surroundings is

Since the heat absorbed by the system is the amount lost by the surroundings, (q_=-q_).Therefore, for a truly reversible process, the entropy change is

If the process is irreversible however, the entropy change is

If we put the two equations for (Delta S_)together for both types of processes, we are left with the second law of thermodynamics,

[Delta S_=Delta S_+Delta S_geq0 label<5>]

where (Delta S_) equals zero for a truly reversible process and is greater than zero for an irreversible process. In reality, however, truly reversible processes never happen (or will take an infinitely long time to happen), so it is safe to say all thermodynamic processes we encounter everyday are irreversible in the direction they occur.

The second law of thermodynamics can also be stated that "all spontaneous processes produce an increase in the entropy of the universe".


Three Laws of Biology

Immense scientific progress has been made in recent centuries, and the time period required to double our knowledge continues to shrink. In recent decades, the sequencing of genomes from diverse species has been a primary driving force behind the expansion of biological knowledge. It has become central to the study of molecular and organismal evolution. The technologies that, for example, enable genomics, molecular medicine, and computing to forge forward at such rapid interdependent paces, are recognized as central to our understanding of Earth’s biosphere and sustaining it for future generations.

In recent years, biology has been at the forefront of science as we satisfy our desires to understand the nature of living organisms and their evolutionary histories. The statements that follow are based on reams of evidence. Only when each statement is integrated with the others does a reasonably complete picture of life become possible. We enlist the assistance of the international scientific community to inform us of any modifications and exceptions to existing scientific dogma so that our concepts can continuously be refined. Only via this approach has it been possible to establish some basic laws of biology. The First Law of Biology: all living organisms obey the laws of thermodynamics. The Second Law of Biology: all living organisms consist of membrane-encased cells. The Third Law of Biology: all living organisms arose in an evolutionary process.

The First Law of Biology: all living organisms obey the laws of thermodynamics. This law is fundamental because the laws of the inanimate world determine the course of the universe. All organisms on all planets, including humans, must obey these laws. The laws of thermodynamics govern energy transformations and mass distributions. Cells that comprise living organisms (see The Second Law) are open systems that allow both mass and energy to cross their membranes. Cells exist in open systems so as to allow acquisition of minerals, nutrients, and novel genetic traits while extruding end products of metabolism and toxic substances. Genetic variation, which results in part from gene transfer in prokaryotes and sexual reproduction in higher organisms, allows tremendously increased phenotypic variability in a population as well as an accelerated rate of evolutionary divergence.

A corollary of the First Law is that life requires the temporary creation of order in apparent contradiction to the second law of thermodynamics. However, considering a completely closed system, including the materials and energy sources provided by the environment for the maintenance of life, living organisms affect the system strictly according to this law, by increasing randomness or chaos (entropy). Resource utilization by living organisms thus increases the entropy of the world. A second corollary of the First Law is that an organism at biochemical equilibrium is dead. When living organisms reach equilibrium with their surrounding environment, they no longer exhibit the quality of life. Life depends on interconnected biochemical pathways to allow for growth, macromolecular synthesis, and reproduction. Thus, all life forms are far from equilibrium with their environments.

The Second Law of Biology: all living organisms consist of membrane-encased cells. Enveloping membranes allow physical separation between the living and the non-living worlds. Viruses, plasmids, transposons, prions, and other selfish, biological entities are not alive. They cannot “self” reproduce. They are dependent on a living cell for this purpose. By definition, they therefore, are not alive. A corollary of the Second Law is that the cell is the only structure that can grow and divide independently of another life form. A second corollary of the Second Law is that all life is programmed by genetic instructions. Genetic instructions are required for cell division, morphogenesis, and differentiation. From single-celled prokaryotic organisms to normal or cancerous tissues in multicellular animals and plants, genetic instructions are required for the maintenance of life.

The Third Law of Biology: all living organisms arose in an evolutionary process. This law correctly predicts the relatedness of all living organisms on Earth. It explains all of their programmed similarities and differences. Natural selection occurs at organismal (phenotypic) and molecular (genotypic) levels. Organisms can live, reproduce, and die. If they die without reproducing, their genes are usually removed from the gene pool, although exceptions exist. At the molecular level, genes and their encoding proteins can evolve “selfishly,” and these can combine with other selfish genes to form selfish operons, genetic units and functional parasitic elements such as viruses.

Two corollaries of the Third Law are that (1) all living organisms contain homologous macromolecules (DNA, RNA, and proteins) that derived from a common ancestor, and (2) the genetic code is universal. These two observations provide compelling evidence for the Third Law of Biology. Because of his accurate enunciation of the Third Law, Charles Darwin is considered by many to be the greatest biologist of all time.

Although science is continually pushing back the frontiers of our knowledge, we will never know everything. In fact, we do not even know what we do not know. For example, we may never know how life arose. Although life may be sprinkled throughout the universe, life is not required for the continuity of inanimate matter that is, living organisms are not essential for the universe to function. The laws of physics continue to apply regardless of the presence of life. To the best of our knowledge, life can only arise from pre-existing life. This of course begs the question how the first living cell(s) might have arisen. Did life spontaneously arise from inanimate nature just once, or more than once? Can life be transferred between receptive planets through space travel? We simply do not know. The mechanisms that may have led to the origin of a cell capable of autonomous growth and division are a mystery. This is an area of biology that will require a tremendous amount of scientific research if evidence is ever to become available, and there are no guarantees.

The rules of biology and science cannot be broken. They are not artificial human-made laws. They are natural laws that govern all life while living organisms are evolving on our planet. In recent decades, humans have altered our common, shared biosphere with resource depletion and pollution. We know that these activities have upset the balance of Nature, causing extensive species extinction. The most significant forms of pollution can be attributed to too many humans consuming too many non-renewable resources at an ever-increasing rate. Much of this harm is driven by pleasure, greed, conflict, and the desire for power. To varying degrees, we are all to blame.

Why do so many people assault the biosphere in such a primitive manner? Some are ignorant of the outcome. They are oblivious to the consequences of their actions. They do not recognize that incorrect action can have disastrous outcomes for our biosphere and us all. They do not understand that natural selection is cruel and can cause immense suffering and death. They think only of the moment and refuse to accept that it is their offspring who will have to face calamity. Still others are fully aware of the ultimate consequences. And those of us who are aware must take action to pass on our knowledge so as to attempt to avoid or delay our self-imposed fate. Research does tell us that we are assaulting our biosphere, and that the planet cannot accommodate our huge human population. We depend on natural resources for the continuance of our existence, but we are not living-sustainable. This planet does not need more consumption and pollution. It is groaning under the weight of our ever-increasing human population. Entropy will have its way. It might help if everyone understood science and our natural world so that they would recognize what is required for survival of the human species with some reasonable quality of life and the first step in this direction is to understand the basic laws of physics, chemistry, and biology and how they govern our biosphere, which is currently under assault and in need of being rescued. However, without profound respect for Nature and compassion for life, all life, knowledge is likely to be insufficient. We must develop into more caring, sensitive, and compassionate beings.


II THE SECOND LAW OF THERMODYNAMICS

  • 4 . Background to the Second Law of Thermodynamics
    • 4 . 1 Reversibility and Irreversibility in Natural Processes
    • 4 . 2 Difference between Free and Isothermal Expansions
    • 4 . 3 Features of reversible processes
    • 4 . 4 Muddiest Points on Chapter 4
    • 5 . 1 Concept and Statements of the Second Law
    • 5 . 2 Axiomatic Statements of the Laws of Thermodynamics
      • 5 . 2 . 1 Introduction
      • 5 . 2 . 2 Zeroth Law
      • 5 . 2 . 3 First Law
      • 5 . 2 . 4 Second Law
      • 5 . 2 . 5 Reversible Processes
      • 6 . 1 Limitations on the Work that Can be Supplied by a Heat Engine
      • 6 . 2 The Thermodynamic Temperature Scale
      • 6 . 3 Representation of Thermodynamic Processes in coordinates
      • 6 . 4 Brayton Cycle in - Coordinates
        • 6 . 4 . 1 Net work per unit mass flow in a Brayton cycle
        • 6 . 8 . 1 Entropy
        • 6 . 8 . 2 Reversible and Irreversible Processes
        • 6 . 8 . 3 Examples of Reversible and Irreversible Processes
        • 7 . 1 Entropy Change in Mixing of Two Ideal Gases
        • 7 . 2 Microscopic and Macroscopic Descriptions of a System
        • 7 . 3 A Statistical Definition of Entropy
        • 7 . 4 The Statistical Definition of Entropy and Randomness
        • 7 . 5 Numerical Example: the Equilibrium Distribution
        • 7 . 6 Summary and Conclusions
        • 8 . 1 Behavior of Two-Phase Systems
        • 8 . 2 Work and Heat Transfer with Two-Phase Media
        • 8 . 3 The Carnot Cycle as a Two-Phase Power Cycle
          • 8 . 3 . 1 Example: Carnot steam cycle