Decorative banner

B.4.2 Entropy and system evolution (HL only)

Entropy: A Measure of Disorder

Entropy is a fundamental concept in thermodynamics that describes the degree of disorder or randomness in a system.

Definition

Entropy

Entropy is a measure of the disorder or randomness in a system. It quantifies the number of possible ways a system can be arranged.

Calculating Entropy Change

The change in entropy, ΔS, is defined as:

ΔS=ΔQT

where:

  • ΔQ is the heat added to the system.
  • T is the absolute temperature (in Kelvin) at which the heat is added.

Example

If 200 J of heat is added to a system at 400 K, the change in entropy is:

ΔS=200J400K=0.5J K1

Tip

Entropy increases when heat is added to a system and decreases when heat is removed.

The Second Law of Thermodynamics

The second law of thermodynamics introduces the concept of entropy to explain why certain processes occur spontaneously while others do not.

Definition

The second law of thermodynamics

The second law of thermodynamics states that in any natural process, the total entropy of an isolated system always increases or remains constant in ideal reversible processes.

Entropy Increases in Isolated Systems

  1. In an isolated system (where no energy or matter is exchanged with the surroundings), the total entropy always increases or remains constant.
  2. This is often stated as:
    • Entropy increases in realistic, irreversible processes.
    • Entropy remains constant in idealized, reversible processes.

Example

  • When a hot object is placed in contact with a cold object, heat flows from the hot object to the cold one.
  • The entropy of the hot object decreases, but the entropy of the cold object increases by a larger amount, leading to a net increase in entropy.

Common Mistake

A common misconception is that entropy always increases.

In fact, entropy can decrease locally (e.g., when a gas is compressed), but the total entropy of the system and its surroundings will always increase or remain constant.

Illustration of entropy.
Illustration of entropy.

Reversible and Irreversible Processes

Definition

Reversible processes

Reversible processes are idealized processes that occur in such a way that the system and its surroundings can be returned to their original states without any net change.

In these processes, the entropy change of the system is exactly balanced by the entropy change of the surroundings, resulting in no net change in the total entropy.
Definition

Irreversible processes

Irreversible processes are real-world processes that cannot be undone without leaving a net change in the system or surroundings.

These processes always result in an increase in the total entropy of the universe.

Example

  • The isothermal expansion of a gas is a reversible process if it occurs infinitely slowly, allowing the system to remain in equilibrium with its surroundings.
  • In contrast, the rapid expansion of a gas into a vacuum is an irreversible process, leading to an increase in entropy.

Note

The second law of thermodynamics can also be stated as:

It is impossible for heat to flow spontaneously from a colder body to a hotter body without external work being done.

Tip

The processes in real isolated systems are almost always irreversible and consequently the entropy of a real isolated system always increases.

Entropy and Microstates

Entropy can also be understood from a statistical perspective, which connects macroscopic thermodynamic properties to microscopic configurations.

Microstates and Macrostates

  1. A microstate is a specific arrangement of particles in a system, while a macrostate is defined by macroscopic properties such as temperature, pressure, and volume.
  2. The number of microstates, Ω, corresponding to a given macrostate determines the entropy of the system.

Boltzmann’s Entropy Formula

The entropy, S, of a system is given by Boltzmann’s formula:

S=kBlnΩ

where:

  • kB is the Boltzmann constant (1.38×1023J K1).
  • Ω is the number of microstates.

Example

Consider a system with 100 possible microstates. The entropy of the system is:

S=kBln1004.6×1023J K1

Analogy

Think of a deck of cards. An ordered deck (e.g., all suits in sequence) has only one microstate, while a shuffled deck has many possible arrangements. The shuffled deck has higher entropy because it is more disordered.

Entropy and Probability

  1. The statistical definition of entropy highlights its connection to probability.
  2. A macrostate with a larger number of microstates is more probable and has higher entropy.
  3. This explains why systems tend to evolve toward states of higher entropy—these states are simply more likely to occur.

Theory of Knowledge

  • How does the concept of entropy challenge our understanding of time?
  • Why do we perceive time as moving forward, and how is this related to the second law of thermodynamics?

Applications and Implications

  1. Entropy has profound implications beyond thermodynamics.
  2. It helps explain why certain processes are irreversible and why energy transformations are never 100% efficient.
  3. Entropy also plays a critical role in fields such as information theory, where it measures the uncertainty or randomness of information.

Example

  • In a refrigerator, mechanical work is used to transfer heat from a cold interior to a warmer exterior.
  • While the entropy of the interior decreases, the entropy of the surroundings increases by a larger amount, ensuring that the total entropy of the universe increases.

Reflection and Self-Assessment

Self review

  1. What is the difference between reversible and irreversible processes in terms of entropy?
  2. How does the statistical definition of entropy connect to the macroscopic definition?
  3. Why does the second law of thermodynamics prevent 100% efficient energy conversion?

Entropy provides a powerful framework for understanding the directionality of natural processes and the limitations imposed by the second law of thermodynamics.

Jojo winking

You've read 2/2 free chapters this week.

Upgrade to PLUS or PRO to unlock all notes, for every subject.

Questions

Recap questions

1 of 4

Question 1

How does Boltzmann's entropy formula connect the microscopic arrangements of particles to the macroscopic properties of a system?

End of article
Flashcards

Remember key concepts with flashcards

19 flashcards

What is the relationship between heat transfer and entropy change?

Lesson

Recap your knowledge with an interactive lesson

10 minute activity

Note

Entropy and System Evolution

Entropy is a fundamental concept in thermodynamics that helps us understand how systems evolve over time. It is a measure of disorder or randomness in a system.

  • Entropy is a measure of disorder or randomness in a system
  • It helps us understand why certain processes occur spontaneously
  • Higher entropy = more possible arrangements of a system

Definition

Entropy

A measure of the disorder or randomness in a system, representing the number of possible ways a system can be arranged.

Analogy

Think of entropy like a messy room - there are many more ways for the room to be disorganized than organized.

Example

When ice melts into water, the entropy increases because the water molecules can move more freely in many more possible arrangements.