What is Thermodynamics?

Defining Thermodynamics

Thermodynamics is the branch of science that deals with the study of the energy and work of systems. Chemical processes are governed by the laws of thermodynamics.

The boundaries of a system are based on the analysis being performed. In chemistry a system is often defined as a chemical reaction or a series of chemical reactions. In order to understand the properties of a system, the system must first be defined. Defining a system sets it apart as distinct from its surroundings. The boundaries of a system are chosen so that analysis of the interactions between the system and its environment or among multiple systems can be done. For example, a system could be a piston in a car engine, the entire engine, the engine and transmission and axles, or the entire car.

Thermodynamics is a branch of physical science that investigates the energy and work of systems. Work (w) is energy that is transferred when a force acts on an object over a distance. The heat, or thermal energy, of a reaction is related to the energy available to the reaction system to do work. Equilibrium is a vital consideration in thermodynamics. Equilibrium is a state in which the rates of the forward and reverse reactions are equal. A system, or chemical reaction that has reached equilibrium will come to rest, and no further changes to the system will occur. A system that has not yet reached equilibrium will continue to change until equilibrium is reached. The direction that the reaction equilibrium will shift is determined by the relative formation energies of reactants and products. Comparing the formation energies, the energies required to form bonds, of reactants and products gives a direction that the reaction will shift. If the formation energies are equal, then the reaction is at equilibrium; i.e., there is no energy available to shift the reaction toward reactants or products.

Spontaneous and Nonspontaneous Processes

Processes can be described as either spontaneous or nonspontaneous. Spontaneous processes are those that give off free energy when they occur and result in lower energy processes. Nonspontaneous processes require an input of energy to occur.
When a system is not in equilibrium, the system will change until it reaches equilibrium. This process, in which energy is released as a system changes, is a spontaneous change, or spontaneous process.

Spontaneous Process

A spontaneous process can be reversible or irreversible. For example, compressing gas in a chamber is a reversible process. When the size of the chamber is expanded and contracted, the gas changes from less compressed to more compressed and back again. However, releasing the gas from the chamber to the atmosphere is an irreversible process. Once the gas is out, it is impossible to collect it and put it back in the chamber. Releasing the gas into another part of a closed system, however, can be reversed with a vacuum pump. In contrast to a spontaneous process, in a nonspontaneous change, or nonspontaneous process, energy must be added to the system for the change to occur.

Nonspontaneous Process

In chemistry, spontaneous and nonspontaneous processes generally refer to chemical reactions. A spontaneous reaction will occur provided the reactants are present. A nonspontaneous reaction will not occur without some input of energy from outside the system, such as thermal energy, electrical energy, or chemical energy. Many chemical reactions are spontaneous in one direction but not the other, such as the oxidation of iron, or rusting. Iron(III) oxide will spontaneously form from iron and water in air, but iron, water, and air will not spontaneously form from iron(III) oxide. In other words, iron rusts in the presence of water and oxygen, but rust does not spontaneously turn back into iron, water, and air. A nonspontaneous reaction requires energy input, but the spontaneity of a reaction is not determined by whether or not it requires an input of energy to proceed. The change in Gibbs free energy of a system, which is the energy available to the system to do work, indicates whether a reaction is spontaneous.

Entropy

Entropy is a measurement of the disorder of a system. A system with greater disorder has a greater possible variety of energy distributions.

When discussing thermodynamics, it is important to understand the concept of entropy. Entropy (S) is a measure of the disorder of a system. For example, a tower of blocks arranged one on top of the other has low entropy. There is little disorder: each block sits on top of another block, except for the bottom block, which sits on the ground. There is little uncertainty: below each block is another block, except for the ground beneath the bottom block. A toppling tower of blocks has high entropy. There is much disorder: each block's position relative to the other blocks and the ground is difficult to describe. There is uncertainty; the relationships between the positions of the blocks are random.

In real-world systems, macroscopic properties, such as temperature, pressure, volume, density, velocity, and mass, can be measured. The combination of these measurable macroscopic properties of a system is called the macrostate. Entropy describes how organized a system is as defined by microstates. A microstate is a possible energy and positional configuration of the particles of a system. Microstates represent probabilities; the exact positions and velocities of the molecules involved cannot be known, but probabilities for them can be known.

Consider measuring the temperature of a flask of water. It is impossible to measure the temperature of every water molecule. The temperature that is measured, a macrostate, is a measure of the average kinetic energy of all the molecules. Each possible energy configuration is a different microstate. As temperature increases, the number of microstates, possible energy configurations, also increases. The number of microstates is influenced by other macrostates as well. Ludwig Boltzmann, an Austrian physicist and philosopher, developed an equation around 1872 that uses the number of microstates to calculate entropy:
$S=k{\;\rm{ln}}\;W$
In this formula, S is entropy, in joules per kelvin (J/K), k is the Boltzmann constant, which is $1.38\;\times\;10^{-23}\;{\rm J/K,}$ ln is the natural logarithm, and W is the number of microstates. To understand entropy and microstates, consider a system of four particles enclosed in a glass container. The container consists of two bulbs connected by a thin tube, and the particles can freely move between the two bulbs. Consider the different ways the particles can be distributed in the system. All four particles can be in one bulb and none in the other. Or three particles can be in one bulb and one in the other bulb. Alternatively two particles can be in each bulb.
Each distribution of the particles in the container is a microstate of the system. This simple system depends only on the position of each particle, so the number of microstates can be calculated by an, where a is the number of positions possible and n is the number of particles. Therefore the system consisting of four particles, each with two possible positions, has 24, or 16, possible microstates. Using Boltzmann's equation the number of microstates can be used to calculate the entropy of this system.
\begin{aligned}S&=k\;{\rm {ln}} \;W\\&={(1.38\times10^{-23}\;{\rm{ J/K})\;{ln}}\; {16}}\\&={3.8\times10^{-23}\;{\rm{ J/K}}}\end{aligned}
Notice that several of these microstates are identical to each other, varying only in which particle is in each container. Each collection of microstates that have identical distributions is an arrangement. Five arrangements exist in this system:
• Arrangement A: four particles in the left bulb and none in the right; there is one microstate with this arrangement
• Arrangement B: three particles in the left bulb and one in the right; there are four microstates with this arrangement
• Arrangement C: two particles in each bulb; there are six microstates with this arrangement
• Arrangement D: one particle in the left bulb and three in the right; there are four microstates with this arrangement
• Arrangement E: no particles in the left bulb and four in the right; there is one microstate with this arrangement

The probability for the system to be in arrangement A is 1 in 16, or 6.25%, and the probability for arrangement E is also 6.25%. The probability that the system is in arrangement B is 4 in 16, or 25%, and the probability for arrangement D is also 25%. The probability for arrangement C is 6 in 16, or 37.5%.

This simple system shows that the most probable state of the system has the particles distributed evenly between the two containers. However, real systems have many more than four particles, and they have many more than two ways the particles can be arranged and energy values of the particles. As the number of particles increases, the number of microstates becomes uncountably large. The probability of the system being in the state in which the particles are evenly distributed becomes so large that all other possibilities can be disregarded.

When considering chemical reactions, an analysis of the reactants and products allows scientists to predict whether the change in entropy for the reaction is positive or negative. If only gases are involved in the reaction, then the number of moles of reactants and products can be used to determine the sign of the entropy change. If the number of moles of products is less than the number of moles than the reactants, then entropy is negative. The reverse is also true. If the reaction involves multiple states of matter, then the side of the reaction with the gas typically has higher entropy than the side without.

For example, in the reaction $2{\rm {H}_{2}}(g)+{\rm O}_{2}(g)\rightarrow 2{\rm H_{2}{O}}(g)$, there are more moles on the reactants side than the products side, so the change in entropy is negative.

In the reaction $2{\rm {H_2}{O}_{2}}(l)\rightarrow 2{\rm {H}_{2}O}(l)+{\rm {O}_{2}}(g)$, the gas is present on the product side, so the change in entropy is positive.