Difference Between Enthalpy and Entropy (With Table)

To understand thermodynamics, enthalpy and entropy are two foundational concepts that no one can miss. Knowing the difference between enthalpy and entropy not only helps us pass our science exam but also provides a rational explanation for many processes that we witness in our daily life. From changing phases to the transfer of energy in a single state, thermodynamics can explain all of it.

Enthalpy vs Entropy

The main difference between enthalpy and entropy is that enthalpy is the measurement of the total energy of a system which is the sum of internal energy and product of pressure and volume. On the other hand, entropy is the amount of thermal energy in a system that is not available for its conversion into work. 

Enthalpy of a thermodynamic system is defined as a state function that is calculated at constant pressure (large open atmosphere). The unit of enthalpy is the same as energy, i.e., J in the SI unit because it is the sum of a system’s internal energy and the product of pressure and change in volume. The total enthalpy of a system cannot be measured directly. So, we measure the change in the enthalpy of a system. 

In simple words, entropy is the measure of randomness or chaos in a system. It is an extensive property which means that the value of entropy changes according to the amount of matter present in the system. If a system is highly ordered (less chaotic) then it has low entropy and vice versa. The SI unit of entropy is J⋅K−1.

Comparison Table Between Enthalpy and Entropy

Parameters of Comparison

Enthalpy

Entropy

Definition

Enthalpy is the sum of internal energy and product of pressure and volume of a thermodynamic system. 

Entropy is the amount of thermal energy of a system that is not available for conversion into mechanical or useful work. 

Measurement

The total enthalpy of a system cannot be measured directly hence we calculate the change in enthalpy. 

Measuring entropy of a system refers to the amount of disorder or chaos present in a thermodynamic system. 

Unit

The SI unit of enthalpy is the same as that of energy hence can be measured in J. 

The SI unit of entropy for unit mass is J⋅K−1⋅kg−1 and for entropy per unit amount of substance is J⋅K−1⋅mol−1.

Symbol

Enthalpy is denoted by H.

Entropy is denoted by S.

History

A scientist named Heike Kamerlingh Onnes coined the term “enthalpy.”

A German physicist called Rudolf Clausius coined the term “entropy.”

Favoring Conditions

A thermodynamic system always favors minimum enthalpy. 

A thermodynamic system always prefers maximum entropy. 

What is Enthalpy?

 Enthalpy is a thermodynamic property that refers to the sum of the internal energy and product of pressure and volume of a system. Enthalpy of a system signifies its capacity to release heat and thus it has the same unit as energy (joules, calories, etc). Enthalpy is denoted by H. 

It is not possible to calculate the total enthalpy of a system as it is impossible to know the zero point. So, the change in enthalpy is calculated between one state and another when the pressure is constant. The formula of enthalpy is H = E + PV where E is the internal energy of a system, P is the pressure, and V is the volume. 

There is a lot of significance of enthalpy in a thermodynamic system as it determines if a chemical reaction is endothermic or exothermic. It is also used to calculate the heat of reaction, minimum power requirement for a compressor, etc. 

What is Entropy?

 Entropy is an extensive property and it is the measure of randomness or chaos in a thermodynamic system. The value of entropy changes with the change in the amount of matter in the system. Entropy is denoted by S and the common units of entropy are joules per kelvin J⋅K−1 or J⋅K−1⋅kg−1 for entropy per unit mass. Since entropy measures randomness, a highly ordered system has low entropy.

There are several methods to calculate the entropy of a system. But, two of the most common ways are calculating the entropy of a reversible process and an isothermal process. For calculating the entropy of a reversible process, the formula is S = kB ln W where kB is the Boltzmann’s constant and its value is equal to 1.38065 × 10-23 J/K and W is the number of possible states. For calculating the entropy of an isothermal process, the formula is ΔS = ΔQ / T where ΔQ refers to the change in heat and T is the absolute temperature of the system in Kelvin.

The melting of ice into the water followed by its vaporization into steam is an example of increasing chaos and decreasing entropy. When the ice cube gains energy, the heat energy loosens its structure to form liquid and thus increases the chaos in the system. A similar thing happens when liquid is changed to a vapor state. But, while focusing on the system, the entropy decreases while the entropy of the surroundings increases. 

Main Differences Between Enthalpy and Entropy

  1. Enthalpy is the sum of internal energy and product of pressure and volume of a thermodynamic system. On the other hand, entropy is the amount of thermal energy of a system that is not available for conversion into mechanical or useful work. 
  2. Measuring enthalpy means measuring the change in enthalpy of a system whereas measuring entropy refers to the amount of disorder or chaos in a system.
  3. The SI unit of enthalpy is the same as that of energy hence can be measured in J whereas the SI unit of entropy for unit mass is J⋅K−1⋅kg−1 and for entropy per unit amount of substance is J⋅K−1⋅mol−1.
  4. Enthalpy is denoted by H whereas entropy is denoted by S.
  5. Heike Kamerlingh Onnes coined the term “enthalpy” whereas Rudolf Clausius coined the term “entropy.”
  6. In a thermodynamic system, minimum enthalpy is preferred whereas in the same system maximum entropy is preferred. 

Conclusion

Understanding the difference between enthalpy and entropy is very important to reason several processes in nature. From the melting of ice to solving complex thermodynamics problems, the basic concepts of enthalpy and entropy are applied. 

But, enthalpy cannot be calculated directly so we calculate its change between two phases. On the other hand, entropy is calculated as the degree of randomness in a system. When a system gains energy, the disorder increases, and entropy decreases, and vice versa. 

References

  1. https://pubs.acs.org/doi/pdf/10.1021/j100362a018
  2. https://pubs.rsc.org/en/content/articlehtml/2014/md/c4md00057a