Skip to main content

Section 1.2 Entropy Taximony

Note ID: 202604110002 | Tags: <thermodynamics>, <statistical mechanics>, <information theory>
A taxomony of entropy across various domains.

Subsection 1.2.1 Entropy in Thermodynamics

Definition 1.2.1. Clausius Entropy.

The Clausius entropy is a change in the entropy of a system due to some reversible process where it absobes some amount of heat \(Q_\text{in}\) at a constant temperature \(T\text{:}\)
\begin{equation*} \Delta S_\text{system} := \int_\text{rev}{\frac{d \, Q_\text{in}}{T}} . \end{equation*}

Subsection 1.2.2 Entropy in Statistical Mechanics

Definition 1.2.2. Boltzmann Entropy.

The Boltzmann entropy of a macroscopic system in a state with multiplicity \(\Omega\) is given by:
\begin{equation*} S_\text{Boltzmann} := k_B \ln{\Omega} . \end{equation*}

Definition 1.2.3. Gibbs Entropy.

The Gibbs entropy of a macroscopic system is defined in terms of the probabilities \(\mathbb{p}_i\) of being in microstate \(i\text{:}\)
\begin{equation*} S_\text{Gibbs} := -k_B \sum_i{\mathbb{p}_i \ln{\mathbb{p}_i}} \end{equation*}

Subsection 1.2.3 Entropy in Information Theory

Definition 1.2.4.

The Shannon entropy of a discrete random variable with possible outcomes \(i\in\{1,2,\ldots,n\}\) where \(n\in\mathbb{N}\) and corresponding probablities \(\mathbb{p}_i\) is defined as:
\begin{equation*} H_\text{Shannon} := -\sum_{i=1}^n{\mathbb{p}_i \log_2{\mathbb{p}_i}} . \end{equation*}
πŸ–‡οΈLinked Notes: AssumptionΒ B.1.2
πŸ”–References: [Lecture 6: Entropy]