logo

What is Joint Entropy in Classical Information Theory?

What is Joint Entropy in Classical Information Theory?

양자정보이론
[ 펼치기 · 접기 ]

Definition1

Let’s consider two discrete random variables $X, Y$ with their joint probability mass function $p$. The joint entropy of $X$ and $Y$ is defined as follows.

$$ H(X, Y) := - \sum_{i,j} p(x_{i}, y_{j}) \log_{2}p(x_{i}, y_{j}) $$

Explanation

The definition of entropy simply changes to involve the joint probability mass function.


  1. Stephen M. Barnett, Quantum Information (2009), p10-11 ↩︎