logo

What is Joint Entropy in Classical Information Theory?

What is Joint Entropy in Classical Information Theory?

양자정보이론
[ 펼치기 · 접기 ]

Definition1

Let’s consider two discrete random variables X,YX, Y with their joint probability mass function pp. The joint entropy of XX and YY is defined as follows.

H(X,Y):=i,jp(xi,yj)log2p(xi,yj) H(X, Y) := - \sum_{i,j} p(x_{i}, y_{j}) \log_{2}p(x_{i}, y_{j})

Explanation

The definition of entropy simply changes to involve the joint probability mass function.


  1. Stephen M. Barnett, Quantum Information (2009), p10-11 ↩︎