What is Joint Entropy in Classical Information Theory?
양자정보이론 | ||||||||||||||||
[ 펼치기 · 접기 ]
|
Definition1
Let’s consider two discrete random variables with their joint probability mass function . The joint entropy of and is defined as follows.
Explanation
The definition of entropy simply changes to involve the joint probability mass function.
Stephen M. Barnett, Quantum Information (2009), p10-11 ↩︎