Covariance Matrix
📂Mathematical StatisticsCovariance Matrix
Definition
p For a dimensional random vector X=(X1,⋯,Xp), the following defined ▶eq03◀ is called the covariance matrix.
(Cov(X))ij:=Cov(Xi,Xj)
Explanation
To put the definition more simply, it can be stated as follows:
Cov(X):=Var(X1)Cov(X2,X1)⋮Cov(Xp,X1)Cov(X1,X2)Var(X2)⋮Cov(Xp,X2)⋯⋯⋱⋯Cov(X1,Xp)Cov(X2,Xp)⋮Var(Xp)
All covariance matrices are positive semidefinite matrices. In other words, for every vector x∈Rp, the following holds:
0≤xTCov(X)x
Theorems
[1]: If μ∈Rp is given as μ:=(EX1,⋯,EXp)
Cov(X)=E[XXT]−μμT
[2]: If a matrix of constants A∈Rk×p is given as (A)ij:=aij
Cov(AX)=ACov(X)AT
[3]: The covariance matrix is a symmetric matrix.
Cov(X)=[Cov(X)]T
Proof
[1]
Cov(X)====E[(X−μ)(X−μ)T]E[XXT−μXT−XμT+μμT]E[XXT]−μE[XT]−E[X]μT+E[μμT]E[XXT]−μμT
■
[2]
Cov(AX)====E[(AX−Aμ)(AX−Aμ)T]E[A(X−μ)(X−μ)TAT]AE[(X−μ)(X−μ)T]ATACov(X)AT
■
[3]
This holds as the covariance satisfies Cov(X,Y)=Cov(Y,X).
■