Covariance Matrix
📂Mathematical StatisticsCovariance Matrix
Definition
p-dimensional random vector X=(X1,⋯,Xp) is defined as follows for Cov(X), which is called a Covariance Matrix.
(Cov(X))ij:=Cov(Xi,Xj)
Explanation
To put the definition in simpler words, it is as follows.
Cov(X):=Var(X1)Cov(X2,X1)⋮Cov(Xp,X1)Cov(X1,X2)Var(X2)⋮Cov(Xp,X2)⋯⋯⋱⋯Cov(X1,Xp)Cov(X2,Xp)⋮Var(Xp)
All covariance matrices are positive semi-definite matrices. In other words, for all vectors x∈Rp, the following holds true.
0≤xTCov(X)x
Theorems
- [1]: If μ∈Rp is given as μ:=(EX1,⋯,EXp),
Cov(X)=E[XXT]−μμT
- [2]: If a matrix of constants A∈Rk×p is given as (A)ij:=aij,
Cov(AX)=ACov(X)AT
Proof
[1]
Cov(X)====E[(X−μ)(X−μ)T]E[XXT−μXT−XμT+μμT]E[XXT]−μE[XT]−E[X]μT+E[μμT]E[XXT]−μμT
■
[2]
Cov(AX)====E[(AX−Aμ)(AX−Aμ)T]E[A(X−μ)(X−μ)TAT]AE[(X−μ)(X−μ)T]ATACov(X)AT
■