logo

Spectral Decomposition 📂Matrix Algebra

Spectral Decomposition

Definition 1

In spectral theory, the statement that AA is a Hermitian matrix is equivalent to it being unitarily diagonalizable: A=A    A=QΛQ A = A^{\ast} \iff A = Q \Lambda Q^{\ast}

The term A=QΛQA = Q \Lambda Q^{\ast} as mentioned in spectral theory is referred to as Spectral Decomposition, and is expressed as a series of eigenpairs {(λk,ek)}k=1n\left\{ \left( \lambda_{k} , e_{k} \right) \right\}_{k=1}^{n}. A=k=1nλkekek A = \sum_{k=1}^{n} \lambda_{k} e_{k} e_{k}^{\ast}

Description

Especially in statistics, covariance matrices are often positive definite matrices, and positive definite matrices are Hermitian matrices. Not just covariance matrices, but also for design matrices XX, XTXX^{T} X becomes a symmetric matrix, specially if XRm×nX \in \mathbb{R}^{m \times n} then again becomes a Hermitian matrix. Under such conditions, according to spectral theory, AA can achieve QQ, composed of orthonormal eigenvectors e1,,ene_{1} , \cdots , e_{n}, which can be rewritten as follows. A=QΛQ=Q[λ1000λ2000λn][e1e2en]=[e1e2en][λ1e1λ2e2λnen]=λ1e1e1+λ2e2e2++λnenen=k=1nλkekek \begin{align*} & A \\ = & Q \Lambda Q^{\ast} \\ = & Q \begin{bmatrix} \lambda_{1} & 0 & \cdots & 0 \\ 0 & \lambda_{2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_{n} \end{bmatrix} \begin{bmatrix} e_{1}^{\ast} \\ e_{2}^{\ast} \\ \vdots \\ e_{n}^{\ast} \end{bmatrix} \\ = & \begin{bmatrix} e_{1} & e_{2} & \cdots & e_{n} \end{bmatrix} \begin{bmatrix} \lambda_{1} e_{1}^{\ast} \\ \lambda_{2} e_{2}^{\ast} \\ \vdots \\ \lambda_{n} e_{n}^{\ast} \end{bmatrix} \\ = & \lambda_{1} e_{1} e_{1}^{\ast} + \lambda_{2} e_{2} e_{2}^{\ast} + \cdots + \lambda_{n} e_{n} e_{n}^{\ast} \\ = & \sum_{k=1}^{n} \lambda_{k} e_{k} e_{k}^{\ast} \end{align*}


  1. Johnson. (2013). Applied Multivariate Statistical Analysis(6th Edition): p99. ↩︎