logo

Singular Value Decomposition for Least Squares 📂Matrix Algebra

Singular Value Decomposition for Least Squares

Algorithm

ACm×nA \in \mathbb{C}^{m \times n} and vector bCm\mathbf{b} \in \mathbb{C}^{m}, let’s say rankA=n\text{rank} A = n and the least squares solution of Ax=bA \mathbf{x} = \mathbf{b} is x\mathbf{x}_{\ast}.

Step 1. Singular Value Decomposition

Find orthogonal matrix U^\widehat{U}, diagonal matrix Σ^\widehat{\Sigma}, and unitary matrix VV that satisfy A=U^Σ^VA = \widehat{U} \widehat{\Sigma} V^{\ast}.


Step 2.

Using U^\widehat{U} obtained from the singular value decomposition, compute the projection P:=U^U^P : = \widehat{U} \widehat{U}^{\ast}.

Given Ax=PbA \mathbf{x}_{\ast} = P \mathbf{b}, we have U^Σ^Vx=U^U^b\widehat{U} \widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \widehat{U} \widehat{U}^{\ast} \mathbf{b}, and by multiplying the left side of both sides by U^\widehat{U}^{\ast}, we obtain Σ^Vx=U^b\widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \widehat{U}^{\ast} \mathbf{b}.


Step 3.

Compute y:=U^b\mathbf{y} := \widehat{U}^{\ast} \mathbf{b} to get Σ^Vx=y\widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \mathbf{y}.


Step 4.

Considering Σ^Vx=y\widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \mathbf{y} as w:=Vx\mathbf{w} : = V^{\ast} \mathbf{x}_{\ast}, solve equation Σ^w=y\widehat{\Sigma} \mathbf{w} = \mathbf{y} for w\mathbf{w}.


Step 5.

Since VV is a unitary matrix, compute x=Vw\mathbf{x}_{\ast} = V \mathbf{w}.