Singular Value Decomposition for Least Squares
Algorithm
$A \in \mathbb{C}^{m \times n}$ and vector $\mathbf{b} \in \mathbb{C}^{m}$, let’s say $\text{rank} A = n$ and the least squares solution of $A \mathbf{x} = \mathbf{b}$ is $\mathbf{x}_{\ast}$.
Step 1. Singular Value Decomposition
Find orthogonal matrix $\widehat{U}$, diagonal matrix $\widehat{\Sigma}$, and unitary matrix $V$ that satisfy $A = \widehat{U} \widehat{\Sigma} V^{\ast}$.
Step 2.
Using $\widehat{U}$ obtained from the singular value decomposition, compute the projection $P : = \widehat{U} \widehat{U}^{\ast}$.
Given $A \mathbf{x}_{\ast} = P \mathbf{b}$, we have $\widehat{U} \widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \widehat{U} \widehat{U}^{\ast} \mathbf{b}$, and by multiplying the left side of both sides by $\widehat{U}^{\ast}$, we obtain $\widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \widehat{U}^{\ast} \mathbf{b}$.
Step 3.
Compute $\mathbf{y} := \widehat{U}^{\ast} \mathbf{b}$ to get $\widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \mathbf{y}$.
Step 4.
Considering $\widehat{\Sigma} V^{\ast} \mathbf{x}_{\ast} = \mathbf{y}$ as $\mathbf{w} : = V^{\ast} \mathbf{x}_{\ast}$, solve equation $\widehat{\Sigma} \mathbf{w} = \mathbf{y}$ for $\mathbf{w}$.
Step 5.
Since $V$ is a unitary matrix, compute $\mathbf{x}_{\ast} = V \mathbf{w}$.
- Similar to Least Squares Method using QR Decomposition but does not use forward substitution or back substitution since it employs a diagonal matrix.