Proof of the Matrix Determinant Lemma
Theorem
An invertible matrix satisfies the following for $A \in \mathbb{R}^{n \times n}$ and $\mathbf{u} , \mathbf{v} \in \mathbb{R}^{n}$. $$ \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) = \left( 1 + \mathbf{v}^{T} A^{-1} \mathbf{u} \right) \det A $$ Particularly, for the classical adjoint matrix $\text{adj} (A) = A^{-1} \det A$, it can be represented as follows. $$ \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) = \det A + \mathbf{v}^{T} \text{adj} (A) \mathbf{u} $$
- $\mathbf{u}^{T}$ implies the transpose matrix of $\mathbf{u}$.
Proof 1
Strategy: Deduce directly based on the properties of determinants. Unless otherwise stated, block matrices are fundamentally used according to the dimension of the matrix.
Properties of determinants: Let $A,B$ be $n\times n$ matrix, and $k$ be a constant. The determinant satisfies the following properties.
- (b) $\det(AB) = \det(A)\det(B)$
- (c) $\det(AB)=\det(BA)$
Properties of block matrices: Let $A = \begin{bmatrix} A_{1} & A_{2} \\ O & A_{3} \end{bmatrix}$ be a block matrix. The following holds for its determinant. $$ \det A = \det A_{1} \det A_{3} $$
An identity matrix $I \in \mathbb{R}^{n \times n}$ and zero vectors $\mathbf{0} \in \mathbb{R}^{T}$ and $\mathbf{w} := A^{-1} \mathbf{u} \in \mathbb{R}^{n}$ satisfy the following. $$ \begin{align*} & \begin{bmatrix} I & \mathbf{0} \\ \mathbf{v}^{T} & 1 \end{bmatrix} \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{0}^{T} & 1 \end{bmatrix} \begin{bmatrix} I & \mathbf{0} \\ - \mathbf{v}^{T} & 1 \end{bmatrix} \\ =& \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{v}^{T} + \mathbf{v}^{T} \mathbf{w} \mathbf{v}^{T} & \mathbf{v}^{T} \mathbf{w} + 1 \end{bmatrix} \begin{bmatrix} I & \mathbf{0} \\ - \mathbf{v}^{T} & 1 \end{bmatrix} \\ =& \begin{bmatrix} I & \mathbf{w} \\ \mathbf{0}^{T} & 1 + \mathbf{v}^{T} \mathbf{w} \end{bmatrix} \end{align*} $$
Since the determinant of a triangular matrix is the product of its diagonal entries, applying $\det$ yields the following based on the properties of determinants (b) and block matrices. $$ \begin{align*} 1 + \mathbf{v}^{T} \mathbf{w} =& \det \begin{bmatrix} I & \mathbf{w} \\ \mathbf{0}^{T} & 1 + \mathbf{v}^{T} \mathbf{w} \end{bmatrix} \\ =& \det \begin{bmatrix} I & \mathbf{0} \\ \mathbf{v}^{T} & 1 \end{bmatrix} \det \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{0}^{T} & 1 \end{bmatrix} \det \begin{bmatrix} I & \mathbf{0} \\ - \mathbf{v}^{T} & 1 \end{bmatrix} \\ =& 1 \cdot \det \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{0}^{T} & 1 \end{bmatrix} \cdot 1 \\ =& \det \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} \end{bmatrix} \cdot \det \begin{bmatrix} 1 \end{bmatrix} \\ =& \det \left( I + \mathbf{w} \mathbf{v}^{T} \right) \end{align*} $$ Multiplying both sides by $\det A$ yields the following based on the properties of determinants (b) and (c). $$ \begin{align*} & & \left( 1 + \mathbf{v}^{T} \mathbf{w} \right) \cdot \det A =& \det A \cdot \det \left( I + A^{-1} \mathbf{u} \mathbf{v}^{T} \right) \\ \implies & & \left( 1 + \mathbf{v}^{T} A^{-1} \mathbf{u} \right) \det A =& \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) \end{align*} $$
Properties of classical adjoint matrices: Especially if $A$ is an invertible matrix, a classical adjoint matrix can be represented as follows. $$ \text{adj} (A) = \det (A) A^{-1} $$
Finally, the following is obtained according to the properties of classical adjoint matrices. $$ \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) = \det A + \mathbf{v}^{T} \text{adj} (A) \mathbf{u} $$
■
This lemma is mainly mentioned in the derivation of the Sherman-Morrison formula.