logo

Proof of the Matrix Determinant Lemma 📂Matrix Algebra

Proof of the Matrix Determinant Lemma

Theorem

An invertible matrix satisfies the following for ARn×nA \in \mathbb{R}^{n \times n} and u,vRn\mathbf{u} , \mathbf{v} \in \mathbb{R}^{n}. det(A+uvT)=(1+vTA1u)detA \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) = \left( 1 + \mathbf{v}^{T} A^{-1} \mathbf{u} \right) \det A Particularly, for the classical adjoint matrix adj(A)=A1detA\text{adj} (A) = A^{-1} \det A, it can be represented as follows. det(A+uvT)=detA+vTadj(A)u \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) = \det A + \mathbf{v}^{T} \text{adj} (A) \mathbf{u}


Proof 1

Strategy: Deduce directly based on the properties of determinants. Unless otherwise stated, block matrices are fundamentally used according to the dimension of the matrix.

Properties of determinants: Let A,BA,B be n×nn\times n matrix, and kk be a constant. The determinant satisfies the following properties.

  • (b) det(AB)=det(A)det(B)\det(AB) = \det(A)\det(B)
  • (c) det(AB)=det(BA)\det(AB)=\det(BA)

Properties of block matrices: Let A=[A1A2OA3]A = \begin{bmatrix} A_{1} & A_{2} \\ O & A_{3} \end{bmatrix} be a block matrix. The following holds for its determinant. detA=detA1detA3 \det A = \det A_{1} \det A_{3}


An identity matrix IRn×nI \in \mathbb{R}^{n \times n} and zero vectors 0RT\mathbf{0} \in \mathbb{R}^{T} and w:=A1uRn\mathbf{w} := A^{-1} \mathbf{u} \in \mathbb{R}^{n} satisfy the following. [I0vT1][I+wvTw0T1][I0vT1]=[I+wvTwvT+vTwvTvTw+1][I0vT1]=[Iw0T1+vTw] \begin{align*} & \begin{bmatrix} I & \mathbf{0} \\ \mathbf{v}^{T} & 1 \end{bmatrix} \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{0}^{T} & 1 \end{bmatrix} \begin{bmatrix} I & \mathbf{0} \\ - \mathbf{v}^{T} & 1 \end{bmatrix} \\ =& \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{v}^{T} + \mathbf{v}^{T} \mathbf{w} \mathbf{v}^{T} & \mathbf{v}^{T} \mathbf{w} + 1 \end{bmatrix} \begin{bmatrix} I & \mathbf{0} \\ - \mathbf{v}^{T} & 1 \end{bmatrix} \\ =& \begin{bmatrix} I & \mathbf{w} \\ \mathbf{0}^{T} & 1 + \mathbf{v}^{T} \mathbf{w} \end{bmatrix} \end{align*}

Since the determinant of a triangular matrix is the product of its diagonal entries, applying det\det yields the following based on the properties of determinants (b) and block matrices. 1+vTw=det[Iw0T1+vTw]=det[I0vT1]det[I+wvTw0T1]det[I0vT1]=1det[I+wvTw0T1]1=det[I+wvT]det[1]=det(I+wvT) \begin{align*} 1 + \mathbf{v}^{T} \mathbf{w} =& \det \begin{bmatrix} I & \mathbf{w} \\ \mathbf{0}^{T} & 1 + \mathbf{v}^{T} \mathbf{w} \end{bmatrix} \\ =& \det \begin{bmatrix} I & \mathbf{0} \\ \mathbf{v}^{T} & 1 \end{bmatrix} \det \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{0}^{T} & 1 \end{bmatrix} \det \begin{bmatrix} I & \mathbf{0} \\ - \mathbf{v}^{T} & 1 \end{bmatrix} \\ =& 1 \cdot \det \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} & \mathbf{w} \\ \mathbf{0}^{T} & 1 \end{bmatrix} \cdot 1 \\ =& \det \begin{bmatrix} I + \mathbf{w} \mathbf{v}^{T} \end{bmatrix} \cdot \det \begin{bmatrix} 1 \end{bmatrix} \\ =& \det \left( I + \mathbf{w} \mathbf{v}^{T} \right) \end{align*} Multiplying both sides by detA\det A yields the following based on the properties of determinants (b) and (c). (1+vTw)detA=detAdet(I+A1uvT)    (1+vTA1u)detA=det(A+uvT) \begin{align*} & & \left( 1 + \mathbf{v}^{T} \mathbf{w} \right) \cdot \det A =& \det A \cdot \det \left( I + A^{-1} \mathbf{u} \mathbf{v}^{T} \right) \\ \implies & & \left( 1 + \mathbf{v}^{T} A^{-1} \mathbf{u} \right) \det A =& \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) \end{align*}

Properties of classical adjoint matrices: Especially if AA is an invertible matrix, a classical adjoint matrix can be represented as follows. adj(A)=det(A)A1 \text{adj} (A) = \det (A) A^{-1}

Finally, the following is obtained according to the properties of classical adjoint matrices. det(A+uvT)=detA+vTadj(A)u \det \left( A + \mathbf{u} \mathbf{v}^{T} \right) = \det A + \mathbf{v}^{T} \text{adj} (A) \mathbf{u}

This lemma is mainly mentioned in the derivation of the Sherman-Morrison formula.