logo

Proof of Cramer's Rule 📂Matrix Algebra

Proof of Cramer's Rule

Overview

Cramer’s Rule is not efficient for solving systems of equations, but if $A_{j}$ is an invertible matrix or $A$ itself is given under conditions that make it convenient to calculate determinants, it can be sufficiently useful to directly find the necessary answers.

Theorem

Let’s assume the system of equations $A \mathbb{x} = \mathbb{b}$ consists of an invertible matrix $$ A = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} $$ and two vectors $$ \mathbb{x} = \begin{bmatrix} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{bmatrix}, \mathbb{b} = \begin{bmatrix} b_{1} \\ b_{2} \\ \vdots \\ b_{n} \end{bmatrix} $$ . Then, if we replace the $j$-th column of $A$ with $\mathbb{b}$, the matrix is called $A_{j}$, $$ x_{j} = {{ \det A_{j} } \over { \det A }} $$

Proof

Let’s call the cofactor of $A$ of $i,j$ as $C_{ij}$.

For the selected $j$th column, $\displaystyle \det A = \sum_{j=1}^{n} a_{ij} C_{ij}$

Since the $j$th column of $A$ is $a_{1j} , a_{2j} , \cdots , a_{nj}$, based on the Laplace expansion,

$$ \det A = a_{1j} C_{1j} + a_{2j} C_{2j} + \cdots + a_{nj} C_{nj} $$

Meanwhile, the $k \ne j$ column of $A$ is $a_{1k} , a_{2k} , \cdots , a_{nk}$, and $a_{1k} C_{1j} + a_{2k} C_{2j} + \cdots + a_{nk} C_{nj}$ is equal to the determinant of the matrix with at least two identical columns of $a_{1k} , a_{2k} , \cdots , a_{nk}$,

$$ a_{1k} C_{1j} + a_{2k} C_{2j} + \cdots + a_{nk} C_{nj} = 0 $$

Breaking down the system of equations $A \mathbb{x} = \mathbb{b}$,

$$ \begin{align*} a_{11} x_{1} + a_{12} x_{2} + \cdots + a_{1j} x_{j} + \cdots + a_{1n} x_{n} =& b_{1} \\ a_{21} x_{1} + a_{22} x_{2} + \cdots + a_{2j} x_{j} + \cdots + a_{2n} x_{n} =& b_{2} \\ &\vdots \\ a_{n1} x_{1} + a_{n2} x_{2} + \cdots + a_{nj} x_{j} + \cdots + a_{nn} x_{n} =& b_{n} \end{align*} $$

By multiplying both sides of each $i$th equation by $C_{ij}$,

$$ \begin{align*} a_{11} x_{1} C_{1j} + a_{12} x_{2} C_{1j} + \cdots + a_{1j} x_{j} C_{1j} + \cdots + a_{1n} x_{n} C_{1j} =& b_{1} C_{1j} \\ a_{21} x_{1} C_{2j} + a_{22} x_{2} C_{2j} + \cdots + a_{2j} x_{j} C_{2j}+ \cdots + a_{2n} x_{n} C_{2j} &= b_{2} C_{2j} \\ &\vdots \\ a_{n1} x_{1} C_{nj} + a_{n2} x_{2} C_{nj} + \cdots + a_{nj} x_{j} C_{nj} + \cdots + a_{nn} x_{n} C_{nj} &= b_{n} C_{nj} \end{align*} $$

Summing all the left sides, except for the middle’s $\displaystyle \sum_{i=1}^{n} a_{ij} x_{j} C_{ij} = \det A$, vanishes into $\displaystyle \sum_{i=1}^{n} a_{ik} x_{j} C_{ik} = 0$,

$$ \sum_{i=1}^{n} a_{ij} x_{j} C_{ij} = b_{1} C_{1j} + b_{2} C_{2j} + \cdots + b_{n} C_{nj} $$ is obtained. But since $A_{j}$ is the matrix with the $j$th column replaced by $b_{1} , b_{2} , \cdots , b_{n}$,

$$ \det A_{j} = b_{1} C_{1j} + b_{2} C_{2j} + \cdots + b_{n} C_{nj} $$

Taking $x_{j}$ out as $\displaystyle \sum_{i=1}^{n}$,

$$ x_{j} \sum_{i=1}^{n} a_{ij} C_{ij} = \det A_{j} $$

Therefore, summarizing it as $\displaystyle x_{j} \det A = \det A_{j}$, and since $A$ is assumed to be an invertible matrix, $\det A \ne 0$ is true. Therefore,

$$ x_{j} = {{ \det A_{j} } \over {\det A}} $$