logo

Inverse Matrices and Systems of Linear Equations 📂Matrix Algebra

Inverse Matrices and Systems of Linear Equations

Theorem: Equivalent Conditions for an Invertible Matrix1

Let $A$ be a square matrix of size $n\times n$. Then the following statements are equivalent.

(a) $A$ is an invertible matrix.

(e) $A\mathbf{x}=\mathbf{b}$ has a solution for all matrices $n\times 1$ of size $\mathbf{b}$.

(f) $A\mathbf{x}=\mathbf{b}$ has exactly one solution for all matrices $n\times 1$ of size $\mathbf{b}$. That is, $\mathbf{x}=A^{-1}\mathbf{b}$ holds.

Description

(e) and (f) being equivalent means that if the linear system $A \mathbf{x} = \mathbf{b}$ has at least one solution for all matrices $n \times 1$ of size $\mathbf{b}$, then it has exactly one solution.

Proof

(a) $\implies$ (f)

Assume $A$ is an invertible matrix. Then $A(A^{-1}\mathbf{b}) = \mathbf{b}$ holds. Substituting $\mathbf{b} = A \mathbf{x}$ on the right side gives:

$$ \begin{align*} && A (A^{-1}\mathbf{b}) &= A \mathbf{x} \\ \implies && A^{-1}\mathbf{b} &= \mathbf{x} \end{align*} $$

Therefore, $\mathbf{x} = A^{-1} \mathbf{b}$ is a solution to $A \mathbf{x} = \mathbf{b}$. Now, let any solution be denoted $\mathbf{x}_{0}$. Then $A \mathbf{x}_{0} = \mathbf{b}$ holds, and multiplying both sides by $A^{-1}$ gives $\mathbf{x}_{0} = A^{-1} \mathbf{b}$, which means the solution is unique as $\mathbf{x} = A^{-1} \mathbf{b}$.

(f) $\implies$ (e)

This is obvious.

(e) $\implies$ (a)

Assume the linear system $A \mathbf{x} = \mathbf{b}$ has a solution for all matrices $n \times 1$ of size $\mathbf{b}$. Then the following linear systems all have solutions:

$$ A \mathbf{x} = \begin{bmatrix}1 \\ 0 \\ \vdots \\ 0 \end{bmatrix},\quad A \mathbf{x} = \begin{bmatrix}0 \\ 1 \\ \vdots \\ 0 \end{bmatrix},\quad A \mathbf{x} = \begin{bmatrix}0 \\ 0 \\ \vdots \\ 1 \end{bmatrix} $$

Let the solutions to these linear systems be sequentially denoted $\mathbf{x}_{1}, \mathbf{x}_{2}, \dots, \mathbf{x}_{n}$. And let the matrix that has these solutions as column vectors be denoted $C$.

$$ C = \begin{bmatrix} \mathbf{x}_{1} & \mathbf{x}_{2} & \cdots & \mathbf{x}_{n} \end{bmatrix} $$

Calculating $AC$ gives the following, so $C$ is the inverse of $A$. Therefore, $A$ is invertible.

$$ AC = A\begin{bmatrix} \mathbf{x}_{1} & \mathbf{x}_{2} & \cdots & \mathbf{x}_{n} \end{bmatrix} = \begin{bmatrix} A\mathbf{x}_{1} & A\mathbf{x}_{2} & \cdots & A\mathbf{x}_{n} \end{bmatrix} = \begin{bmatrix} 1 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \end{bmatrix} = I_{n} $$


  1. Howard Anton, Elementary Linear Algebra: Aplications Version (12th Edition, 2019), p64-65 ↩︎