logo

Relationship Between Orthogonality and Linear Independence 📂Linear Algebra

Relationship Between Orthogonality and Linear Independence

Definition1

An inner product space VV’s two vectors u,v\mathbf{u}, \mathbf{v} are said to be orthogonal if they satisfy u,v=0\langle \mathbf{u}, \mathbf{v} \rangle = 0.

A set made up of elements of VV where each element is orthogonal to every other element is called an orthogonal set.

If the norm of every element in an orthogonal set is 11, then it is called an orthonormal set.

Theorem

A subset S={v1,v2,,vn}S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \cdots, \mathbf{v}_{n} \right\} of an inner product space VV that includes no zero vector and is an orthogonal set implies that SS is linearly independent.

Proof

To prove that SS is linearly independent, it suffices to show that the only solution to the following equation

k1v1+k2v2++knvn=0 k_{1} \mathbf{v}_{1} + k_{2} \mathbf{v}_{2} + \cdots + k_{n} \mathbf{v}_{n} = \mathbf{0}

is k1=k2==kn=0k_{1}=k_{2}=\cdots=k_{n}=0. Taking the inner product of each vector vi\mathbf{v}_{i} with the above equation, we have

0=0,vi=k1v1+k2v2++knvn,vi=k1v1,vi+k2v2,vi+kivi,vi++knvn,vi=kivi,vi \begin{align*} \\ 0 &= \langle \mathbf{0}, \mathbf{v}_{i} \rangle \\ &= \langle k_{1} \mathbf{v}_{1} + k_{2} \mathbf{v}_{2} + \cdots + k_{n} \mathbf{v}_{n}, \mathbf{v}_{i} \rangle \\ &= k_{1} \langle \mathbf{v}_{1}, \mathbf{v}_{i} \rangle + k_{2} \langle \mathbf{v}_{2}, \mathbf{v}_{i} \rangle + \cdots k_{i} \langle \mathbf{v}_{i}, \mathbf{v}_{i} \rangle +\cdots + k_{n} \langle \mathbf{v}_{n}, \mathbf{v}_{i} \rangle \\ &= k_{i} \langle \mathbf{v}_{i}, \mathbf{v}_{i} \rangle \end{align*}

However, since SS is an orthogonal set that does not include the zero vector, vi,vi>0\langle \mathbf{v}_{i}, \mathbf{v}_{i} \rangle>0 is true. Therefore,

ki=0,1in k_{i} = 0,\quad \forall 1\le i \le n


  1. Howard Anton, Elementary Linear Algebra: Aplications Version (12th Edition, 2019), p361-362 ↩︎