logo

Basis of Vector Space 📂Linear Algebra

Basis of Vector Space

Definition1

Let S={v1,v2,,vr}S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{r} \right\} be a subset of vector space VV. If SS satisfies the following two conditions, then SS is called a basis of VV.

Explanation

As the name suggests, the concept of a basis corresponds to ’the smallest thing that can create a vector space’. The condition of spanning has the meaning of ‘creating a vector space’, and being linearly independent has the meaning of being ’the smallest’. Even though the notion of creating a vector space is understood, the need to be smallest may not be immediately clear. However, this can be easily understood with just one simple example. For instance, we do not represent a vector (2,3)(2,3) as

(2,3)=1(1,0)+2(0,1)+1(1,1) (2,3)=1(1,0) + 2(0,1) + 1(1,1) .

Because (1,1)(1,1) can be represented as a linear combination of (1,0),(0,1)(1,0), (0,1). Thus, the above expression is merely an unnecessarily lengthy way of writing it. Therefore, being linearly independent ensures that when representing any vector as a linear combination of the basis, it is done in the neatest manner possible, including only what is necessary.

It is important to note that a basis for a vector space does not necessarily have to be unique. For example, {(1,0),(0,1)}\left\{ (1,0) , (0,1) \right\} is a basis that spans R2\mathbb{R}^{2}. However, according to the definition, {(2,0),(0,2)}\left\{ (2,0) , (0,2) \right\} can also be a basis for R2\mathbb{R}^2. And? Even {(1,1),(1,1)}\left\{ (1,1) , (-1,1) \right\} has no problem being a basis that spans R2\mathbb{R}^2. However, generally, in the case of Rn\mathbb{R}^{n}, a basis consisting of the following vectors is discussed.

e1=(1,0,0,,0),e2=(0,1,0,,0),en=(0,0,0,,1) \mathbf{e}_{1} = (1,0,0,\dots,0), \quad \mathbf{e}_{2}=(0,1,0,\dots,0),\quad \mathbf{e}_{n}=(0,0,0,\dots,1)

Such a basis is called standard basis for Rn\mathbb{R}^{n}. Each of ei\mathbf{e}_{i} is referred to as a standard unit vector. Especially for n=3n=3, it is commonly denoted as follows.

x^= e1=x^1=i=(1,0,0)y^= e2=x^2=j=(0,1,0)z^= e3=x^3=k=(0,0,1) \begin{align*} \hat{\mathbf{x}} =&\ \mathbf{e}_{1} = \hat{\mathbf{x}}_{1} = \mathbf{i}=(1,0,0) \\ \hat{\mathbf{y}} =&\ \mathbf{e}_{2} = \hat{\mathbf{x}}_{2} = \mathbf{j}=(0,1,0) \\ \hat{\mathbf{z}} =&\ \mathbf{e}_{3} = \hat{\mathbf{x}}_{3} = \mathbf{k}=(0,0,1) \end{align*}

From the following theorem, the concept of coordinates can also be talked about in abstract vector spaces. If vV\mathbf{v} \in V is represented as (1)(1), [v]S[\mathbf{v}]_{S} is called the coordinate vector of v\mathbf{v} relative to basis SS.

[v]S=[c1c2cn] [\mathbf{v}]_{S} = \begin{bmatrix} c_{1} \\ c_{2} \\ \vdots \\ c_{n} \end{bmatrix}

Theorem: Uniqueness of Basis Representation

Let S={v1,v2,,vn}S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{n} \right\} be a basis of vector space VV. Then, for every vector vV\mathbf{v} \in V,

v=c1v1+c2v2++cnvn \begin{equation} \mathbf{v} = c_{1}\mathbf{v}_{1} + c_{2}\mathbf{v}_{2} + \cdots + c_{n}\mathbf{v}_{n} \end{equation}

there exists a unique way to represent it like this. In other words, there exists a unique ordered pair of coefficients (c1,c2,,cn)(c_{1},c_{2},\dots,c_{n}) that satisfies the above equation.

Proof

Since SS spans VV, by the definition of spanning, all vectors in VV can be expressed as a linear combination of SS. Let’s say some vector v\mathbf{v} can be expressed by the following two linear combinations.

v=c1v1+c2v2++cnvnv=k1v1+k2v2++knvn \begin{align*} \mathbf{v} &= c_{1}\mathbf{v}_{1} + c_{2}\mathbf{v}_{2} + \cdots + c_{n}\mathbf{v}_{n} \\ \mathbf{v} &= k_{1}\mathbf{v}_{1} + k_{2}\mathbf{v}_{2} + \cdots + k_{n}\mathbf{v}_{n} \end{align*}

Subtracting the equations from each other yields the following.

0=(c1k1)v1+(c2k2)v2++(cnkn)vn \mathbf{0} = (c_{1} - k_{1}) \mathbf{v}_{1} + (c_{2} - k_{2}) \mathbf{v}_{2} + \cdots + (c_{n} - k_{n}) \mathbf{v}_{n}

However, since v1,v2,,vn\mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{n} is linearly independent, the only solution that satisfies the above equation is

c1k1=0,c2k2=0,,cnkn=0 c_{1} - k_{1} = 0,\quad c_{2} - k_{2} = 0,\quad \dots,\quad c_{n} - k_{n} = 0

Therefore, the following holds.

c1=k1,c2=k2,,cn=kn c_{1} = k_{1},\quad c_{2} = k_{2},\quad \dots,\quad c_{n} = k_{n}

Hence, the two linear combination representations are identical.


  1. Howard Anton, Elementary Linear Algebra: Applications Version (12th Edition, 2019), p240 ↩︎