logo

Basis of Vector Space 📂Linear Algebra

Basis of Vector Space

Definition1

Let $S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{r} \right\}$ be a subset of vector space $V$. If $S$ satisfies the following two conditions, then $S$ is called a basis of $V$.

Explanation

As the name suggests, the concept of a basis corresponds to ’the smallest thing that can create a vector space’. The condition of spanning has the meaning of ‘creating a vector space’, and being linearly independent has the meaning of being ’the smallest’. Even though the notion of creating a vector space is understood, the need to be smallest may not be immediately clear. However, this can be easily understood with just one simple example. For instance, we do not represent a vector $(2,3)$ as

$$ (2,3)=1(1,0) + 2(0,1) + 1(1,1) $$.

Because $(1,1)$ can be represented as a linear combination of $(1,0), (0,1)$. Thus, the above expression is merely an unnecessarily lengthy way of writing it. Therefore, being linearly independent ensures that when representing any vector as a linear combination of the basis, it is done in the neatest manner possible, including only what is necessary.

It is important to note that a basis for a vector space does not necessarily have to be unique. For example, $\left\{ (1,0) , (0,1) \right\}$ is a basis that spans $\mathbb{R}^{2}$. However, according to the definition, $\left\{ (2,0) , (0,2) \right\}$ can also be a basis for $\mathbb{R}^2$. And? Even $\left\{ (1,1) , (-1,1) \right\}$ has no problem being a basis that spans $\mathbb{R}^2$. However, generally, in the case of $\mathbb{R}^{n}$, a basis consisting of the following vectors is discussed.

$$ \mathbf{e}_{1} = (1,0,0,\dots,0), \quad \mathbf{e}_{2}=(0,1,0,\dots,0),\quad \mathbf{e}_{n}=(0,0,0,\dots,1) $$

Such a basis is called standard basis for $\mathbb{R}^{n}$. Each of $\mathbf{e}_{i}$ is referred to as a standard unit vector. Especially for $n=3$, it is commonly denoted as follows.

$$ \begin{align*} \hat{\mathbf{x}} =&\ \mathbf{e}_{1} = \hat{\mathbf{x}}_{1} = \mathbf{i}=(1,0,0) \\ \hat{\mathbf{y}} =&\ \mathbf{e}_{2} = \hat{\mathbf{x}}_{2} = \mathbf{j}=(0,1,0) \\ \hat{\mathbf{z}} =&\ \mathbf{e}_{3} = \hat{\mathbf{x}}_{3} = \mathbf{k}=(0,0,1) \end{align*} $$

From the following theorem, the concept of coordinates can also be talked about in abstract vector spaces. If $\mathbf{v} \in V$ is represented as $(1)$, $[\mathbf{v}]_{S}$ is called the coordinate vector of $\mathbf{v}$ relative to basis $S$.

$$ [\mathbf{v}]_{S} = \begin{bmatrix} c_{1} \\ c_{2} \\ \vdots \\ c_{n} \end{bmatrix} $$

Theorem: Uniqueness of Basis Representation

Let $S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{n} \right\}$ be a basis of vector space $V$. Then, for every vector $\mathbf{v} \in V$,

$$ \begin{equation} \mathbf{v} = c_{1}\mathbf{v}_{1} + c_{2}\mathbf{v}_{2} + \cdots + c_{n}\mathbf{v}_{n} \end{equation} $$

there exists a unique way to represent it like this. In other words, there exists a unique ordered pair of coefficients $(c_{1},c_{2},\dots,c_{n})$ that satisfies the above equation.

Proof

Since $S$ spans $V$, by the definition of spanning, all vectors in $V$ can be expressed as a linear combination of $S$. Let’s say some vector $\mathbf{v}$ can be expressed by the following two linear combinations.

$$ \begin{align*} \mathbf{v} &= c_{1}\mathbf{v}_{1} + c_{2}\mathbf{v}_{2} + \cdots + c_{n}\mathbf{v}_{n} \\ \mathbf{v} &= k_{1}\mathbf{v}_{1} + k_{2}\mathbf{v}_{2} + \cdots + k_{n}\mathbf{v}_{n} \end{align*} $$

Subtracting the equations from each other yields the following.

$$ \mathbf{0} = (c_{1} - k_{1}) \mathbf{v}_{1} + (c_{2} - k_{2}) \mathbf{v}_{2} + \cdots + (c_{n} - k_{n}) \mathbf{v}_{n} $$

However, since $\mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{n}$ is linearly independent, the only solution that satisfies the above equation is

$$ c_{1} - k_{1} = 0,\quad c_{2} - k_{2} = 0,\quad \dots,\quad c_{n} - k_{n} = 0 $$

Therefore, the following holds.

$$ c_{1} = k_{1},\quad c_{2} = k_{2},\quad \dots,\quad c_{n} = k_{n} $$

Hence, the two linear combination representations are identical.


  1. Howard Anton, Elementary Linear Algebra: Applications Version (12th Edition, 2019), p240 ↩︎