logo

Dimension of the Vector Space 📂Linear Algebra

Dimension of the Vector Space

Definition1

The number of elements (vectors) of a basis for a vector space $V$ is defined as the dimension of $V$ and is denoted as follows.

$$ \dim (V) $$

Explanation

Such a generalization of dimensions goes beyond merely exploring vector spaces and is being applied to various technologies that support this society. It might seem pointless to consider dimensions higher than the $3$ dimensions of our world and the $4$ dimensions we can’t even draw, but this is because Euclidean space is not the only kind of vector space. For example, consider a dataset used in statistics, which can be viewed as a vector. For instance, if a person named ‘Adam’ has a height of 175, weight of 62, age of 22, IQ of 103, and vision of 1.2, it can be represented as ‘Adam=(175, 62, 22, 103, 1.2)’. Even such straightforward data involves the use of $5$ dimensions, which would be ineffective with even minor limitations.

On the other hand, considering that the basis of a vector space is not unique, for the above definition to be considered valid it is necessary that all bases have the same number of elements. From the following two theorems, it can be known that all bases of a finite-dimensional vector space must have the same number of vectors.

Theorems

Let $S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots \mathbf{v}_{n} \right\}$ be any basis of the vector space $V$.

(a) A subset of $V$ that has more vectors than the basis is linearly dependent.

(b) A subset of $V$ that has fewer vectors than the basis cannot span $V$.

Proof2

(a)

Let’s consider $W=\left\{ \mathbf{w}_{1},\ \mathbf{w}_{2},\ \cdots ,\ \mathbf{w}_{m} \right\} \subset V$. In this case, $m > n$. Since $S$ is a basis of $V$, the elements of $W$ can be expressed as linear combinations of the vectors of $S$.

$$ \begin{equation} \begin{aligned} \mathbf{w}_{1} &= a_{11}\mathbf{v}_{1}+a_{21}\mathbf{v}_{2} + \cdots + a_{n1}\mathbf{v}_{n}=\sum \limits _{i}^{n} a_{i1}\mathbf{v}_{i} \\ \mathbf{w}_{2} &= a_{12}\mathbf{v}_{1}+a_{22}\mathbf{v}_{2} + \cdots + a_{n2}\mathbf{v}_{n}=\sum \limits _{i}^{n} a_{i2}\mathbf{v}_{i} \\ & \vdots \\ \mathbf{w}_{m} &= a_{1m}\mathbf{v}_{1}+a_{2m}\mathbf{v}_{2} + \cdots + a_{nm}\mathbf{v}_{n}=\sum \limits _{i}^{n} a_{im}\mathbf{v}_{i} \end{aligned} \label{wlincom1} \end{equation} $$

To show that $W$ is linearly dependent,

$$ \begin{equation} k_{1}\mathbf{w}_{1} + k_2\mathbf{w}_{2} + \cdots + k_{m}\mathbf{w}_{m}= \mathbf{0} \label{wlincom2} \end{equation} $$

it suffices to show that there exists $(k_{1},k_{2},\dots,k_{m}) \ne (0,0,\dots,0)$ that satisfies the equation. By substituting $(1)$ into $(2)$, we get the following.

$$ \begin{align*} &k_{1}(a_{11}\mathbf{v}_{1} + a_{21}\mathbf{v}_{2} + \cdots + a_{n1}\mathbf{v}_{n}) \\ + &k_2(a_{12}\mathbf{v}_{1} + a_{22}\mathbf{v}_{2} + \cdots + a_{n2}\mathbf{v}_{n}) \\ + &\cdots \\ + &k_{m}(a_{1m}\mathbf{v}_{1} + a_{2m}\mathbf{v}_{2} + \cdots + a_{nm}\mathbf{v}_{n}) = \mathbf{0} \end{align*} $$

Arranging this for $\mathbf{v}_{i}$ gives the following.

$$ \left( \sum \limits _{j} ^{m} k_{j}a_{1j} \right)\mathbf{v}_{1} + \left( \sum \limits _{j} ^{m} k_{j}a_{2j} \right)\mathbf{v}_{2} + \cdots + \left( \sum \limits _{j} ^{m} k_{j}a_{nj} \right)\mathbf{v}_{n} = \mathbf{0} $$

Since $S$ is a basis of $V$ and is linearly independent, the only solution that satisfies the above equation is when all coefficients are $0$. Hence, the following equation holds.

$$ \begin{align*} a_{11}k_{1} + a_{12}k_{2} + \cdots + a_{1m}k_{m} = 0 \\ a_{21}k_{1} + a_{22}k_{2} + \cdots + a_{2m}k_{m} = 0 \\ \vdots \\ a_{n1}k_{1} + a_{n2}k_{2} + \cdots + a_{nm}k_{m} = 0 \end{align*} $$

Looking at the system of equations, there are $n$ equations, and the number of unknowns $k$ is $m$. Since there are more unknowns than equations, the system of equations has infinitely many nontrivial solutions. Therefore, not all solutions are $0$, and a $k_{1},\dots,k_{m}$ exists. Hence, $W$ is linearly dependent. Moreover, this proof applies to any set that has more elements than the basis.

(b)

The proof is by contradiction.

Assume $W=\left\{ \mathbf{w}_{1},\ \mathbf{w}_{2},\ \cdots ,\ \mathbf{w}_{m} \right\} \subset V$. Then, $m < n$. Suppose that $W$ spans $V$. Then, all vectors of $V$ can be expressed as linear combinations of $W$.

$$ \begin{equation} \begin{aligned} \mathbf{v}_{1} &= a_{11}\mathbf{w}_{1}+a_{21}\mathbf{w}_{2} + \cdots + a_{m1}\mathbf{w}_{m} \\ \mathbf{v}_{2} &= a_{12}\mathbf{w}_{1}+a_{22}\mathbf{w}_{2} + \cdots + a_{m2}\mathbf{w}_{m} \\ & \vdots \\ \mathbf{v}_{n} &= a_{1n}\mathbf{w}_{1}+a_{2n}\mathbf{w}_{2} + \cdots + a_{mn}\mathbf{w}_{m} \end{aligned} \label{vlincom1} \end{equation} $$

This leads to a contradiction that the $\left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots \mathbf{v}_{n} \right\}$ are linearly dependent. Consider the following homogeneous equation.

$$ k_{1}\mathbf{v}_{1} + k_2\mathbf{v}_{2} + \cdots + k_{n}\mathbf{v}_{n}= \mathbf{0} $$

Substituting $(1)$ into it, we get the following.

$$ \begin{align*} &k_{1}(a_{11}\mathbf{w}_{1} + a_{21}\mathbf{w}_{2} + \cdots + a_{m1}\mathbf{w}_{m}) \\ + &k_2(a_{12}\mathbf{w}_{1} + a_{22}\mathbf{w}_{2} + \cdots + a_{m2}\mathbf{w}_{m}) \\ + &\cdots \\ + &k_{n}(a_{1n}\mathbf{w}_{1} + a_{2n}\mathbf{w}_{2} + \cdots + a_{mn}\mathbf{w}_{m}) = \mathbf{0} \end{align*} $$

Arranging this for $\mathbf{w}_{i}$ gives the following.

$$ \left( \sum \limits _{j} ^{n} k_{j}a_{1j} \right)\mathbf{w}_{1} + \left( \sum \limits _{j} ^{n} k_{j}a_{2j} \right)\mathbf{w}_{2} + \cdots + \left( \sum \limits _{j} ^{n} k_{j}a_{mj} \right)\mathbf{w}_{m} = \mathbf{0} $$

Then we obtain the following homogeneous linear system for the unknowns $k$.

$$ \begin{align*} a_{11}k_{1} + a_{12}k_{2} + \cdots + a_{1n}k_{n} = 0 \\ a_{21}k_{1} + a_{22}k_{2} + \cdots + a_{2n}k_{n} = 0 \\ \vdots \\ a_{m1}k_{1} + a_{m2}k_{2} + \cdots + a_{mn}k_{n} = 0 \end{align*} $$

Since the number of unknowns is $n$ and the number of equations is $m$, and since $m < n$, the linear system has infinitely many nontrivial solutions. Hence, we conclude that $S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots \mathbf{v}_{n} \right\}$ is linearly dependent, which contradicts the fact that $S$ is linearly independent, thereby disproving the assumption. Thus, $W$ cannot span $V$.


  1. Howard Anton, Elementary Linear Algebra: Applications Version (12th Edition, 2019), p248 ↩︎

  2. Howard Anton, Elementary Linear Algebra: Applications Version (12th Edition, 2019), p252-253 ↩︎