logo

Linear Combination, Span 📂Linear Algebra

Linear Combination, Span

Definition: Linear Combination1

Let w\mathbf{w} be a vector in the vector space VV. If w\mathbf{w} can be expressed as follows for vectors v1,v2,,vr\mathbf{v}_{1},\mathbf{v}_{2},\cdots ,\mathbf{v}_{r} in VV and arbitrary constants k1,k2,,krk_{1}, k_{2}, \cdots, k_{r}, then w\mathbf{w} is called a linear combination of v1,v2,,vr\mathbf{v}_{1},\mathbf{v}_{2},\cdots ,\mathbf{v}_{r}.

w=k1v1+k2v2++krvr \mathbf{w} = k_{1}\mathbf{v}_{1} + k_{2}\mathbf{v}_{2} + \cdots + k_{r}\mathbf{v}_{r}

Additionally, in this case, the constants k1,k2,,krk_{1}, k_{2}, \cdots, k_{r} are referred to as the coefficients of the linear combination w\mathbf{w}.

Explanation

Though it might seem unfamiliar presented in a formulaic manner, it’s not a complex concept. The representation of vectors in a two-dimensional Cartesian coordinate system is precisely the linear combination of two unit vectors x^=(1,0)\hat{\mathbf{x}} = (1,0) and y^=(0,1)\hat{\mathbf{y}} = (0,1).

v=(v1,v2)=(v1,0)+(0,v2)=v1(1,0)+v2(0,1)=v1x^+v2y^ \mathbf{v} = (v_{1}, v_{2}) = (v_{1},0)+(0,v_{2}) = v_{1}(1,0) + v_{2}(0,1) = v_{1}\hat{\mathbf{x}} + v_{2} \hat{\mathbf{y}}

Theorem

Let S={w1,w2,,wr}S = \left\{ \mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r} \right\} be a non-empty subset of the vector space VV. Then the following hold.

(a) Let WW be the set of all possible linear combinations of elements of SS. WW is a subspace of VV.

(b) The WW from (a) is the smallest subspace of VV that includes SS. That is, if WW^{\prime} is a subspace of VV that includes SS, then the following equation holds.

SWW S \subset W \le W^{\prime}

Proof

(a)

To demonstrate that WW is closed under addition and scalar multiplication, one can apply the subspace test as follows.

u=c1w1+c2w2++crwr,v=k1w1+k2w2++krwr \mathbf{u} = c_{1} \mathbf{w}_{1} + c_{2} \mathbf{w}_{2} + \cdots + c_{r} \mathbf{w}_{r}, \quad \mathbf{v} = k_{1} \mathbf{w}_{1} + k_{2} \mathbf{w}_{2} + \cdots + k_{r} \mathbf{w}_{r}

  • (A1)

    u+v\mathbf{u}+\mathbf{v} is as follows.

    u+v=(c1+k1)w1+(c2+k2)w2++(cr+kr)wr \mathbf{u} +\mathbf{v} = ( c_{1} + k_{1} ) \mathbf{w}_{1} + ( c_{2} + k_{2} ) \mathbf{w}_{2} + \cdots + ( c_{r} + k_{r} ) \mathbf{w}_{r}

    Since this is a linear combination of w1,w2,,wr\mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r}, u+vW\mathbf{u} + \mathbf{v} \in W is true.

  • (M1)

    For any constant kk, kuk\mathbf{u} is as follows.

    ku=(kc1)w1+(kc2)w2++(kcr)wr k\mathbf{u} = ( k c_{1} ) \mathbf{w}_{1} + ( k c_{2} ) \mathbf{w}_{2} + \cdots + ( k c_{r} ) \mathbf{w}_{r}

    Since this is a linear combination of w1,w2,,wr\mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r}, kuWk\mathbf{u} \in W is true.

  • Conclusion

    Since WW is closed under addition and scalar multiplication, by the subspace test, WW is a subspace of VV.

    WV W \le V

(b)

Assuming WW^{\prime} is a subspace of VV that includes SS, since WW^{\prime} is closed under addition and scalar multiplication, all linear combinations of elements of SS are elements of WW^{\prime}. Therefore,

WW W \le W^{\prime}

Definition: Span

The WW in the theorem is referred to as the subspace of VV spanned by SS. Furthermore, it is said that the vectors w1,w2,,wr\mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r} span WW, which is denoted as follows.

W=span{w1,w2,,wr}orW=span(S) W = \text{span}\left\{ \mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r} \right\} \quad \text{or} \quad W = \text{span}(S)

Explanation

The concept of spanning is necessary to contemplate the smallest set that contains certain elements. Indeed, the above theorem highlights this point. Additionally, eliminating all redundant elements from SS itself would make it the basis of a vector space.

Theorem

Let S={v1,v2,,vr}S = \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{r} \right\} and S={w1,w2,,wr}S^{\prime} = \left\{ \mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r} \right\} be non-empty subsets of the vector space VV. Then,

span{v1,v2,,vr}=span{w1,w2,,wr} \text{span} \left\{ \mathbf{v}_{1}, \mathbf{v}_{2}, \dots, \mathbf{v}_{r} \right\} = \text{span} \left\{ \mathbf{w}_{1}, \mathbf{w}_{2}, \dots, \mathbf{w}_{r} \right\}

The necessary and sufficient condition for this to hold is that all vectors of SS can be expressed as linear combinations of vectors of SS^{\prime}, and all vectors of SS^{\prime} can be expressed as linear combinations of vectors of SS.


  1. Howard Anton, Elementary Linear Algebra: Applications Version (12th Edition, 2019), p220-222 ↩︎