Linear Transformation Space
📂Linear AlgebraLinear Transformation Space
Definition
The set of all linear transformations from the vector space V to W is denoted as L(V,W).
L(V,W)=L(V,W):={T:V→W∣T is linear }
This is also expressed as follows, referred to as the homomorphism space.
Hom(V,W)=L(V,W)={T:V→W is linear}
Additionally, when W=V, it is also denoted as follows, referred to as the endomorphism space.
End(V)=Hom(V,V)=Hom(V)=L(V,V)=L(V)
The notations that are mainly used are L, L.
Explanation
Homomorphic means, as the word suggests, being almost like an isomorphism, with the condition of being invertible omitted.
Let’s say T,U∈L(V,W), a∈R. The sum of two linear transformations T+U and the scalar multiplication of a linear transformation aT are defined as follows.
(T+U)(x)=T(x)+U(x)and(aT)(x)=aT(x) for x∈V,a∈R
Then, aT+U also becomes a linear transformation belonging to L(V,W) (closed under both operations), and L(V,W) becomes a vector space under the aforementioned operations.
Basis
Let V,W be vector spaces of dimensions n,m, respectively. Let BV={v1,⋯,vn}, BW={w1,⋯,wm} be bases of V,W, respectively. Let’s define the linear transformation ϕij as follows.
ϕij:Vvj′→W↦{wi0if j′=jif j′=j
Then, the set {ϕij:1≤i≤m,1≤j≤n} becomes a basis of L(V,W).
Linear Independence
i,j∑λijϕij=0⟹λij=0∀i,j
Showing linear independence means showing the above equation. The zero vector of the vector space L(V,W) is the zero transformation T0, so let’s say ∑i,jaijϕij=T0. If we substitute v1 into both sides, the following is obtained.
i,j∑λijϕij(v1)=i∑λi1wi=T0(v1)=0
⟹i∑λi1wi=0
However, since {wi} is a basis, it concludes ∀i λi1=0. Similarly, by substituting all of vj into both sides of ∑i,jλijϕij=T0, the following result is obtained.
λij=0∀i,j
Therefore, {ϕij} is linearly independent.
Span
All that needs to be shown is that any T∈L(V,W) can be expressed as a sum of ϕij. Let’s say the basis BV maps as follows.
T(vj)=i=1∑mbijwi
Then, according to (1), since wi=∑k=1nϕik(vj), for x=j=1∑najvj∈V, T(x) is as follows.
T(x)=T(∑jajvj)=j∑ajT(vj)=j∑aji∑bijwi=i,j∑ajbijwi=i,j∑ajbijk∑ϕik(vj)
But here, since inevitably k=j if ϕik(vj)=0, for a fixed j, the following holds.
bijk∑ϕik(vj)=k∑bikϕik(vj)
Therefore,
⟹T(x)T=i,j∑ajk∑bikϕik(vj)=i,j,k∑bikϕik(ajvj)=i,k∑bikϕik(x)=i,j∑bijϕij(x)=(i,j∑bijϕij)(x)=i,j∑bijϕij
{ϕij} is a basis because it spans L(V,W). Also, here bij is the component (i,j) of the matrix representation [T]BVBW of T.
[T]BVBW=[bij]
Representation with respect to Dual Basis
The content is much easier to understand when expressed in terms of dual bases. Let’s say the dual basis of BV is {v1∗,…,vn∗}. Then, it is possible to consider the following linear transformation corresponding to wi and vj∗.
wivj∗:Vx→W↦vj∗(x)wi∀i,j
It is a linear transformation with rank 1, trivially mapping only to the scalar multiple of wi for a fixed i, essentially no different from the definition of (1). Since {wi} is a basis, it is self-evident that for the index i, {wivj∗} is linearly independent. By checking ∑jλijwivj∗(vj′)=0 in the manner shown above, it is also understood that it is independent for the index j. It is easy to show that any T∈L(V,W) can be expressed as a linear combination of {wivj∗}. The coefficients here are the components bij of the matrix representation of T. For x=∑jxjvj∈V, T(vj)=∑ibijwi,
T(x)=T(∑jxjvj)=j∑xjT(vj)=j∑xji∑bijwi=i,j∑bijxjwi=i,j∑bijvj∗(x)wi=i,j∑bijwivj∗(x)=(∑i,jbijwivj∗)(x)
⟹T=i,j∑bijwivj∗