Eigenvalues and Eigenvectors of Finite-Dimensional Linear Transformations
Definition1
Let $V$ be a finite-dimensional $F$-vector space. Let $T : V \to V$ be a linear transformation. For $\lambda \in F$, $$ Tx = \lambda x $$ a non-zero vector $x \in V$ satisfying this is called an eigenvector of $T$.
The scalar $\lambda \in F$ is called the eigenvalue corresponding to the eigenvector $x$.
Explanation
Although one might find the term eigenvector replaced by the terms characteristic vector or proper vector, and eigenvalue replaced by characteristic value or proper value, the author has not encountered these terms.
Eigenvalues and eigenvectors are related to the diagonalization of linear transformations.
Theorem
A linear transformation $T : V \to V$ on a $n$-dimensional vector space $V$ is diagonalizable if and only if there exists an ordered basis of $V$ made up of eigenvectors of $T$, denoted as $\beta$. That is to say, $T$ is diagonalizable if there exist $n$ linearly independent eigenvectors of $T$.
Moreover, if $T$ is diagonalizable, and if $\beta = \left\{ v_{1}, \dots, v_{n} \right\}$ is an ordered basis of eigenvectors of $T$, and if $D = \begin{bmatrix} T \end{bmatrix}_{\beta}$, then $D$ is a diagonal matrix and $D_{jj}$ corresponds to the eigenvalue of $v_{j}$.
See Also
Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p245~264 ↩︎