Eigenvalues and Eigenvectors
Definition
Let be an matrix. A scalar is an eigenvalue of if there exists a nonzero column vector — called an eigenvector — such that
Multiplication by leaves the direction of unchanged (or reverses it), scaling it by . If is an eigenvector then so is any nonzero scalar multiple ; eigenvectors are therefore only determined up to scale.
Intuition
Eigenvectors are the special directions in space that a linear transformation does not rotate — it only stretches or compresses them (and possibly flips them if ). Every other vector gets both rotated and scaled, but eigenvectors stay on their own lines through the origin. The eigenvalue tells you exactly how much stretching occurs along that direction. A transformation can be fully understood once all its eigendirections and associated scales are known.
Formal Description
Characteristic equation. Rewriting the eigenvalue equation using the identity matrix ,
For nonzero solutions to exist the matrix must be singular:
This is the characteristic equation of . By the Leibniz formula it is a degree- polynomial in , so an matrix has exactly eigenvalues counted with multiplicity (over ). Once an eigenvalue is found, the corresponding eigenvector is obtained by solving .
2×2 case. For the characteristic equation is
where is the trace. The discriminant determines the nature of the roots.
Distinct real eigenvalues (). The two roots are real. Eigenvectors corresponding to distinct eigenvalues are linearly independent, so is diagonalisable over .
Complex conjugate eigenvalues (). The roots form a conjugate pair with , . The corresponding eigenvectors are also complex conjugates of each other. A real matrix with complex eigenvalues represents a rotation-scaling transformation; it is diagonalisable over but not over .
Repeated eigenvalue (). The single eigenvalue may or may not yield two linearly independent eigenvectors; the matrix may or may not be diagonalisable.
More generally, an matrix may have fewer than distinct eigenvalues, and eigenvalues can be real or complex.
Applications
- Diagonalisation of matrices for efficient computation (e.g., matrix powers, differential equations).
- Principal Component Analysis (PCA): eigenvectors of the covariance matrix are the principal components.
- Stability analysis of dynamical systems: eigenvalues determine whether perturbations grow or decay.
- Spectral graph theory: eigenvalues of the graph Laplacian encode connectivity structure.
Trade-offs
- Solving the characteristic polynomial is exact for (closed-form roots exist) but numerically unstable for large ; iterative methods (QR algorithm, power iteration) are used instead.
- Complex eigenvalues require working over even when is real, adding computational overhead.
- Repeated eigenvalues (defective matrices) may lack a full set of linearly independent eigenvectors, making diagonalisation impossible; Jordan normal form is required in that case.
Links
- Determinants — characteristic equation is evaluated via
- Matrix Diagonalization — eigenvectors form the columns of the diagonalising matrix