Matrix Operations
Definition
Matrix operations include addition, scalar multiplication, matrix-matrix multiplication, matrix-vector multiplication, and the transpose. They form the algebraic foundation for linear algebra.
Intuition
Matrix–matrix multiplication computes all dot products between rows of the left matrix and columns of the right matrix — think of it as composing two linear transformations. The transpose geometrically “reflects” a matrix across its diagonal. The Hadamard product is purely element-wise and has no linear-map interpretation on its own.
Formal Description
Matrix Addition
Matrices can be added only if they have the same dimensions; addition is element-wise:
i.e. .
Scalar Multiplication
Every element of the matrix is multiplied by the scalar :
Matrix–Matrix Multiplication
An matrix can be multiplied by an matrix ; the result is :
Example ():
The Hadamard product (element-wise product) is denoted and is distinct from standard matrix multiplication.
Matrix–Vector Multiplication
For and , the product with represents a system of linear equations.
The dot product of two vectors is the matrix product .
Transpose
The transpose switches rows and columns: . For an matrix, is .
Matrix multiplication properties:
- Distributive:
- Associative:
- Not commutative in general:
Transpose properties:
- Dot product commutativity:
Symmetry:
- is symmetric if .
- is skew-symmetric if (diagonal elements must be zero).
Examples:
Applications
- Expressing systems of linear equations as , where , , .
- Broadcasting in deep learning: adds a vector to every row of a matrix.
Trade-offs
- Matrix multiplication is not commutative: in general, even when both products are defined. Reversing operand order is a common source of bugs.
- Dimensions must be compatible for multiplication ( by ); addition requires identical shapes.
- The Hadamard product is commutative but does not correspond to function composition — mixing it with standard multiplication requires care.