Outer Product of Two Vectors
Definition
The outer product of two column vectors $\mathbf{u} = \begin{bmatrix} u_{1} \\ \vdots \\ u_{n} \end{bmatrix}$ and $\mathbf{v} = \begin{bmatrix} v_{1} \\ \vdots \\ v_{n} \end{bmatrix}$ is defined as follows.
$$ \mathbf{u} \otimes \mathbf{v} = \mathbf{u}\mathbf{v}^{\mathsf{T}} = \begin{bmatrix} u_{1} \\ u_{2} \\ \vdots \\ u_{n} \end{bmatrix} \begin{bmatrix} v_{1} & v_{2} & \cdots & v_{n} \end{bmatrix}= \begin{bmatrix} u_{1}v_{1} & u_{1}v_{2} & \cdots & u_{1}v_{n} \\ u_{2}v_{1} & u_{2}v_{2} & \cdots & u_{2}v_{n} \\ \vdots & \vdots & \ddots & \vdots \\ u_{n}v_{1} & u_{n}v_{2} & \cdots & u_{n}v_{n} \end{bmatrix} $$
Here, ${}^{\mathsf{T}}$ is the transpose of a matrix.
Explanation
There are various operations involving vectors and matrices, so care must be taken to avoid confusion.
- The cross product defined in 3-dimensional space is often translated as an outer product, requiring attention. Considering the English term and its properties, cross product can be appropriately termed as a vector product or a wedge product.
- It can be seen as a special case of the Kronecker product (tensor product). $A \otimes B$ is when $A$ is a column vector, and $B$ is a row vector.
- It can also be seen as a special case of matrix multiplication. $A \times B$ is when $A$ is a column vector and $B$ is a row vector.
The scalar product (dot product, inner product) of two vectors results in a scalar, whereas the vector product of two vectors results in a vector. The outer product of two vectors results in a matrix (tensor).
Operation | Scalar Product (Inner Product) | Vector Product | Outer Product (Tensor Product) |
---|---|---|---|
Dimension | $n$-dimensional vector | $3$-dimensional vector | $n$-dimensional vector |
Notation | $\mathbf{u} \cdot \mathbf{v} = \mathbf{u}^{\mathsf{T}} \mathbf{v}$ | $\mathbf{u} \times \mathbf{v}$ | $\mathbf{u} \otimes \mathbf{v} = \mathbf{u}\mathbf{v}^{\mathsf{T}}$ |
Result | Scalar $=1 \times 1$ matrix | $3$-dimensional vector | $n \times n$ matrix |
Value | $\sum_{i} u_{i}v_{i} = u_{1}v_{1} + \cdots + u_{n}v_{n}$ | $\begin{bmatrix} u_{2}v_{3} - u_{3}v_{2} \\ u_{3}v_{1} - u_{1}v_{3} \\ u_{1}v_{2} - u_{2}v_{1} \end{bmatrix}$ | $\begin{bmatrix} u_{1}v_{1} & u_{1}v_{2} & \cdots & u_{1}v_{n} \\ u_{2}v_{1} & u_{2}v_{2} & \cdots & u_{2}v_{n} \\ \vdots & \vdots & \ddots & \vdots \\ u_{n}v_{1} & u_{n}v_{2} & \cdots & u_{n}v_{n}\end{bmatrix}$ |
Generalization
In fact, unlike scalar and vector products, the sizes of the two vectors involved in this operation do not need to be the same for it to be well-defined. For example, the outer product of two vectors $\mathbf{u} = \begin{bmatrix} u_{1} \\ \vdots \\ u_{n} \end{bmatrix}$ and $\mathbf{v} = \begin{bmatrix} v_{1} \\ \vdots \\ v_{m} \end{bmatrix}$ can be defined as follows.
$$ \mathbf{u} \otimes \mathbf{v} = \begin{bmatrix} u_{1} \\ \vdots \\ u_{n} \end{bmatrix} \begin{bmatrix} v_{1} & \cdots & v_{m} \end{bmatrix}^{\mathsf{T}} = \begin{bmatrix} u_{1}v_{1} & u_{1}v_{2} & \cdots & u_{1}v_{m} \\ u_{2}v_{1} & u_{2}v_{2} & \cdots & u_{2}v_{m} \\ \vdots & \vdots & \ddots & \vdots \\ u_{n}v_{1} & u_{n}v_{2} & \cdots & u_{n}v_{m} \end{bmatrix} $$
Properties
Let $\mathbf{u} = \begin{bmatrix} u_{1} & \cdots & u_{n} \end{bmatrix}^{\mathsf{T}}$, $\mathbf{v} = \begin{bmatrix} v_{1} & \cdots & v_{n} \end{bmatrix}^{\mathsf{T}}$, and $\mathbf{w} = \begin{bmatrix} w_{1} & \cdots & w_{n} \end{bmatrix}^{\mathsf{T}}$. The following holds.
$$ (\mathbf{u} \otimes \mathbf{v})^{\mathsf{T}} = \mathbf{v} \otimes \mathbf{u} \tag{1} $$
$(2)$Linearity: $$ (\mathbf{v} + \mathbf{w}) \otimes \mathbf{u} = \mathbf{v} \otimes \mathbf{u} + \mathbf{w} \otimes \mathbf{u} $$ $$ \mathbf{u} \otimes (\mathbf{v} + \mathbf{w}) = \mathbf{u} \otimes \mathbf{v} + \mathbf{u} \otimes \mathbf{w} $$
Let $\alpha \in \mathbb{R}$ be a constant.
$$ (\alpha \mathbf{v}) \otimes \mathbf{u} = \alpha (\mathbf{v} \otimes \mathbf{u}) = (\alpha \mathbf{v}) \otimes \mathbf{u} $$
$$ (\mathbf{u} \otimes \mathbf{v}) \mathbf{w} = (\mathbf{v} \cdot \mathbf{w}) \mathbf{u} \tag{3} $$
$$ \mathbf{w}^{\mathsf{T}} (\mathbf{u} \otimes \mathbf{v}) = (\mathbf{w} \cdot \mathbf{u}) \mathbf{v}^{\mathsf{T}} \tag{4} $$
$(5)$Associativity:
When $\otimes$ is extended by Kronecker product, the following holds.
$$ (\mathbf{u} \otimes_{\text{Kron}} \mathbf{v}) \otimes_{\text{Kron}} \mathbf{w} = \mathbf{u} \otimes_{\text{Kron}} (\mathbf{v} \otimes_{\text{Kron}} \mathbf{w}) $$
Proof
$(1)$
This can be seen from the properties of the transpose.
$$ \begin{align*} (\mathbf{u} \otimes \mathbf{v})^{\mathsf{T}} &= (\mathbf{u} \mathbf{v}^{\mathsf{T}})^{\mathsf{T}} \\ &= \mathbf{v} \mathbf{u}^{\mathsf{T}} \\ &= \mathbf{v} \otimes \mathbf{u} \\ \end{align*} $$
$(2)$
This holds since the matrix multiplication is linear.
$$ \begin{align*} (\mathbf{v} + \mathbf{w}) \otimes \mathbf{u} &= (\mathbf{v} + \mathbf{w}) \mathbf{u}^{\mathsf{T}} \\ &= \mathbf{v} \mathbf{u}^{\mathsf{T}} + \mathbf{w} \mathbf{u}^{\mathsf{T}} \\ &= \mathbf{v} \otimes \mathbf{u} + \mathbf{w} \otimes \mathbf{u} \\ \end{align*} $$
This holds since the transpose is linear.
$$ \begin{align*} \mathbf{u} \otimes (\mathbf{v} + \mathbf{w}) &= \mathbf{u} (\mathbf{v} + \mathbf{w})^{\mathsf{T}} \\ &= \mathbf{u} (\mathbf{v}^{\mathsf{T}} + \mathbf{w}^{\mathsf{T}}) \\ &= \mathbf{u} \mathbf{v}^{\mathsf{T}} + \mathbf{u} \mathbf{w}^{\mathsf{T}} \\ &= \mathbf{u} \otimes \mathbf{v} + \mathbf{u} \otimes \mathbf{w} \\ \end{align*} $$
$$ \begin{align*} (\alpha \mathbf{v}) \otimes \mathbf{u} &= (\alpha \mathbf{v}) \mathbf{u}^{\mathsf{T}} \\ &= \alpha (\mathbf{v} \mathbf{u}^{\mathsf{T}}) \\ &= \alpha (\mathbf{v} \otimes \mathbf{u}) \\ \end{align*} $$
■
$(3)$
$$ \begin{align*} (\mathbf{u} \otimes \mathbf{v}) \mathbf{w} &= (\mathbf{u} \mathbf{v}^{\mathsf{T}}) \mathbf{w} \\ &= \mathbf{u} (\mathbf{v}^{\mathsf{T}} \mathbf{w}) \\ &= \mathbf{u} (\mathbf{v} \cdot \mathbf{w}) \\ &= (\mathbf{v} \cdot \mathbf{w}) \mathbf{u} \\ \end{align*} $$
Note that $\mathbf{v}^{\mathsf{T}} \mathbf{w}$ is a scalar.
■
$(4)$
$$ \begin{align*} \mathbf{w}^{\mathsf{T}} (\mathbf{u} \otimes \mathbf{v}) &= \mathbf{w}^{\mathsf{T}} (\mathbf{u} \mathbf{v}^{\mathsf{T}}) \\ &= (\mathbf{w}^{\mathsf{T}} \mathbf{u}) \mathbf{v}^{\mathsf{T}} \\ &= (\mathbf{w} \cdot \mathbf{u}) \mathbf{v}^{\mathsf{T}} \\ \end{align*} $$
■