Wiener Process
Definition
When $s< t < t+u$, a stochastic process $\left\{ W_{t} \right\}$ that satisfies the following conditions is called a Wiener Process:
- (i): $W_{0} = 0$
- (ii): $\left( W_{t+u} - W_{t} \right) \perp W_{s}$
- (iii): $\left( W_{t+u} - W_{t} \right) \sim N ( 0, u )$
- (iv): The sample paths of $W_{t}$ are almost surely continuous.
Basic Properties
- [1]: $\displaystyle W_{t} \sim N ( 0 , t )$
- [2]: $\displaystyle E ( W_{t} ) = 0$
- [3]: $\displaystyle \operatorname{Var} ( W_{t} ) = t$
- [4]: $\displaystyle \text{cov} ( W_{t} , W_{s} ) = E (W_{t}W_{s}) = {{1} \over {2}} (|t| + |s| - |t-s|) = \min \left\{ t , s \right\}$
Description
The Wiener Process is also called Brownian Motion.
(ii): Having $\left( W_{t+u} - W_{t} \right) \perp W_{s}$ means that
(iii): The increments follow a normal distribution $N(0,t)$, signifying that the Wiener Process does not care about specific points in time, but the uncertainty increases as the difference between two points in time increases.
(iv): The fact that sample paths are almost surely continuous means that if there is a point following the Wiener process, the chance of it ’teleporting’ is as if $0$. If it’s too hard to understand, knowing that it does not make sudden leaps is enough.
[1]: An interesting fact is that the probability density function of $W_{t}$ $$ f_{W_{t}} (x,t) = {{1} \over { \sqrt{ 2 \pi t } }} e^{ - {{x^2} \over {2t} } } $$ becomes the solution to the heat equation $$ {{\partial u } \over { \partial t }} = {{1} \over {2}} {{\partial^2 u } \over { \partial x^2 }} $$.
[4]: It’s not common to see the covariance expressed as the minimum of something. It’s highly recommended to follow the proof process and understand how it was derived.
Proof
[1]
By (i) and (iii), $W_{t} = W_{t} - 0 = W_{t} - W_{0} \sim N ( 0 , t )$
■
[2]
Since $W_{t}$ follows a normal distribution by [1], $\displaystyle E ( W_{t} ) = 0$
■
[3]
Since $W_{t}$ follows a normal distribution by [1], $\displaystyle \operatorname{Var} ( W_{t} ) = t$
■
[4]
Let $t > s$ then by the definition of covariance and [2] $$ \text{cov} ( W_{t} , W_{s} ) = E \left( \left[ W_{t} - E ( W_{t} ) \right] \left[ W_{s} - E ( W_{s} ) \right] \right) = E \left( W_{t} W_{s} \right) $$
$W_{t} = ( W_{t} - W_{s} ) + W_{s}$ therefore
$$ \begin{align*} E \left( W_{t} W_{s} \right) =& E \left[ \left( ( W_{t} - W_{s} ) + W_{s} \right) \cdot W_{s} \right] \\ =& E \left[ ( W_{t} - W_{s} ) \cdot W_{s} \right] + E \left( W_{s}^{2} \right) \end{align*} $$
The first term by (ii) and [2]
$$ E \left[ ( W_{t} - W_{s} ) \cdot W_{s} \right] = E ( W_{t} ) \cdot E ( W_{t} - W_{s} ) = 0 $$
The second term by [3]
$$ E \left( W_{s}^{2} \right) - 0^2 = E \left( W_{s}^{2} \right) - \left[ E ( W_{s} ) \right]^2 = \operatorname{Var} ( W_{s} ) = s $$
Summarizing $\displaystyle \text{cov} ( W_{t} , W_{s} ) = s$. Similarly, the same result is obtained when $s > t$
$$ \text{cov} ( W_{t} , W_{s} ) = \min \left\{ t , s \right\} $$
■