logo

Stochastic Differential Equations with White Noise 📂Stochastic Differential Equations

Stochastic Differential Equations with White Noise

Motivation

$$ \xi (t) \overset{?}{:=} \dot{W}(t) = {{d W (t)} \over {dt}} $$

Imagine a $\xi$ as the derivative of a Wiener process as shown above. When thinking of Brownian motion, this $\xi (t)$ would be a noise representing random fluctuations at the time point $t$. Although it seems very intuitive and not at all awkward, regrettably, there is an issue with the existence of $\dot{W}(t)$ in the universal sense.

Non-differentiability of the Wiener Process 1

$$ Y_{h} := {{W (t + h) - W(t)} \over {h}} $$

Let’s consider a probability process of the average rate of change $\left\{ Y_{h} \right\}$ for the Wiener process $\left\{ W_{t} \right\}_{t \ge 0}$ as shown above. Obviously, $Y_{h}$ follows a normal distribution, and its mean and variance are calculated as follows. $$ \begin{align*} E \left( Y_{h} \right) & = {{ 1 } \over { h }} E \left( W (t+h) - W (t) \right) = 0 \\ \operatorname{Var} \left( Y_{h} \right) & = {{ 1 } \over { h^{2} }} \operatorname{Var} \left( W (t+h) - W (t) \right) = {{ 1 } \over { h^{2} }} \cdot h = {{ 1 } \over { h }} \end{align*} $$ Therefore, $Y_{n}$ can be represented for the random variable $Z$ that follows the standard normal distribution $N(0,1)$ as shown in $\displaystyle Y_{h} = {{ 1 } \over { \sqrt{h} }} Z$. At this time, let’s consider the probability $$ P \left( \left| Y_{h} \right| > k \right) = P \left( \left| {{ Z } \over { \sqrt{h} }} \right| > k \right) $$ with a suitable constant $k > 0$. No matter what $k$ is, together with $Z$, if $h \to 0$ then $\left| {{ Z } \over { \sqrt{h} }} \right|$ diverges to infinity, therefore $$ \lim_{h \to 0} P \left( \left| {{ Z } \over { \sqrt{h} }} \right| > k \right) = 1 $$ is satisfied. In other words, what was thought to be a probability process of the average rate of change $\left\{ Y_{h} \right\}$ is actually (probability) diverging, and hence it can be known that $W(t)$ is non-differentiable anywhere.

Buildup

Since there was somewhat a problem in defining white noise with the concept of the Wiener process and traditional derivatives, we try to circumvent and formulate its definition. First, let $\left\{ X_{t} \right\}_{t \ge 0}$ always be a stochastic process that follows a finite variance normal distribution. In other words, for all $t \ge 0$, $$ E \left( X_{t}^{2} \right) < \infty $$ it is. For all $t_{1}, t_{2} \ge 0$, $$ E \left( X_{t_{1}} \right) = E \left( X_{t_{2}} \right) $$ if and the covariance with some function $h = \mathbb{R} \to \mathbb{R}$ is $$ \operatorname{Cov} \left( X_{t_{1}} , X_{t_{2}} \right) = E \left( X_{t_{1}} \cdot X_{t_{2}} \right) = h \left( t_{2} - t_{1} \right) $$ then such Gaussian stochastic process $\left\{ X_{t} \right\}_{t \ge 0}$ is called Stationary in the wide Sense. This is an adequate expression to some extent regressed from stationarity in time-series analysis in terms of having a constant mean and the variance changing over time lapses explained through the function $h$.

Dirac delta function: Let’s call the Dirac measure defined as below a $\delta_{x_0}$. $$ \delta_{x_0} (E) := \begin{cases} 1 & x_0 \in E \\ 0 & x_0 \notin E \end{cases} $$

In particular, if $X_{0} = 0$ and $h$ is a Dirac function that is $x_{0} =0$, then the intuitive meaning of $\left\{ X_{t} \right\}_{t \ge 0}$ becomes noise that satisfies all of the following. $$ \begin{align*} X_{t} & \sim N \left( 0 , 1 \right) & \left( \because E X_{0} = 0 \land \delta_{0} = 1 \implies E X_{t}^{2} = 1 \right) \\ X_{t_{1}} & \perp X_{t_{2}} & \left( \because \delta_{0} = 0 \implies \operatorname{Cov} \left( X_{t_{1}}, X_{t_{2}} \right) = 0 \right) \end{align*} $$

This means that the noise does not drift―the mean is $0$ and follows a normal distribution with constant variance, and any noise at a certain time is independent of any noise at another time. As seen in the definition of the Wiener process, properties (i) and (ii) are satisfied, while (iii) and (iv) are meaningful when thinking of intervals rather than a single point, thus not significant when discussing white noise.

Wiener Process: When $s< t < t+u$ is defined, a stochastic process $\left\{ W_{t} \right\}$ that satisfies the following conditions is called a Wiener process.

  • (i): $W_{0} = 0$
  • (ii): $\left( W_{t+u} - W_{t} \right) \perp W_{s}$
  • (iii): $\left( W_{t+u} - W_{t} \right) \sim N ( 0, u )$
  • (iv): The sample paths of $W_{t}$ are almost everywhere continuous.

Accordingly, we intend to introduce the following definition not as a derivative of a real Wiener process but as sufficient noise at each point in time. Indeed, after this definition, such expressions are naturally used in formulas. $$ \begin{align*} \xi (t) =& {{d W (t)} \over {dt}} \\ d W (t) =& \text{noise} \cdot dt \end{align*} $$

Extending the definition of the Wiener process to distributions, the distributional derivative of the Wiener process satisfies the definition of white noise. In other words, white noise is a weak derivative of the Wiener process.

Definition 2

A Gaussian stochastic process $\left\{ \xi (t) \right\}$ is called White Noise if it has the wide sense of stationarity satisfying the following two conditions:

  • (i): $E \left( \xi (0) \right) = 0$
  • (ii): $\operatorname{Cov} \left( \xi \left( t_{1} \right), \xi \left( t_{2} \right) \right) = \delta_{0}$

See Also


  1. Panik. (2017). Stochastic Differential Equations: An Introduction with Applications in Population Dynamics Modeling: p110. ↩︎

  2. Panik. (2017). Stochastic Differential Equations: An Introduction with Applications in Population Dynamics Modeling: p109. ↩︎