logo

Increment of Stochastic Processes 📂Probability Theory

Increment of Stochastic Processes

Definition

A stochastic process $\xi (t)$ is defined at time $T$ and let’s denote it by $t_{0} < t_{1} < \cdots < t_{n} \in T$.

  1. $\xi ( t ) - \xi ( s )$ is called an increment.
  2. If for all $i=1, \cdots , n$, $\xi ( t_{i} ) - \xi ( t_{i-1} )$ are mutually independent, $\xi (t)$ is said to have independent increments.
  3. If for all $h>0$ and $t,s,t+h,s+h \in T$, $\xi (t+h) - \xi ( s + h )$ has the same probability distribution, $\xi (t)$ is said to have stationary increments.

Explanation

The term Increment, originally the noun form of ‘Increase’, means ‘growth’. However, in the context of a stochastic process, it’s somewhat ambiguous to directly imply ‘growing’, and it’s not particularly concerned with whether the value actually increases or decreases, so using ‘increment-decrement’ is also not quite apt. There’s just a prior and a later time causing the value of time to increase.

If such increments change only according to the time difference, irrespective of the specific point in time, it would be convenient to deal with, similar to a Markov Chain. Independence and stationarity are essential conditions for being a ‘good stochastic process’. Fulfilling these conditions would evidently result in iid (independent identically distributed). Specifically, the concept of stationarity in stochastic processes is, theoretically, a bit stronger condition that needs to be satisfied as compared to stationarity in time series analysis.