logo

Discrete Markov Chains 📂Probability Theory

Discrete Markov Chains

Definition

A discrete-time Markov chain (DTMC), or simply a Markov Chain, is a discrete stochastic process $\left\{ X_{n} \right\}$ satisfying the following, provided the state space is a countable set: $$ p \left( X_{n+1} = j \mid X_{n} = i , X_{n-1} = k , \cdots , X_{0} = l \right) = p \left( X_{n+1} = j \mid X_{n} = i \right) $$

See Also

Description

$p_{ij}:= p \left( X_{n+1} = j \mid X_{n} = i \right)$ is called the Transition Probability, where $i$ represents the Source State, and $j$ represents the Target State. The transition probability after $k$ steps is represented as $p_{ij}^{(k)}: = p \left( X_{n+k} = j \mid X_{n} = i \right)$.

A Markov Chain refers to a stochastic process where the probability of the next step, given the entire history, is the same as the probability given only the current state. In simpler terms, if one accurately knows the current state, the past does not influence the future of the process, a property often referred to as Memorylessness. Naturally, such an assumption significantly simplifies calculations or anything alike.

Consider modeling the probability of precipitation as a Markov chain. Whether it rained yesterday or not, let’s assume the probability of rain tomorrow only depends on whether it rained today. Let the state of it raining be represented by $0$, and the state of no rain by $1$. $$\begin{matrix} p_{00} = 0.7 & p_{01} = 0.3 \\ p_{10} = 0.4 & p_{11} = 0.6 \end{matrix}$$ This means, if it rained today, the probability of rain tomorrow is $70 %$ and no rain is $30 %$; if it did not rain today, the probability of no rain tomorrow is $60 %$ and rain is $40 %$.

Now, $$P:= \begin{bmatrix} p_{00} & p_{01} \\ p_{10} & p_{11} \end{bmatrix} = \begin{bmatrix} 0.7 & 0.3 \\ 0.4 & 0.6 \end{bmatrix} $$ using a matrix, we can simplify such probability calculations. This matrix is referred to as the Transition Probability Matrix, and the transition probability matrix after $k$ steps is represented as $P^{(k)}:= \left( p_{ij}^{ ( k ) } \right)$. A useful property of the transition probability matrix is $P^{(n)} = P^{n}$, which can be proven through the Chapman-Kolmogorov equation.

If it rained today, the probability of rain in two days is $$ \begin{align*} P^{ (2) } =& P^{2} \\ =& \begin{bmatrix} 0.7 & 0.3 \\ 0.4 & 0.6 \end{bmatrix} \begin{bmatrix} 0.7 & 0.3 \\ 0.4 & 0.6 \end{bmatrix} \\ =& \begin{bmatrix} 0.61 & 0.39 \\ 0.52 & 0.48 \end{bmatrix} \end{align*} $$ and matches precisely with $p_{00}^{(2)} = (0.7)^2 + (0.3) (0.4) = 0.61$. Typically, our interest in problems is more complex and inclined towards the distant future, thus handling such matrices becomes essential.

Formal Definition of the Transition Probability Matrix

Note that the sum of elements in the same row of the transition probability matrix must always equal $1$, naturally, since the sum of the probabilities of all next steps equals $1$. This property can be formally represented as $\displaystyle \sum_{j} p_{ij} = 1$. Depending on where one studies stochastic processes, this property might even be accepted as a definition.