logo

Derivation of the Chapman-Kolmogorov Equation 📂Probability Theory

Derivation of the Chapman-Kolmogorov Equation

Theorem

Probability process establishes the following equations for transition probabilities $p_{ij}^{(n)}$, $p_{ij}(t)$ and transition probability matrices $P^{(n)}$, $P(t)$.

Discrete Probability Process

$$ \begin{align*} p_{ ij }^{ (n+m) } =& \sum _{ k } p_{ ik }^{ (n) } p _{ kj }^{ (m) } \\ P^{(n+m)} =& P^{(n)} P^{(m)} \end{align*} $$

Continuous Probability Process

$$ \begin{align*} p_{ij} (t + s) =& \sum _{ k } p_{ ik } \left( t \right) p _{ kj } \left( s \right) \\ P(t+s) =& P(t) P(s) \end{align*} $$

Explanation

This means that the steps $n+m$ it takes to go from state $i$ to $j$ can be broken down into $n$ and $m$. Even without proving it, intuitively thinking about the probability of taking $n$ steps from $i$ to $k$ and then $m$ steps from $k$ to $j$ should equal the probability of moving from $i$ through $k$ to $j$, and summing up these probabilities for all states $k$ will result in the probability of going from $i$ to $j$ no matter what’s in between.

Derivation

Strategy: Proof is only for discrete probability processes. Initially assume $X_0$ to be $i$, so it’s unnecessary to mention ‘starting from $i$’ afterwards. The expression inside sigma can be transitioned like multiplying both sides of conditional probability $P(A|B)=P(AB)/P(B)$ by $P(B)$ results in $P(AB)=P(A|B)P(B)$.


Assuming $ { X }_{ 0 }=i$ $$ \begin{align*} { p } _{ ij }^{ (n+m) } =& P({ X }_{ n+m }=j ) \\ =& \sum _{ k }^{ }{ P({ X }_{ m }=j , { X }_{ n }=k) } \\ =& \sum _{ k }^{ }{ P({ X }_{ m }=j | { X }_{ n }=k)P({ X }_{ n }=k) } \\ =& \sum _{ k }^{ }{ { p }_{ ik }^{ (n) } { p }_{ kj }^{ (m) } } \end{align*} $$