Continuous Markov Chain
Definition
A Continuous-Time Markov Chain (CTMC) refers to a continuous stochastic process $\left\{ X_{t} \right\}$ that satisfies the following conditions for all finite sequences of time points $0 \le t_{0} \le \cdots \le t_{n} \le t_{n+1}$, where the state space is a countable set: $$ P \left( X_{t_{n+1}} = j \mid X_{t_{n}} = i , X_{t_{n-1}} = k , \cdots , X_{t_{0}} = l \right) = P \left( X_{t_{n+1}} = j \mid X_{t_{n}} = i \right) $$
See Also
Explanation
Naturally derived from the concept of a Markov Chain, it must satisfy conditions for all sequences $\left\{ t_{n} \ge 0 : n = 0 , \cdots , n \right\}$ to be considered a continuous Markov chain, otherwise, it would simply be a Markov chain. The term used to contrast CTMC is Discrete-Time Markov Chain.