Continuous Markov Chain
Definition
A Continuous-Time Markov Chain (CTMC) refers to a continuous stochastic process that satisfies the following conditions for all finite sequences of time points , where the state space is a countable set:
See Also
Explanation
Naturally derived from the concept of a Markov Chain, it must satisfy conditions for all sequences to be considered a continuous Markov chain, otherwise, it would simply be a Markov chain. The term used to contrast CTMC is Discrete-Time Markov Chain.