A Markov chain that is continuous in time, which means the probability of moving from one state to another is dependent upon the length of time.
A Markov chain that is continuous in time, which means the probability of moving from one state to another is dependent upon the length of time.