Reducible Markov Chain

Home > Mathematics > Probability > Markov Chains > Reducible Markov Chain

A Markov chain in which it is not possible to transition from one or more states to one or more other states.