Markov Chains

Home > Mathematics > Probability > Stochastic Processes > Markov Chains

A type of stochastic process that describes the evolution of a system over time in a discrete state space.