# Markov chain

(redirected from Markoff chain)
Also found in: Dictionary, Thesaurus, Medical, Wikipedia.
Related to Markoff chain: Markov process

## Markov chain

[′mar‚kȯf ‚chān]
(mathematics)
A Markov process whose state space is finite or countably infinite.

## Markov Chain

a concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes.

## Markov chain

(probability)
(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

Site: Follow: Share:
Open / Close