# Markov chain

Also found in: Dictionary, Thesaurus, Medical, Acronyms, Wikipedia.

## Markov chain

[′mar‚kȯf ‚chān] (mathematics)

A Markov process whose state space is finite or countably infinite.

McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.

The following article is from

*The Great Soviet Encyclopedia*(1979). It might be outdated or ideologically biased.## Markov Chain

a concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes.

The Great Soviet Encyclopedia, 3rd Edition (1970-1979). © 2010 The Gale Group, Inc. All rights reserved.

## Markov chain

(probability)(Named after Andrei Markov) A model of
sequences of events where the probability of an event
occurring depends upon the fact that a preceding event
occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

This article is provided by FOLDOC - Free Online Dictionary of Computing (

**foldoc.org**)