Markov chain

Also found in: Dictionary, Thesaurus, Medical, Acronyms, Wikipedia.

Markov chain

[′mar‚kȯf ‚chān]
A Markov process whose state space is finite or countably infinite.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
The following article is from The Great Soviet Encyclopedia (1979). It might be outdated or ideologically biased.

Markov Chain


a concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes.

The Great Soviet Encyclopedia, 3rd Edition (1970-1979). © 2010 The Gale Group, Inc. All rights reserved.

Markov chain

(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

This article is provided by FOLDOC - Free Online Dictionary of Computing (
References in periodicals archive ?
The weights for the weighted Markov chain model are computed by standardizing Kappa coefficient [[??].sub.wt] computed by the following equation:
The corresponding Markov chain model takes values in the state space S = {[G.sub.0],[G.sub.1],[G.sub.23],Y} (Figure 1(c)) and is described by the following generating matrix, [mathematical expression not reproducible]:
Markov Chains. Define a finite set [mathematical expression not reproducible], where [direct sum] represents a novel operation mark among matrices, [GAMMA] is a set corresponding to [mathematical expression not reproducible].
A Markov chain is homogeneous if all transition probabilities are independent of time,
In the case [DELTA]r = 1, the mean number [d.sub.j] of transitions needed by the Markov chain, starting from j, to reach either 1 or N can be expressed as follows:
Series expansion techniques for Markov chains go by different names in literature, including perturbation techniques, the power series algorithm, and light-traffic approximations.
Let {[Y.sub.n], n = 1,2, ..., L} be an n-order Markov chain on the finite set a>, where [Y.sub.n] is the n-indexed set of pixels obtained by a row, column, zigzag, or Hilbert scanning method.
It is also observed that the posterior variance and convergence of Markov chain can be used as a good measure of parameter accuracy in posterior statistics.
Analysis of a Markov Chain Model of a Multistage Manufacturing System with Inspection, Rejection, and Rework.
These characterizations are given in terms of subgraphs of the underlying graph of the Markov chain: For the variance-covariance matrix, we only have to consider all cycles.
The statistical method best suited for the analysis of facies transitions data in stratigraphic sections is the Markov Chain Transition Matrix.