Markov chain

(redirected from Discrete-time Markov chain)
Also found in: Dictionary, Thesaurus, Medical, Acronyms.

Markov chain

[′mar‚kȯf ‚chān]
A Markov process whose state space is finite or countably infinite.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
The following article is from The Great Soviet Encyclopedia (1979). It might be outdated or ideologically biased.

Markov Chain


a concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes.

The Great Soviet Encyclopedia, 3rd Edition (1970-1979). © 2010 The Gale Group, Inc. All rights reserved.

Markov chain

(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

This article is provided by FOLDOC - Free Online Dictionary of Computing (
References in periodicals archive ?
The discrete-time Markov chains are solved to calculate the medium access probabilities of all user priorities.
Based on the normalization condition of discrete-time Markov chains, summation of all the sates probabilities is equal to 1, which leads to the set of eight equations:
The discussion is organized into chapters dealing with introductory matters, transient behavior and discrete-time Markov chains, first passage times and discrete-time Markov chains, limiting behavior and discrete-time Markov chains, Poisson processes, continuous-time Markov chains, queuing models, renewal processes, Markov regenerative processes, and diffusion processes.

Full browser ?