Markov chain

(redirected from Discrete-time Markov chain)
Also found in: Dictionary, Thesaurus, Medical, Acronyms.

Markov chain

[′mar‚kȯf ‚chān]
(mathematics)
A Markov process whose state space is finite or countably infinite.

Markov Chain

 

a concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes.

Markov chain

(probability)
(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.

References in periodicals archive ?
The discrete-time Markov chains are solved to calculate the medium access probabilities of all user priorities.
Based on the normalization condition of discrete-time Markov chains, summation of all the sates probabilities is equal to 1, which leads to the set of eight equations:
The discussion is organized into chapters dealing with introductory matters, transient behavior and discrete-time Markov chains, first passage times and discrete-time Markov chains, limiting behavior and discrete-time Markov chains, Poisson processes, continuous-time Markov chains, queuing models, renewal processes, Markov regenerative processes, and diffusion processes.
He tested the optimal forms of importance sampling for estimating state probabilities in discrete-time Markov chains.

Full browser ?