Bayes' theorem


Also found in: Dictionary, Thesaurus, Medical, Legal, Wikipedia.
Related to Bayes' theorem: conditional probability

Bayes' theorem

[¦bāz ′thir·əm]
(mathematics)
A theorem stating that the probability of a hypothesis, given the original data and some new data, is proportional to the probability of the hypothesis, given the original data only, and the probability of the new data, given the original data and the hypothesis. Also known as inverse probability principle.
References in periodicals archive ?
Reference 2 discusses the application of Bayes' Theorem to a horse-racing example.
Bayes' theorem tells us that the value of a piece of evidence in testing a particular assertion is determined by its likelihood ratio.
Let's analyze SPOT using Bayes' Theorem and some numerical approximations and conservative assumptions.
That being the case, Bayes' theorem tells us that when the level of statistical significance is set at P <0.
5) Bayes' theorem postulates that the probability of an event E occurrence, conditioned by hypothesis H is proportional to the product of the probability of E by the probability of H conditioned by E or, accordingly, that the probability of E conditioned by H is modified in the same direction and at the same magnitude as the probability of H conditioned by E (see de Finetti (1931a)(1931b)).
Bayes' Theorem can also be written as Equation 3 below where ([alpha]) consists of the set of (m) parameters of the cumulative (unknown) distribution [PHI]( x | [alpha] ).
In general, Bayes' theorem tells us that new information should be interpreted in light of what is already known.
When we observe the training data D, we adjust the prior density to a posterior density using Bayes' theorem (level-1 inference).
In technical terms, Bayes' Theorem states that the subjective posterior odds (odds after being exposed to new data) (32) that a hypothesis is true can be determined by multiplying the prior odds (or odds before exposure to the new data) (33) by the ratio of (1) the probability that the data would have been observed if the hypothesis were true to (2) the probability that the data would have been observed if the hypothesis were not true.
Bayes' theorem adds common sense to the maths used to work out how likely something is.
This articles discusses ways to respond to the need for criterion referenced information using Bayes' theorem, a method coupled with criterion referenced testing in the early 1970s.
Bayes' Theorem makes it possible to infer the probability of an event by examining past events and new observations, and chaining the probabilities to maximize the amount of learned information that can be applied to a problem.