List of probability topics
Encyclopedia
This is a list of probability topics, by Wikipedia page.
It overlaps with the (alphabetical) list of statistical topics. There are also the topic outline of probability, catalog of articles in probability theory
, list of probabilists and list of statisticians.
Random variable
Conditional probability
Probability distribution
Applied probability
Stochastic process
Gambling
Genetics
It overlaps with the (alphabetical) list of statistical topics. There are also the topic outline of probability, catalog of articles in probability theory
Catalog of articles in probability theory
This page lists articles related to Probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form , which refers to number of random variables involved and the type of the distribution. For example ...
, list of probabilists and list of statisticians.
General aspects
- ProbabilityProbabilityProbability is ordinarily used to describe an attitude of mind towards some proposition of whose truth we arenot certain. The proposition of interest is usually of the form "Will a specific event occur?" The attitude of mind is of the form "How certain are we that the event will occur?" The...
- RandomnessRandomnessRandomness has somewhat differing meanings as used in various fields. It also has common meanings which are connected to the notion of predictability of events....
, PseudorandomnessPseudorandomnessA pseudorandom process is a process that appears to be random but is not. Pseudorandom sequences typically exhibit statistical randomness while being generated by an entirely deterministic causal process...
, Quasirandom - RandomizationRandomizationRandomization is the process of making something random; this means:* Generating a random permutation of a sequence .* Selecting a random sample of a population ....
, hardware random number generatorHardware random number generatorIn computing, a hardware random number generator is an apparatus that generates random numbers from a physical process. Such devices are often based on microscopic phenomena that generate a low-level, statistically random "noise" signal, such as thermal noise or the photoelectric effect or other... - Random number generator
- Random sequenceRandom sequenceThe concept of a random sequence is essential in probability theory and statistics. The concept generally relies on the notion of a sequence of random variables and many statistical discussions begin with the words "let X1,...,Xn be independent random variables...". Yet as D. H. Lehmer stated in...
- Coin flippingCoin flippingCoin flipping or coin tossing or heads or tails is the practice of throwing a coin in the air to choose between two alternatives, sometimes to resolve a dispute between two parties...
/tossing; checking if a coin is biased - UncertaintyUncertaintyUncertainty is a term used in subtly different ways in a number of fields, including physics, philosophy, statistics, economics, finance, insurance, psychology, sociology, engineering, and information science...
- Statistical dispersionStatistical dispersionIn statistics, statistical dispersion is variability or spread in a variable or a probability distribution...
- Observational errorObservational errorObservational error is the difference between a measured value of quantity and its true value. In statistics, an error is not a "mistake". Variability is an inherent part of things being measured and of the measurement process.-Science and experiments:...
- EquiprobableEquiprobableEquiprobability is a philosophical concept in probability theory that allows one to assign equal probabilities to outcomes when they are judged to be equipossible or to be "equally likely" in some sense...
- EquipossibleEquipossibleEquipossibility is a philosophical concept in possibility theory that is a precursor to the notion of equiprobability in probability theory. It is used to distinguish what can occur in a probability experiment...
- Equipossible
- AverageAverageIn mathematics, an average, or central tendency of a data set is a measure of the "middle" value of the data set. Average is one form of central tendency. Not all central tendencies should be considered definitions of average....
- Probability interpretationsProbability interpretationsThe word probability has been used in a variety of ways since it was first coined in relation to games of chance. Does probability measure the real, physical tendency of something to occur, or is it just a measure of how strongly one believes it will occur? In answering such questions, we...
- Markovian
- Statistical regularityStatistical regularityStatistical regularity is a notion in statistics and probability theory that random events exhibit regularity when repeated enough times or that enough sufficiently similar random events exhibit regularity...
- Central tendencyCentral tendencyIn statistics, the term central tendency relates to the way in which quantitative data is clustered around some value. A measure of central tendency is a way of specifying - central value...
- Bean machineBean machineThe bean machine, also known as the quincunx or Galton box, is a device invented by Sir Francis Galton to demonstrate the central limit theorem, in particular that the normal distribution is approximate to the binomial distribution....
- Relative frequency
- Frequency probabilityFrequency probabilityFrequency probability is the interpretation of probability that defines an event's probability as the limit of its relative frequency in a large number of trials. The development of the frequentist account was motivated by the problems and paradoxes of the previously dominant viewpoint, the...
- Maximum likelihoodMaximum likelihoodIn statistics, maximum-likelihood estimation is a method of estimating the parameters of a statistical model. When applied to a data set and given a statistical model, maximum-likelihood estimation provides estimates for the model's parameters....
- Bayesian probabilityBayesian probabilityBayesian probability is one of the different interpretations of the concept of probability and belongs to the category of evidential probabilities. The Bayesian interpretation of probability can be seen as an extension of logic that enables reasoning with propositions, whose truth or falsity is...
- Principle of indifferencePrinciple of indifferenceThe principle of indifference is a rule for assigning epistemic probabilities.Suppose that there are n > 1 mutually exclusive and collectively exhaustive possibilities....
- Cox's theoremCox's theoremCox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability. As the laws of probability derived by Cox's theorem are applicable to...
- Principle of maximum entropyPrinciple of maximum entropyIn Bayesian probability, the principle of maximum entropy is a postulate which states that, subject to known constraints , the probability distribution which best represents the current state of knowledge is the one with largest entropy.Let some testable information about a probability distribution...
- Information entropyInformation entropyIn information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
- Urn problemUrn problemIn probability and statistics, an urn problem is an idealized mental exercise in which some objects of real interest are represented as colored balls in an urn or other container....
s - Extractor
- Aleatoric, aleatoric musicAleatoric musicAleatoric music is music in which some element of the composition is left to chance, and/or some primary element of a composed work's realization is left to the determination of its performer...
- Free probabilityFree probabilityFree probability is a mathematical theory that studies non-commutative random variables. The "freeness" or free independence property is the analogue of the classical notion of independence, and it is connected with free products....
- Exotic probabilityExotic probabilityExotic probability is a branch of probability theory that deals with probabilities which are outside the normal range of [0, 1]. The most common author of papers on exotic probability theory is Saul Youssef...
- Schrödinger methodSchrödinger methodIn combinatorial mathematics and probability theory, the Schrödinger method, named after the Austrian physicist Erwin Schrödinger, is used to solve some problems of distribution and occupancy.SupposeX_1,\dots,X_n \,...
- Empirical measureEmpirical measureIn probability theory, an empirical measure is a random measure arising from a particular realization of a sequence of random variables. The precise definition is found below. Empirical measures are relevant to mathematical statistics....
- Glivenko–Cantelli theorem
- Zero-one lawZero-one lawIn probability theory, a zero-one law is a result that states that an event must have probability 0 or 1 and no intermediate value. Sometimes, the statement is that the limit of certain probabilities must be 0 or 1.It may refer to:...
- Kolmogorov's zero-one lawKolmogorov's zero-one lawIn probability theory, Kolmogorov's zero-one law, named in honor of Andrey Nikolaevich Kolmogorov, specifies that a certain type of event, called a tail event, will either almost surely happen or almost surely not happen; that is, the probability of such an event occurring is zero or one.Tail...
- Hewitt–Savage zero-one law
- Kolmogorov's zero-one law
- Law of Truly Large NumbersLaw of Truly Large NumbersThe law of truly large numbers, attributed to Persi Diaconis and Frederick Mosteller, states that with a sample size large enough, any outrageous thing is likely to happen. Because we never find it notable when likely events occur, we highlight unlikely events and notice them more...
- Littlewood's lawLittlewood's lawLittlewood's Law states that individuals can expect a "miracle" to happen to them at the rate of about one per month.-History:The law was framed by Cambridge University Professor J. E...
- Infinite monkey theoremInfinite monkey theoremThe infinite monkey theorem states that a monkey hitting keys at random on a typewriter keyboard for an infinite amount of time will almost surely type a given text, such as the complete works of William Shakespeare....
- Littlewood's law
- Littlewood–Offord problem
- Inclusion-exclusion principleInclusion-exclusion principleIn combinatorics, the inclusion–exclusion principle is an equation relating the sizes of two sets and their union...
- Impossible eventImpossible eventIn the mathematics of probability, an impossible event is an event A with probability zero, or Pr = 0. See in particular almost surely.An impossible event is not the same as the stronger concept of logical impossibility...
- Information geometryInformation geometryInformation geometry is a branch of mathematics that applies the techniques of differential geometry to the field of probability theory. It derives its name from the fact that the Fisher information is used as the Riemannian metric when considering the geometry of probability distribution families...
- Talagrand's concentration inequalityTalagrand's concentration inequalityIn probability theory, Talagrand's concentration inequality, is an isoperimetric-type inequality for product probability spaces. It was first proved by the French mathematician Michel Talagrand...
Foundations of probability theory
- Probability theoryProbability theoryProbability theory is the branch of mathematics concerned with analysis of random phenomena. The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single...
- Probability spaceProbability spaceIn probability theory, a probability space or a probability triple is a mathematical construct that models a real-world process consisting of states that occur randomly. A probability space is constructed with a specific kind of situation or experiment in mind...
- Sample space
- Standard probability spaceStandard probability spaceIn probability theory, a standard probability space is a probability space satisfying certain assumptions introduced by Vladimir Rokhlin in 1940...
- Random elementRandom elementIn probability theory, random element is a generalization of the concept of random variable to more complicated spaces than the simple real line...
- Random compact setRandom compact setIn mathematics, a random compact set is essentially a compact set-valued random variable. Random compact sets are useful in the study of attractors for random dynamical systems.-Definition:...
- Random compact set
- Dynkin systemDynkin systemA Dynkin system, named after Eugene Dynkin, is a collection of subsets of another universal set \Omega satisfying a set of axioms weaker than those of σ-algebra. Dynkin systems are sometimes referred to as λ-systems or d-system...
- Probability axiomsProbability axiomsIn probability theory, the probability P of some event E, denoted P, is usually defined in such a way that P satisfies the Kolmogorov axioms, named after Andrey Kolmogorov, which are described below....
- Normalizing constantNormalizing constantThe concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics.-Definition and examples:In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g.,...
- Event (probability theory)Event (probability theory)In probability theory, an event is a set of outcomes to which a probability is assigned. Typically, when the sample space is finite, any subset of the sample space is an event...
- Complementary eventComplementary eventIn probability theory, the complement of any event A is the event [not A], i.e. the event that A does not occur. The event A and its complement [not A] are mutually exclusive and exhaustive. Generally, there is only one event B such that A and B are both mutually exclusive and...
- Complementary event
- Elementary eventElementary eventIn probability theory, an elementary event or atomic event is a singleton of a sample space. An outcome is an element of a sample space. An elementary event is a set containing exactly one outcome, not the outcome itself...
- Mutually exclusiveMutually exclusiveIn layman's terms, two events are mutually exclusive if they cannot occur at the same time. An example is tossing a coin once, which can result in either heads or tails, but not both....
- Boole's inequalityBoole's inequalityIn probability theory, Boole's inequality, also known as the union bound, says that for any finite or countable set of events, the probability that at least one of the events happens is no greater than the sum of the probabilities of the individual events...
- Probability density functionProbability density functionIn probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...
- Cumulative distribution functionCumulative distribution functionIn probability theory and statistics, the cumulative distribution function , or just distribution function, describes the probability that a real-valued random variable X with a given probability distribution will be found at a value less than or equal to x. Intuitively, it is the "area so far"...
- Law of total cumulanceLaw of total cumulanceIn probability theory and mathematical statistics, the law of total cumulance is a generalization to cumulants of the law of total probability, the law of total expectation, and the law of total variance. It has applications in the analysis of time series...
- Law of total expectationLaw of total expectationThe proposition in probability theory known as the law of total expectation, the law of iterated expectations, the tower rule, the smoothing theorem, among other names, states that if X is an integrable random variable The proposition in probability theory known as the law of total expectation, ...
- Law of total probabilityLaw of total probabilityIn probability theory, the law of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.-Statement:The law of total probability is the proposition that if \left\...
- Law of total varianceLaw of total varianceIn probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then...
- Almost surelyAlmost surelyIn probability theory, one says that an event happens almost surely if it happens with probability one. The concept is analogous to the concept of "almost everywhere" in measure theory...
- Cox's theoremCox's theoremCox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability. As the laws of probability derived by Cox's theorem are applicable to...
- Bayesianism
- Prior probabilityPrior probabilityIn Bayesian statistical inference, a prior probability distribution, often called simply the prior, of an uncertain quantity p is the probability distribution that would express one's uncertainty about p before the "data"...
- Posterior probabilityPosterior probabilityIn Bayesian statistics, the posterior probability of a random event or an uncertain proposition is the conditional probability that is assigned after the relevant evidence is taken into account...
- Borel's paradoxBorel's paradoxIn probability theory, the Borel–Kolmogorov paradox is a paradox relating to conditional probability with respect to an event of probability zero...
- Bertrand's paradoxBertrand's paradox (probability)The Bertrand paradox is a problem within the classical interpretation of probability theory. Joseph Bertrand introduced it in his work Calcul des probabilités as an example to show that probabilities may not be well defined if the mechanism or method that produces the random variable is not...
- Coherence (philosophical gambling strategy)Coherence (philosophical gambling strategy)In a thought experiment proposed by the Italian probabilist Bruno de Finetti in order to justify Bayesian probability, an array of wagers is coherent precisely if it does not expose the wagerer to certain loss regardless of the outcomes of events on which he is wagering, even if his opponent makes...
- Dutch bookDutch bookIn gambling a Dutch book or lock is a set of odds and bets which guarantees a profit, regardless of the outcome of the gamble. It is associated with probabilities implied by the odds not being coherent....
- Algebra of random variablesAlgebra of random variablesIn the algebraic axiomatization of probability theory, the primary concept is not that of probability of an event, but rather that of a random variable. Probability distributions are determined by assigning an expectation to each random variable...
- Belief propagationBelief propagationBelief propagation is a message passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields. It calculates the marginal distribution for each unobserved node, conditional on any observed nodes...
- Transferable belief modelTransferable belief modelThe transferable belief model is an elaboration on the Dempster-Shafer theory of evidence.-Context:Consider the following classical problem of information fusion. A patient has an illness that can be caused by three different factors A, B and C...
- Dempster-Shafer theoryDempster-Shafer theoryThe Dempster–Shafer theory is a mathematical theory of evidence. It allows one to combine evidence from different sources and arrive at a degree of belief that takes into account all the available evidence. The theory was first developed by Arthur P...
- Possibility theoryPossibility theoryPossibility theory is a mathematical theory for dealing with certain types of uncertainty and is an alternative to probability theory. Professor Lotfi Zadeh first introduced possibility theory in 1978 as an extension of his theory of fuzzy sets and fuzzy logic. D. Dubois and H. Prade further...
Random variableRandom variableIn probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
s
- Discrete random variable
- Probability mass functionProbability mass functionIn probability theory and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value...
- Probability mass function
- Constant random variable
- Expected valueExpected valueIn probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...
- Jensen's inequalityJensen's inequalityIn mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906. Given its generality, the inequality appears in many forms depending on the context,...
- Jensen's inequality
- VarianceVarianceIn probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...
- Standard deviationStandard deviationStandard deviation is a widely used measure of variability or diversity used in statistics and probability theory. It shows how much variation or "dispersion" there is from the average...
- Geometric standard deviationGeometric standard deviationIn probability theory and statistics, the geometric standard deviation describes how spread out are a set of numbers whose preferred average is the geometric mean...
- Standard deviation
- Multivariate random variableMultivariate random variableIn mathematics, probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose values is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value.More formally, a multivariate random...
- Joint probability distribution
- Marginal distributionMarginal distributionIn probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. The term marginal variable is used to refer to those variables in the subset of variables being retained...
- Kirkwood approximationKirkwood approximationThe Kirkwood superposition approximation was introduced by Matsuda as a means of representing a discrete probability distribution. The name apparently refers to a 1942 paper by John G. Kirkwood...
- Independent identically-distributed random variables
- Independent and identically-distributed random variables
- Statistical independenceStatistical independenceIn probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs...
- Conditional independenceConditional independenceIn probability theory, two events R and B are conditionally independent given a third event Y precisely if the occurrence or non-occurrence of R and the occurrence or non-occurrence of B are independent events in their conditional probability distribution given Y...
- Pairwise independencePairwise independenceIn probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent. Any collection of mutually independent random variables is pairwise independent, but some pairwise independent collections are not mutually independent...
- CovarianceCovarianceIn probability theory and statistics, covariance is a measure of how much two variables change together. Variance is a special case of the covariance when the two variables are identical.- Definition :...
- Covariance matrixCovariance matrixIn probability theory and statistics, a covariance matrix is a matrix whose element in the i, j position is the covariance between the i th and j th elements of a random vector...
- De Finetti's theoremDe Finetti's theoremIn probability theory, de Finetti's theorem explains why exchangeable observations are conditionally independent given some latent variable to which an epistemic probability distribution would then be assigned...
- Conditional independence
- CorrelationCorrelationIn statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence....
- UncorrelatedUncorrelatedIn probability theory and statistics, two real-valued random variables are said to be uncorrelated if their covariance is zero. Uncorrelatedness is by definition pairwise; i.e...
- Correlation functionCorrelation functionA correlation function is the correlation between random variables at two different points in space or time, usually as a function of the spatial or temporal distance between the points...
- Uncorrelated
- Canonical correlationCanonical correlationIn statistics, canonical correlation analysis, introduced by Harold Hotelling, is a way of making sense of cross-covariance matrices. If we have two sets of variables, x_1, \dots, x_n and y_1, \dots, y_m, and there are correlations among the variables, then canonical correlation analysis will...
- Convergence of random variablesConvergence of random variablesIn probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to statistics and stochastic processes...
- Weak convergence of measures
- Helly–Bray theoremHelly–Bray theoremIn probability theory, the Helly–Bray theorem relates the weak convergence of cumulative distribution functions to the convergence of expectations of certain measurable functions. It is named after Eduard Helly and Hubert Evelyn Bray....
- Slutsky's theoremSlutsky's theoremIn probability theory, Slutsky’s theorem extends some properties of algebraic operations on convergent sequences of real numbers to sequences of random variables.The theorem was named after Eugen Slutsky. Slutsky’s theorem is also attributed to Harald Cramér....
- Helly–Bray theorem
- Skorokhod's representation theoremSkorokhod's representation theoremIn mathematics and statistics, Skorokhod's representation theorem is a result that shows that a weakly convergent sequence of probability measures whose limit measure is sufficiently well-behaved can be represented as the distribution/law of a pointwise convergent sequence of random variables...
- Lévy's continuity theoremLévy's continuity theoremIn probability theory, the Lévy’s continuity theorem, named after the French mathematician Paul Lévy, connects convergence in distribution of the sequence of random variables with pointwise convergence of their characteristic functions...
- Uniform integrability
- Weak convergence of measures
- Markov's inequalityMarkov's inequalityIn probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant...
- Chebyshev's inequalityChebyshev's inequalityIn probability theory, Chebyshev’s inequality guarantees that in any data sample or probability distribution,"nearly all" values are close to the mean — the precise statement being that no more than 1/k2 of the distribution’s values can be more than k standard deviations away from the mean...
= Chernoff boundChernoff boundIn probability theory, the Chernoff bound, named after Herman Chernoff, gives exponentially decreasing bounds on tail distributions of sums of independent random variables... - Chernoff's inequality
- Bernstein inequalities (probability theory)
- Hoeffding's inequalityHoeffding's inequalityIn probability theory, Hoeffding's inequality provides an upper bound on the probability for the sum of random variables to deviate from its expected value. Hoeffding's inequality was proved by Wassily Hoeffding.LetX_1, \dots, X_n \!...
- Hoeffding's inequality
- Kolmogorov's inequalityKolmogorov's inequalityIn probability theory, Kolmogorov's inequality is a so-called "maximal inequality" that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound...
- Etemadi's inequalityEtemadi's inequalityIn probability theory, Etemadi's inequality is a so-called "maximal inequality", an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound...
- Khintchine inequalityKhintchine inequalityIn mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis...
- Paley–Zygmund inequalityPaley–Zygmund inequalityIn mathematics, the Paley–Zygmund inequality bounds theprobability that a positive random variable is small, in terms ofits mean and variance...
- Laws of large numbers
- Asymptotic equipartition propertyAsymptotic equipartition propertyIn information theory the asymptotic equipartition property is a general property of the output samples of a stochastic source. It is fundamental to the concept of typical set used in theories of compression....
- Typical setTypical setIn information theory, the typical set is a set of sequences whose probability is close to two raised to the negative power of the entropy of their source distribution. That this set has total probability close to one is a consequence of the asymptotic equipartition property which is a kind of law...
- Law of large numbersLaw of large numbersIn probability theory, the law of large numbers is a theorem that describes the result of performing the same experiment a large number of times...
- Asymptotic equipartition property
- Random fieldRandom fieldA random field is a generalization of a stochastic process such that the underlying parameter need no longer be a simple real or integer valued "time", but can instead take values that are multidimensional vectors, or points on some manifold....
- Conditional random fieldConditional random fieldA conditional random field is a statistical modelling method often applied in pattern recognition.More specifically it is a type of discriminative undirected probabilistic graphical model. It is used to encode known relationships between observations and construct consistent interpretations...
- Conditional random field
- Borel-Cantelli lemmaBorel-Cantelli lemmaIn probability theory, the Borel–Cantelli lemma is a theorem about sequences of events. In general, it is a result in measure theory. It is named after Émile Borel and Francesco Paolo Cantelli...
- Wick productWick productIn probability theory, the Wick product\langle X_1,\dots,X_k \rangle\,named after physicist Gian-Carlo Wick, is a sort of product of the random variables, X1, ..., Xk, defined recursively as follows:\langle \rangle = 1\,...
Conditional probabilityConditional probabilityIn probability theory, the "conditional probability of A given B" is the probability of A if B is known to occur. It is commonly notated P, and sometimes P_B. P can be visualised as the probability of event A when the sample space is restricted to event B...
- Conditioning (probability)Conditioning (probability)Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations and conditional distributions are treated on three levels: discrete probabilities, probability density functions, and measure theory...
- Conditional expectationConditional expectationIn probability theory, a conditional expectation is the expected value of a real random variable with respect to a conditional probability distribution....
- Conditional probability distribution
- Regular conditional probabilityRegular conditional probabilityRegular conditional probability is a concept that has developed to overcome certain difficulties in formally defining conditional probabilities for continuous probability distributions...
- Disintegration theoremDisintegration theoremIn mathematics, the disintegration theorem is a result in measure theory and probability theory. It rigorously defines the idea of a non-trivial "restriction" of a measure to a measure zero subset of the measure space in question. It is related to the existence of conditional probability measures...
- Bayes' theoremBayes' theoremIn probability theory and applications, Bayes' theorem relates the conditional probabilities P and P. It is commonly used in science and engineering. The theorem is named for Thomas Bayes ....
- de Finetti's theoremDe Finetti's theoremIn probability theory, de Finetti's theorem explains why exchangeable observations are conditionally independent given some latent variable to which an epistemic probability distribution would then be assigned...
- Exchangeable random variables
- Rule of successionRule of successionIn probability theory, the rule of succession is a formula introduced in the 18th century by Pierre-Simon Laplace in the course of treating the sunrise problem....
- Conditional independenceConditional independenceIn probability theory, two events R and B are conditionally independent given a third event Y precisely if the occurrence or non-occurrence of R and the occurrence or non-occurrence of B are independent events in their conditional probability distribution given Y...
- Conditional event algebraConditional event algebraA conditional event algebra is an algebraic structure whose domain consists of logical objects described by statements of forms such as "If A, then B," "B, given A," and "B, in case A." Unlike the standard Boolean algebra of events, a CEA allows the defining of a probability function, P, which...
- Goodman-Nguyen-van Fraassen algebraGoodman-Nguyen-van Fraassen algebraA Goodman–Nguyen–van Fraassen algebra is a type of conditional event algebra that embeds the standard Boolean algebra of unconditional events in a larger algebra which is itself Boolean...
- Goodman-Nguyen-van Fraassen algebra
Probability distributionProbability distributionIn probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....
s
- Probability distribution functionProbability distribution functionDepending upon which text is consulted, a probability distribution function is any of:* a probability distribution function,* a cumulative distribution function,* a probability mass function, or* a probability density function....
- QuantileQuantileQuantiles are points taken at regular intervals from the cumulative distribution function of a random variable. Dividing ordered data into q essentially equal-sized data subsets is the motivation for q-quantiles; the quantiles are the data values marking the boundaries between consecutive subsets...
- Moment (mathematics)Moment (mathematics)In mathematics, a moment is, loosely speaking, a quantitative measure of the shape of a set of points. The "second moment", for example, is widely used and measures the "width" of a set of points in one dimension or in higher dimensions measures the shape of a cloud of points as it could be fit by...
- Moment about the mean
- Standardized moment
- SkewnessSkewnessIn probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. The skewness value can be positive or negative, or even undefined...
- KurtosisKurtosisIn probability theory and statistics, kurtosis is any measure of the "peakedness" of the probability distribution of a real-valued random variable...
- Locality
- Skewness
- CumulantCumulantIn probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have...
- Factorial moment
- Expected valueExpected valueIn probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...
- Law of the unconscious statisticianLaw of the unconscious statisticianIn probability theory and statistics, the law of the unconscious statistician is a theorem used to calculate the expected value of a function g of a random variable X when one knows the probability distribution of X but one does not explicitly know the distribution of g.The form of the law can...
- Law of the unconscious statistician
- Second moment methodSecond moment methodIn mathematics, the second moment method is a technique used in probability theory and analysis to show that a random variable has positive probability of being positive...
- VarianceVarianceIn probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...
- Coefficient of variationCoefficient of variationIn probability theory and statistics, the coefficient of variation is a normalized measure of dispersion of a probability distribution. It is also known as unitized risk or the variation coefficient. The absolute value of the CV is sometimes known as relative standard deviation , which is...
- Variance-to-mean ratio
- Coefficient of variation
- Covariance functionCovariance functionIn probability theory and statistics, covariance is a measure of how much two variables change together and the covariance function describes the variance of a random variable process or field...
- An inequality on location and scale parameters
- Taylor expansions for the moments of functions of random variablesTaylor expansions for the moments of functions of random variablesIn probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite...
- Moment problem
- Hamburger moment problem
- Carleman's conditionCarleman's conditionIn mathematics, Carleman's condition is a sufficient condition for the determinacy of the moment problem.-Hamburger moment problem:For the Hamburger moment problem, the theorem, proved by Torsten Carleman, states the following:...
- Carleman's condition
- Hausdorff moment problem
- Trigonometric moment problem
- Stieltjes moment problem
- Hamburger moment problem
- Prior probability distribution
- Total variation distanceTotal variationIn mathematics, the total variation identifies several slightly different concepts, related to the structure of the codomain of a function or a measure...
- Hellinger distanceHellinger distanceIn probability and statistics, the Hellinger distance is used to quantify the similarity between two probability distributions. It is a type of f-divergence...
- Wasserstein metricWasserstein metricIn mathematics, the Wasserstein metric is a distance function defined between probability distributions on a given metric space M....
- Lévy–Prokhorov metric
- Lévy metricLévy metricIn mathematics, the Lévy metric is a metric on the space of cumulative distribution functions of one-dimensional random variables. It is a special case of the Lévy–Prokhorov metric, and is named after the French mathematician Paul Pierre Lévy.-Definition:...
- Lévy metric
- Continuity correctionContinuity correctionIn probability theory, if a random variable X has a binomial distribution with parameters n and p, i.e., X is distributed as the number of "successes" in n independent Bernoulli trials with probability p of success on each trial, then...
- Heavy-tailed distributionHeavy-tailed distributionIn probability theory, heavy-tailed distributions are probability distributions whose tails are not exponentially bounded: that is, they have heavier tails than the exponential distribution...
- Truncated distributionTruncated distributionIn statistics, a truncated distribution is a conditional distribution that results from restricting the domain of some other probability distribution. Truncated distributions arise in practical statistics in cases where the ability to record, or even to know about, occurrences is limited to values...
- Infinite divisibilityInfinite divisibilityThe concept of infinite divisibility arises in different ways in philosophy, physics, economics, order theory , and probability theory...
- Stability (probability)Stability (probability)In probability theory, the stability of a random variable is the property that a linear combination of two independent copies of the variable has the same distribution, up to location and scale parameters. The distributions of random variables having this property are said to be "stable...
- Indecomposable distributionIndecomposable distributionIn probability theory, an indecomposable distribution is a probability distribution that cannot be represented as the distribution of the sum of two or more non-constant independent random variables: Z ≠ X + Y. If it can be so expressed, it is decomposable:...
- Power lawPower lawA power law is a special kind of mathematical relationship between two quantities. When the frequency of an event varies as a power of some attribute of that event , the frequency is said to follow a power law. For instance, the number of cities having a certain population size is found to vary...
- Anderson's theoremAnderson's theoremIn mathematics, Anderson's theorem is a result in real analysis and geometry which says that the integral of an integrable, symmetric, unimodal, non-negative function f over an n-dimensional convex body K does not decrease if K is translated inwards towards the origin...
Discrete probability distributions
- Bose-Einstein statistics
- Fermi-Dirac statisticsFermi-Dirac statisticsFermi–Dirac statistics is a part of the science of physics that describes the energies of single particles in a system comprising many identical particles that obey the Pauli Exclusion Principle...
- Bernoulli distribution
- Bernoulli trialBernoulli trialIn the theory of probability and statistics, a Bernoulli trial is an experiment whose outcome is random and can be either of two possible outcomes, "success" and "failure"....
- Bernoulli trial
- Binomial distribution
- Binomial probabilityBinomial probabilityBinomial probability typically deals with the probability of several successive decisions, each of which has two possible outcomes.- Definition :...
- Binomial probability
- Coupon collector's problemCoupon collector's problemIn probability theory, the coupon collector's problem describes the "collect all coupons and win" contests. It asks the following question: Suppose that there are n coupons, from which coupons are being collected with replacement...
- Degenerate distribution
- Dirichlet distribution
- Geometric distribution
- Graphical modelGraphical modelA graphical model is a probabilistic model for which a graph denotes the conditional independence structure between random variables. They are commonly used in probability theory, statistics—particularly Bayesian statistics—and machine learning....
- Hypergeometric distribution
- Maxwell-Boltzmann statistics
- Multinomial distribution
- Negative binomial distributionNegative binomial distributionIn probability theory and statistics, the negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of Bernoulli trials before a specified number of failures occur...
- Negative hypergeometric distribution
- Poisson distributionPoisson distributionIn probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since...
- Compound Poisson distributionCompound Poisson distributionIn probability theory, a compound Poisson distribution is the probability distribution of the sum of a "Poisson-distributed number" of independent identically-distributed random variables...
- Compound Poisson distribution
- Poisson binomial distribution
- (a,b,0) class of distributions(a,b,0) class of distributionsIn probability theory, the distribution of a discrete random variable N is said to be a member of the class of distributions if its probability mass function obeyswhere p_k = P ....
Continuous probability distributions
- Beta distribution
- Cantor distribution
- Cauchy distributionCauchy distributionThe Cauchy–Lorentz distribution, named after Augustin Cauchy and Hendrik Lorentz, is a continuous probability distribution. As a probability distribution, it is known as the Cauchy distribution, while among physicists, it is known as the Lorentz distribution, Lorentz function, or Breit–Wigner...
- Chi-squared distribution
- Erlang distribution
- Exponential distributionExponential distributionIn probability theory and statistics, the exponential distribution is a family of continuous probability distributions. It describes the time between events in a Poisson process, i.e...
- Exponential familyExponential familyIn probability and statistics, an exponential family is an important class of probability distributions sharing a certain form, specified below. This special form is chosen for mathematical convenience, on account of some useful algebraic properties, as well as for generality, as exponential...
- Exponential family
- F-distribution
- Fisher-Tippett distribution
- Gamma distribution
- Lévy distribution
- Normal distribution, also called the Gaussian distribution
- Error functionError functionIn mathematics, the error function is a special function of sigmoid shape which occurs in probability, statistics and partial differential equations...
- Multivariate normal distribution
- Matrix normal distributionMatrix normal distributionThe matrix normal distribution is a probability distribution that is a generalization of the normal distribution to matrix-valued random variables.- Definition :...
- Integration of the normal density function
- Cramér's theoremCramér's theoremIn mathematical statistics, Cramér's theorem is one of several theorems of Harald Cramér, a Swedish statistician and probabilist.- Normal random variables :...
(first part)
- Error function
- MemorylessnessMemorylessnessIn probability and statistics, memorylessness is a property of certain probability distributions: the exponential distributions of non-negative real numbers and the geometric distributions of non-negative integers....
- Pareto distribution
- Phase-type distributionPhase-type distributionA phase-type distribution is a probability distribution that results from a system of one or more inter-related Poisson processes occurring in sequence, or phases. The sequence in which each of the phases occur may itself be a stochastic process. The distribution can be represented by a random...
- Stable distributions
- Student's t-distribution
- Triangular distribution
- Weibull distribution
- Wigner semicircle distribution
- Wishart distribution
- Zeta distribution
Properties of probability distributions
- Central limit theoremCentral limit theoremIn probability theory, the central limit theorem states conditions under which the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed. The central limit theorem has a number of variants. In its common...
- Illustration of the central limit theoremIllustration of the central limit theoremThis article gives two concrete illustrations of the central limit theorem. Both involve the sum of independent and identically-distributed random variables and show how the probability distribution of the sum approaches the normal distribution as the number of terms in the sum increases.The first...
- Concrete illustration of the central limit theorem
- Berry-Esséen theorem
- Berry–Esséen theoremBerry–Esséen theoremThe central limit theorem in probability theory and statistics states that under certain circumstances the sample mean, considered as a random quantity, becomes more normally distributed as the sample size is increased...
- De Moivre–Laplace theoremDe Moivre–Laplace theoremIn probability theory, the de Moivre–Laplace theorem is a normal approximation to the binomial distribution. It is a special case of the central limit theorem...
- Lyapunov's central limit theorem
- Martingale central limit theoremMartingale central limit theoremIn probability theory, the central limit theorem says that, under certain conditions, the sum of many independent identically-distributed random variables, when scaled appropriately, converges in distribution to a standard normal distribution...
- Infinite divisibility (probability)Infinite divisibility (probability)The concepts of infinite divisibility and the decomposition of distributions arise in probability and statistics in relation to seeking families of probability distributions that might be a natural choice in certain applications, in the same way that the normal distribution is...
- Method of moments (probability theory)
- Stability (probability)Stability (probability)In probability theory, the stability of a random variable is the property that a linear combination of two independent copies of the variable has the same distribution, up to location and scale parameters. The distributions of random variables having this property are said to be "stable...
- Stein's lemmaStein's lemmaStein's lemma, named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its applications to statistical inference — in particular, to James–Stein estimation and empirical Bayes methods — and its applications to portfolio choice...
- Illustration of the central limit theorem
- Characteristic function (probability theory)Characteristic function (probability theory)In probability theory and statistics, the characteristic function of any random variable completely defines its probability distribution. Thus it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative...
- Lévy continuity theoremLévy continuity theoremIn probability theory, the Lévy’s continuity theorem, named after the French mathematician Paul Lévy, connects convergence in distribution of the sequence of random variables with pointwise convergence of their characteristic functions...
- Lévy continuity theorem
- Edgeworth seriesEdgeworth seriesThe Gram–Charlier A series , and the Edgeworth series are series that approximate a probability distribution in terms of its cumulants...
- Helly-Bray theorem
- Location parameterLocation parameterIn statistics, a location family is a class of probability distributions that is parametrized by a scalar- or vector-valued parameter μ, which determines the "location" or shift of the distribution...
- Maxwell's theoremMaxwell's theoremIn probability theory, Maxwell's theorem, named in honor of James Clerk Maxwell, states that if the probability distribution of a vector-valued random variable X = T is the same as the distribution of GX for every n×n orthogonal matrix G and the components are independent, then the components...
- Moment-generating functionMoment-generating functionIn probability theory and statistics, the moment-generating function of any random variable is an alternative definition of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or...
- Negative probabilityNegative probabilityIn 1942, Paul Dirac wrote a paper "The Physical Interpretation of Quantum Mechanics" where he introduced the concept of negative energies and negative probabilities:...
- Probability-generating functionProbability-generating functionIn probability theory, the probability-generating function of a discrete random variable is a power series representation of the probability mass function of the random variable...
- Vysochanskiï-Petunin inequalityVysochanskiï-Petunin inequalityIn probability theory, the Vysochanskij–Petunin inequality gives a lower bound for the probability that a random variable with finite variance lies within a certain number of standard deviations of the variable's mean, or equivalently an upper bound for the probability that it lies further away....
- Mutual informationMutual informationIn probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables...
- Kullback-Leibler divergence
- Normally distributed and uncorrelated does not imply independentNormally distributed and uncorrelated does not imply independentIn probability theory, two random variables being uncorrelated does not imply their independence. In some contexts, uncorrelatedness implies at least pairwise independence ....
- Le Cam's theoremLe Cam's theoremIn probability theory, Le Cam's theorem, named after Lucien le Cam , is as follows.Suppose:* X1, ..., Xn are independent random variables, each with a Bernoulli distribution , not necessarily identically distributed.* Pr = pi for i = 1, 2, 3, ...* \lambda_n = p_1 + \cdots + p_n.\,* S_n = X_1...
- Large deviations theoryLarge deviations theoryIn probability theory, the theory of large deviations concerns the asymptotic behaviour of remote tails of sequences of probability distributions. Some basic ideas of the theory can be tracked back to Laplace and Cramér, although a clear unified formal definition was introduced in 1966 by Varadhan...
- Contraction principle (large deviations theory)Contraction principle (large deviations theory)In mathematics — specifically, in large deviations theory — the contraction principle is a theorem that states how a large deviation principle on one space "pushes forward" to a large deviation principle on another space via a continuous function.-Statement of the theorem:Let X and Y be...
- Varadhan's lemmaVaradhan's lemmaIn mathematics, Varadhan's lemma is a result in large deviations theory named after S. R. Srinivasa Varadhan. The result gives information on the asymptotic distribution of a statistic φ of a family of random variables Zε as ε becomes small in terms of a rate function for the variables.-Statement...
- Tilted large deviation principleTilted large deviation principleIn mathematics — specifically, in large deviations theory — the tilted large deviation principle is a result that allows one to generate a new large deviation principle from an old one by "tilting", i.e. integration against an exponential functional...
- Rate functionRate functionIn mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. It is required to have several "nice" properties which assist in the formulation of the large deviation principle...
- Laplace principle (large deviations theory)Laplace principle (large deviations theory)In mathematics, Laplace's principle is a basic theorem in large deviations theory, similar to Varadhan's lemma. It gives an asymptotic expression for the Lebesgue integral of exp over a fixed set A as θ becomes large...
- Exponentially equivalent measuresExponentially equivalent measuresIn mathematics, the notion of exponential equivalence of measures is a concept that describes how two sequences or families of probability measures are “the same” from the point of view of large deviations theory.-Definition:...
- Cramér's theoremCramér's theoremIn mathematical statistics, Cramér's theorem is one of several theorems of Harald Cramér, a Swedish statistician and probabilist.- Normal random variables :...
(second part)
- Contraction principle (large deviations theory)
Applied probabilityApplied probabilityMuch research involving probability is done under the auspices of applied probability, the application of probability theory to other scientific and engineering domains...
- Empirical findings
- Benford's lawBenford's lawBenford's law, also called the first-digit law, states that in lists of numbers from many real-life sources of data, the leading digit is distributed in a specific, non-uniform way...
- Pareto principlePareto principleThe Pareto principle states that, for many events, roughly 80% of the effects come from 20% of the causes.Business-management consultant Joseph M...
- Zipf's law
- Benford's law
Stochastic processStochastic processIn probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...
es
- Adapted processAdapted processIn the study of stochastic processes, an adapted process is one that cannot "see into the future". An informal interpretation is that X is adapted if and only if, for every realisation and every n, Xn is known at time n...
- Bernoulli processBernoulli processIn probability and statistics, a Bernoulli process is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1. The component Bernoulli variables Xi are identical and independent...
- Bernoulli schemeBernoulli schemeIn mathematics, the Bernoulli scheme or Bernoulli shift is a generalization of the Bernoulli process to more than two possible outcomes. Bernoulli schemes are important in the study of dynamical systems, as most such systems exhibit a repellor that is the product of the Cantor set and a smooth...
- Bernoulli scheme
- Branching processBranching processIn probability theory, a branching process is a Markov process that models a population in which each individual in generation n produces some random number of individuals in generation n + 1, according to a fixed probability distribution that does not vary from individual to...
- Point processPoint processIn statistics and probability theory, a point process is a type of random process for which any one realisation consists of a set of isolated points either in time or geographical space, or in even more general spaces...
- Wiener processWiener processIn mathematics, the Wiener process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often called standard Brownian motion, after Robert Brown...
- Brownian motionBrownian motionBrownian motion or pedesis is the presumably random drifting of particles suspended in a fluid or the mathematical model used to describe such random movements, which is often called a particle theory.The mathematical model of Brownian motion has several real-world applications...
- Geometric Brownian motionGeometric Brownian motionA geometric Brownian motion is a continuous-time stochastic process in which the logarithm of the randomly varying quantity follows a Brownian motion, also called a Wiener process...
- Donsker's theoremDonsker's theoremIn probability theory, Donsker's theorem, named after M. D. Donsker, identifies a certain stochastic process as a limit of empirical processes. It is sometimes called the functional central limit theorem....
- Empirical processEmpirical processThe study of empirical processes is a branch of mathematical statistics and a sub-area of probability theory. It is a generalization of the central limit theorem for empirical measures...
- Wiener equation
- Wiener sausageWiener sausageIn the mathematical field of probability, the Wiener sausage is a neighborhood of the trace of a Brownian motion up to a time t, given by taking all points within a fixed distance of Brownian motion. It can be visualized as a sausage of fixed radius whose centerline is Brownian motion...
- Brownian motion
- Chapman–Kolmogorov equation
- Chinese restaurant process
- Coupling (probability)Coupling (probability)In probability theory, coupling is a proof technique that allows one to compare two unrelated variables by "forcing" them to be related in some way.-Definition:...
- Ergodic theoryErgodic theoryErgodic theory is a branch of mathematics that studies dynamical systems with an invariant measure and related problems. Its initial development was motivated by problems of statistical physics....
- Maximal ergodic theorem
- Ergodic (adjective)Ergodic (adjective)In mathematics, the term ergodic is used to describe a dynamical system which, broadly speaking, has the same behavior averaged over time as averaged over space. In physics the term is used to imply that a system satisfies the ergodic hypothesis of thermodynamics.-Etymology:The word ergodic is...
- Galton–Watson process
- Gauss-Markov process
- Gaussian processGaussian processIn probability theory and statistics, a Gaussian process is a stochastic process whose realisations consist of random values associated with every point in a range of times such that each such random variable has a normal distribution...
- Gaussian random fieldGaussian random fieldA Gaussian random field is a random field involving Gaussian probability density functions of the variables. A one-dimensional GRF is also called a Gaussian process....
- Gaussian isoperimetric inequality
- Large deviations of Gaussian random functionsLarge deviations of Gaussian random functionsA random function – of either one variable , or two or more variables – is called Gaussian if every finite-dimensional distribution is a multivariate normal distribution. Gaussian random fields on the sphere are useful when analysing* the anomalies in the cosmic microwave background...
- Gaussian random field
- Girsanov's theorem
- Itô's lemmaIto's lemmaIn mathematics, Itō's lemma is used in Itō stochastic calculus to find the differential of a function of a particular type of stochastic process. It is named after its discoverer, Kiyoshi Itō...
- Law of the iterated logarithmLaw of the iterated logarithmIn probability theory, the law of the iterated logarithm describes the magnitude of the fluctuations of a random walk. The original statement of the law of the iterated logarithm is due to A. Y. Khinchin . Another statement was given by A.N...
- Lévy flightLévy flightA Lévy flight is a random walk in which the step-lengths have a probability distribution that is heavy-tailed. When defined as a walk in a space of dimension greater than one, the steps made are in isotropic random directions...
- Lévy processLévy processIn probability theory, a Lévy process, named after the French mathematician Paul Lévy, is any continuous-time stochastic process that starts at 0, admits càdlàg modification and has "stationary independent increments" — this phrase will be explained below...
- Loop-erased random walkLoop-erased random walkIn mathematics, loop-erased random walk is a model for a random simple path with important applications in combinatorics and, in physics, quantum field theory. It is intimately connected to the uniform spanning tree, a model for a random tree...
- Markov chainMarkov chainA Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...
- Continuous-time Markov process
- Examples of Markov chainsExamples of Markov chains- Board games played with dice :A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain, indeed, an absorbing Markov chain. This is in contrast to card games such as blackjack, where the cards represent a 'memory' of the past moves. To see the...
- Detailed balanceDetailed balanceThe principle of detailed balance is formulated for kinetic systems which are decomposed into elementary processes : At equilibrium, each elementary process should be equilibrated by its reverse process....
- Markov propertyMarkov propertyIn probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov....
- Hidden Markov modelHidden Markov modelA hidden Markov model is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved states. An HMM can be considered as the simplest dynamic Bayesian network. The mathematics behind the HMM was developed by L. E...
- Markov chain mixing timeMarkov chain mixing timeIn probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution.More precisely, a fundamental result about Markov chains is that a finite state irreducible aperiodic chain has a unique stationary distribution π and,...
- MartingaleMartingale (probability theory)In probability theory, a martingale is a model of a fair game where no knowledge of past events can help to predict future winnings. In particular, a martingale is a sequence of random variables for which, at a particular time in the realized sequence, the expectation of the next value in the...
- Doob martingaleDoob martingaleA Doob martingale is a mathematical construction of a stochastic process which approximates a given random variable and has the martingale property with respect to the given filtration...
- Optional stopping theoremOptional stopping theoremIn probability theory, the optional stopping theorem says that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial value...
- Martingale representation theoremMartingale representation theoremIn probability theory, the martingale representation theorem states that a random variable which is measurable with respect to the filtration generated by a Brownian motion can be written in terms of an Itô integral with respect to this Brownian motion....
- Azuma's inequality
- Wald's equationWald's equationIn probability theory, Wald's equation, Wald's identity or Wald's lemma is an important identity that simplifies the calculation of the expected value of the sum of a random number of random quantities...
- Doob martingale
- Poisson processPoisson processA Poisson process, named after the French mathematician Siméon-Denis Poisson , is a stochastic process in which events occur continuously and independently of one another...
- Population processPopulation processIn applied probability, a population process is a Markov chain in which the state of the chain is analogous to the number of individuals in a population , and changes to the state are analogous to the addition or removal of individuals from the population.Although named by analogy to biological...
- Process with independent increments
- Progressively measurable processProgressively measurable processIn mathematics, progressive measurability is a property of stochastic processes. A progressively measurable process is one for which events defined in terms of values of the process across a range of times can be assigned probabilities . Being progressively measurable is a strictly stronger...
- Queueing theoryQueueing theoryQueueing theory is the mathematical study of waiting lines, or queues. The theory enables mathematical analysis of several related processes, including arriving at the queue, waiting in the queue , and being served at the front of the queue...
- Erlang unitErlang unitThe erlang is a dimensionless unit that is used in telephony as a statistical measure of offered load or carried load on service-providing elements such as telephone circuits or telephone switching equipment. It is named after the Danish telephone engineer A. K...
- Erlang unit
- Random walkRandom walkA random walk, sometimes denoted RW, is a mathematical formalisation of a trajectory that consists of taking successive random steps. For example, the path traced by a molecule as it travels in a liquid or a gas, the search path of a foraging animal, the price of a fluctuating stock and the...
- Random walk Monte Carlo
- Skorokhod's embedding theoremSkorokhod's embedding theoremIn mathematics and probability theory, Skorokhod's embedding theorem is either or both of two theorems that allow one to regard any suitable collection of random variables as a Wiener process evaluated at a collection of stopping times. Both results are named for the Ukrainian mathematician A.V...
- Stationary processStationary processIn the mathematical sciences, a stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space...
- Stochastic calculusStochastic calculusStochastic calculus is a branch of mathematics that operates on stochastic processes. It allows a consistent theory of integration to be defined for integrals of stochastic processes with respect to stochastic processes...
- Itô calculusIto calculusItō calculus, named after Kiyoshi Itō, extends the methods of calculus to stochastic processes such as Brownian motion . It has important applications in mathematical finance and stochastic differential equations....
- Malliavin calculusMalliavin calculusThe Malliavin calculus, named after Paul Malliavin, is a theory of variational stochastic calculus. In other words it provides the mechanics to compute derivatives of random variables....
- Stratonovich integralStratonovich integralIn stochastic processes, the Stratonovich integral is a stochastic integral, the most common alternative to the Itō integral...
- Itô calculus
- Time series analysis
- Autoregressive modelAutoregressive modelIn statistics and signal processing, an autoregressive model is a type of random process which is often used to model and predict various types of natural phenomena...
- Moving average modelMoving average modelIn time series analysis, the moving-average model is a common approach for modeling univariate time series models. The notation MA refers to the moving average model of order q:...
- Autoregressive moving average modelAutoregressive moving average modelIn statistics and signal processing, autoregressive–moving-average models, sometimes called Box–Jenkins models after the iterative Box–Jenkins methodology usually used to estimate them, are typically applied to autocorrelated time series data.Given a time series of data Xt, the ARMA model is a...
- Autoregressive integrated moving average model
- Anomaly time seriesAnomaly time seriesIn atmospheric sciences and some other applications of statistics, an anomaly time series is the time series of deviations of a quantity from some mean. Similarly a standardized anomaly series contains values of deviations divided by a standard deviation...
- Autoregressive model
- Renewal theoryRenewal theoryRenewal theory is the branch of probability theory that generalizes Poisson processes for arbitrary holding times. Applications include calculating the expected time for a monkey who is randomly tapping at a keyboard to type the word Macbeth and comparing the long-term benefits of different...
Geometric probability
- Buffon's needleBuffon's needleIn mathematics, Buffon's needle problem is a question first posed in the 18th century by Georges-Louis Leclerc, Comte de Buffon:Buffon's needle was the earliest problem in geometric probability to be solved; it can be solved using integral geometry...
- Integral geometryIntegral geometryIn mathematics, integral geometry is the theory of measures on a geometrical space invariant under the symmetry group of that space. In more recent times, the meaning has been broadened to include a view of invariant transformations from the space of functions on one geometrical space to the...
- Hadwiger's theoremHadwiger's theoremIn integral geometry , Hadwiger's theorem characterises the valuations on convex bodies in Rn. It was proved by Hugo Hadwiger.-Valuations:...
- Wendel's theoremWendel's theoremIn geometric probability theory, Wendel's theorem, named after James G. Wendel, gives the probability that N points distributed uniformly at random on an n-dimensional hypersphere all lie on the same "half" of the hypersphere...
GamblingGamblingGambling is the wagering of money or something of material value on an event with an uncertain outcome with the primary intent of winning additional money and/or material goods...
- LuckLuckLuck or fortuity is good fortune which occurs beyond one's control, without regard to one's will, intention, or desired result. There are at least two senses people usually mean when they use the term, the prescriptive sense and the descriptive sense...
- Game of chanceGame of chanceA game of chance is a game whose outcome is strongly influenced by some randomizing device, and upon which contestants may or may not wager money or anything of monetary value...
- OddsOddsThe odds in favor of an event or a proposition are expressed as the ratio of a pair of integers, which is the ratio of the probability that an event will happen to the probability that it will not happen...
- Gambler's fallacyGambler's fallacyThe Gambler's fallacy, also known as the Monte Carlo fallacy , and also referred to as the fallacy of the maturity of chances, is the belief that if deviations from expected behaviour are observed in repeated independent trials of some random process, future deviations in the opposite direction are...
- Inverse gambler's fallacyInverse gambler's fallacyThe inverse gambler's fallacy, named by philosopher Ian Hacking, is a formal fallacy of Bayesian inference which is similar to the better known gambler's fallacy. It is the fallacy of concluding, on the basis of an unlikely outcome of a random process, that the process is likely to have occurred...
- Parrondo's paradoxParrondo's paradoxParrondo's paradox, a paradox in game theory, has been described as: A losing strategy that wins. It is named after its creator, Spanish physicist Juan Parrondo, who discovered the paradox in 1996...
- Pascal's wagerPascal's WagerPascal's Wager, also known as Pascal's Gambit, is a suggestion posed by the French philosopher, mathematician, and physicist Blaise Pascal that even if the existence of God could not be determined through reason, a rational person should wager as though God exists, because one living life...
- Gambler's ruinGambler's ruinThe term gambler's ruin is used for a number of related statistical ideas:* The original meaning is that a gambler who raises his bet to a fixed fraction of bankroll when he wins, but does not reduce it when he loses, will eventually go broke, even if he has a positive expected value on each bet.*...
- Poker probabilityPoker probabilityIn poker, the probability of each type of 5-card hand can be computed by calculating the proportion of hands of that type among all possible hands.-Frequency of 5-card poker hands:...
- Poker probability (Omaha)Poker probability (Omaha)In poker, the probability of many events can be determined by direct calculation. This article discusses how to compute the probabilities for many commonly occurring events in the game of Omaha hold 'em and provides some probabilities and odds for specific situations...
- Poker probability (Texas hold 'em)Poker probability (Texas hold 'em)In poker, the probability of many events can be determined by direct calculation. This article discusses computing probabilities for many commonly occurring events in the game of Texas hold 'em and provides some probabilities and odds for specific situations...
- Pot oddsPot oddsIn poker, pot odds are the ratio of the current size of the pot to the cost of a contemplated call. Pot odds are often compared to the probability of winning a hand with a future card in order to estimate the call's expected value....
- Poker probability (Omaha)
- RouletteRouletteRoulette is a casino game named after a French diminutive for little wheel. In the game, players may choose to place bets on either a single number or a range of numbers, the colors red or black, or whether the number is odd or even....
- Martingale (betting system)Martingale (betting system)Originally, martingale referred to a class of betting strategies popular in 18th century France. The simplest of these strategies was designed for a game in which the gambler wins his stake if a coin comes up heads and loses it if the coin comes up tails...
- The man who broke the bank at Monte CarloThe Man Who Broke the Bank at Monte CarloThe Man Who Broke the Bank at Monte Carlo is a 1935 American romantic comedy film made by 20th Century Fox. It was directed by Stephen Roberts, and starred Ronald Colman, Joan Bennett, and Colin Clive. The screenplay was written by Nunnally Johnson and Howard Smith, based on play by Ilya Surguchev...
- Martingale (betting system)
- LotteryLotteryA lottery is a form of gambling which involves the drawing of lots for a prize.Lottery is outlawed by some governments, while others endorse it to the extent of organizing a national or state lottery. It is common to find some degree of regulation of lottery by governments...
- Lottery machineLottery machineA lottery machine is the machine used to draw the winning numbers for a lottery.Early lotteries were done by drawing numbers, or winning tickets, from a container...
- PachinkoPachinkois a type of game originating in Japan, and used as both a form of recreational arcade game and much more frequently as a gambling device, filling a niche in gambling in Japan comparable to that of the slot machine in Western gambling. A pachinko machine resembles a vertical pinball machine, but...
- Lottery machine
- Coherence (philosophical gambling strategy)Coherence (philosophical gambling strategy)In a thought experiment proposed by the Italian probabilist Bruno de Finetti in order to justify Bayesian probability, an array of wagers is coherent precisely if it does not expose the wagerer to certain loss regardless of the outcomes of events on which he is wagering, even if his opponent makes...
- Coupon collector's problemCoupon collector's problemIn probability theory, the coupon collector's problem describes the "collect all coupons and win" contests. It asks the following question: Suppose that there are n coupons, from which coupons are being collected with replacement...
Coincidence
- Birthday paradoxBirthday paradoxIn probability theory, the birthday problem or birthday paradox pertains to the probability that, in a set of n randomly chosen people, some pair of them will have the same birthday. By the pigeonhole principle, the probability reaches 100% when the number of people reaches 366. However, 99%...
- Birthday problem
- Index of coincidenceIndex of coincidenceIn cryptography, coincidence counting is the technique of putting two texts side-by-side and counting the number of times that identical letters appear in the same position in both texts...
- Bible codeBible codeThe Bible code , also known as the Torah code, is a purported set of secret messages encoded within the text Hebrew Bible and describing prophesies and other guidance regarding the future. This hidden code has been described as a method by which specific letters from the text can be selected to...
- Spurious relationshipSpurious relationshipIn statistics, a spurious relationship is a mathematical relationship in which two events or variables have no direct causal connection, yet it may be wrongly inferred that they do, due to either coincidence or the presence of a certain third, unseen factor In statistics, a spurious relationship...
Algorithmics
- Probable primeProbable primeIn number theory, a probable prime is an integer that satisfies a specific condition also satisfied by all prime numbers. Different types of probable primes have different specific conditions...
- Probabilistic algorithm = Randomised algorithm
- Monte Carlo methodMonte Carlo methodMonte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems...
- Las Vegas algorithmLas Vegas algorithmIn computing, a Las Vegas algorithm is a randomized algorithm that always gives correct results; that is, it always produces the correct result or it informs about the failure. In other words, a Las Vegas algorithm does not gamble with the verity of the result; it gambles only with the resources...
- Probabilistic Turing machineProbabilistic Turing machineIn computability theory, a probabilistic Turing machine is a non-deterministic Turing machine which randomly chooses between the available transitions at each point according to some probability distribution....
- Stochastic programmingStochastic programmingStochastic programming is a framework for modeling optimization problems that involve uncertainty. Whereas deterministic optimization problems are formulated with known parameters, real world problems almost invariably include some unknown parameters. When the parameters are known only within...
- Probabilistically checkable proof
- Box–Muller transform
- Metropolis algorithm
- Gibbs samplingGibbs samplingIn statistics and in statistical physics, Gibbs sampling or a Gibbs sampler is an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variables...
- Inverse transform sampling methodInverse transform sampling methodInverse transform sampling, also known as the inverse probability integral transform or inverse transformation method or Smirnov transform or even golden rule, is a basic method for pseudo-random number sampling, i.e. for generating sample numbers at random from any probability distribution given...
Financial mathematics
- RiskRiskRisk is the potential that a chosen action or activity will lead to a loss . The notion implies that a choice having an influence on the outcome exists . Potential losses themselves may also be called "risks"...
- Value at riskValue at riskIn financial mathematics and financial risk management, Value at Risk is a widely used risk measure of the risk of loss on a specific portfolio of financial assets...
- Market riskMarket riskMarket risk is the risk that the value of a portfolio, either an investment portfolio or a trading portfolio, will decrease due to the change in value of the market risk factors. The four standard market risk factors are stock prices, interest rates, foreign exchange rates, and commodity prices...
- Risk-neutral measureRisk-neutral measureIn mathematical finance, a risk-neutral measure, is a prototypical case of an equivalent martingale measure. It is heavily used in the pricing of financial derivatives due to the fundamental theorem of asset pricing, which implies that in a complete market a derivative's price is the discounted...
- VolatilityVolatility (finance)In finance, volatility is a measure for variation of price of a financial instrument over time. Historic volatility is derived from time series of past market prices...
- Technical analysisTechnical analysisIn finance, technical analysis is security analysis discipline for forecasting the direction of prices through the study of past market data, primarily price and volume. Behavioral economics and quantitative analysis incorporate technical analysis, which being an aspect of active management stands...
- Kelly criterionKelly criterionIn probability theory, the Kelly criterion, or Kelly strategy or Kelly formula, or Kelly bet, is a formula used to determine the optimal size of a series of bets. In most gambling scenarios, and some investing scenarios under some simplifying assumptions, the Kelly strategy will do better than any...
Physics
- Probability amplitudeProbability amplitudeIn quantum mechanics, a probability amplitude is a complex number whose modulus squared represents a probability or probability density.For example, if the probability amplitude of a quantum state is \alpha, the probability of measuring that state is |\alpha|^2...
- Statistical physicsStatistical physicsStatistical physics is the branch of physics that uses methods of probability theory and statistics, and particularly the mathematical tools for dealing with large populations and approximations, in solving physical problems. It can describe a wide variety of fields with an inherently stochastic...
- Boltzmann factorBoltzmann factorIn physics, the Boltzmann factor is a weighting factor that determines the relative probability of a particle to be in a state i in a multi-state system in thermodynamic equilibrium at temperature T...
- Feynman-Kac formulaFeynman-Kac formulaThe Feynman–Kac formula, named after Richard Feynman and Mark Kac, establishes a link between parabolic partial differential equations and stochastic processes. It offers a method of solving certain PDEs by simulating random paths of a stochastic process. Conversely, an important class of...
- Fluctuation theoremFluctuation theoremThe fluctuation theorem , which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium will increase or decrease over a given amount of time...
- Information entropyInformation entropyIn information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
- Vacuum expectation valueVacuum expectation valueIn quantum field theory the vacuum expectation value of an operator is its average, expected value in the vacuum. The vacuum expectation value of an operator O is usually denoted by \langle O\rangle...
- Cosmic varianceCosmic varianceCosmic variance is the statistical uncertainty inherent in observations of the universe at extreme distances. It is based on the idea that it is only possible to observe part of the universe at one particular time, so it is difficult to make statistical statements about cosmology on the scale of...
- Negative probabilityNegative probabilityIn 1942, Paul Dirac wrote a paper "The Physical Interpretation of Quantum Mechanics" where he introduced the concept of negative energies and negative probabilities:...
- Gibbs stateGibbs stateIn probability theory and statistical mechanics, a Gibbs state is an equilibrium probability distribution which remains invariant under future evolution of the system...
- Master equationMaster equationIn physics and chemistry and related fields, master equations are used to describe the time-evolution of a system that can be modelled as being in exactly one of countable number of states at any given time, and where switching between states is treated probabilistically...
- Partition function (mathematics)Partition function (mathematics)The partition function or configuration integral, as used in probability theory, information science and dynamical systems, is an abstraction of the definition of a partition function in statistical mechanics. It is a special case of a normalizing constant in probability theory, for the Boltzmann...
- Quantum probabilityQuantum probabilityQuantum probability was developed in the 1980s as a noncommutative analog of the Kolmogorovian theory of stochastic processes. One of its aims is to clarify the mathematical foundations of quantum theory and its statistical interpretation....
GeneticsGeneticsGenetics , a discipline of biology, is the science of genes, heredity, and variation in living organisms....
- Punnett squarePunnett squareThe Punnett square is a diagram that is used to predict an outcome of a particular cross or breeding experiment. It is named after Reginald C. Punnett, who devised the approach, and is used by biologists to determine the probability of an offspring's having a particular genotype...
- Hardy–Weinberg principle
- Ewens's sampling formulaEwens's sampling formulaIn population genetics, Ewens' sampling formula, describes the probabilities associated with counts of how many different alleles are observed a given number of times in the sample.-Definition:...
- Population geneticsPopulation geneticsPopulation genetics is the study of allele frequency distribution and change under the influence of the four main evolutionary processes: natural selection, genetic drift, mutation and gene flow. It also takes into account the factors of recombination, population subdivision and population...