Method of moments (statistics)
Encyclopedia
In statistics
, the method of moments is a method of estimation
of population parameters such as mean, variance, median, etc. (which need not be moments), by equating sample moments
with unobservable population moments and then solving those equations for the quantities to be estimated.
. Suppose of the moments of the true distribution can be expressed as functions of the s:
Let be the j-th sample moment corresponding to the population
moment . The method of moments estimator
for denoted by is defined by the solution (if there is one) to the equations:
for x > 0, and 0 for x < 0.
The first moment, i.e., the expected value
, of a random variable with this probability distribution is
and the second moment, i.e., the expected value of its square, is
These are the "population
moments".
The first and second "sample
moments" m1 and m2 are respectively
and
Equating the population moments with the sample moments, we get
and
Solving these two equations for α and β, we get
and
We then use these 2 quantities as estimates, based on the sample, of the two unobservable population parameters α and β.
's method of maximum likelihood
, because maximum likelihood estimator
s have higher probability of being close to the quantities to be estimated.
However, in some cases, as in the above example of the gamma distribution, the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be quickly and easily calculated by hand as shown above.
Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton–Raphson method. In this way the method of moments and the method of maximum likelihood are symbiotic.
In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space; it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood. Also, estimates by the method of moments are not necessarily sufficient statistics
, i.e., they sometimes fail to take into account all relevant information in the sample.
When estimating other structural parameters (e.g., parameters of a utility function
, instead of parameters of a known probability distribution), appropriate probability distributions may not be known, and moment-based estimates may be preferred to Maximum Likelihood Estimation.
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....
, the method of moments is a method of estimation
Estimation
Estimation is the calculated approximation of a result which is usable even if input data may be incomplete or uncertain.In statistics,*estimation theory and estimator, for topics involving inferences about probability distributions...
of population parameters such as mean, variance, median, etc. (which need not be moments), by equating sample moments
Moment (mathematics)
In mathematics, a moment is, loosely speaking, a quantitative measure of the shape of a set of points. The "second moment", for example, is widely used and measures the "width" of a set of points in one dimension or in higher dimensions measures the shape of a cloud of points as it could be fit by...
with unobservable population moments and then solving those equations for the quantities to be estimated.
Methodology
Suppose that the problem is to estimate unknown parameters characterizing a distributionDistribution (mathematics)
In mathematical analysis, distributions are objects that generalize functions. Distributions make it possible to differentiate functions whose derivatives do not exist in the classical sense. In particular, any locally integrable function has a distributional derivative...
. Suppose of the moments of the true distribution can be expressed as functions of the s:
Let be the j-th sample moment corresponding to the population
Population
A population is all the organisms that both belong to the same group or species and live in the same geographical area. The area that is used to define a sexual population is such that inter-breeding is possible between any pair within the area and more probable than cross-breeding with individuals...
moment . The method of moments estimator
Estimator
In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule and its result are distinguished....
for denoted by is defined by the solution (if there is one) to the equations:
Example
Suppose X1, ..., Xn are independent identically distributed random variables with a gamma distribution with probability density functionProbability density function
In probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...
for x > 0, and 0 for x < 0.
The first moment, i.e., the expected value
Expected value
In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...
, of a random variable with this probability distribution is
and the second moment, i.e., the expected value of its square, is
These are the "population
Statistical population
A statistical population is a set of entities concerning which statistical inferences are to be drawn, often based on a random sample taken from the population. For example, if we were interested in generalizations about crows, then we would describe the set of crows that is of interest...
moments".
The first and second "sample
Sample (statistics)
In statistics, a sample is a subset of a population. Typically, the population is very large, making a census or a complete enumeration of all the values in the population impractical or impossible. The sample represents a subset of manageable size...
moments" m1 and m2 are respectively
and
Equating the population moments with the sample moments, we get
and
Solving these two equations for α and β, we get
and
We then use these 2 quantities as estimates, based on the sample, of the two unobservable population parameters α and β.
Advantages and disadvantages of this method
In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by FisherRonald Fisher
Sir Ronald Aylmer Fisher FRS was an English statistician, evolutionary biologist, eugenicist and geneticist. Among other things, Fisher is well known for his contributions to statistics by creating Fisher's exact test and Fisher's equation...
's method of maximum likelihood
Maximum likelihood
In statistics, maximum-likelihood estimation is a method of estimating the parameters of a statistical model. When applied to a data set and given a statistical model, maximum-likelihood estimation provides estimates for the model's parameters....
, because maximum likelihood estimator
Estimator
In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule and its result are distinguished....
s have higher probability of being close to the quantities to be estimated.
However, in some cases, as in the above example of the gamma distribution, the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be quickly and easily calculated by hand as shown above.
Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton–Raphson method. In this way the method of moments and the method of maximum likelihood are symbiotic.
In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space; it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood. Also, estimates by the method of moments are not necessarily sufficient statistics
Sufficiency (statistics)
In statistics, a sufficient statistic is a statistic which has the property of sufficiency with respect to a statistical model and its associated unknown parameter, meaning that "no other statistic which can be calculated from the same sample provides any additional information as to the value of...
, i.e., they sometimes fail to take into account all relevant information in the sample.
When estimating other structural parameters (e.g., parameters of a utility function
Utility
In economics, utility is a measure of customer satisfaction, referring to the total satisfaction received by a consumer from consuming a good or service....
, instead of parameters of a known probability distribution), appropriate probability distributions may not be known, and moment-based estimates may be preferred to Maximum Likelihood Estimation.
See also
- Generalized method of momentsGeneralized method of momentsIn econometrics, generalized method of moments is a generic method for estimating parameters in statistical models. Usually it is applied in the context of semiparametric models, where the parameter of interest is finite-dimensional, whereas the full shape of the distribution function of the data...