Method of Moments
Introduction to Method of Moments
Recommended Prerequesites
- Probability
- Probability 2
Definition
The Method of Moments is a technique used to estimate the parameters of a probability distribution by equating sample moments to the corresponding population moments.
It serves as a less common alternative to MLE, which can often be simpler.
Sample Moments
Given a set of data \(X_1, X_2, \dots, X_n\), the sample moments are empirical counterparts of population moments. The k-th sample moment is defined as:
$$m'_{k}=\frac{1}{n}\sum_{i=1}^{n}X_{i}^k$$
Similarly, the k-th sample central moment is given by:
$$m_{k}=\frac{1}{n}\sum_{i=1}^{n}(X_{i}-\bar{X})^k$$
where \(\bar{X}\) is the sample mean.
Performing the Method of Moments
The idea behind the Method of Moments is to estimate the parameters of a probability distribution by matching the first few sample moments with their corresponding population moments. Suppose a distribution is characterized by a set of parameters \(\theta_1,\theta_2,\dots,\theta_k\).
The equations that relate sample moments to the theoretical moments are called moment conditions.
For a distribution parameterized by \(\theta\), let \(g(X,\theta)\) represent a moment condition. The MoM estimates \(\hat{\theta}\) by solving the moment conditions:
$$\mathbb{E}[g(X,\theta)]=0$$
To estimate these parameters:
- Compute the first k sample moments \(m'_1,m'_2,\dots,m'_k\)
- Equate the sample moments to the corresponding population moments expressed as functions of \(\theta_1,\theta_2,\dots,\theta_k\).
- Solve the resulting system of equations to find estimates of the parameters.
Example
The probability density of the gamma distribution is given by:
$$f(x;\alpha,\beta)=\frac{x^{\alpha-1}e^{-x/\beta}}{\beta^\alpha\Gamma(\alpha)}\quad\text{for }x>0$$
where \(\alpha\) is the shape parameter and \(\beta\) is the scale parameter.
The population moments of the gamma distribution are:
$$\mu'_1=\alpha\beta$$
$$\mu'_2=\alpha(\alpha-1)\beta^2$$
To estimate \(\alpha\) and \(\beta\) using the Method of Moments, we equate the first two sample moments to the corresponding population moments:
$$m'_1=\alpha\beta$$
$$m'_2=\alpha(\alpha-1)\beta^2$$
Solving for \(\beta\) using the first equation:
$$\beta=\frac{m'_1}{\alpha}$$
And substitute into the second equation:
$$m'_2=\alpha(\alpha+1)\left(\frac{m'_1}{\alpha}\right)^2$$
Simplify and solve for \(\alpha\), then use the solution to find \(\beta\).
Table of Estimates
Distribution |
Parameters |
MoM Estimates |
Normal (Gaussian) |
\(\mu, \sigma\) |
\(\hat{\mu} = M_1 = \frac{1}{n} \sum_{i=1}^{n} X_i\)
\(\hat{\sigma}^2 = M_2 - M_1^2 = \frac{1}{n} \sum_{i=1}^{n} (X_i - M_1)^2\)
|
Exponential |
\(\lambda\) |
\(\hat{\lambda} = \frac{1}{M_1} = \frac{1}{\frac{1}{n} \sum_{i=1}^{n} X_i}\) |
Poisson |
\(\lambda\) |
\(\hat{\lambda} = M_1 = \frac{1}{n} \sum_{i=1}^{n} X_i\) |
Uniform \((a, b)\) |
\(a, b\) |
\(\hat{a} = M_1 - \sqrt{3(M_2 - M_1^2)}\)
\(\hat{b} = M_1 + \sqrt{3(M_2 - M_1^2)}\)
|
Binomial \((n, p)\) |
\(n, p\) |
\(\hat{p} = \frac{M_1}{n}\)
\(M_2 = np(1-p) + (np)^2\)
|
Gamma \((\alpha, \beta)\) |
\(\alpha, \beta\) |
\(\hat{\alpha} = \frac{M_1^2}{M_2 - M_1^2}\)
\(\hat{\beta} = \frac{M_2 - M_1^2}{M_1}\)
|
Beta \((\alpha, \beta)\) |
\(\alpha, \beta\) |
\(\hat{\alpha} = \frac{M_1\left(M_1(1-M_1)/M_2 - 1\right)}{1-M_1}\)
\(\hat{\beta} = \frac{(1-M_1)\left(M_1(1-M_1)/M_2 - 1\right)}{M_1}\)
|
Negative Binomial \((r, p)\) |
\(r, p\) |
\(\hat{p} = 1 - \frac{M_1}{M_2 - M_1^2}\)
\(\hat{r} = \frac{M_1^2}{M_2 - M_1^2}\)
|
Log-Normal \((\mu, \sigma)\) |
\(\mu, \sigma\) |
\(\hat{\mu} = \log(M_1) - \frac{1}{2} \log\left(\frac{M_2}{M_1^2}\right)\)
\(\hat{\sigma}^2 = \log\left(\frac{M_2}{M_1^2}\right)\)
|
Weibull \((\lambda, k)\) |
\(\lambda, k\) |
\(\hat{k} = \frac{M_1}{\left(\frac{M_2}{M_1^2} - 1\right)^{1/2}}\)
\(\hat{\lambda} = M_1\left(\frac{M_2}{M_1^2} - 1\right)^{1/k}\)
|
Chi-Squared \((k)\) |
\(k\) |
\(\hat{k} = 2 M_1\)
|
Cauchy \((x_0, \gamma)\) |
\(x_0, \gamma\) |
\(\hat{x}_0 = M_1\)
\(\hat{\gamma} = \frac{M_2 - M_1^2}{M_1}\)
|
Pareto \((x_m, \alpha)\) |
\(x_m, \alpha\) |
\(\hat{\alpha} = \frac{n}{\sum_{i=1}^{n} \log\left(\frac{X_i}{x_m}\right)}\)
|
Generalized Method of Moments
Method of Moments Estimator
Method of Moments Explanation
Method of Moments Practice Problems
- Suppose \(X_1, X_2,\dots, X_n\) are iid samples from a binomial distribution \(\text{Bin}(n,p)\).
- Write down the first moment (mean) of the binomial distribution in terms of n and p
- Use the method of moments to derive an estimator for p, assuming n is known.
- If \(n=10\) and the sample mean is 4, estimate p.
-
Let \(X_1,X_2,\dots,X_n\) be iid samples from an exponential distribution with parameter \(\theta\), where the PDF is given by:
$$f(x;\theta)=\frac{1}{\theta}e^{-x/\theta}$$
- Write down the first moment of the exponential distribution
- Use the method of moments to derive an estimator for \(\theta\)
- If the sample mean is 5, estimate \(\theta\)
-
Suppose \(X_1, X_2,\dots,X_n\) are iid samples from a uniform distribution:
- Write down the first and second moments of the uniform distribution in terms of a and b.
- Use the method of moments to derive estimators for both a and b
- If the sample mean is 4 and the sample variance is 9, estimate a and b.
-
Suppose \(X_1, X_2,\dots,X_n\) are iid samples from a gamma distribution whose pdf is given by:
$$f(x;k,\theta)=\frac{1}{\Gamma(k)\theta^k}x^{k-1}e^{-x/\theta};\quad x\gt 0$$
where k is known and \(\theta\) is the parameter of interest. The following integral may be useful:
$$\int_{0}^{\infty}x^ne^{-ax}dx=\frac{n!}{a^{n+1}}$$
- Write down the first moment (mean) of the gamma distribution in terms of k and \(\theta\)
- Use the method of moments to derive an estimator for \(\theta\)
- Given k=2 and a sample mean of 6, estimate \(\theta\)
-
Suppose \(X_1, X_2,\dots,X_n\) are iid samples from a beta distribution whose pdf is given by:
$$f(x;\alpha,\beta)=\frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}x^{\alpha-1}(1-x)^{\beta-1}$$
The following integral may be useful:
$$\int_0^{1}x^{a}(1-x)^{b}=\frac{B(a+1,b+1)}{a+b+1}$$
- Write down the first two moments of the beta distribution in terms of \(\alpha\) and \(\beta\)
- Use the method of moments to derive estimators for \(\alpha\) and \(\beta\).
- Give a sample mean of 0.4 and a sample variance of 0.1, estimate \(\alpha\) and \(\beta\).