Processing math: 61%

Introduction to Moment Generating Functions

In statistics, a moment is a quantitative measure of the shape of a distribution. The moment generating function of a random variable, when it exists, is a unique descriptor of the variable's probability distribution.

What is a Moment?

The nth moment is the expected value of Xn, denoted E(Xn).

The first moment is E(X) which represents the mean of the distribution. The second moment, E(X2) forms part of the expression for variance: E(X2)E(X)2.

They are, in essence, measures of the shape of the distribution. If your distribution is bounded (A.K.A. it doesn’t stretch infinitely to either side), it has a unique collection of moments from E(X) to E(X).

What is a Moment Generating Function?

A Moment Generating Function (MGF) is a generating function to find each moment.

The MGF for a continuous random variable is

M(t)=E[etx]xetxf(x)dx

and the MGF for a discrete random variable is

M(t)=E[etx]xetxf(x)dx

Once you find M(t), take the nth derivative and plug in 0 for t to get your nth moment.

Proof

So why and how does this work? We can look towards the taylor series for ex for our answer.

We know that

ex=1+x+x22!++xnn!+etx=1+tx+(tx)22!++(tx)nn!+

If we take the expected value of etx, we get

E(etx)=E(1)+E(tx)+E((tx)22!)++E((tx)nn!)+=E(1)+tE(x)+t22!E(x2)++tnn!E(xn)+

and then if we take the derivative of the expression (with respect to t), we get

ddtE(etx)=ddtE(1)+ddttE(x)+ddtt22!E(x2)++ddttnn!E(xn)+=0+E(X)+tE(X2)++ntn1n!E(Xn)+plug in t=0=0+E(X)+0++0+=E(X)

Thus proving that M(t)|t=0=E(X). So, we’ve proven the first derivative, what about the rest? Look at what pattern is ocurring as we take the derivative. First, we are reducing the fraction to be equivalent to the previous term. The power rule means we take the n in the exponent and put it in the numerator of the fraction:

ddttnn!=ntn1n!

This n now cancels with the first term in the factorial in the denominator and reduces it to the factorial of n1.

tn1(n1)!

This is now the coefficient of the (n1)th term.

Second, notice how only one term “survives” after we plug in t=0. This will always be the first term that doesn’t evaluate to 0 after taking the necessary number of derivatives. This is because it will be the only term not already 0 or not with a t that will turn the whole expression into 0 once t=0 is plugged in.

So the nth derivative of the series is

ddtE(etx)=0+E(X)+tE(X2)++tn1(n1)!E(Xn)+d2dt2E(etx)=0+0+E(X2)++tn2(n2)!E(Xn)+d3dt3E(etx)=0+0+0+E(X3)++tn3(n3)!E(Xn)+

and then when we plug in t=0:

ddtE(etx)=0+E(X)+0++0+=E(X)d2dt2E(etx)=0+0+E(X2)++0+=E(X2)d3dt3E(etx)=0+0+0+E(X3)++0+=E(X3)

Examples

Example 1: Continuous Random Variable

Let’s take a continuous variable f(x)=ex for x0 and 0 otherwise. The moment generating function is then

M(t)=E[etx]=0etxf(x)dx=lim

From here we have three different solutions

  • For t>1, the term e^{(t-1)n} diverges to infinity as n increases. Because the limit diverges, the function is undefined
  • For t = 1, the terms \frac{1}{t-1} becomes undefined. Thus, the function is undefined here as well
  • For t<1, the term e^{(t-1)n} converges to 0, so the function is defined

So, the Moment Generating Function for t<1 is

M(t) = \frac{1}{t-1}(0)-\frac{1}{t-1} = (1-t)^{-1}

Also for t<1, the first and second derivatives are

M’(t) = \frac{d}{dt} (1-t)^{-1} = (1-t)^{-2}

M”(t) = \frac{d^2}{dt^2} (1-t)^{-1} = 2(1-t)^{-3}

If we are trying to find the expectation for this function in the defined range, we plug in 0 for t in M’(t) which gives us an expectation of 1

For variance, we would use the equation

M”(0) - M’(0)^2

which gives us a variance of \frac{1}{2} - 1 = -\frac12

Example 2: Discrete Random Variable

Given a random variable Y such that

y P(Y=y)
0 0.5
1 0.25
2 0.25

The MGF is

\begin{align*} M(t) = \sum e^{tx}f(x) = 0.5 + 0.25e^{t} + 0.25e^{2t} \end{align*}

So the Expectation is

\begin{align*} M'(t)\|_{t=0} = \frac{d}{dt}\left[ \sum e^{tx}f(x) \right]\|_{t=0} = \left[0.25e^t + 0.5^{2t}\right]\|_{t=0} = 0.75 \end{align*}

and the Variance is

\begin{align*} M"(t)\|_{t=0} - E(X)^2 &= \frac{d^2}{dt^2}\left[ \sum e^{tx}f(x) \right]\|_{t=0} - 0.75^2\\ &= \left[0.25e^t + e^{2t}\right]\|_{t=0} - 0.5625 = 0.6875 \end{align*}
← Newer Post
Bivariate Transformations

December 6, 2021

Older Post →
Project Euler - Problem 1 (Musical Notes)

December 25, 2020