logo

What is the Moment Generating Function? 📂Mathematical Statistics

What is the Moment Generating Function?

Definition 1

For a random variable XX and some positive number h>0h>0, if E(etX)E(e^{tX}) exists in h<t<h-h< t < h, then M(t)=E(etX)M(t) = E( e^{tX} ) is defined as the Moment Generating Function of XX.

Explanation

The moment generating function (mgf) is a concept often encountered relatively early in mathematical statistics, yet its unfamiliar definition and seemingly contextless introduction can make it a source of dislike for the subject. The challenge with understanding mgf typically stems from textbooks diving straight into its definition and application, leaving readers knowing what it is but not why it takes its form or its purpose. Essentially, the term ‘moment generating function’ is constructed by combining ‘moment’ and ‘generating function’. For those readers pressed for time, but wanting the key points:

  1. There’s no need to understand what a moment is: Fundamentally, a moment is an abstract concept encompassing means, variances, etc. While moments can become meaningful statistical measures when manipulated appropriately according to their order, they are not inherently meaningful in a statistical sense. It’s sufficient to know the moment itself without forcing a connection to specific statistical measures.
  2. The moment generating function is merely a type of generating function: While generating functions are simply a general expression of a polynomial function, knowing that the moment generating function takes coefficients that are moments allows for a more accurate understanding of its properties.

Decomposing the moment generating function using a Maclaurin series yields the following. [ NOTE: The rationale behind setting the radius of convergence to h<t<h-h<t<h in the definition is illustrated here. ] M(t)=E(etX)=1+E(tX)+E(t2X2)2!+ \begin{align*} M(t) =& E(e^{tX}) \\ =& 1 + E(tX) + {{E(t^2 X^2)} \over {2!}} + \cdots \end{align*} Since expectation has linearity, it can be represented as a generating function for tt, as shown below. M(t)=1+E(X)t+E(X2)t22!+ M(t) = 1 + E(X) t+ {{E( X^2) t^2 } \over {2!}} + \cdots Note that the coefficient of the tkt^k term is a constant multiple E(Xk)k!\displaystyle {{E(X^{k})} \over {k!}} of the kk-th moment. Now, by differentiating both sides nn times with respect to tt and substituting t=0t=0, we get: M(n)(0)=E(Xn) M^{(n)} (0) = E(X^{n}) Therefore, it can be said that the function MM generates moments, and thus it is appropriate to refer to it as the moment generating function. Had MM not been directly provided in the definition or had there only been mention of a generating function, it would have been considerably easier to grasp.

Considering random variables XX and YY with moment generating functions MXM_{X} and MYM_{Y} respectively, if we assume these moment generating functions are identical. Moments are a concept devised to calculate statistical measures ultimately of interest in statistics. If all terms’ moments are equal, then it can be said that XX and YY follow the same distribution. Following this theorem, if the distributions have moment generating functions, it’s reasonable to compare these functions as equivalents of the distributions themselves. Although distribution functions, expressed as integrals, are convenient for representing probabilities, they are less so for handling distributions themselves. This is where the utility of moment generating functions comes in, as they’re frequently used when mathematically discussing which distribution a random variable follows.

Theorem

Let XX and YY be random variables with moment generating functions MXM_{X} and MYM_{Y}, and cumulative distribution functions FXF_{X} and FYF_{Y}, respectively. For all zR z \in \mathbb{R}, FX(z)=FY(z)F_{X} (z) = F_{Y}(z) being true is equivalent to having some h>0h>0 and for all t(h,h)t \in (-h,h), MX(t)=MY(t)M_{X}(t) = M_{Y}(t) being true.


  • R\mathbb{R} represents the set of real numbers.

  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p59. ↩︎