logo

What is the Moment Generating Function? 📂Mathematical Statistics

What is the Moment Generating Function?

Definition 1

For a random variable $X$ and some positive number $h>0$, if $E(e^{tX})$ exists in $-h< t < h$, then $M(t) = E( e^{tX} )$ is defined as the Moment Generating Function of $X$.

Explanation

The moment generating function (mgf) is a concept often encountered relatively early in mathematical statistics, yet its unfamiliar definition and seemingly contextless introduction can make it a source of dislike for the subject. The challenge with understanding mgf typically stems from textbooks diving straight into its definition and application, leaving readers knowing what it is but not why it takes its form or its purpose. Essentially, the term ‘moment generating function’ is constructed by combining ‘moment’ and ‘generating function’. For those readers pressed for time, but wanting the key points:

  1. There’s no need to understand what a moment is: Fundamentally, a moment is an abstract concept encompassing means, variances, etc. While moments can become meaningful statistical measures when manipulated appropriately according to their order, they are not inherently meaningful in a statistical sense. It’s sufficient to know the moment itself without forcing a connection to specific statistical measures.
  2. The moment generating function is merely a type of generating function: While generating functions are simply a general expression of a polynomial function, knowing that the moment generating function takes coefficients that are moments allows for a more accurate understanding of its properties.

Decomposing the moment generating function using a Maclaurin series yields the following. [ NOTE: The rationale behind setting the radius of convergence to $-h<t<h$ in the definition is illustrated here. ] $$ \begin{align*} M(t) =& E(e^{tX}) \\ =& 1 + E(tX) + {{E(t^2 X^2)} \over {2!}} + \cdots \end{align*} $$ Since expectation has linearity, it can be represented as a generating function for $t$, as shown below. $$ M(t) = 1 + E(X) t+ {{E( X^2) t^2 } \over {2!}} + \cdots $$ Note that the coefficient of the $t^k$ term is a constant multiple $\displaystyle {{E(X^{k})} \over {k!}} $ of the $k$-th moment. Now, by differentiating both sides $n$ times with respect to $t$ and substituting $t=0$, we get: $$ M^{(n)} (0) = E(X^{n}) $$ Therefore, it can be said that the function $M$ generates moments, and thus it is appropriate to refer to it as the moment generating function. Had $M$ not been directly provided in the definition or had there only been mention of a generating function, it would have been considerably easier to grasp.

Considering random variables $X$ and $Y$ with moment generating functions $M_{X}$ and $M_{Y}$ respectively, if we assume these moment generating functions are identical. Moments are a concept devised to calculate statistical measures ultimately of interest in statistics. If all terms’ moments are equal, then it can be said that $X$ and $Y$ follow the same distribution. Following this theorem, if the distributions have moment generating functions, it’s reasonable to compare these functions as equivalents of the distributions themselves. Although distribution functions, expressed as integrals, are convenient for representing probabilities, they are less so for handling distributions themselves. This is where the utility of moment generating functions comes in, as they’re frequently used when mathematically discussing which distribution a random variable follows.

Theorem

Let $X$ and $Y$ be random variables with moment generating functions $M_{X}$ and $M_{Y}$, and cumulative distribution functions $F_{X}$ and $F_{Y}$, respectively. For all $ z \in \mathbb{R}$, $F_{X} (z) = F_{Y}(z)$ being true is equivalent to having some $h>0$ and for all $t \in (-h,h)$, $M_{X}(t) = M_{Y}(t)$ being true.


  • $\mathbb{R}$ represents the set of real numbers.

  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p59. ↩︎