Expectation of the Sum of Random Variables in the Form of Functions
📂Mathematical StatisticsExpectation of the Sum of Random Variables in the Form of Functions
Theorem
Given that X1,⋯,Xn is a random sample, and there exist functions Eg(X1) and Varg(X1) such that g:R→R is given, then the following hold:
- [1] Mean:
E(k=1∑ng(Xk))=nEg(X1)
- [2] Variance:
Var(k=1∑ng(Xk))=nVarg(X1)
Explanation
The critical point to note in this theorem is that {Xk}k=1n is a random sample, in other words, iid. For example, when i=j, then Xi=Xj and if g(x)=x, as is well known from the properties of variance,
Var(k=1∑nXk)=Var(nXk)=n2VarXk
This means that the condition of being independent is absolutely necessary to derive theorem [2].
Proof
[1]
Since the expectation is linear, and since X1,⋯,Xn follows the same distribution, the following holds:
==E(k=1∑ng(Xk))k=1∑nEg(Xk)nEg(X1)∵lineartiy∵identical distributed
■
[2]
Since X1,⋯,Xn is independent, if i=j, then Cov(g(Xi),g(Xj))=0 is true. Therefore,
=====Var(k=1∑ng(Xk))E[k=1∑ng(Xk)−Ek=1∑ng(Xk)]2E[k=1∑n[g(Xk)−Eg(Xk)]]2k=1∑nE[g(Xk)−Eg(Xk)]2+i=j∑E[g(Xi)−Eg(Xi)g(Xj)−Eg(Xj)]k=1∑nVarg(Xk)+i=j∑Cov(g(Xi),g(Xj))nVarg(X1)+0
holds.
■