Definition of Weighted Average
Definition
The following is called the Weighted Mean for data $\mathbf{x} = \left\{ x_{1} , \cdots , x_{n} \right\}$ and vector $\mathbf{w} = \left( w_{1} , \cdots , w_{n} \right) \in \mathbb{R}^{n}$. $$ {{ \sum_{k=1}^{n} w_{k} x_{k} } \over { \sum_{k=1}^{n} w_{k} }} = {{ w_{1} x_{1} + \cdots + w_{n} x_{n} } \over { w_{1} + \cdots + w_{n} }} $$ Meanwhile, $\mathbf{w}$ is also called a weighted vector or simply a weight, and in English, it is just called Weight.
Description
Weighted Mean is a statistic frequently mentioned in mathematical statistics and various branches of general mathematics. It can be seen as a generalization of the arithmetic mean, where all weights are equal. If $\mathbf{w} = \left( a , \cdots , a \right) \ne \mathbf{0}$, it becomes the widely used sample mean as follows. $$ {{ a x_{1} + \cdots + a x_{n} } \over { a + \cdots + a }} = {{ x_{1} + \cdots + x_{n} } \over { n }} $$ If $\mathbf{x}$ is extended to multiple dimensions, it can be geometrically considered as the centroid of multiple points allowing overlaps. In physics, the center of mass can be defined as a weighted mean where each particle’s mass is a weight, as follows. $$ \mathbf{r}_{cm}=\frac{m_{1}\mathbf{r}_{1}+m_{2}\mathbf{r}_{2}+\cdots + m_{n}\mathbf{r}_{n}}{m_{1}+ m_{2}+ \cdots+ m_{n}}=\frac{\sum m_{i}\mathbf{r}_{i}}{m} $$ In the home ground of statistics, it is difficult to specifically pinpoint an example because there are so many, and it appears so obvious and familiar that it suddenly comes up without any special explanation. For example, the population mean of the combined variance from several populations $s_{p}^{2}$ is as follows. $$ s_{p}^{2} = {{ \left( n_{1} - 1 \right) s_{1}^{2} + \cdots + \left( n_{m} - 1 \right) s_{m}^{2} } \over { \left( n_{1} - 1 \right) + \cdots + \left( n_{m} - 1 \right) }} = {{ \sum_{i=1}^{m} \left( n_{i} - 1 \right) s_{i}^{2} } \over { \sum_{i=1}^{m} \left( n_{i} - 1 \right) }} $$
Exponentially Weighted Average
For time series data $\left\{ x_{t} \right\}_{t=1}^{n}$, the following value is called the exponentially weighted average of $\left\{ x_{t} \right\}_{t=1}^{n}$. Regarding $\beta \in (0,1)$,
$$ \begin{align*} \dfrac{\beta^{n-1}x_{1} + \beta^{n-2}x_{2} + \cdots + \beta^{0}x_{n}}{\beta^{n-1} + \beta^{n-2} + \cdots + \beta^{0}} &= (1 - \beta) \dfrac{\beta^{n-1}x_{1} + \beta^{n-2}x_{2} + \cdots + \beta^{0}x_{n}}{1 - \beta^{n}} \\ &= \dfrac{ (1 - \beta) \sum\limits_{t=1}^{n}\beta^{n-t}x_{t} }{1 - \beta^{n}} \end{align*} $$
The first equality is due to the formula for the sum of a geometric sequence. This means adding $x_{t}$s by reducing the weight exponentially for data from further in the past. It is also defined recursively as follows.
$$ \begin{align*} y_{0} &= 0 \\ y_{t} &= \beta y_{t-1} + (1-\beta) x_{t} = (1-\beta) \sum\limits_{j=1}^{t} \beta^{t-j} x_{j} \end{align*} $$
In this case, as it is a weighted sum, dividing by $(1 - \beta^{t})$ results in a weighted average.
$$ \hat{y}_{t} = \dfrac{y_{t}}{1 - \beta^{t}} $$