logo

Autoregressive Process 📂Statistical Analysis

Autoregressive Process

Model 1

White noise $\left\{ e_{t} \right\}_{t \in \mathbb{N}}$ is defined as in $Y_{t} := \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + \cdots + \phi_{p} Y_{t-p} + e_{t}$ and is called an $p$th order autoregressive process $AR(p)$.

  • (1): $AR(1) : Y_{t} = \phi Y_{t-1} + e_{t}$
  • (2): $AR(2) : Y_{t} = \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + e_{t}$
  • (p): $AR(p) : Y_{t} = \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + \cdots + \phi_{p} Y_{t-p} + e_{t}$
  • (∞): $AR( \infty ) : Y_{t} = e_{t} + \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + \cdots $

Explanation

20190220_095519.png

$AR(p)$ is called an ‘autoregressive process’ because it literally takes the form of a regression equation where previous times of itself are viewed as independent variables. It is evident that independence among the variables is not assumed. Moreover, it does not require stationarity, and, for example, it is not difficult to surmise that $AR(1) : Y_{t} = \phi Y_{t-1} + e_{t}$ might show simple movements such as increasing, decreasing, or oscillating.


  1. Cryer. (2008). Time Series Analysis: With Applications in R(2nd Edition): p66. ↩︎