logo

Autoregressive Process 📂Statistical Analysis

Autoregressive Process

Model 1

White noise {et}tN\left\{ e_{t} \right\}_{t \in \mathbb{N}} is defined as in Yt:=ϕ1Yt1+ϕ2Yt2++ϕpYtp+etY_{t} := \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + \cdots + \phi_{p} Y_{t-p} + e_{t} and is called an ppth order autoregressive process AR(p)AR(p).

  • (1): AR(1):Yt=ϕYt1+etAR(1) : Y_{t} = \phi Y_{t-1} + e_{t}
  • (2): AR(2):Yt=ϕ1Yt1+ϕ2Yt2+etAR(2) : Y_{t} = \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + e_{t}
  • (p): AR(p):Yt=ϕ1Yt1+ϕ2Yt2++ϕpYtp+etAR(p) : Y_{t} = \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + \cdots + \phi_{p} Y_{t-p} + e_{t}
  • (∞): AR():Yt=et+ϕ1Yt1+ϕ2Yt2+AR( \infty ) : Y_{t} = e_{t} + \phi_{1} Y_{t-1} + \phi_{2} Y_{t-2} + \cdots

  • N\mathbb{N} represents the set of natural numbers {1,2,3,}\left\{ 1, 2, 3 , \cdots \right\}.

Explanation

20190220_095519.png

AR(p)AR(p) is called an ‘autoregressive process’ because it literally takes the form of a regression equation where previous times of itself are viewed as independent variables. It is evident that independence among the variables is not assumed. Moreover, it does not require stationarity, and, for example, it is not difficult to surmise that AR(1):Yt=ϕYt1+etAR(1) : Y_{t} = \phi Y_{t-1} + e_{t} might show simple movements such as increasing, decreasing, or oscillating.


  1. Cryer. (2008). Time Series Analysis: With Applications in R(2nd Edition): p66. ↩︎