logo

Extended Autocorrelation Function 📂Statistical Analysis

Extended Autocorrelation Function

Buildup

PACF helps in determining the order of AR(p)AR(p), while ACF is helpful in setting the order for MA(q)MA(q). However, when applying to ARMA(p,q)ARMA(p,q) model, due to the invertibility of ARMA models, AR(p)AR(p) may appear as MA()MA(\infty), and MA(q)MA(q) may appear as AR()AR(\infty). Therefore, various methods have been devised to circumvent these issues and find the ARMA model.

Definition

The Extended Autocorrelation Function is one such method, defined as EACF for lags kk and jj, described as follows: Wt,k,j:=Ytϕ~1Yt1ϕ~kYtk W_{t,k,j} := Y_{t} - \tilde{\phi}_{1} Y_{t-1} - \cdots - \tilde{\phi}_{k} Y_{t-k}

Explanation

Understanding EACF just from its definition is impossible, so let’s comprehend it through equations. Given lags kk and jj, an ARMA model ARMA(k,j)ARMA(k,j) can be expressed as follows: Yt=i=1kϕiYti+eti=1jθieti Y_{t} = \sum_{i = 1}^{k} \phi_{i} Y_{t-i} + e_{t} - \sum_{i = 1}^{j} \theta_{i} e_{t-i} Here, performing a multiple regression analysis with YtY_{t} as Yt1,,YtkY_{t-1} , \cdots , Y_{t-k} yields the regression coefficients ϕ~1,,ϕ~k\tilde{\phi}_{1} , \cdots , \tilde{\phi}_{k}, which are estimates of ϕ1,,ϕk\phi_{1} , \cdots , \phi_{k}. And the residuals would disastrously appear as the following: Yti=1kϕ~iYti=eti=1jθieti Y_{t} - \sum_{i = 1}^{k} \tilde{\phi}_{i} Y_{t-i} = e_{t} - \sum_{i = 1}^{j} \theta_{i} e_{t-i} However, on closer inspection, by setting the left side as Wt,k,jW_{t,k,j}, we can achieve the MA(j)MA(j) model. Wt,k,j=eti=1jθieti W_{t,k,j} = e_{t} - \sum_{i = 1}^{j} \theta_{i} e_{t-i} Then, Wt,k,jW_{t,k,j} follows a normal distribution N(0,1nkj)\displaystyle N \left( 0 , {{1} \over {n - k - j}}\right), just as ACF did, and this is used for hypothesis testing.

To simplify, this approach essentially disentangles the complex ARMA model ARMA(k,j)ARMA(k,j) by specifying lag kk to eliminate AR(k)AR(k), and then by specifying lag jj to solely consider MA(j)MA(j) for targeted analysis. The reason it is aptly named Extended Autocorrelation Function is because it ultimately requires the use of ACF to discern MA(j)MA(j).

Exercise

20190724\_150509.png

Since the lags are listed on two axes, unlike ACF or PACF, it’s impossible to draw a correlogram 1 and instead, a table that merely indicates whether the null hypothesis is rejected or not with O and X is used. However, this table is not based on actual data but is a theoretical schematic that should emerge from analyzing an ideally working ARMA(1,1)ARMA(1,1) model. It’s rarely this clean in reality.

The method to read the table is as follows:

  • Step 1. Find a vertex where a line continues horizontally to the right from the upper left.
  • Step 2. From the vertex, find the line that drops diagonally right-downward forming an acute angle with a horizontal line.
  • Step 3. If both lines are identified, analyze with the corresponding ARMA(p,q)ARMA(p,q) at the vertex.

Following this method, in the given illustration, ARMA(1,1)ARMA(1,1) at the vertex O* becomes a candidate model. As one might guess after studying time series to some extent and reading the explanation thoroughly, the actual analysis often involves a lot of subjective and vague parts. The only way to get accustomed to this is by actively performing many analyses.

See Also


  1. Technically, it could be drawn in three dimensions, but it’s avoided because it’s inconvenient to look at. The analyst only needs to check whether it falls inside or outside the confidence interval. ↩︎