Causal Autoregressive Time Series

In the MAT8181 graduate course on Time Series, we will discuss (almost) only causal models. For instance, with https://latex.codecogs.com/gif.latex?AR(1),

https://latex.codecogs.com/gif.latex?X_t=\phi%20X_{t-1}+\varepsilon_t

with some white noise https://latex.codecogs.com/gif.latex?(\varepsilon_t), those models are obtained when https://latex.codecogs.com/gif.latex?\vert%20\phi\vert%20%3C1. In that case, we’ve seen that https://latex.codecogs.com/gif.latex?(\varepsilon_t) was actually the innovation process, and we can write

https://latex.codecogs.com/gif.latex?X_t%20=%20\sum_{h=0}^{+\infty}%20\phi^h%20\varepsilon_{t-h}

which is actually a mean-square convergent series (using simple Analysis arguments on series). From that expression, we can easily see that https://latex.codecogs.com/gif.latex?(X_t) is stationary, since https://latex.codecogs.com/gif.latex?\mathbb{E}(X_t)=0 (which does not depend on https://latex.codecogs.com/gif.latex?t) and

https://latex.codecogs.com/gif.latex?\text{cov}(X_t,X_{t-h})=\frac{\phi^h}{1-\phi^2}\sigma^2(which does not depend on https://latex.codecogs.com/gif.latex?t).

Consider now the case where https://latex.codecogs.com/gif.latex?\vert%20\phi\vert%20%3E1. Clearly, we have some problem here, since

https://latex.codecogs.com/gif.latex?X_t%20=%20\sum_{h=0}^{+\infty}%20\phi^h%20\varepsilon_{t-h}

cannot be defined (the series does not converge, in https://latex.codecogs.com/gif.latex?L^2). Nevertheless, it is still possible to write

https://latex.codecogs.com/gif.latex?X_t=\frac{1}{\phi}%20X_{t{\color{Red}%20+1}}{\color{Red}%20-\frac{1}{\phi}}\varepsilon_{t{\color{Red}%20+1}}But it is possible to iterate (as in the previous case) and write

https://latex.codecogs.com/gif.latex?X_t%20=%20\sum_{h={\color{Red}%201}}^{+\infty}%20\frac{-1}{\phi^h}%20\varepsilon_{t{\color{Red}%20+h}}

which is actually well defined. And in that case, the sequence of random variables https://latex.codecogs.com/gif.latex?(X_t) obtained from this equation is the unique stationary solution of the recursive equation https://latex.codecogs.com/gif.latex?X_t=\phi%20X_{t-1}+\varepsilon_t. This might be confusing, but the thing is this solution should not be confused with the usual non-stationary solution of https://latex.codecogs.com/gif.latex?X_t=\phi%20X_{t-1}+\varepsilon_t obtained from https://latex.codecogs.com/gif.latex?X_0. As in the code writen to generate a time series, from some starting value https://latex.codecogs.com/gif.latex?X_0 in the previous post.

Now, let us spent some time with this stationary time series, considered as unatural in Brockwell and Davis (1991). One point is that, in the previous case (where https://latex.codecogs.com/gif.latex?\vert%20\phi\vert%20%3C1) https://latex.codecogs.com/gif.latex?(\varepsilon_t) was the innovation process. So variable https://latex.codecogs.com/gif.latex?X_t was not correlated with the future of the noise, https://latex.codecogs.com/gif.latex?\sigma\{\varepsilon_{t+1},\varepsilon_{t+2},\cdots\}. Which is not the case when https://latex.codecogs.com/gif.latex?\vert%20\phi\vert%20%3E1.

All that looks nice, if you’re willing to understand thing at some theoretical level. What does all that mean from a computational perspective ? Consider some white noise (this noise actually does exist whatever you want to define, based on that time series)

> n=10000
> e=rnorm(n)
> plot(e,type="l",col="red")

If we look at the simple case, to start with,

> phi=.8
> X=rep(0,n)
> for(t in 2:n) X[t]=phi*X[t-1]+e[t]

The time series – the latest 1,000 observations – looks like

Now, if we use the cumulated sum of the noise,

> Y=rep(0,n)
> for(t in 2:n) Y[t]=sum(phi^((0:(t-1)))*e[t-(0:(t-1))])

we get

Which is exactly the same process ! This should not surprise us because that’s what the theory told us. Now, consider the problematic case, where https://latex.codecogs.com/gif.latex?\vert%20\phi\vert%20%3E1

> phi=1.1
> X=rep(0,n)
> for(t in 2:n) X[t]=phi*X[t-1]+e[t]

Clearly, that series is non-stationary (just look at the first 1,000 values)

Now, if we look at the series obtained from the cumulated sum of future values of the noise

> Y=rep(0,n)
> for(t in 1:(n-1)) Y[t]=sum((1/phi)^((1:(n-t)))*e[t+(1:(n-t))])

We get something which is, actually, stationary,

So, what is this series exactly ? If you look that the autocorrelation function,

> acf(Y)

we get the autocorrelation function of a (stationary) https://latex.codecogs.com/gif.latex?AR(1) process,

> acf(Y)[1]

Autocorrelations of series ‘Y’, by lag

    1 
0.908 

> 1/phi
[1] 0.9090909

Observe that there is a white noise – call it https://latex.codecogs.com/gif.latex?(\eta_t) – such that

https://latex.codecogs.com/gif.latex?X_t=\frac{1}{\phi}X_{t-1}+\eta_t

This is what we call the canonical form of the stationary process https://latex.codecogs.com/gif.latex?(X_t).


OpenEdition suggests that you cite this post as follows:
Arthur Charpentier (January 21, 2014). Causal Autoregressive Time Series. Freakonometrics. Retrieved December 3, 2024 from https://doi.org/10.58079/outo


2 thoughts on “Causal Autoregressive Time Series”

  1. If I am not wrong, you can differentiate both series using the third or fourth moment, for instance, right?

    That’s why you need to rule out gaussianity to be able to check if a series is causal or not. This is quite important in the MA case, i.e. if the series is fundamental or not, for impulse-response functions in macroeconomics, for example.

    Very nice post!

    1. Thanks,

      now, to get back on your question, I don’t think I need the Gaussian assumption… I believe that I do not even need the existence of third or fourth moment, actually. I will check that point when I can find some time !

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.