Time Series 3—ARMA
The autoregressive moving average (ARMA) model
Stationarity
A stochastic process is covariance stationary or weakly stationary if for all
If a process is covariance stationary, the covariance between
For a covariance stationary series, we can define the autocorrelation between
where
Ergodicity
Imagine a battery of
We might view this as the probability limit of the ensemble average
The above expectations of a time series in terms of ensemble averages may seem a bit contrived. Usually we have a single realization of size
A covariance stationary process is said to be ergodic for the mean if
Moving Average Processes
The First-Order Moving Average Process
Let
where
Expectation
The expectation of
Variance
The variance of
Autocovariance
The first autocovariance of
Higher autocovariances are all zero. For all
Autocorrelation
The
The th-Order Moving Average Process
A
Expectation
The expectation of
Variance
The variance of
Autocovariance
The autocovariance of
For all
The Infinite-Order Moving Average Process
Consider the process when
This could be described as an
The
It is often to work with a slightly stronger condition called absolutely summable
Expectation
The mean of an
Autocovariance
The autocovariance of an
Autoregressive Processes
The First-Order Autoregressive Process
A first-order autoregressive, denoted
When $$ | \phi | <1$$, this process is covariance stationary. It can be rewritten as |
We can derive the expectation and autocovariance of
Expectation Taking expectations both sides
Autocovariance
The th-Order Autoregressive Process
The Autocorrelation Function
Autocorrelation function (ACF) and the partial autocorrelation function (PACF) are useful to determine the type of time series data.
For AR(1) model
- Method 1
So the autocorrelation function for AR(1)
- Method 2 If the process started at time zero
Take the expectation of
If $$ | a_{1} | <1 |
The Autocorrelation Function of an AR(2) Process
We assume that
Using Yule-Walker equations: multiply the second-order D.E by
By definition, the autocovariances of a stationary series are such
We also know that coefficient on
Now we can get the ACF
We know
The Autocorrelation Function of an MA(1) Process
Consider the MA(1) process
Applying the Yule-Walker equations
So the ACF of MA(1)
The Autocorrelation Function of an ARMA(1,1) Process
Consider the ARMA(1,1)
Solve the equations and get
And the AFC
The Partial Autoorrelation Function
In AR(1) process,
Method to find the PACF:
- Form the series
, where - Form the first-order autoregression equation:
where
Sample Autoorrelations
Given that a series is stationary, we can use the sample mean, variance and autocorrelations to estimate the parameters of the actual data-generating process.
The estimates of
Box and Pierce used the sample autocorrelations to form the statistic
If the data are generated from a stationary ARMA process, Q is asymptotically
Ljung and Box test