## Regression Models with Time Series Errors

### What Are Regression Models with Time Series Errors?

Regression models with time series errors attempt to explain the mean behavior of a response series (*y _{t}*,

*t*= 1,...,

*T*) by accounting for linear effects of predictors (

*X*) using a multiple linear regression (MLR). However, the errors (

_{t}*u*), called

_{t}*unconditional disturbances*, are time series rather than white noise, which is a departure from the linear model assumptions. Unlike the ARIMA model that includes exogenous predictors, regression models with time series errors preserve the sensitivity interpretation of the regression coefficients (

*β*) [2].

These models are particularly useful for econometric data. Use these models to:

Analyze the effects of a new policy on a market indicator (an intervention model).

Forecast population size adjusting for predictor effects, such as expected prevalence of a disease.

Study the behavior of a process adjusting for calendar effects. For example, you can analyze traffic volume by adjusting for the effects of major holidays. For details, see [3].

Estimate the trend by including time (

*t*) in the model.Forecast total energy consumption accounting for current and past prices of oil and electricity (distributed lag model).

Use these tools in Econometrics Toolbox™ to:

Specify a regression model with ARIMA errors (see

`regARIMA`

).Estimate parameters using a specified model, and response and predictor data (see

`estimate`

).Simulate responses using a model and predictor data (see

`simulate`

).Forecast responses using a model and future predictor data (see

`forecast`

).Infer residuals and estimated unconditional disturbances from a model using the model and predictor data (see

`infer`

).`filter`

innovations through a model using the model and predictor dataGenerate impulse responses (see

`impulse`

).Compare a regression model with ARIMA errors to an ARIMAX model (see

`arima`

).

### Conventions

A regression model with time series errors has the following form (in lag operator notation):

$$\begin{array}{c}{y}_{t}=c+{X}_{t}\beta +{u}_{t}\\ a\left(L\right)A\left(L\right){\left(1-L\right)}^{D}\left(1-{L}^{s}\right){u}_{t}=b\left(L\right)B\left(L\right){\epsilon}_{t},\end{array}$$ | (1) |

*t*= 1,...,*T*.*y*is the response series._{t}*X*is row_{t}*t*of*X*, which is the matrix of concatenated predictor data vectors. That is,*X*is observation_{t}*t*of each predictor series.*c*is the regression model intercept.*β*is the regression coefficient.*u*is the disturbance series._{t}*ε*is the innovations series._{t}$${L}^{j}{y}_{t}={y}_{t-j}.$$

$$a\left(L\right)=\left(1-{a}_{1}L-\mathrm{...}-{a}_{p}{L}^{p}\right),$$ which is the degree

*p*, nonseasonal autoregressive polynomial.$$A\left(L\right)=\left(1-{A}_{1}L-\mathrm{...}-{A}_{{p}_{s}}{L}^{{p}_{s}}\right),$$ which is the degree

*p*, seasonal autoregressive polynomial._{s}$${\left(1-L\right)}^{D},$$ which is the degree

*D*, nonseasonal integration polynomial.$$\left(1-{L}^{s}\right),$$ which is the degree

*s*, seasonal integration polynomial.$$b\left(L\right)=\left(1+{b}_{1}L+\mathrm{...}+{b}_{q}{L}^{q}\right),$$ which is the degree

*q*, nonseasonal moving average polynomial.$$B\left(L\right)=\left(1+{B}_{1}L+\mathrm{...}+{B}_{{q}_{s}}{L}^{{q}_{s}}\right),$$ which is the degree

*q*, seasonal moving average polynomial._{s}

Following Box and Jenkins methodology, *u _{t}* is a stationary or unit root nonstationary, regular, linear time series. However, if

*u*is unit root nonstationary, then you do not have to explicitly difference the series as they recommend in [1]. You can simply specify the seasonal and nonseasonal integration degree using the software. For details, see Create Regression Models with ARIMA Errors.

_{t}Another deviation from the Box and Jenkins methodology is that *u _{t}* does not have a constant term (conditional mean), and therefore its unconditional mean is 0. However, the regression model contains an intercept term,

*c*.

**Note**

If the unconditional disturbance process is nonstationary (i.e., the nonseasonal or seasonal integration degree is greater than 0), then the regression intercept, *c*, is not identifiable. For details, see Intercept Identifiability in Regression Models with ARIMA Errors.

The software enforces stability and invertibility of the ARMA process. That is,

$$\psi (L)=\frac{b(L)B(L)}{a(L)A(L)}=1+{\psi}_{1}L+{\psi}_{2}{L}^{2}+\mathrm{...},$$

where the series {*ψ _{t}*} must be absolutely summable. The conditions for {

*ψ*} to be absolutely summable are:

_{t}*a*(*L*) and*A*(*L*) are*stable*(i.e., the eigenvalues of*a*(*L*) = 0 and*A*(*L*) = 0 lie inside the unit circle).*b*(*L*) and*B*(*L*) are*invertible*(i.e., their eigenvalues lie of*b*(*L*) = 0 and*B*(*L*) = 0 inside the unit circle).

The software uses maximum likelihood for parameter estimation. You can choose either a Gaussian or Student’s *t* distribution for the innovations, *ε _{t}*.

The software treats predictors as nonstochastic variables for estimation and inference.

## References

[1] Box, G. E. P., G. M. Jenkins, and G. C. Reinsel. *Time Series Analysis: Forecasting and Control*. 3rd ed. Englewood Cliffs, NJ: Prentice Hall, 1994.

[2] Hyndman, R. J. (2010, October). “The ARIMAX Model Muddle.”
*Rob J. Hyndman*. Retrieved May 4, 2017 from `https://robjhyndman.com/hyndsight/arimax/`

.

[3] Ruey, T. S. “Regression Models with Time Series Errors.”
*Journal of the American Statistical Association.* Vol. 79, Number 385, March 1984, pp. 118–124.

## See Also

`regARIMA`

| `arima`

| `estimate`

| `filter`

| `forecast`

| `impulse`

| `infer`

| `simulate`

## Related Examples

- Alternative ARIMA Model Representations
- Intercept Identifiability in Regression Models with ARIMA Errors