Time Series Regression

What Is Time Series Regression?

Time series regression is a statistical method for predicting a future response based on the response history (known as autoregressive dynamics) and the transfer of dynamics from relevant predictors. Time series regression can help you understand and predict the behavior of dynamic systems from experimental or observational data. Common uses of time series regression include modeling and forecasting of economic, financial, biological, and engineering systems.

You can start a time series analysis by building a design matrix (\(X_t\)), also called a feature or regressor matrix, which can include current and past observations of predictors ordered by time (t). Then, apply ordinary least squares (OLS) to the multiple linear regression (MLR) model

\[y_t = X_t\beta + e_t\]

to get an estimate of a linear relationship of the response \((y_t)\) to the design matrix. \(\beta\) represents the linear parameter estimates to be computed and \((e_t)\) represents the innovation terms. This form can be generalized to multivariate cases vector \((y_t)\), including exogenous inputs such as control signals, and correlation effects in the residues. For more difficult cases, the linear relationship can be replaced by a nonlinear one \(y_t = f(X_t,e_t)\), where \(f()\) is a nonlinear function such as a neural network.

Typically, time series modeling involves picking a model structure (such as an ARMA form or a transfer function) and incorporating known attributes of the system such as non-stationarities. Some examples are:

  • Autoregressive integrated moving average with exogenous predictors (ARIMAX)
  • Distributed lag models (transfer functions)
  • State space models
  • Spectral models
  • Nonlinear ARX models

The choice of model depends on your goals for the analysis and the properties of the data.  See Econometrics Toolbox™ and System Identification Toolbox™ for more details.

See also: cointegration, GARCH models, DSGE, equity trading, predictive modeling