Common Approaches to Univariate Time Series

There are a number of approaches to modeling time series. We outline a few of the most common approaches below.

Trend, Seasonal, Residual Decompositions
One approach is to decompose the time series into a trend, seasonal, and residual component.

Triple exponential smoothing is an example of this approach. Another example, called seasonal loess, is based on locally weighted least squares and is discussed by Cleveland (1993). We do not discuss seasonal loess in this handbook.

Frequency Based Methods
Another approach, commonly used in scientific and engineering applications, is to analyze the series in the frequency domain. An example of this approach in modeling a sinusoidal type data set is shown in the beam deflection case study. The spectral plot is the primary tool for the frequency analysis of time series.

Detailed discussions of frequency-based methods are included in Bloomfield (1976), Jenkins and Watts (1968), and Chatfield (1996).

Autoregressive (AR) Models

A common approach for modeling univariate time series is the autoregressive (AR) model:

Xt=δ+ϕ1X(t−1)+ϕ2X(t−2)+⋯+ϕpX(t−1)+At
where Xt is the time series, At is white noise, and

δ=(1−∑i=1pϕi)μ,
with μ denoting the process mean.

An autoregressive model is simply a linear regression of the current value of the series against one or more prior values of the series. The value of p is called the order of the AR model.

AR models can be analyzed with one of various methods, including standard linear least squares techniques. They also have a straightforward interpretation.

Moving Average (MA) Models

Another common approach for modeling univariate time series models is the moving average (MA) model:

Xt=μ+At−θ1A(t−1)−θ2A(t−2)−⋯−θqA(t−q)
where Xt is the time series, μ is the mean of the series, At−i are white noise terms, and θ1,…,θq are the parameters of the model. The value of q is called the order of the MA model.

That is, a moving average model is conceptually a linear regression of the current value of the series against the white noise or random shocks of one or more prior values of the series. The random shocks at each point are assumed to come from the same distribution, typically a normal distribution, with location at zero and constant scale. The distinction in this model is that these random shocks are propogated to future values of the time series. Fitting the MA estimates is more complicated than with AR models because the error terms are not observable. This means that iterative non-linear fitting procedures need to be used in place of linear least squares. MA models also have a less obvious interpretation than AR models.

Sometimes the ACF and PACF will suggest that a MA model would be a better model choice and sometimes both AR and MA terms should be used in the same model (see Section 6.4.4.5).

Note, however, that the error terms after the model is fit should be independent and follow the standard assumptions for a univariate process.

Box-Jenkins Approach

Box and Jenkins popularized an approach that combines the moving average and the autoregressive approaches in the book “Time Series Analysis: Forecasting and Control” (Box, Jenkins, and Reinsel, 1994).

Although both autoregressive and moving average approaches were already known (and were originally investigated by Yule), the contribution of Box and Jenkins was in developing a systematic methodology for identifying and estimating models that could incorporate both approaches. This makes Box-Jenkins models a powerful class of models. The next several sections will discuss these models in detail.