In time series analysis, the moving-average model (MA model), also known as moving-average process, is a common approach for modeling univariate time series.[1][2] The moving-average model specifies that the output variable is cross-correlated with a non-identical to itself random-variable.
Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series,[3] which have a more complicated stochastic structure. Contrary to the AR model, the finite MA model is always stationary.
The moving-average model should not be confused with the moving average, a distinct concept despite some similarities.[1]
{{citation}}
: CS1 maint: location (link)