# Stationary process

In mathematics and statistics, a **stationary process** (a.k.a. a **strict/strictly stationary process** or **strong/strongly stationary process**) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time.^{[1]} Consequently, parameters such as mean and variance also do not change over time.

Since stationarity is an assumption underlying many statistical procedures used in time series analysis, non-stationary data is often transformed to become stationary. The most common cause of violation of stationarity is a trend in the mean, which can be due either to the presence of a unit root or of a deterministic trend. In the former case of a unit root, stochastic shocks have permanent effects and the process is not mean-reverting. In the latter case of a deterministic trend, the process is called a trend stationary process, and stochastic shocks have only transitory effects after which the variable tends toward a deterministically evolving (non-constant) mean.

A trend stationary process is not strictly stationary, but can easily be transformed into a stationary process by removing the underlying trend, which is solely a function of time. Similarly, processes with one or more unit roots can be made stationary through differencing. An important type of non-stationary process that does not include a trend-like behavior is a cyclostationary process, which is a stochastic process that varies cyclically with time.

## Definition

Formally, let be a stochastic process and let represent the cumulative distribution function of the unconditional (i.e., with no reference to any particular starting value) joint distribution of at times . Then, is said to be strictly (or strongly) stationary if, for all , for all , and for all ,

Since does not affect , is not a function of time.

## Examples

White noise is the simplest example of a stationary process.

An example of a discrete-time stationary process where the sample space is also discrete (so that the random variable may take one of *N* possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autoregressive and moving average processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, depending on the parameter values, and important non-stationary special cases are where unit roots exist in the model.

Let *Y* be any scalar random variable, and define a time-series { *X _{t}* }, by

Then { *X _{t}* } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a single realisation takes the random value determined by

*Y*, rather than taking the expected value of

*Y.*

As a further example of a stationary process for which any single realisation has an apparently noise-free structure, let *Y* have a uniform distribution on (0,2π] and define the time series { *X _{t}* } by

Then { *X _{t}* } is strictly stationary.

## Weaker forms of stationarity

### Weak or wide-sense stationarity

A weaker form of stationarity commonly employed in signal processing is known as **weak-sense stationarity**, **wide-sense stationarity** (WSS), **covariance stationarity**, or **second-order stationarity**. WSS random processes only require that 1st moment (i.e. the mean) and autocovariance do not vary with respect to time. Any strictly stationary process which has a mean and a covariance is also WSS.

So, a continuous time random process *x*(*t*) which is WSS has the following restrictions on its mean function

and autocovariance function

The first property implies that the mean function *m*_{x}(*t*) must be constant. The second property implies that the covariance function depends only on the *difference* between
and
and only needs to be indexed by one variable rather than two variables. Thus, instead of writing,

the notation is often abbreviated and written as:

This also implies that the autocorrelation depends only on , that is

The main advantage of wide-sense stationarity is that it places the time-series in the context of Hilbert spaces. Let *H* be the Hilbert space generated by {*x*(*t*)} (that is, the closure of the set of all linear combinations of these random variables in the Hilbert space of all square-integrable random variables on the given probability space). By the positive definiteness of the autocovariance function, it follows from Bochner's theorem that there exists a positive measure *μ* on the real line such that *H* is isomorphic to the Hilbert subspace of *L*^{2}(*μ*) generated by {*e ^{−2πiξ⋅t}*}. This then gives the following Fourier-type decomposition for continuous time stationary stochastic process: there exists a stochastic process

*ω*with orthogonal increments such that, for all

_{ξ}*t*

where the integral on the right hand side is interpreted in a suitable (Riemann) sense. Same result holds for a discrete-time stationary process, with the spectral measure now defined on the unit circle.

When processing WSS random signals with linear, time-invariant (LTI) filters, it is helpful to think of the correlation function as a linear operator. Since it is a circulant operator (depends only on the difference between the two arguments), its eigenfunctions are the Fourier complex exponentials. Additionally, since the eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in the frequency domain. Thus, the WSS assumption is widely employed in signal processing algorithms.

### Other terminology

The terminology used for types of stationarity other than strict stationarity can be rather mixed. Some examples follow.

- Priestley uses
**stationary up to order***m*if conditions similar to those given here for wide sense stationarity apply relating to moments up to order*m*.^{[2]}^{[3]}Thus wide sense stationarity would be equivalent to "stationary to order 2", which is different from the definition of second-order stationarity given here.

- Priestley uses

- Honarkhah and Caers also use the assumption of stationarity in the context of multiple-point geostatistics, where higher n-point statistics are assumed to be stationary in the spatial domain.
^{[4]}

- Honarkhah and Caers also use the assumption of stationarity in the context of multiple-point geostatistics, where higher n-point statistics are assumed to be stationary in the spatial domain.

- Tahmasebi and Sahimi have presented an adaptive Shannon-based methodology that can be used for modeling of any non-stationary systems.
^{[5]}

- Tahmasebi and Sahimi have presented an adaptive Shannon-based methodology that can be used for modeling of any non-stationary systems.

## Differencing

One way to make some time series stationary is to compute the differences between consecutive observations. This is known as differencing.

Transformations such as logarithms can help to stabilize the variance of a time series. Differencing can help stabilize the mean of a time series by removing changes in the level of a time series, and so eliminating trend and seasonality.

One of the ways for identifying non-stationary times series is the ACF plot. For a stationary time series, the ACF will drop to zero relatively quickly, while the ACF of non-stationary data decreases slowly.^{[6]}

## See also

- Lévy process
- Stationary ergodic process
- Wiener–Khinchin theorem
- Ergodicity
- Statistical regularity
- Autocorrelation
- Whittle likelihood

## References

- ↑ Gagniuc, Paul A. (2017).
*Markov Chains: From Theory to Implementation and Experimentation*. USA, NJ: John Wiley & Sons. pp. 1–256. ISBN 978-1-119-38755-8. - ↑ Priestley, M. B. (1981).
*Spectral Analysis and Time Series*. Academic Press. ISBN 0-12-564922-3. - ↑ Priestley, M. B. (1988).
*Non-linear and Non-stationary Time Series Analysis*. Academic Press. ISBN 0-12-564911-8. - ↑ Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using Distance-Based Pattern Modeling".
*Mathematical Geosciences*.**42**(5): 487–517. doi:10.1007/s11004-010-9276-7. - ↑ Tahmasebi, P.; Sahimi, M. (2015). "Reconstruction of nonstationary disordered materials and media: Watershed transform and cross-correlation function" (PDF).
*Physical Review E*.**91**(3). doi:10.1103/PhysRevE.91.032401. - ↑ "8.1 Stationarity and differencing | OTexts".
*www.otexts.org*. Retrieved 2016-05-18.

## Further reading

- Enders, Walter (2010).
*Applied Econometric Time Series*(Third ed.). New York: Wiley. pp. 53–57. ISBN 978-0-470-50539-7. - Jestrovic, I.; Coyle, J. L.; Sejdic, E (2015). "The effects of increased fluid viscosity on stationary characteristics of EEG signal in healthy adults".
*Brain Research*.**1589**: 45–53. doi:10.1016/j.brainres.2014.09.035. PMC 4253861. PMID 25245522. - Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice. Otexts. https://www.otexts.org/fpp/8/1