Proof of Stationarity Conditions for a First-Order Autoregressive (AR(1)) Model

Exploring the cinematic intuition of Proof of Stationarity Conditions for a First-Order Autoregressive (AR(1)) Model.

Visualizing...

Our institutional research engineers are currently mapping the formal proof for Proof of Stationarity Conditions for a First-Order Autoregressive (AR(1)) Model.

Apply for Institutional Early Access →

The Formal Theorem

Consider the AR(1) process defined by Xt=c+ϕXt1+ϵt X_t = c + \phi X_{t-1} + \epsilon_t , where ϵtWN(0,σ2) \epsilon_t \sim WN(0, \sigma^2) . The process is covariance-stationary if and only if the autoregressive parameter satisfies ϕ<1 |\phi| < 1 . Under this condition, the mean is E[Xt]=c1ϕ E[X_t] = \frac{c}{1-\phi} and the autocovariance function is given by:
γ(k)=Cov(Xt,Xtk)=σ2ϕk1ϕ2 \gamma(k) = \text{Cov}(X_t, X_{t-k}) = \frac{\sigma^2 \phi^{|k|}}{1 - \phi^2}

Analytical Intuition.

Imagine the AR(1) AR(1) process as a ball rolling on a friction-sloped surface. If the feedback coefficient ϕ \phi is precisely 1 1 , the process becomes a random walk, drifting infinitely far from the origin with variance growing boundlessly over time. This is the 'explosive' or 'unbounded' regime where memory never fades. However, when ϕ<1 |\phi| < 1 , the system possesses a 'forgetting' mechanism. Each step incorporates the past but dampens it by a factor of ϕ \phi . Like a damped oscillator returning to equilibrium, the influence of the initial state X0 X_0 decays geometrically as ϕt \phi^t . Because the past is continuously eroded by this contraction mapping, the process sheds its dependence on time, converging to a stable, time-invariant probability distribution. Stationarity, therefore, is the mathematical manifestation of equilibrium—the point where the stochastic energy injected by ϵt \epsilon_t is exactly balanced by the dissipative nature of the ϕ \phi parameter.
CAUTION

Institutional Warning.

Students frequently conflate stationarity with the existence of a finite mean. Even if ϕ=1 |\phi| = 1 , a mean can technically exist if defined at a specific starting point, but the variance becomes infinite as t t \to \infty , violating the fundamental requirement of time-invariant variance.

Academic Inquiries.

01

Why is the condition |\phi| < 1 necessary for finite variance?

If |\phi| \ge 1, the variance Var(Xt)=σ2j=0t1ϕ2j \text{Var}(X_t) = \sigma^2 \sum_{j=0}^{t-1} \phi^{2j} forms a divergent geometric series as t t \to \infty , causing the process to wander indefinitely.

02

What happens if \phi = 1?

The model becomes a random walk. It is non-stationary because its variance is time-dependent (tσ2 t\sigma^2 ) and it exhibits unit root behavior.

Standardized References.

  • Definitive Institutional SourceHamilton, J. D., Time Series Analysis.

Institutional Citation

Reference this proof in your academic research or publications.

NICEFA Visual Mathematics. (2026). Proof of Stationarity Conditions for a First-Order Autoregressive (AR(1)) Model: Visual Proof & Intuition. Retrieved from https://nicefa.org/library/applied-statistics/proof-of-stationarity-conditions-for-a-first-order-autoregressive--ar-1---model

Dominate the Logic.

"Abstract theory is just a movement we haven't seen yet."