23
Auto-regressive Processes B. Nag and J. Christophersen MET - 6155 November 09, 2011

Autoregression

Embed Size (px)

Citation preview

Page 1: Autoregression

Auto-regressive Processes

B. Nag and J. Christophersen

MET - 6155

November 09, 2011

Page 2: Autoregression

Outline of the talk

Introduction of AR(p) Processes

Formal Definition

White Noise

Deriving the First Moment

Deriving the Second Moment

Lag 1: AR(1)

Lag 2: AR(2)

Bappaditya, Jonathan Auto-regressive Processes

Page 3: Autoregression

Introduction

Dynamics of many physical processes :

a2d2x(t)

dt2+ a1

dx(t)

dt+ a0x(t) = z(t) (1)

where z(t) is some external forcing function.Time discretization yields

xt = α1xt−1 + α2xt−2 + z ′t (2)

Bappaditya, Jonathan Auto-regressive Processes

Page 4: Autoregression

Formal Definition

Xt : t ∈ Z is an auto-regressive process of order p if there exist realconstants αk , k = 0, . . . , p, with αp 6= 0 and a white noise processZt : t ∈ Z such that

Xt = α0 +

p∑k=1

αkXt−k + Zt (3)

Note : Xt is independent of the part of Zt that is in the future, butdepends on the parts of the noise processes that are in the present andthe past

Bappaditya, Jonathan Auto-regressive Processes

Page 5: Autoregression

White Noise

Consider a time series :Xt = Dt + Nt (4)

with Dt and Nt being the determined and stochastic (random)components respectively.If Dt is independent of Nt, then Dt is deterministic. Nt masksdeterministic oscillations when present.Let us consider the case for k = 1.

Xt = α1Xt−1 + Nt

= α1(Dt−1 + Nt−1) + Nt

= α1Dt−1 + α1Nt−1 + Nt

where, α1Nt−1 can be regarded as the contribution from the dynamics ofthe white noise. The spectrum of a white noise process is flat and hencethe name.

Bappaditya, Jonathan Auto-regressive Processes

Page 6: Autoregression

(a) (b)

Figure: A realization of a process Xt = Dt + Nt for which the dynamicalcomponent Dt = 0.7Xt is affected by the stochastic component Nt .(a) Nt (b) Xt

0All plots are made up of 100 member ensembleBappaditya, Jonathan Auto-regressive Processes

Page 7: Autoregression

First Order Moment : Mean of an AR(p)Process

Assumptions : µX and σ2X is independent of time.

Taking expectations on both sides of the generalized eqn.( 3),

ε(Xt) = ε(α0) + ε(

p∑k=1

αkXt−k) + ε(Zt)

= α0 +

p∑k=1

αkε(Xt−k)

= α0 +

p∑k=1

αkε(Xt)

=α0

1−p∑

k=1

αk

(5)

Bappaditya, Jonathan Auto-regressive Processes

Page 8: Autoregression

Second Order Moment : Variance of an AR(p)Process

Proposition:

Var(Xt) =

p∑k=1

αkρkVar(Xt) + Var(Zt)

Proof: Let µ = ε(Xt), then re-writting eqn. (3),

Xt − µ =

p∑k=1

αk(Xt−k − µ) + Zt (6)

Multiplying both sides by Xt − µ and taking expectations :

Var(Xt) = ε((Xt − µ)2)

= ε(

p∑k=1

αk(Xt − µ)(Xt−k − µ)) + ε((Xt − µ)Zt)

=

p∑k=1

αkε((Xt − µ)(Xt−k − µ)) + ε((Xt − µ)Zt)

Bappaditya, Jonathan Auto-regressive Processes

Page 9: Autoregression

Var(Xt) =

p∑k=1

αkρkVar(Xt) + ε((Xt − µ)Zt) (7)

where ρk is the auto-correlation function defined as

ρk =ε((Xt − µ)(Xt−k − µ))

Var(Xt)(8)

Lemma : ε((Xt − µ)Zt) = Var(Zt)Proof:

ε((Xt − µ)Zt) = ε(XtZt − µZt)

= ε(XtZt)− ε(µZt) (9)

Again,

ε(XtZt) = ε(

p∑k=1

αk(Xt−k − µ) + Zt + µ)Zt

= ε(

p∑k=1

αkXt−kZt)− ε(µZt) + ε(Z2t ) + ε(µZt)

Bappaditya, Jonathan Auto-regressive Processes

Page 10: Autoregression

ε(XtZt) =

p∑k=1

αkε(Xt−kZt) + ε(Z2t )

=

p∑k=1

αkε(Xt−kZt) + Var(Zt) (10)

Since Xt is independent of the part of Zt that is in the future impliesXt−k and Zt are independent. Hence

ε(Xt−kZt) = 0

Hence we get,ε(XtZt) = Var(Zt) (11)

From equation (5),

ε(µZt) = µε(Zt)

=α0

1−∑p

k=1 αkε(Zt)

=α0

1−∑p

k=1 αk× 0

= 0

Bappaditya, Jonathan Auto-regressive Processes

Page 11: Autoregression

Thusε((Xt − µ)Zt) = Var(Zt) (12)

and eqn. (7) reduces to

Var(Xt) =

p∑k=1

αkρkVar(Xt) + Var(Zt)

Var(Xt) =Var(Zt)

1−p∑

k=1

αkρk

(13)

Bappaditya, Jonathan Auto-regressive Processes

Page 12: Autoregression

AR(1) Processes

Consider the following equation:

a1dx

dt+ a0x = z(t) (14)

Discretizing again :

a1(x1 − xt−1) + a0xt = ztatxt − a1xt−1 + a0xt = ztxt(a1 + a0)− a1xt−1 = zt

Therefore we obtain :xt = α1xt−1 + z ′t (15)

where α1 = a1a1+a0

and z ′t = zta1+a0

Bappaditya, Jonathan Auto-regressive Processes

Page 13: Autoregression

AR(1) Processes Continued

Hence an AR(1) Process can be represented as

Xt = α1Xt−1 + Zt (16)

For convinience we assume, α0 = 0 and ε(Xt) = µ = 0Expectation of the product of Xt with Xt−1 is

ε(XtXt−1) = α1ε(X2t−1) + ε(ZtXt−1)

Since Xt does not depend on the part of Zt that is in the future, hence

ε(ZtXt−1) = 0

Also since the variance is independent of time,

ε(XtXt−1) = α1ε(X2t ) (17)

Hence,

α1 =ε(XtXt−1)

Var(Xt)(18)

Bappaditya, Jonathan Auto-regressive Processes

Page 14: Autoregression

AR(1) Processes Continued

Substituting for k = 1, in eqn. (8), yields

ρ1 =ε(XtXt−1)

Var(Xt)(19)

Hence ρ1 = α1

Using this we can write eqn. (13) for an AR(1) process as

Var(Xt) =Var(Z ′t )

1−∑p

k=1 αkρk

=σ2z

1− α21

(20)

This result shows that the variance of the random variable Xt is a linearfunction of the variance of the white noise σ2

Z . This also shows that thevariance is also a nonlinear function of α1.If α1 ≈ 0, then the Var(Xt) ≈ Var(Zt). For α1 ∈ [0, 1], we see thatVar(Xt) > Var(Zt). As α1 approaches 1, the Var(Xt) approaches ∞.

Bappaditya, Jonathan Auto-regressive Processes

Page 15: Autoregression

(a)

(b)

Figure: AR(1) Processes with α1 = 0.3 (top) and α1 = 0.9 (bottom)

Bappaditya, Jonathan Auto-regressive Processes

Page 16: Autoregression

AR(2) Processes

a2d2x(t)

dt2+ a1

dx(t)

dt+ a0x(t) = z(t) (21)

where z(t) is some external forcing function.Time discretization yields

a2(xt + xt−2 − 2xt−1) + a1(xt − xt−1) + a0xt = z(t)

(a0 + a1 + a2)xt = (a1 + 2a2)xt−1 − a2xt−2 + zt

Alternatively,xt = α1xt−1 + α2xt−2 + z ′t (22)

where

α1 =a1 + 2a2

a0 + a1 + a2

α2 = − a2a0 + a1 + a2

z ′t =1

a0 + a1 + a2zt

Bappaditya, Jonathan Auto-regressive Processes

Page 17: Autoregression

Generated by CamScanner from intsig.comFigure: AR(2) Processes with α1 = 0.9 and α2 = −0.8 (top) and withα1 = α2 = 0.3 (bottom)

Bappaditya, Jonathan Auto-regressive Processes

Page 18: Autoregression

Parameterizing AR(2) Processes

In order for AR(2) processes to be stationary, α1 and α2 must satisfythree conditions:

(1) α1 + α2 < 1(2) α1 − α2 < 1(3) −1 < α2 < 1

This defines a triangular region for the (α1, α2)-plane.

Note that if α2 = 0 then we observe AR(1) processes where −1 < α1 < 1defines the space for which α1 is stationary in an AR(1) model.

Bappaditya, Jonathan Auto-regressive Processes

Page 19: Autoregression

Parameterizing AR(2) Processes Continued

Generated by CamScanner from intsig.comFigure: Region of stationary points for AR(1) and AR(2) processes

Bappaditya, Jonathan Auto-regressive Processes

Page 20: Autoregression

Parameterizing AR(2) Processes Continued

The figure above shows:

AR(1) processes are special cases:

α1 > 0 shows exponential decayα1 < 0 shows damped oscillationsα1 > 0 for most meteorological phenomena

The second parameter α2:

More complex relationship between lagsFor (0.9,−0.6), slow damped oscillation around 0AR(2) models can represent pseudoperiodicityBarometric pressure variations due to midlatitude synoptic systemsfollow pseudoperiodic behavior

Bappaditya, Jonathan Auto-regressive Processes

Page 21: Autoregression

Parameterizing AR(2) Processes Continued

(a) (b)

(c) (d)

Figure: Four synthetic time series illustrating some properties ofautoregressive models. (a) α1 = 0.0, α2 = 0.1, (b) α1 = 0.5, α2 = 0.1, (c)α1 = 0.9, α2 = −0.6, (d) α1 = 0.09, α2 = 0.11

Bappaditya, Jonathan Auto-regressive Processes

Page 22: Autoregression

References

von Storch, H., 1999: Statistical analysis in climate research, 1st ed.Cambridge University, 494 pp.

Wilks, D., 1995: Statistical methods in the atmospheric sciences, 1st ed.Academic Press, Inc., 467 pp.

Scheaffer, R., 1994: Introduction to probability and its applications,2nd ed. Duxberry Press, 377 pp.

Bappaditya, Jonathan Auto-regressive Processes

Page 23: Autoregression

Questions

??

Bappaditya, Jonathan Auto-regressive Processes