# Stationary Processes - University of California, San yhk/ece250-win17/pdfs/   LECTURE 9

• View
217

0

Embed Size (px)

### Text of Stationary Processes - University of California, San yhk/ece250-win17/pdfs/   LECTURE 9

• LECTURE 9

Stationary Processes

9.1 STRICT-SENSE STATIONARITY

Stationarity refers to time invariance of some, or all, of the statistics of a random pro-

cess, such as mean, autocorrelation, and n-th order distribution. We define two types of

stationarity: strict-sense stationarity (SSS) and wide-sense stationarity (WSS).

A random process {X(t)} is said to be SSS (or just stationary) if all its finite-orderdistributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of

X(t1), X(t2), . . . , X(tk)and

X(t1 + ), X(t2 + ), . . . , X(tk + )are the same for every k, every t1 , t2 , . . . , tk , and every time shi . So for a SSS process,

the first-order distribution is independent of t , and the second-order distribution the

distribution of any two samples X(t1) and X(t2)depends only on = t2 t1. To see this,note that from the definition of stationarity, for any t , the joint distribution of X(t1) andX(t2) is the same as the joint distribution of X(t) = X(t1 + (t t1)) and X(t2 + (t t1)) =X(t + (t2 t1)).Example 9.1. IID processes are SSS.

Example 9.2. e random walk is not SSS. In fact, no independent increment process is

SSS.

Example 9.3. eGaussMarkov process defined in Section .. is not SSS. However, if

we set X1 to the steady-state distribution of Xn, it becomes SSS (see Problem .).

9.2 WIDE-SENSE STATIONARIY

A random process {X(t)} is said to be wide-sense stationary (WSS) if its mean and auto-correlation functions are time invariant, i.e.,

E[X(t)] = is independent of t and RX(t1 , t2) is a function only of the time difference t2 t1.

• 2 Stationary Processes

As a technical condition, we also assume that

E[X(t)2] < .Since RX(t1 , t2) = RX(t2 , t1), for any wide-sense stationary process {X(t)}, the autocor-relation function RX(t1 , t2) is a function only of |t2 t1|. Clearly SSS implies WSS. econverse is not necessarily true.

Example .. Let

X(t) =.6666>6666F

+ sin t w.p. 1/4, sin t w.p. 1/4,+ cos t w.p. 1/4, cos t w.p. 1/4.

Note that E[X(t)] = 0 and RX(t1 , t2) = (1/2) cos(t2 t1); thus X(t) is WSS. But X(0) andX(/4) have different pmfs (i.e., the first-order pmf is not time-invariant) and the processis not SSS.

For a Gaussian random process, WSS implies SSS, since every finite-order distribu-

tion of the process is completely specified by its mean and autocorrelation functions. e

randomwalk is notWSS, since RX(n1 , n2) = min{n1 , n2} is not time-invariant. In fact, noindependent increment process can be WSS.

9.3 AUTOCORRELATION FUNCTION OF WSS PROCESSES

Let {X(t)} be a WSS process. We relabel RX(t1 , t2) as RX(), where = t1 t2. e auto-correlation function RX() satisfies the following properties.. RX() is even, i.e., RX() = RX() for every .. |RX()| RX(0) = E[X2(t)], the average power of X(t).. If RX(T) = RX(0) for some T = 0, thenRX() is periodic with period T and so is X(t)

(with probability 1).

Example .. Let X(t) = cos(t +) be periodic with random phase. enRX() = 22 cos ,

which is also periodic.

e above properties of RX() are necessary but not sufficient for a function to qual-ify as an autocorrelation function for a WSS process. e necessary and sufficient con-

dition for a function R() to be an autocorrelation function for a WSS process is that it

• 9.3 Autocorrelation Function of WSS Processes 3

be even and nonnegative definite, that is, for any n, any t1, t2 , . . . , tn and any real vector

a = (a1 , . . . , an),nHi=1

nHj=1

aia jR(ti t j) 0.

To see why this is necessary, recall that the correlation matrix for a random vector must be

nonnegative definite, so if we take a set of n samples from theWSS random process, their

correlation matrix must be nonnegative definite. e condition is sufficient since such

an R() can specify a zero-mean stationary Gaussian random process. e nonnegativedefinite conditionmay be difficult to verify directly. It turns out, however, to be equivalent

to the condition that the Fourier transform of RX(), which is called the power spectraldensity SX( f ), is nonnegative for all frequencies f .Example .. Consider the following functions.

(a) (b)

e

e||

(c) (d)

sinc

(e) (f)

T2

T

1

1

2|n|

n4321 1 2 3 4

• 4 Stationary Processes

(g) (h)

1

T T 1

Here, the functions in (a), (c), and (g) are not autocorrelation functions, and the other

functions are autocorrelation functions of some WSS processes.

If RX() drops quickly with , this means that samples become uncorrelated quicklyas we increase . Conversely, if RX () drops slowly with , samples are highly corre-lated. Figure . illustrates autocorrelation functions in these two cases. Hence, RX() isa measure of the rate of change of X(t) with time t . It turns out that this is not just anintuitive interpretationas will be proved in Section ., the Fourier transform of RX()(the power spectral density) is in fact the average power density of X(t)) over frequency.

RX1()

RX2()

(a) (b)

Figure .. Autocorrelation functions when (a) correlation is low and (b) correlationis high.

9.4 POWER SPECTRAL DENSITY

e power spectral density (psd) of aWSS random process {X(t)} is the Fourier transformof RX():

SX( f ) = F [RX()] = X

RX()ei2 f d.

• 9.4 Power Spectral Density 5

For a discrete-time process {Xn}, the power spectral density is the discrete-time Fouriertransform of the sequence RX(n):

SX( f ) = Hn=

RX(n)ei2n f , | f | < 12 .By taking the inverse Fourier transform, RX() (or RX(n)) can be recovered from SX( f )as

RX() = X

SX( f )ei2 f d f ,RX(n) = X

1

2

1

2

SX( f )ei2n f d f .e power spectral density SX( f ) satisfies the following properties.

. SX( f ) is real and even..

SX( f )d f = RX(0) = E[X2(t)], that is, the area under SX( f ) is the average power.

. SX( f ) is the average power density, i.e., the average power of X(t) in the frequencyband [ f1 , f2] is

X f1 f2

SX( f ) d f + X f2f1

SX( f ) d f = 2X f2f1

SX( f ) d f .. SX( f ) 0.e first two properties follow from the definition of power spectral density as the Fourier

transform of a real and even function RX(). e third and fourth properties will beproved later in Section .. In general, a function S( f ) is a psd if and only if it is real,even, nonnegative, and

S( f ) d f < .

Example .. We consider a few pairs of autocorrelation functions and power spectral

densities.

(a)

RX() = e2||

SX( f ) = 2+( f )2

f

• 6 Stationary Processes

(b)

RX () = 22 cos()

SX( f )2

4

2

4

f 2

2

(c)

RX(n) = 2|n|

n4321 1 2 3 4

SX( f ) = 35 4 cos 2 f

f 12

1

2

Example 9.8 (Discrete-timewhitenoise process). Let X1 , X2 , . . . be zeromean, uncorre-

lated, with common variance N , for example, i.i.d. N(0,N). e autocorrelation functionand the power spectral density are plotted in Figure ..

RX(n) = N n = 00 otherwise

N

n

SX( f )N

f 12

+ 12

Figure.. Autocorrelation function and power spectral density of the discrete-timewhite noise process.

Example 9.9 (Band-limited white noise process). Let {X(t)} be a WSS zero-mean pro-cess with autocorrelation function and power spectral density plotted in Figure .. Note

that for any t , the samples Xt n2B

for n = 0, 1, 2, . . . are uncorrelated.

• 9.4 Power Spectral Density 7

SX( f )N

2

fB B

RX() = NBsinc 2B

1

2B2

2B

Figure .. Autocorrelation function and power spectral density of the band-limitedwhite noise process.

Example 9.10 (White noise process). If we letB in the previous example, we obtaina white noise process, which has

SX( f ) = N2 for all f ,RX() = N2 ().

If, in addition, {X(t)} isGaussian, thenweobtain the famouswhiteGaussian noise (WGN)process. A sample path of the white noise process is depicted in Figure .. For a white

noise process, all samples are uncorrelated. e process is not physically realizable, since

it has infinite power. However, it plays a similar role in random processes to the role of

a point mass in physics and delta function in EE. ermal noise and shot noise are well

modeled aswhiteGaussian noise, since they have very flat psd over verywide band (GHz).

Figure .. A sample function of the white noise process.

• 8 Stationary Processes

9.5 WSS PROCESSES AND LTI SYSTEMS

Consider a linear time invariant (LTI) system with impulse response h(t) and transferfunctionH( f ) = F [h(t)]. Suppose that the input to the system is aWSSprocess {X(t) : t }, as depicted in Figure .. We wish to characterize its output

Y(t) = X(t) h(t) = X

X()h(t ) d.

X(t) Y(t)h(t)

Figure .. An LTI system driven by a WSS process input.

Assuming that the system is in steady state, it turns out (not surprisingly) that the

output is also a WSS process. In fact, X(t) and Y(t) are jointly WSS, namely, X(t) and Y(t) are WSS, and their crosscorrelation function

RXY (t1 , t2) = E[X(t1)Y(t2)]is time invariant, RXY (t1 , t2) is a function of = t1 t2.

Henceforth, we relabel RXY (t1 , t2) as RXY (), where = t1 t2. Note that unlike RX(),RXY () is not necessarily even, but satisfies

RXY () = RYX().Example .. Let Unif[0, 2]. Consider two processes

X(t) = cos(t + ),Y(t) = sin(t +).

ese processes are jointly WSS, since each is WSS (in fact SSS) and

RXY (t1 , t2) = E[2 cos(t1 +) sin(t2 + )]= 24

Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents
Documents