21
LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity refers to time invariance of some, or all, of the statistics of a random pro- cess, such as mean, autocorrelation, and n-th order distribution. We define two types of stationarity: strict-sense stationarity (SSS) and wide-sense stationarity (WSS). A random process { X(t )} is said to be SSS (or just stationary) if all its finite-order distributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of X(t 1 ), X(t 2 ),..., X(t k ) and X(t 1 + τ ), X(t 2 + τ ),..., X(t k + τ ) are the same for every k , every t 1 , t 2 ,..., t k , and every time shiſt τ . So for a SSS process, the first-order distribution is independent of t , and the second-order distribution — the distribution of any two samples X(t 1 ) and X(t 2 ) — depends only on τ = t 2 t 1 . To see this, note that from the definition of stationarity, for any t , the joint distribution of X(t 1 ) and X(t 2 ) is the same as the joint distribution of X(t )= X(t 1 +(t t 1 )) and X(t 2 +(t t 1 )) = X(t +(t 2 t 1 )). Example 9.1. IID processes are SSS. Example 9.2. e random walk is not SSS. In fact, no independent increment process is SSS. Example 9.3. e Gauss–Markov process defined in Section ؏.؏.؏ is not SSS. However, if we set X 1 to the steady-state distribution of X n , it becomes SSS (see Problem ؏.؏). 9.2 WIDE-SENSE STATIONARIY A random process { X(t )} is said to be wide-sense stationary (WSS) if its mean and auto- correlation functions are time invariant, i.e., Ȃ E[ X(t )] = is independent of t and Ȃ R X (t 1 , t 2 ) is a function only of the time difference t 2 t 1 .

Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

  • Upload
    ngonhi

  • View
    223

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

LECTURE 9

Stationary Processes

9.1 STRICT-SENSE STATIONARITY

Stationarity refers to time invariance of some, or all, of the statistics of a random pro-

cess, such as mean, autocorrelation, and n-th order distribution. We define two types of

stationarity: strict-sense stationarity (SSS) and wide-sense stationarity (WSS).

A random process {X(t)} is said to be SSS (or just stationary) if all its finite-order

distributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of

X(t1), X(t2), . . . , X(tk)and

X(t1 + τ), X(t2 + τ), . . . , X(tk + τ)are the same for every k, every t1 , t2 , . . . , tk , and every time shi� τ. So for a SSS process,

the first-order distribution is independent of t , and the second-order distribution— the

distribution of any two samples X(t1) and X(t2)—depends only on τ = t2 − t1. To see this,

note that from the definition of stationarity, for any t , the joint distribution of X(t1) andX(t2) is the same as the joint distribution of X(t) = X(t1 + (t − t1)) and X(t2 + (t − t1)) =X(t + (t2 − t1)).Example 9.1. IID processes are SSS.

Example 9.2. �e random walk is not SSS. In fact, no independent increment process is

SSS.

Example 9.3. �eGauss–Markov process defined in Section ؏.؏.؏ is not SSS. However, if

we set X1 to the steady-state distribution of Xn, it becomes SSS (see Problem ؏.؏).

9.2 WIDE-SENSE STATIONARIY

A random process {X(t)} is said to be wide-sense stationary (WSS) if its mean and auto-

correlation functions are time invariant, i.e.,

Ȃ E[X(t)] = ̀ is independent of t and

Ȃ RX(t1 , t2) is a function only of the time difference t2 − t1.

Page 2: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

2 Stationary Processes

As a technical condition, we also assume that

E[X(t)2] < ∞.

Since RX(t1 , t2) = RX(t2 , t1), for any wide-sense stationary process {X(t)}, the autocor-relation function RX(t1 , t2) is a function only of |t2 − t1|. Clearly SSS implies WSS. �e

converse is not necessarily true.

Example ؏.؏. Let

X(t) =.6666>6666F

+ sin t w.p. 1/4,− sin t w.p. 1/4,+ cos t w.p. 1/4,− cos t w.p. 1/4.

Note that E[X(t)] = 0 and RX(t1 , t2) = (1/2) cos(t2 − t1); thus X(t) is WSS. But X(0) andX(π/4) have different pmfs (i.e., the first-order pmf is not time-invariant) and the process

is not SSS.

For a Gaussian random process, WSS implies SSS, since every finite-order distribu-

tion of the process is completely specified by its mean and autocorrelation functions. �e

randomwalk is notWSS, since RX(n1 , n2) = min{n1 , n2} is not time-invariant. In fact, no

independent increment process can be WSS.

9.3 AUTOCORRELATION FUNCTION OF WSS PROCESSES

Let {X(t)} be a WSS process. We relabel RX(t1 , t2) as RX(τ), where τ = t1 − t2. �e auto-

correlation function RX(τ) satisfies the following properties.؏. RX(τ) is even, i.e., RX(τ) = RX(−τ) for every τ.؏. |RX(τ)| ≤ RX(0) = E[X2(t)], the “average power” of X(t).؏. If RX(T) = RX(0) for some T ̀= 0, thenRX(τ) is periodic with period T and so is X(t)

(with probability 1).

Example ؏.؏. Let X(t) = α cos(ωt +Θ) be periodic with random phase. �en

RX(τ) = α2

2cosωτ ,

which is also periodic.

�e above properties of RX(τ) are necessary but not sufficient for a function to qual-

ify as an autocorrelation function for a WSS process. �e necessary and sufficient con-

dition for a function R(τ) to be an autocorrelation function for a WSS process is that it

Page 3: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.3 Autocorrelation Function of WSS Processes 3

be even and nonnegative definite, that is, for any n, any t1, t2 , . . . , tn and any real vector

a = (a1 , . . . , an),nHi=1

nHj=1

aia jR(ti − t j) ≥ 0.

To see why this is necessary, recall that the correlation matrix for a random vector must be

nonnegative definite, so if we take a set of n samples from theWSS random process, their

correlation matrix must be nonnegative definite. �e condition is sufficient since such

an R(τ) can specify a zero-mean stationary Gaussian random process. �e nonnegative

definite conditionmay be difficult to verify directly. It turns out, however, to be equivalent

to the condition that the Fourier transform of RX(τ), which is called the power spectral

density SX( f ), is nonnegative for all frequencies f .

Example ؏.؏. Consider the following functions.

(a) (b)

e−ατ

τ

e−α|τ|

τ

(c) (d)

τ

sinc τ

τ

(e) (f)

τ

T2

T

1

−1

2−|n|

n−4−3−2−1 1 2 3 4

Page 4: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

4 Stationary Processes

(g) (h)

τ

1

−T Tτ

1

Here, the functions in (a), (c), and (g) are not autocorrelation functions, and the other

functions are autocorrelation functions of some WSS processes.

If RX(τ) drops quickly with τ, this means that samples become uncorrelated quickly

as we increase τ. Conversely, if RX (τ) drops slowly with τ, samples are highly corre-

lated. Figure ؏.؏ illustrates autocorrelation functions in these two cases. Hence, RX(τ) isa measure of the rate of change of X(t) with time t . It turns out that this is not just an

intuitive interpretation—as will be proved in Section ؏.؏, the Fourier transform of RX(τ)(the power spectral density) is in fact the average power density of X(t)) over frequency.

RX1(τ)

τ

RX2(τ)

τ

(a) (b)

Figure ؏.؏. Autocorrelation functions when (a) correlation is low and (b) correlationis high.

9.4 POWER SPECTRAL DENSITY

�e power spectral density (psd) of aWSS random process {X(t)} is the Fourier transformof RX(τ):

SX( f ) = F [RX(τ)] = X∞

−∞

RX(τ)e−i2πτ f dτ.

Page 5: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.4 Power Spectral Density 5

For a discrete-time process {Xn}, the power spectral density is the discrete-time Fourier

transform of the sequence RX(n):SX( f ) = ∞H

n=−∞

RX(n)e−i2πn f , | f | < 1

2.

By taking the inverse Fourier transform, RX(τ) (or RX(n)) can be recovered from SX( f )as

RX(τ) = X∞

−∞

SX( f )ei2πτ f d f ,RX(n) = X 1

2

−1

2

SX( f )ei2πn f d f .�e power spectral density SX( f ) satisfies the following properties.

؏. SX( f ) is real and even.

؏. ∫∞

−∞SX( f )d f = RX(0) = E[X2(t)], that is, the area under SX( f ) is the average power.

؏. SX( f ) is the average power density, i.e., the average power of X(t) in the frequency

band [ f1 , f2] isX− f1

− f2

SX( f ) d f + X f2

f1

SX( f ) d f = 2X f2

f1

SX( f ) d f .؏. SX( f ) ≥ 0.

�e first two properties follow from the definition of power spectral density as the Fourier

transform of a real and even function RX(τ). �e third and fourth properties will be

proved later in Section ؏.؏. In general, a function S( f ) is a psd if and only if it is real,

even, nonnegative, and ∫∞

−∞S( f ) d f < ∞.

Example ؏.؏. We consider a few pairs of autocorrelation functions and power spectral

densities.

(a)

RX(τ) = e−2α|τ|

τ

SX( f ) = α

α2+(π f )2

f

Page 6: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

6 Stationary Processes

(b)

RX (τ) = α2

2cos(ωτ)

τ

SX( f )α2

4

α2

4

f− ω2π

ω

(c)

RX(n) = 2−|n|

n−4−3−2−1 1 2 3 4

SX( f ) = 3

5 − 4 cos 2π f

f− 1

2

1

2

Example 9.8 (Discrete-timewhitenoise process). Let X1 , X2 , . . . be zeromean, uncorre-

lated, with common variance N , for example, i.i.d. N(0,N). �e autocorrelation function

and the power spectral density are plotted in Figure ؏.؏.

RX(n) = ®N n = 0

0 otherwise

N

n

SX( f )N

f− 1

2+ 1

2

Figure؏.؏. Autocorrelation function and power spectral density of the discrete-timewhite noise process.

Example 9.9 (Band-limited white noise process). Let {X(t)} be a WSS zero-mean pro-

cess with autocorrelation function and power spectral density plotted in Figure ؏.؏. Note

that for any t , the samples X�t ± n

2B� for n = 0, 1, 2, . . . are uncorrelated.

Page 7: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.4 Power Spectral Density 7

SX( f )N

2

f−B B

RX(τ) = NBsinc 2Bτ

1

2B2

2B

τ

Figure ؏.؏. Autocorrelation function and power spectral density of the band-limitedwhite noise process.

Example 9.10 (White noise process). If we letB Ă ∞ in the previous example, we obtain

a white noise process, which has

SX( f ) = N

2for all f ,

RX(τ) = N

2δ(τ).

If, in addition, {X(t)} isGaussian, thenweobtain the famouswhiteGaussian noise (WGN)

process. A sample path of the white noise process is depicted in Figure ؏.؏. For a white

noise process, all samples are uncorrelated. �e process is not physically realizable, since

it has infinite power. However, it plays a similar role in random processes to the role of

a point mass in physics and delta function in EE. �ermal noise and shot noise are well

modeled aswhiteGaussian noise, since they have very flat psd over verywide band (GHz).

Figure ؏.؏. A sample function of the white noise process.

Page 8: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

8 Stationary Processes

9.5 WSS PROCESSES AND LTI SYSTEMS

Consider a linear time invariant (LTI) system with impulse response h(t) and transfer

functionH( f ) = F [h(t)]. Suppose that the input to the system is aWSSprocess {X(t) : t ∈Ă}, as depicted in Figure ؏.؏. We wish to characterize its output

Y(t) = X(t) ∗ h(t) = X∞

−∞

X(τ)h(t − τ) dτ.

X(t) Y(t)h(t)

Figure ؏.؏. An LTI system driven by a WSS process input.

Assuming that the system is in steady state, it turns out (not surprisingly) that the

output is also a WSS process. In fact, X(t) and Y(t) are jointly WSS, namely,

Ȃ X(t) and Y(t) are WSS, and

Ȃ their crosscorrelation function

RXY (t1 , t2) = E[X(t1)Y(t2)]is time invariant, RXY (t1 , t2) is a function of τ = t1 − t2.

Henceforth, we relabel RXY (t1 , t2) as RXY (τ), where τ = t1 − t2. Note that unlike RX(τ),RXY (τ) is not necessarily even, but satisfies

RXY (τ) = RYX(−τ).Example ؏.؏؏. Let Θ ∼ Unif[0, 2π]. Consider two processes

X(t) = α cos(ωt + Θ),Y(t) = α sin(ωt +Θ).

�ese processes are jointly WSS, since each is WSS (in fact SSS) and

RXY (t1 , t2) = E[α2 cos(ωt1 +Θ) sin(ωt2 + Θ)]= α2

4πX2π

0

sin(ω(t1 + t2) + 2θ) − sin(ω(t1 − t2)) dθ= −α2

2sin(ω(t1 − t2)),

which is a function only of t1 − t2.

Page 9: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.5 WSS Processes and LTI Systems 9

We define the cross power spectral density for jointly WSS processes {X(t)} and {Y(t)}as

SXY ( f ) = F [RXY (τ)].Example ؏.؏؏. Let Y(t) = X(t) + Z(t), where {X(t)} and {Z(t)} are zero-mean uncorre-

lated WSS processes with power spectral densities SX( f ) and SZ( f ). �en, {X(t)} and{Y(t)} are jointly WSS. First, we show that {Y(t)} is WSS, since it is zero mean and

RY (t1 , t2) = E[(X(t1) + Z(t1))(X(t2) + Z(t2))]= E[X(t1)X(t2)] + E[Z(t1)Z(t2)]= RX(τ) + RZ(τ),where (a) follows since {X(t)} and {X(t)} and {Z(t)} are zero mean and uncorrelated.

Taking the Fourier transform of both sides,

SY ( f ) = SX( f ) + SZ( f ).To show that Y(t) and X(t) are jointly WSS, consider

RXY (t1 , t2) = E[X(t1)(X(t2) + Z(t2))]= E[X(t1)X(t2)] + E[X(t1)Z(t2)]= RX(t1 , t2),which is time invariant. Relabeling RX(t1 , t2) = RX(τ) and taking the Fourier transform,

SXY ( f ) = SX( f ).Let {X(t) : t ∈ Ă} be a WSS process input to a LTI system with impulse response h(t)

and transfer functionH( f ). If the system is stable, i.e.,

!!!!!!!! X∞

−∞

h(t) dt !!!!!!!! = |H(0)| < ∞ ,

then the input X(t) and the output Y(t) are jointly WSS with the following properties:

؏. E[Y(t)] = H(0) E[X(t)].؏. RYX (τ) = h(τ) ∗ RX(τ).؏. RY (τ) = h(τ) ∗ RX (τ) ∗ h(−τ).؏. SYX( f ) = H( f )SX( f ).؏. SY ( f ) = |H( f )|2SX( f ).�ese properties are summarized in Figure ؏.؏.

Page 10: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

10 Stationary Processes

E[X(t)] E[Y(t)]H(0)

RX(τ) RY (τ)h(τ) h(−τ)RYX (τ)

PSfrag

SX( f ) SY ( f )H( f ) H(− f )SYX( f )

Figure ؏.؏. Properties of the output of an LTI system driven by aWSS process input.

To show the first property, consider

E[Y(t)] = E�X∞

−∞

X(τ)h(t − τ) dτ�= X∞

−∞

E[X(τ)] h(t − τ) dτ= E[X(t)] X∞

−∞

h(t − τ) dτ= E[X(t)]H(0).

For the second property,

RYX(τ) = E[Y(t + τ)X(t)]= E�X∞

−∞

h(s)X(t + τ − s)X(t) ds�= X∞

−∞

h(s)RX (τ − s) ds= h(τ) ∗ RX(τ).

For the third property, consider

RY (τ) = E[Y(t + τ)Y(t)]= E�Y(t + τ) X∞

−∞

h(s)X(t − s) ds�= X∞

−∞

h(s)RYX (τ + s) ds= RYX(τ) ∗ h(−τ).

In the above, we have shown that E[Y(t + τ)X(t)] and E[Y(t + τ)Y(t)] are functions of τ(not of t), confirming that {X(t)} and {Y(t)} are jointly WSS. �e fourth and fi�h prop-

erties follow by taking the Fourier transforms of RYX (τ) and RY (τ), respectively.

Page 11: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.5 WSS Processes and LTI Systems 11

We can use the above properties to prove that SX( f ) is indeed the power spectral

density of X(t), that is, SX( f )d f is the average power contained in the frequency band[ f , f + d f ]. Consider an ideal band-pass filter shown in Figure ؏.؏ with output

Y(t) = h(t) ∗ X(t)driven by a WSS process {X(t)}. �en the average power of X(t) in the band [ f1 , f2] is

E[Y2(t)] = X∞

−∞

SY ( f ) d f= X∞

−∞

|H( f )|2SX( f ) d f= X− f1

− f2

SX( f ) d f + X f2

f1

SX( f ) d f= 2X f2

f1

SX( f ) d f .�is argument also shows that SX( f ) ≥ 0 for all f .

H( f )1

f− f2 − f1 f1 f2

Figure ؏.؏. �e transfer function of an ideal band-pass filter over [ f1 , f2].

Example ؏.؏؏ (kT/C noise). �e noise in a resistor R (in ohms) due to thermal noise is

modeled as aWGNvoltage sourceV (t) in series withR; see Figure ؏.؏. �epower spectral

density ofV (t) is SV ( f ) = 2kTRV2/Hz for all f , where k is the Boltzmann constant andT

is the temperature in degrees K.We find the average output noise power for an RC circuit

shown in Figure ؏.؏. First, note that the transfer function for the circuit is

H( f ) = 1

1 + i2π f RC,

which implies that

|H( f )|2 = 1

1 + (2π f RC)2 .

Page 12: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

12 Stationary Processes

V (t)R

Figure ؏.؏. A model of the thermal noise in a resistor.

R

CV (t) V0(t)

Figure ؏.؏. An RC circuit driven by the thermal noise.

Now we write the output power spectral density in terms of the input power spectral

density as

SVo( f ) = SV ( f )|H( f )|2 = 2kTR

1

1 + (2π f RC)2 .

�us the average output power is

E[V 2

o (t)] = X∞

−∞

SVo( f )d f

= 2kTR

2πRCX∞

−∞

1

1 + (2π f RC)2 d(2π f RC)= kT

πCX∞

−∞

1

1 + x2dx

= kT

πCarctan x

!!!!!!!+∞

−∞

= kT

C,

which is independent of R.

Page 13: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.6 Linear Estimation of Random Processes 13

9.6 LINEAR ESTIMATION OF RANDOM PROCESSES

Let {X(t)} and {Y(t)} be zero mean jointly WSS processes with known autocorrelation

and crosscorrelation functionsRX(τ),RY (τ), andRXY (τ). We observe the randomprocess

Y(s) for t − a ≤ s ≤ t + b (−a ≤ b) and wish to find the linearMMSE estimate of the signal

X(t), i.e., the estimate of the form

X̀(t) = Xa

−bh(s)Y(t − s) ds

that minimizes the mean square error

E[(X(t) − X̀(t))2].By the orthogonality principle, the linear MMSE estimate must satisfy

(X(t) − X̀(t)) ⊥ Y(t − s) , −b ≤ s ≤ a,

or equivalently,

E[(X(t) − X̀(t))Y(t − s)] = 0 , −b ≤ s ≤ a.

�us, for −b ≤ s ≤ a, the optimal estimation filter h(s)must satisfy

RXY (τ) = E[X(t)Y(t − τ)]= E[X̀(t)Y(t − τ)]= E�Xa

−bh(s)Y(t − s)Y(t − τ) ds�

= Xa

−bh(s)RY (τ − s) ds

for −b ≤ s ≤ a. To find h(s), we need to solve an infinite set of integral equations. Solving

these equations analytically is not possible in general. However, it can be done for two

important special cases:

Ȃ Infinite smoothing: a, b Ă ∞.

Ȃ Filtering: a Ă ∞ and b = 0

We discuss only the first case. �e second case leads to theWiener–Hopf equations from

which the famousWiener filter is derived.

When a, b Ă ∞, the integral equations for the linear MMSE estimate become

RXY (τ) = X∞

−∞

h(s)RY (τ − s) ds , −∞ < τ < +∞.

In other words,

RXY (τ) = h(τ) ∗ RY (τ).

Page 14: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

14 Stationary Processes

By taking Fourier transforms, we have

SXY ( f ) = H( f )SY ( f ),which implies that the optimal infinite smoothing filter is

H( f ) = SXY ( f )SY ( f ) .

Observe the similarity between the optimal filter and the LMMSE estimate

X̀ = ΣTYXΣ

−1Y(Y − E[Y]) + E[X]

for the vector case discussed in Section ؏.؏.

�e LMMSE filter achieves the MSE

E[(X(t) − X̀(t))2] = E[(X(t) − X̀(t))X(t)] − E[(X(t) − X̀(t))X̀(t)](a)= E[(X(t) − X̀(t))X(t)]= E[(X(t)2] − E[X̀(t)X(t)],

where (a) follows by orthogonality. To evaluate the second term, consider

RXX̀(τ) = E[X(t + τ)X̀(t)]= E�X(t + τ) X∞

−∞

h(s)Y(t − s) ds�= X∞

−∞

h(s)RXY (τ + s) ds= RXY (τ) ∗ h(−τ),

which implies that

SXX̀( f ) = H(− f )SXY ( f ).�erefore,

E[X̀(t)X(t)] = RXX̀(0)= X∞

−∞

SXX̀( f ) d f= X∞

−∞

H(− f )SXY ( f ) d f= X∞

−∞

|SXY ( f )|2SY ( f ) d f

Page 15: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

9.6 Linear Estimation of Random Processes 15

and the minimum MSE is

E[(X(t) − X̀(t))2] = E[(X(t)2] − E[X̀(t)X(t)]= X∞

−∞

SX( f ) d f − X∞

−∞

|SXY ( f )|2SY ( f ) d f

= X∞

−∞

�SX( f ) − |SXY ( f )|2SY ( f ) � d f .

�is MSE can be compared to

σ2X − ΣTYXΣ

−1YΣYX

for the vector case.

Example ؏.؏؏ (Additivewhite noise channel). Let {X(t)} and {Z(t)} be zero-mean un-

correlated WSS processes with

SX( f ) = ®P/2 | f | ≤ B ,

0 otherwise,

and

SZ( f ) = N

2for all f ,

as shown in Figure ؏.؏؏. In other words, the signal {X(t)} is band-limited white noise and

the noise {Z(t)} is white. We find the optimal infinite smoothing filter for estimating X(t)given

Y(τ) = X(τ) + Z(τ) , −∞ < τ < +∞.

�e transfer function of the optimal filter is

H( f ) = SXY ( f )SY ( f )

= SX( f )SX( f ) + SZ( f )

= ®P/(P + N) | f | ≤ B ,

0 otherwise,

as shown in Figure ؏.؏؏. �e MSE of the optimal filter is

X∞

−∞

SX( f ) − |SXY ( f )|2SY ( f ) d f = XB

−B

P

2− (P/2)2P/2 + N/2 d f

= PB − P2/4(P + N)/2 2B

= NPB

N + P.

Page 16: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

16 Stationary Processes

SX( f )

SZ( f )

P

2

N2

f

f−B B

Figure ؏.؏؏. Power spectral densities SX ( f ) and SZ( f ).H( f )

PP+N

f−B B

Figure ؏.؏؏. �e transfer function H( f ) of the optimal infinite smoothing filter.

PROBLEMS

؏.؏. Moving average process. Let Z0 , Z1 , Z2 , . . . be i.i.d. ∼ N (0, 1).(a) Let Xn = 1

2Zn−1 + Zn for n ≥ 1. Find the mean and autocorrelation function of

Xn.

(b) Is {Xn} wide-sense stationary?(c) Is {Xn} Gaussian?(d) Is {Xn} strict-sense stationary?(e) Find E(X3|X1 , X2).(f) Find E(X3|X2).(g) Is {Xn}Markov?

(h) Is {Xn} independent increment?

(i) Let Yn = Zn−1 + Zn for n ≥ 1. Find the mean and autocorrelation functions of{Yn}.

Page 17: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

Problems 17

(j) Is {Yn} wide-sense stationary?(k) Is {Yn} Gaussian?(l) Is {Yn} strict-sense stationary?(m)Find E(Y3|Y1 ,Y2).(n) Find E(Y3|Y2).(o) Is {Yn}Markov?

(p) Is {Yn} independent increment?

؏.؏. Random binary modulation. Let {Xn} be a zero-mean wide-sense stationary ran-

dom process with autocorrelation function RX(n), and Z1 , Z2 , . . . be i.i.d. Bern(p)random variables, i.e.,

Zi = ®1, with probability p,

0, with probability 1 − p.

Let

Yn = Xn ⋅ Zn , n = 1, 2, . . . .

(a) Find the mean and the autocorrelation function of {Yn} in terms of RX(n) andp.

(b) Is {Yn} jointly wide-sense stationary with {Xn}?؏.؏. Random binary waveform (؏؏ pts). Let {N(t)}∞t=0 be a Poisson process with rate λ,

and Z be independent of {N(t)} with P(Z = 1) = P(Z = −1) = 1/2. DefineX(t) = Z ⋅ (−1)N(t) , t ≥ 0.

(a) Find the mean and autocorrelation function of {X(t)}∞t=0.(b) Is {X(t)}∞t=0 wide-sense stationary?(c) Find the first-order pmf pX(t)(x) = P(X(t) = x).(d) Find the second-order pmf pX(t1),X(t2)(x1 , x2) = P(X(t1) = x1 , X(t2) = x2).(Hint: ∑k even x

k/k! = (ex + e−x)/2 and∑k odd xk/k! = (ex − e−x)/2.)

؏.؏. QAM random process. Consider the random process

X(t) = Z1 cosωt + Z2 sinωt , −∞ < t < ∞ ,

where Z1 and Z2 are i.i.d. discrete random variables such that pZi(+1) = pZi

(−1) =1

2.

(a) Is X(t) wide-sense stationary? Justify your answer.(b) Is X(t) strict-sense stationary? Justify your answer.

Page 18: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

18 Stationary Processes

؏.؏. Mixture of twoWSS processes. Let X(t) andY(t) be two zero-meanWSS processes

with autocorrelation functions RX (τ) and RY (τ), respectively. Define the processZ(t) = ¦ X(t), with probability 1

2

Y(t), with probability 1

2.

Find the mean and autocorrelation functions for Z(t). Is Z(t) a WSS process?

؏.؏. Stationary Gauss–Markov process. Let

X0 ∼ N(0, a)Xn = 1

2Xn−1 + Zn , n ≥ 1 ,

where Z1 , Z2 , Z3 , . . . are i.i.d. N(0, 1) independent of X0.

(a) Find a such that Xn is stationary. Find themean and autocorrelation functions

of Xn.

(b) (Difficult.) Consider the sample mean Sn = 1

n∑n

i=1 Xi , n ≥ 1. Show that Snconverges to the process mean in probability even though the sequence Xn is

not i.i.d. (A stationary process for which the sample mean converges to the

process mean is called mean ergodic.)

؏.؏. AM modulation. Consider the AM modulated random process

X(t) = A(t) cos(2πt + Θ) ,where the amplitude A(t) is a zero-mean WSS process with autocorrelation func-

tion RA(τ) = e−1

2|τ|, the phase Θ is a Unif[0, 2π) random variable, and A(t) and Θ

are independent. Is X(t) a WSS process?

؏.؏. Random-delay mixture.

Let {X(t)}, −∞ < t < ∞, be a zero-mean wide-sense stationary process with au-

tocorrelation function RX(τ) = e−|τ|. Let

Y(t) = X(t −U ),whereU is a random delay, independent of {X(t)}. Suppose thatU ∼ Bern(1/2),that is,

Y(t) = ®X(t) with probability 1/2,X(t − 1) with probabiltiy 1/2.

(a) Find the mean and autocorrelation functions of {Y(t)}.(b) Is {Y(t)} wide-sense stationary? Justify your answer.(c) Find the average power E(Y(t)2) of {Y(t)}.(d) Now suppose that U ∼ Exp(1), i.e., fU (u) = e−u, u ≥ 0. Find the mean and

autocorrelation function of {Y(t)}.

Page 19: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

Problems 19

؏.؏. Linear estimation. Let X(t) be a zero-mean WSS process with autocorrelation

function RX(τ) = e−|τ|. Let Y(t) be another zero-mean process which is jointly

WSS with X(t) with cross correlation function RXY (τ) = ∫1

0RX(s − τ)ds.

(a) Find the linearMMSE estimate ofY(t) given X(t). Leave your answer in terms

of e.

(b) Find the linear MMSE estimate of Y(t) given X(t) and X(t + 1).؏.؏؏. LTI system with WSS process input. Let Y(t) = h(t) ∗ X(t) and Z(t) = X(t) − Y(t)

as shown in the Figure ؏.؏؏.

(a) Find SZ( f ).(b) Find E(Z2(t)).Your answers should be in termsof SX ( f ) and the transfer functionH( f ) = F [h(t)].

X(t) Y(t)

+ -Z(t)

h(t)

Figure ؏.؏؏. LTI system.

؏.؏؏. Echo filtering. A signal X(t) and its echo arrive at the receiver as Y(t) = X(t) +X(t − ̀) + Z(t). Here the signal X(t) is a zero-mean WSS process with power

spectral density SX( f ) and the noise Z(t) is a zero-meanWSS with power spectral

density SZ( f ) = N0/2, uncorrelated with X(t).(a) Find SY ( f ) in terms of SX( f ), ̀, and N0.

(b) Find the best linear filter to estimate X(t) from {Y(s)}−∞<s<∞ .

؏.؏؏. Discrete-time LTI system with white noise input. Let {Xn : −∞ < n < ∞} be a

discrete-time white noise process, i.e., E(Xn) = 0, −∞ < n < ∞, and

RX(n) = ®1 n = 0,

0 otherwise.

�e process is filtered using a linear time invariant system with impulse response

h(n) = .66>66Fα n = 0,

β n = 1,

0 otherwise.

Page 20: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

20 Stationary Processes

Find α and β such that the output process Yn has

RY (n) = .66>66F2 n = 0,

1 |n| = 1,

0 otherwise.

؏.؏؏. Finding time of flight. Finding the distance to an object is o�en done by sending

a signal and measuring the time of flight, the time it takes for the signal to return

(assuming speed of signal, e.g., light, is known). Let X(t) be the signal sent andY(t) = X(t − δ) + Z(t) be the signal received, where δ is the unknown time of

flight. Assume that X(t) and Z(t) (the sensor noise) are uncorrelated zero mean

WSS processes. �e estimated crosscorrelation function of Y(t) and X(t), RYX (t)is shown in Figure ؏.؏؏. Find the time of flight δ.

3

2 5 8t

RYX(t)

Figure ؏.؏؏. Crosscorrelation function.

؏.؏؏. Finding impulse response of LTI system. To find the impulse response h(t) of an LTIsystem (e.g., a concert hall), i.e., to identify the system, white noise X(t), −∞ <t < ∞, is applied to its input and the output Y(t) is measured. Given the input

and output sample functions, the crosscorrelation RYX(τ) is estimated. Show how

RYX (τ) can be used to find h(t).؏.؏؏. Generating a random process with a prescribed power spectral density. Let S( f ) ≥ 0,

for −∞ < f < ∞, be a real and even function such that

X∞

−∞

S( f )d f = 1.

Define the random process

X(t) = cos(2πFt + Θ),where F ∼ S( f ) and Θ ∼ U[−π , π) are independent. Find the power spectral den-sity of X(t). Interpret the result.

؏.؏؏. Integration. Let X(t) be a zero-mean WSS process with autocorrelation function

RX(τ) = e−|τ|. Let

Y(t) = Xt+1

tX(s)ds.

Page 21: Stationary Processes - University of California, San Diegocircuit.ucsd.edu/~yhk/ece250-win17/pdfs/lect09.pdf · LECTURE 9 Stationary Processes 9.1 STRICT-SENSE STATIONARITY Stationarity

Problems 21

(a) Is Y(t)WSS?

(b) Is (X(t),Y(t)) jointly WSS?

(c) Find the linearMMSE estimate ofY(t) given X(t). Leave your answer in terms

of e.

(d) Find the linear MMSE estimate of Y(t) given X(t) and X(t + 1).؏.؏؏. Integrators. Let Y(t) be a short-term integration of a WSS process X(t):

Y(t) = 1

TXt

t−TX(u)du.

Find SY ( f ) in terms of SX( f ).؏.؏؏. Derivatives of stochastic processes. Let {X(t)} be a wide-sense stationary random

process with mean zero and autocorrelation function R(τ) = e−|τ|. Recall that a

random process {Y(t)} is continuous in mean square if E[(Y(t + є) − Y(t))2] Ă 0

as є Ă 0.

(a) Find the mean and the variance of X(t).(b) Is X(t) continuous in mean square? Justify your answer.

(c) Now let

Zє(t) = X(t + є) − X(t)є

be an є-approximation of the derivative X̀(t). Find the mean and the variance

of Zє(t).(d) Find the linear MMSE estimate of Zє(t) given (X(t), X(t + є)) and the associ-

ated MSE.

(e) Find the linear MMSE estimate of Zє(t) given X(t) and the associated MSE.

(f) Find the limiting mean and variance of Zє(t) as є Ă 0.