40
Estimation Theory Chapter 12

Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

  • Upload
    others

  • View
    12

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Estimation Theory

Chapter 12

Page 2: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Linear Bayesian Estimators• Optimal MMSE Bayesian estimators in generalare difficult to compute in closed form; exceptfor the jointly Gaussian case. But in manysituations, we can’t make the Gaussianassumption.

• Instead, we keep the MMSE cost functionbut constrain the estimator to be linear.In this case, it turns out that an explicitform for the estimator can be determinedwhich only depends on 1st and 2nd

moments of pdf. This is analogous toBLUE in classical estimation.

CalledWienerFilter

Page 3: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Linear MMSE Estimation (Scalar Case)Problem: Estimate scalar random parameter θbased on X=[X(0) X(1) …..X(N-1)]T by consideringthe class of linear (affine) estimators

Note:1)aN allows for case of non-zero-mean X and θ2)LMMSE is suboptimal unless optimal MMSE

estimator E(θ/X) happens to be linear as in thecase of linear model X=Hθ+W

3)LMMSE relies on statistical dependence(correlation) between θ and X

θ)p(X, w.r.t isn expectatio thewhere

minimizing and ])θE[(θ)θBMSE( )(θ 21

0−=+=∑

=N

N

nn anXa

Page 4: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Determining Linear MMSE Estimator

[ ]{ }θθθθ

θθ

θθθ

θθ

θθ

Caa-aa ))(())((a E

))(())(()(()ˆ(

a,...,a,a gCalculatin 2)zero. are means theif zero be which will

)()())(()(

0])([2]))([(

a optimum Calculate )1

TT

2T

21

0

1N10

1

0

2

N

+−=

−−−=

−−−=

−=−=⇒

=−−−=−−∂∂

∑∑

=

=

XXXX

N

nn

N

nnN

nNnNn

N

CCCEXEX

EnXEnXaEBmse

EEnXEaEa

anXaEanXaEa

XaT

Page 5: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

θθθθ

θ

θθ

θθ

θ

θ

θ

θθ

θ

XXXX

XXX

XXθX

XXTXXX

TXN

T

XXXXXX

CCCC

XCC

XEXCXECCEXCCaXa

CCaCaCa

Bmse

1

1

1

11

1

)ˆBmse(

ˆ

:casemean -zeroFor )linear! be tohappenslatter the(since case

Gaussian-jointlyfor estimator MMSE toidentical isWhich ))((C)E(

)()(ˆ

0220)ˆ(

−−

−=

=

−+=

−+=+=

=⇒=−⇒=∂

Determining Linear MMSE Estimator

Page 6: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Example

)1(

11 )11(I :MIL Using

)(:)11(1ˆ

1)()(C ;11)( 0)(E(A):LMMSE uboptimal)Applying(s

W1X n.integratio required todue form-closedin determined bet can'estimator

MMSE that thefound we,PDFprior uniform w/ in WGN level DCFor

2

2A

2

2A

12

2A

2A

2212221

2θX

22

σσ

σσ

σσ

σ

σσσσ

σ

φ

θ

NI

N

AEXIXCCA

AEAXEIAECXE

A

T

T

AT

AT

AXXX

TTTXX

+−=

++

=+==

==+=

=⇒=+=

−−

Page 7: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

∞→→+

=

==+

=

XXX

X

CCC

XX

NA

A

AAX

N

we

θ

θθθ

θ

σ

σσσ

σ

C

pdf. not the W(n)andA of momentsorder second and E(X) ),E(means of knowledgeon dependsonly hat solution t form Closed

N as 3

312)2( where,A :get

220

20

20

202

A22A

2A

Example

Page 8: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Geometrical Interpretation• Similar to geometrical interpretation of LLSE except that vector θ is now random & all vectors assumed zero-mean so that Cov(X,Y)=E(XY)-E(X)E(Y)=E(XY) hence orthogonality and uncorrelatedness become equivalent

Where we define length2 of a random vector as

• Two vectors are orthogonal iff (X,Y) =E(XY) =0• Geometrically, the norm of the error vector is minimized when ε (X(0),X(1),…X(N-1))

21

0)()]ˆ[( ∑

=

−=−=N

nn nXaEMSE θθθ

)( Y,X :product inner also )( : normVariance

22 YXEXEXXX TT =><==

Page 9: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Geometrical Interpretation[ ]

( )[ ]∑

=

=

==⇒

==−⇒

==

==−⇒

1-N

0m

1-N

0mm

1-0,1,...Nn: ))(())()((

1-0,1,...Nn: 0)(ˆ

1-0,1,...Nnfor 0)())(a-(

Principle 1-0,1,...Nnfor 0)()ˆ(E ity Orthogonal

nXEnXmXEa

nXE

nXmXE

nX

m θ

θθ

θ

θθ

θ

θ

θθ

XXX Ca

N

C

NXE

XEXE

a

aa

NXEXNXEXNXENXXEXEXXENXXEXXEXE

=

−−−−−

− ))1((

))1(())0((

)]1([)]1()1([)]0()1([)]1()1([()]1([)]0()1([)]1()0([)]1()0([)]0([

1

1

0

2

2

2

This is the famous Normal Equation !

Page 10: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Normal Equations

θθθθθθθ

φ

θθ

θθθ

θθθ

θθ

εθ

θ

XXXXXT

m nnm

nn

XXXT

XXX

CCCCCa

mXnXaEanXEaE

mXnXnX

mXanX

Bmse

XCCXaCCa

1

2

n mmn

nn

n mmn

2

11

C

)())(())(()(

)(a))(a-(E))(a-(E

))())((a-(E

)ˆ(

ˆ

−−

−=−=

−−−=

=

−=

=

==⇒=

∑ ∑∑

∑ ∑∑

∑ ∑

Page 11: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Vector LMMSE

[ ][ ]

XCCX

CCXX

C

CCCCE

XEXCCE

XXX

XXXT

iii

XXXXT

XXX

1T

TT

ˆ

1

AˆA0])A-E[(

Principleity Orthogonal Using: casemean -zeroFor

)ˆBmse(

)ˆ)(ˆ(C

))(()(ˆ

Matrix

Covariance

==

=⇒=

=

−=−−=

−+=

θ

θ

θ

θθθθθ

θ

θ

θ

θ

θθθθ

θθ

Page 12: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Properties of LMMSE

))(()(ˆ

))(()(ˆ where

ˆˆ ˆ if )2

MAP) & MMSE (like ations transformaffineover

commutes LMMSE ˆˆ bA if 1)

later) problem review asgiven is proof(

1

222

1

111

2121

XEXCCE

XEXCCE

bA

XXX

XXX

−+

−+

=⇒=

+=⇒+=

=

=

++

θ

θ

θθ

θθ

θθαθθα

θαθα

Page 13: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Bayesian Gauss - Markov Theorem

practice.in popular very s,covariance and

meanon dependsonly form, Closed :LMMSE of Advantage case)Gaussian -jointlyin (asLinear be tohappens /X)E(n expectatio

lconditiona theunless optimum benot willestimator This .assumptionGaussian

except w/o Ch.11in modellinear Bayesian toidentical Results

)(C

)(

ˆ- define

))(()()(

))(()()(ˆ

assumptionGaussian noWHX :ModelLinear Bayesian For the

111-

1

1111

1

+=

+−=⇒

=

−++=

−++=⇒

⇐+=

−−

−−−−

θ

θθε

θ

θθ

θ

θθ

θθ

θθθθθθθθεε

θθθθ

HCHHCCHHCHCCC

XEXCHHCHCEXEXCHHCHCE

WT

WTT

WT

WT

WTT

LMMSE

Page 14: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

BLUE vs. LLSE vs. LMMSE• BLUE (Ch.6) : classical estimator (deterministic

parameter), unbiased, only noise assumed random

• LLSE (Ch.8) : no statistical assumption, only linear model assumption

• LMMSE (Ch.12) : Bayesian estimator, random parameters, converges to BLUE w/ no apriori info.

XCHHCH wT

wT

BLUE111 )(ˆ −−−=θ

WXHWHHWHXJ TTLLSELLSE

122/1 )(ˆ;)()( −=−= θθθ

))(()()( toequal isit Model,Linear For ))(()(ˆ

1111

1

θθ

θθ

θ

θ

HEXCHHCHCE

XEXCCE

WT

WT

XXX

−++=

−+=

−−−−

Page 15: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Wiener Filtering

n offunction

order reversedin

flipping denotes `TSS

1`TSS

X

1WWSS

`TSS

1WWSSXXXX

XX

)]0()1()([ ) (r))(( r:

))((C where)RR(r(n)S

)(ˆˆRRCR

data. filter to causal ofn Applicatio only. datapast andpresent on based i.e. 0,1,....n,mfor

W(m)S(m)X(m)on based estimated be toS(n) :Filtering)1Toeplitz ymmetricCmean zero w/ WSSis Data

:sAssumption LMMSE ofn applicatioimportant Most

ssssss

TXX

TT

T

XXX

XX

rnrnrSnSERaXa

XnSEX

XCCnS

SR

−=

====

=+=

==⇒+==

=

+====⇒

∗∗

−∆

θ

θθ

θ

'ssXX raR =⇒

Page 16: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

FIR Wiener Filter

filterorder non based computed isfilter order 1)(n thewhere Algorithm)(Levinson formula recursive-orderan derive topossible isIt

)(r

)1(r)0(r

)(

)1()0(

)0(r)1(r)(r

)1(r)0(r)1(r)(r)1(r)0(r

)J &matrix reversal is J whereJrJ)(Ja)(JR : (Proof

R have, We

filter. FIR Varying Time )()((n)S

)(h where)()(h

)()(ˆ

th

ss

ss

ss

order) n(

)(

)(

)(

XXXXXX

XXXXXX

XXXXXX

2'ssXX

'XX

n

0k

)(

nk,n

n

0k

(n)(n)

0,

th

+

=

==

=⇒=

⇐−=

=−=

=⇒

=

−=

=

nnh

hh

nn

nn

IrhRra

knXkh

akkXkn

kXanS

n

n

n

ssXXss

n

n

knk

Time Reversed Version

Page 17: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

IIR Wiener Filtering

. 0for only holdsit since TransformFourier using solved bet can' )( that Note :Remark

)( 0,1,...for )()(h(k)r

))()(())-k)X(n-h(k)X(nE(

0,1,....for 0))-(n)]X(nS-E([S(n)

..) 1),-X(n X(n),()(n)S-(S(n) :principleity orthogonal Using

)()((n)S

past infinite andpresent on the based S(n) Estimate

,filter)order -(infinite Case Asymptotic

nextstudy llwe' whichproblem smoothing the

have weallowed 0When

0XX

0k

(n)

0

<≥∗

∗==−⇒

−=⇒

==⇒

−=

=

=

=

ll

llrkl

lnXnSEl

ll

knXkh

SSk

k

ε

We’ll come back to solve it soon !

Page 18: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Two-Sided Wiener Filtering

}

∞→→=

=<<

−+

=+

==⇒

=∗

−=+=

+=

=+===

==

∞→

×

−=

SNR(f) :1SNR(f) :

1H(f)0

:1)(

)()(P)(P

)(P )(P)(PH(f)

(n)r(n)rh(n) hence

l"." allfor but holds (*)Equation :Asymptote

)()RR(R-RM :

: )(R

]W)E[S(S ][C whereCS

X(n) data future) and(past

entire on the based it) ofsubset (or

)1(

)1()0(

Estimate

ns)observationoisy entire its from signalrecover (:Smoothing)2

WWSS

SS

XX

SS

ssXX

N

1WWSSSSSSS

Matrix

1SS

TX

1X

10

Error of

Covariance

filter smoothing Wiener theis This

φφ

θ

θθ

CausalNonfSNR

fSNRff

fff

RWIR

XRRRSXEC

NS

SS

S

SSSS

NN

WWSS

SST

XX

Nn

Page 19: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Prediction Wiener Filter

[ ]

'XX

1XX

T'XXXX

XX

XX

XX

XXXX

XXXX

XXXXXX

'1

N

1k

1

0

'11-N

0kk

T1T'XX

X

rRr(0)rMMSE

)1(r)1(r

)(r

)(

)2()1(

)0(r)1(r

)2(r)1(r)1(r)1(r)0(r

a Since

k)-h(k)X(N

k)-h(N where: )()(

a : )(a

ar)1-(NX

])()2()1([)]1()1()0()[1(C problem predictionorder -an is This

1)}-NX(1),...X({X(0), from )1-X(N Estimate points) datapast other from data (future :Prediction )3

'

=

=

=

−=

+−+=

−−

=⇒=

=

=−=

==

==+⇒

+−+−=−+−=

+=

lNl

l

Nh

hh

N

NN

rhRrR

akXkNh

rRkX

XXRl

lrlNrlNrNXXXlNXEl

l

XXXXXXXX

k

N

k

XXXX

XX

r

XXXXXXT

XX

θ

θ

We can also consider interpolation problems where we estimate missing sample(s) within the data record

Page 20: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Filtering Problem

[ ] 0)()zH(z)B(z)B(

)()( )( Theoremion Factorizat SpectralBy

0 ispart causal : )]()()([

)())(r(h(n)

Causal :)()([X(z)]

transform- Zsided-one theDefine

0: )()(h(k)r

12.16)SK (see :problem filtering theBack to

1-

Phase-MaxCausal-Anti

1

phase-minCausal

XX

0

0kXX

causal is h(n) where0nfor

=−⇒

=

=−⇒

=−∗⇒

=

=

≥=−

+

+

=

+

∞=

−∞=

−+

=

∑∑

zP

zBzBzP

zPzPzH

nrn

ZnXZnX

llrkl

SS

XX

SSXX

SS

n

nn

n

n

ss

φ

φ

Page 21: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

)(P)(P

)()(P

B(z)1 H(z)

)()(P G(z)or

0nfor )B(z)( )}({ Z

hence and G(z) toequal bemust component causal

its sequence, sided- twoa is )B(z)( since Now,

causal.-antistrictly also is )B(z)(-G(z) iff satisfied is )( hence

sequence, causal-anti of transform-z is )B(zBut

causal-anti bemust ))B(z)()()(B(z

)( 0 ))B(z)()()(B(z

function) (causal H(z)B(z)G(z)Let

XX

SS1

SS1

SS

1-11-

1-

1-

1-

1-1-

1-1-

zz

zBz

zBz

zPZzG

zP

zP

zPzG

zPzG

SS

SS

SS

SS

SS

=⇒

=

=

∗∗

−⇒

∗∗=

−⇒

=

+−

+−

+

(powers of Z only)

Page 22: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Causal Wiener Filtering Example

• Suppose where and

• Find the causal Wiener filter to estimate from

• Solution:

nnn WSX += 1)( =zPWW ( )( )zzzPSS 9.019.01

19.0)( 1 −−= −

nS ,..., 1−nn XX

1)()( += zPzP SSXX

( )( )( )( )zz

zzzPXX 9.019.01627.01627.01436.1)( 1

1

−−−−

=⇒ −

)9.01()627.01(436.1)( 1

1

−−

=zzzB

Page 23: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Example (Cont’d)

+−

=

)()()( 1zB

zPzG SS

( )( )( )( ) +

−−

×−−

=⇒z

zzz

zG627.01

9.01436.11

9.019.0119.0)( 1

( )( ) +−

×

−−=

436.119.0

627.019.011

1 zz

+−−

−+

−=

627.0273.0

9.01436.0

436.11

11 zz

19.01436.0

436.11)( −−

=z

zG

11

1

1 627.01304.0

627.019.01

436.11

9.01436.0

436.11

)()()( −−

− −=

−−

×−

==zz

zzzB

zGzH

( ) 0for 627.0304.0 ≥= kh kk

Page 24: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Prediction Filter Example

+−

==

==+=

++==

=

=+=

+++

)1(

)2(1

'1

2XX

XX

2

||2A

n

)()1(

)0()1()1()0(

a

1 2,N )()(r

)))((()()(r :X(1) and X(0) from

X(2)predict filter to prediction Wiener tap-2 Compute ce w/ varianprocess random mean white-zero is Noise

)1.0()(r sequencen correlatio-autow/

process randommean -zero is A 0,1,...;n WSS

XX

XX

r

XX

r

XX

XXXX

XXXXXXXX

lWA

lnlnnnlnn

W

lA

nnn

lrlNr

rrrr

rR

llrlWAWAEXXElSol

lWAX

δσ

σ

σ

Page 25: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Prediction - Example

1100

1

0kk

1

0

2

2

2

2

2

2

22

4222

222

2

4224

2

22

2

4222

2

222

222

2

1

222

222

2

2

22

a (2)X

099.01.0

01.0

11.119.0

)/(01.0)(

1.0099.001.0

99.021.0099.001.0

a

01.0)(1.001.0

1.01.0

1.001.0

1.01.0

01.0)2(1.0)1(

)0(

XaXaX

aa

rr

r

k

W

A

W

A

W

A

WA

AWA

AWA

W

AAWW

A

WA

W

AWA

A

WAA

AWA

AWAA

AWA

AXX

AXX

WAXX

+==⇒

=

+

+

+

=−+

+=

++⋅

+=

−+

+−−+

=

++

===

+=

∑=

σσ

σσ

σσ

σσσσσ

σσσ

σ

σσσσσ

σσσ

σσσσ

σσσσσσ

σσσσσσσ

σσσσ

Page 26: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Prediction - Example

processes random S WS)(r)(r )(r tindependen are &A whereX If :GeneralIn

infinite becomes SNR as case limiting heconsider t : Exercise

]0.10.01[)(

)0(

WWAAXX

nn

1

02A

2W

2A

'

lllWWA

aa

arrMMSE

nnn

TXXXX

+=⇒+=

−+=

−=

σσσ

Practice Problems (from SK) : 12.1,12.2,12.6,12.8

Page 27: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Summary of Estimation Methods

Dimensionality a problem

Signal processingproblem

Prior Knowledge

Yes

No

Prior Knowledge

New Data Modelor

Take new data

Yes

Yes Bayesian Approach

Bayesian Approach

No

No

Classical Approach

NoNot

Possible

Classical vs. Bayesian Approach

Page 28: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Summary of Estimation Methods

PDF Known

Compute Mean ofPosterior PDF

Yes

No First twoMoments known

Yes

Yes LMMSEEstimator

MMSE Estimator

No

No

NotPossible

MaximizePosterior PDF

Yes MAP Estimator

NotPossible

No

Bayesian Approach

Page 29: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Summary of Estimation Methods

PDF Known

CRLB Satisfied

Yes

No Signal in Noise

Yes

Yes

NotPossible

MVUEStimator

No

No

NoLSE

Classical Approach

Complete SufficientStatisitc exist

No

Make itUnbiased

Yes

No

Yes MVUEStimator

Evaluate MLEYes

No

MLE

Evaluate Methods ofMoments Estimator

Yes MomentsEstimator

No

Signal Linear

First Two NoiseMoments Known

No

Yes

Yes

BLUE

NotPossible

Page 30: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems

[ ]

)E(c-)E(b-)E(a-)E(

)0()0(

)E(x)E(

)xE(

cb

1)()()()()()()()(

)()()E( 0Bmse

)()()(x)E( 0Bmse

)()()()xE( 0Bmseby x x(0)denote;c)-bx(0)-(0)ax-(E)Bmse(

1.12

22

2

2

2

23

234

2

23

2342

22

θθθθ

θθ

θθθ

θ

θ

θ

θθ

∧∧∧

∧∧∧

=

−−−=

=

++=⇒=∂

++=⇒=∂

++=⇒=∂

∂=

xx

cxbxaEMMSE

a

xExExExExExExExE

cxbExaEc

xcExbExaEb

xcExbExaEa

Page 31: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems (Cont’d)

04.02

14021

,215 ,0 ,90

0021

1012/1012/1012/10801

801)E(x ,0)E(x ,

121)E(x ,

21)(

2cos)(

)E( 0,E(x)

x(0))cos(2 & 21,

21U~ x(0)if ,

22

22

4322

2

21

21

=−−

−−=

==−

=⇒

−=

===−=

==

==

=

∧∧∧

−∫

φφππ

ππ

π

πθ

φπθ

φθ

πθ

MMSE

cba

cba

xE

dxxxxE

Now

Question : how does this estimator relate to the Taylor series expansionof ))0(2cos( xπθ =

Page 32: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems (Cont.)

21)E( MMSE 0

0cb 0 )E( 0,x)E(but

)E(

x)E(

cb

1E(x)

E(x))E(x

cbx(0)

have we,estimatorlinear a Using

2

2

==⇒=⇒

==⇒==

=

+=

∧∧

θθ

θθ

θθ

θ

Page 33: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems

( ) ( ) ( )

nfx

f

fff

fff

ni

1N

0i

^

T^

T

i

T^

T

p

2

1

p21

p21

2cos[n] N2A

N2

2N

(4.13)) (see orthogonal are of columns ,NiFor

equations normal are

A

AA

1N2cos1N2cos1N2cos

2cos2cos2cos111

s

8.5 Prob:8Chapter

π

θ

θ

πππ

πππ

θ

∑−

=

=⇒

=⇒=⇒

=

=

−−−

=

xHIHH

H

xHHH

Page 34: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems (Cont’d)

( )

( )( )1T2^

P

1

2i

^1N

0

2

2^2

T

2TT

2TTT

T1TTmin

, ~ is PDF For WGN,

A2N][x

2N

N2

xN2

N2-

-J

=

=

⊥⊥

∑∑ −=

−=

−=

==

=

=

HH

xx

Hxx

xPxPxxHHIx

xHHHHIxP

σθθ

θ

N

nin

Page 35: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems (Cont’d)

( ) ( )

( ) ( )( ) ( )

( ) ( )( )

==

=

−−=

−=

==

−−

−−

Ix

IHH

HHHIHHH

HHHHxHxHHH

C

xHHHH

N2 offunction linear a is Since,

N2

E

E

E E Since,

2

21T2

1T2T1T

1TT

ww

T1T

T

T1T

T

σθθθ

σσ

σ

θθ

θθθθ

θθ

θ

θ

,N~

.

^^

^^

^

^

Page 36: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems

( )

( )

[ ][ ]( )

( ) ( )12.30 & 12.29 Usingr11

1ABmse

and

r

r rA

r rr 1 where,

1A

12.27 From: 12.2 Prob

1N

0

222

^

1N

0

22

2

1N

0^

T1-N2

2

T1

2

T

2

^

=

=

=

+=

+

−+=∴

=

++=

n

n

A

n

n

A

nA

nn

A

AA

A

nx

σσ

σσ

µµ

µσσσ

µ

h

hxhhh

Page 37: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems

( )

( ) ( )( ) ( )( )

x ][ s or,

EE

RR E where,

12.20 Using:12.6 Prob

22s

2s

^

22s

2s

2s

TT

22swwss

T

1

][

xs

IwsssxC

IxxCxCCs

x s

xx

xx x s

nn

&

^

^

σσσ

σσσ

σ

σσ

+=

+=∴

=+==

+=+==

= −

Page 38: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems (Cont’d)( )

( )

I

I

II

CCCCM

s

s

s

ss

s

ss

sxxxxssss

22

22

22

22

22

222

1

1

12.21 From

σσσσ

σσσσ

σσσσ

+=

+

−=

+−=

−= −^

Page 39: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems

( ) ( )( )

( ) ( )( )( ) ( )( )[ ]( )( ) ( )( )[ ]

( ) ( )( )

bA

E AbAE

A E EA E

E E E &

bAE E But i)

E E

12.8 Prob

^

1

T

T

1

+=

−++=∴

=−−=

−−=

+=

−+=

θ

θα

θθ

αα

θα

αα

xx

xx

xx

xx

^

^

xxxθ

xxxα

CC

C

C

CC

Page 40: Estimation Theory - University of Texas at Dallasaldhahir/6343/Ch12.pdf · Estimation Theory Chapter 12. Linear Bayesian Estimators • Optimal MMSE Bayesian estimators in general

Review Problems

( )( )

( )( )[ ]( )( )[ ]

( )( )[ ]

( ) ( )( )^^

xx

xx

xx

xx

xx

21

121

^

T22

T11

T2121

2121

1^

E ) E()E(

E)) E(-( E

E))E(-( E

E)) E(-)E(-( E &

) E()E( if

E)E( ii)

θθ

θθα

θθ

θθ

θθθθ

θθαθθα

αα

+=

−+++=∴

+=−+

−=

−+=

+=⇒+=

−+=

xx x θx θ

x θx θ

x α

xx x α

CCC

CC

C

CC

21

21