66
Lecture 9 Inexact Theories

Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Lecture 9

Inexact Theories

Page 2: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

SyllabusLecture 01 Describing Inverse ProblemsLecture 02 Probability and Measurement Error, Part 1Lecture 03 Probability and Measurement Error, Part 2 Lecture 04 The L2 Norm and Simple Least SquaresLecture 05 A Priori Information and Weighted Least SquaredLecture 06 Resolution and Generalized InversesLecture 07 Backus-Gilbert Inverse and the Trade Off of Resolution and VarianceLecture 08 The Principle of Maximum LikelihoodLecture 09 Inexact TheoriesLecture 10 Nonuniqueness and Localized AveragesLecture 11 Vector Spaces and Singular Value DecompositionLecture 12 Equality and Inequality ConstraintsLecture 13 L1 , L∞ Norm Problems and Linear ProgrammingLecture 14 Nonlinear Problems: Grid and Monte Carlo Searches Lecture 15 Nonlinear Problems: Newton’s Method Lecture 16 Nonlinear Problems: Simulated Annealing and Bootstrap Confidence Intervals Lecture 17 Factor AnalysisLecture 18 Varimax Factors, Empirical Orthogonal FunctionsLecture 19 Backus-Gilbert Theory for Continuous Problems; Radon’s ProblemLecture 20 Linear Operators and Their AdjointsLecture 21 Fréchet DerivativesLecture 22 Exemplary Inverse Problems, incl. Filter DesignLecture 23 Exemplary Inverse Problems, incl. Earthquake LocationLecture 24 Exemplary Inverse Problems, incl. Vibrational Problems

Page 3: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Purpose of the Lecture

Discuss how an inexact theory can be represented

Solve the inexact, linear Gaussian inverse problem

Use maximization of relative entropy as a guiding principle for solving inverse problems

Introduce F-test as way to determine whether one solution is “better” than another

Page 4: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Part 1

How Inexact Theories can be Represented

Page 5: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

How do we generalize the case of

an exact theory

to one that is inexact?

Page 6: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 1 2 3 4 5

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

m

d

dobs model, m

datu

m, d

mapmest

d pre

d=g(m)

exact theory case

theory

Page 7: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 1 2 3 4 5

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

m

d

dobs model, m

datu

m, d

mapmest

d pre

d=g(m)

to make theory inexact ...must make thetheory probabilisticor fuzzy

Page 8: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 1 2 3 4 5

0

1

2

3

4

5

m

d

0 1 2 3 4 5

0

1

2

3

4

5

m

d

0 1 2 3 4 5

0

1

2

3

4

5

m

ddobs da

tum

, d model, m

map

dobs

map

dobs mapmest

dpre

model, m model, mda

tum

, d

datu

m, d

combinationtheorya prior p.d.f.

Page 9: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

how do youcombine

two probability density functions ?

Page 10: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

how do youcombine

two probability density functions ?

so that the information in them is combined ...

Page 11: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

desirable properties

order shouldn’t matter

combining something with the null distribution should leave it unchanged

combination should be invariant under change of variables

Page 12: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Answer

Page 13: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 1 2 3 4 5

0

1

2

3

4

5

m

d

0 1 2 3 4 5

0

1

2

3

4

5

m

d

0 1 2 3 4 5

0

1

2

3

4

5

m

d0 1 2 3 4 5

0

1

2

3

4

5

m

d

0 1 2 3 4 5

0

1

2

3

4

5

m

d

0 1 2 3 4 5

0

1

2

3

4

5

md

dobs da

tum

, d model, m

mapdobs

map

dobs

mapmest

dpre

model, m model, m

datu

m, d

datu

m, d

theory, pgdobs

datu

m, d

model, m

map

dobs

mapdobs

mapmestdpre

model, m model, mda

tum

, d

datu

m, d

(F)(E)(D)

a priori , pA total, pT

Page 14: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

“solution to inverse problem”maximum likelihood point of

(with pN∝constant)

simultaneously givesmest and dpre

Page 15: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

probability that the estimated model parameters are near m and the predicted data are near d

probability that the estimated model parameters are near m irrespective of the value of the predicted data

T

Page 16: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

conceptual problem

do not necessarily have maximum likelihood points at the same value of m

and

T

Page 17: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 1 2 3 4 5

0

1

2

3

4

5

0 1 2 3 4 50

0.2

0.4

0.6

0.8

1

m

p(m

)dobs dpre

model, m

datu

m,

d 0 1 2 3 4 5

0

1

2

3

4

5

0 1 2 3 4 50

0.2

0.4

0.6

0.8

1

m

p(m

)

mapmest

model, mmest’

p(m)

Page 18: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

illustrates the problem in defining adefinitive solution

to an inverse problem

Page 19: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

illustrates the problem in defining adefinitive solution

to an inverse problem

fortunatelyif all distributions are Gaussian

the two points are the same

Page 20: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Part 2

Solution of the inexact linear Gaussian inverse problem

Page 21: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Gaussian a priori information

Page 22: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Gaussian a priori information

a priori values of model

parameterstheir uncertainty

Page 23: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Gaussian observations

Page 24: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Gaussian observations

observed data

measurement error

Page 25: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Gaussian theory

Page 26: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Gaussian theory

linear theory

uncertaintyin theory

Page 27: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

mathematical statement of problem

find (m,d) that maximizes

pT(m,d) = pA(m) pA(d) pg(m,d)

and, along the way, work out the form of pT(m,d)

Page 28: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

notational simplification

group m and d into single vector x = [dT, mT]T

group [cov m]A and [cov d]A into single matrix

write d-Gm=0 as Fx=0 with F=[I, –G]

Page 29: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

after much algebra, we find pT(x) is a Gaussian distribution

with mean

and variance

Page 30: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

after much algebra, we find pT(x) is a Gaussian distribution

with mean

and variancesolution to inverse problem

Page 31: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

after pulling mest out of x*

Page 32: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

after pulling mest out of x*

reminiscent of GT(GGT)-1 minimum length solution

Page 33: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

after pulling mest out of x*

error in theory adds to error in data

Page 34: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

after pulling mest out of x*

solution depends on the values of the prior information only to the extent that the model resolution matrix is different from an identity matrix

Page 35: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

and after algebraic manipulation

which also equals

reminiscent of (GTG)-1 GT least squares solution

Page 36: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

interesting aside

weighted least squares solution

is equal to the

weighted minimum length solution

Page 37: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

what did we learn?

for linear Gaussian inverse problem

inexactness of theory

just adds to

inexactness of data

Page 38: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Part 3

Use maximization of relative entropy as a guiding principle for

solving inverse problems

Page 39: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

from last lecture

Page 40: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

assessing the information contentin pA(m)

Do we know a little about mor

a lot about m ?

Page 41: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Information Gain, S

-S called Relative Entropy

Page 42: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

-25 -20 -15 -10 -5 0 5 10 15 20 250

0.1

0.2

0.3

m

p and

q

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

1

2

3

4

sigmamp

S

mpA(m)

S(σ A )pN(m)

σA

(A)

(B)

Page 43: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Principle ofMaximum Relative Entropy

or if you prefer

Principle ofMinimum Information Gain

Page 44: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

find solution p.d.f. pT(m) that has smallest possible new information as compared to a priori p.d.f. pA(m)

find solution p.d.f. pT(m) that has the largest relative entropy as compared to a priori p.d.f. pA(m)

or if you prefer

Page 45: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and
Page 46: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

properly normalized

p.d.f.

data is satisfied in the meanor

expected value of error is zero

Page 47: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

After minimization using Lagrange Multipliers process

pT(m) is Gaussian with maximum likelihood point mest satisfying

Page 48: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

After minimization using Lagrane Multipliers process

pT(m) is Gaussian with maximum likelihood point mest satisfying

just the weighted minimum length solution

Page 49: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

What did we learn?

Only that the

Principle of Maximum Entropy

is yet another way of deriving

the inverse problem solutions

we are already familiar with

Page 50: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Part 4

F-test as way to determine whether one solution is

“better” than another

Page 51: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Common Scenario

two different theories

solution mestAMA model parametersprediction error EA

solution mestBMB model parametersprediction error EB

Page 52: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Suppose EB < EAIs B really better than A ?

Page 53: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

What if B has many more model parameters than A

MB >> MAIs B fitting better any surprise?

Page 54: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

Need to against Null Hypothesis

The difference in error is due torandom variation

Page 55: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

suppose error e has a Gaussian p.d.f.uncorrelated

uniform variance σd

Page 56: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

estimate variance

Page 57: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

want to known the probability density function of

Page 58: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

actually, we’ll use the quantity

which is the same, as long as the two theories that we’re testing is applied to the same data

Page 59: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

0.51

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

0.51

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5012

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5012

p(FN,2)

p(FN,5)

p(FN,50)

F

F

F

F

p(FN,25)

N=2 50

N=2 50

N=2 50

N=2 50

p.d.f. of F is known

Page 60: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

as is its mean and variance

Page 61: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

example

same dataset fit with

a straight line

and

a cubic polynomial

Page 62: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1-1

-0.5

0

0.5

1linear fit

z

d(z)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1-1

-0.5

0

0.5

1cubic fit

zs

d(z)

(A) Linear fit, N-M=9, E=0.030

(B) Cubic fit, N-M=7, E=0.006zi

zi

di

di

Page 63: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1-1

-0.5

0

0.5

1linear fit

z

d(z)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1-1

-0.5

0

0.5

1cubic fit

zs

d(z)

(A) Linear fit, N-M=9, E=0.030

(B) Cubic fit, N-M=7, E=0.006zi

zi

di

di

F7,9 = 4.1est

Page 64: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

probability thatF >F est(cubic fit seems better than linear fit)

by random chance alone

or F < 1/F est(linear fit seems better than cubic fit)

by random chance alone

Page 65: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

in MatLab

P = 1 - (fcdf(Fobs,vA,vB)-fcdf(1/Fobs,vA,vB));

Page 66: Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and

answer: 6%

The Null Hypothesis

that the difference is due to random variation

cannot be rejected to 95% confidence