32
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Embed Size (px)

Citation preview

Page 1: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Image Analysis, Random Fields and Dynamic MCMC

By Marc Sobel

Page 2: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

A field!!!!!!!!

Page 3: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Random Fields Random Fields RF consist in collections of

points P={p} and neighborhoods {Np} of points. (Neighborhoods Np do not contain p) The field imposes ‘label’ values f={f[p]} points. We use the notation f[S] for the label values imposed on a set S Random Fields have one central property which is closely related to the markov property: [ ] | [ / ] [ ] | pP f p f p P f p f N P

Page 4: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Reasoning: Hammersely-Clifford Theorem

Under certain assumptions, assuming the points can be enumerated by p1,…,pN we have that: (the distribution can be generated from these conditionals)

1* *1

* *1 1 1

* * *1 1 1 1

*

* *

[ ],..., [ ]

[ ],..., [ ]

( [ ] | [ ],..., [ ], [ ],..., [ ])

( [ ] | [ ],..., [ ], [ ],..., [ ])

[ ] | [ ], [ ] =

[ ] | [ ], [ ]

N

N

Ni i i N

i i i i N

i

i

f p f p

f p f p

f p f p f p f p f p

f p f p f p f p f p

f p f p f p

f p f p f p

N

N1

N

i

Page 5: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Gibbs Random Field Gibbs random fields are characterized by

Where cε are cliques. Cliques are contained in neighborhoods: ⊂

For example, if cliques c are all pairs, we could put :

( )

exp( )

( ) ; Z= expf

U fU fTP f

Z T

( ) ( ); where: V ( ) 0 if c c cc

U f V f f f N

C

1 21 &if f[c ] f[ ]; (f ) (autologistic model)

1 &otherwiseCc

V

Page 6: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Gibbs=Markov!!!!!! Under Gibbs, conditioning is on neighborhoods:

But, the term, Cancels in numerator and denominator giving the result

: [ ] ; [ ]

: [ ]

exp ( ) exp ( )

[ ] | [ ]

exp ( ) exp ( )

p p N p p

p N p p

c cf f p f f N f c N c N

p p N

c cf f N f c N c N

V f V f

P f p f f N f

V f V f

exp ( )p

cc N

V f

Page 7: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Examples of Random Fields Automodels: all cliques have one or two members. Autobinomial models: How to biuld a k color map:

labels are 0,1,…,k. Neighborhoods are of size M.

Autologistic model (i.e., Model which imposes energy 1 when contiguous elements are different and -1 otherwise).

( [ ] | #{ [ ] } ) (1 ) ;

exp{ }

1 exp{ }

s k sp

kP f p s f N s t p p

s

tp

t

1 21 &if f[c ] f[ ]; (f ) (autologistic model)

1 &otherwiseCc

V

Page 8: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

A Metropolis Hastings update for autologistic field models

1) Propose a flip at a randomly selected point p.

2) The move probability is:

* [ ]

* [ ]

* [ ]

[ ] [ *]exp

Probmove=1[ ] [ *]

exp

[ ] [ *] =1 exp 2

p N p

p N p

p N p

f p f pT

f p f pT

f p f pT

Page 9: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

The 1-d autologistic The 1-d autologistic is:

The effect of the prior is to smooth out the results.

( 1) ( 1)Probmove=1 exp 2

f p f pT

Page 10: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

The 2-d autologistic or Ising Model

In a 2-d setting we update using:

1 2

Probmove=

( 0,1; 0,1)1 exp 2

f p p

T

Page 11: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Example: The Ising Model: Each rectangle below is a field configuration f: black=1 and white=-1. Color results from multiple label values

Page 12: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Extensions: Segmentation Start with a map Y (over a 2d grid). Assume we would like to distinguish which

points in the map are importand and which are background.

Devise an ising field model prior which captures the importand points of the maps and downweights the others. E.g.,

, 1, , 1( , )

, ,( , )

( , )i j i j i j

i ji j i j

i j

X X X

U X Y Y XT

Page 13: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Extensions (concluded)

So, minimizing the potential contains a ‘magnetic field’ (based on the first term) and an ‘external field’ based on the second term.

Other extensions are to Line processes, image reconstruction, texture representation.

Page 14: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Random Field Priors: The Ising model or autologistic model: (Metropolis Hastings updates): Temperature=5; at time t=10000. Note the presence of more ‘islands’.

Page 15: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Random Field Priors: The Ising model. (Metropolis Hastings updates):Temperature=.005; Note the presence of fewer islands.

Page 16: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Generalized Ising models: Mean Field Equation

The energy is:

What is the ‘impact of this prior’? Use mean field equations to get the closest possible prior (in KLD) which makes the field points mutually independent.

( ) 3 ; i j ji j

E x x x x

Page 17: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Generalized Field: Note the Swiss Cheese aspect. Temperature=10.

Page 18: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Mean Field Equation The mean field equation minimizes:

For distributions Q which make points mutually independent. For the generalized field model, the mean field equation is:

( )log

( )QQ X

KLD EF X

' [ ]

1[ ] [ '] 3

p N pa p a p

T

Page 19: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Mean Field Approximation to the General Ising Field at temperature T=10. We simulate from the mean field prior.We retain the swiss cheese but lose the islands.

Page 20: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Gaussian Process Autonormal models: If labels are real

numbers , (i.e., we are trying to biuld a picture with many different grey levels):

, '' [ ]

[ ] | [ ] ( [ '] )p p p pp N p

f p f N f p

Page 21: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Gaussian Processes

For Gaussian processes, the covariance,

Cov(f[p],f[p’])= Σ βp’,p’’ cov(f[p’],f[p’’]) + σ2 ;

This gives the Yule-Walker equation: COV=B*COV+I; or COV-1=(I-B)/σ2; So the likelihood is

given by,

Page 22: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Gaussian Processes The likelihood is gaussian with mean μ and

inverse covariance matrix I-B;

Example: assume a likelihood, centered at i+j. Assume a gaussian process prior.

2( ) '( )( )

exp2

f I B ff

2

' [ ]

[ , ] ,1 ;

(1/ 8) ' ; .01N

observe f i j i j

z

N

Page 23: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Posterior Distribution for the Gaussian Model

2 2' [ ]

2 2 2

[ ] | , [ ( )]

'

( ) ,1 8 1

N p

p F N p

F p

N

Page 24: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Gaussian field at time t=20,000 with conditional prior variance=.01. Mesh is over a realization of μ. Note how smooth the mesh is:

Page 25: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Maximum Aposteriori Estimates

2' [ ]

2 2

'

[ ] ( )1 8

N pp F p

Page 26: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

MAP Estimator with prior variance =.5

Page 27: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Maximum Aposteriori Estimate with prior variance =.01

Page 28: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Smoothness Priors Suppose we observe data with prior,

221

,

2 2 22 2 22 , , ,

,

[ , ] [ , ] ;

( ) exp ;

( ) exp 2

i ji j

i i i j j ji j

d i j f i j z

f f f f

f f f f f

Page 29: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Smoothness priors

The smoothness prior π1 has the effect of imposing a small ‘derivative’ on the field.

The smoothness prior π2 has the effect of imposing a small curvature on the field.

Page 30: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Smoothness Priors Smoothness priors have the same kind of

impact as choosing a function which minimizes the ‘loss’,

Assume the likelihood

2

2( )

( , ) ( )if d

L f d f

Page 31: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Data = -5 below 50 and Data=5 above 50. Conditional prior variance is .5

Page 32: Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

Data = -5 below 50 and Data=5 above 50. Conditional prior variance is .005