23
1 Bayesian image modeling by generalized sparse Markov random fields and loopy belief propagation Kazuyuki Tanaka Graduate School of Information Sciences (GSIS), Tohoku University, Sendai, Japan http://www.smapip.is.tohoku.ac.jp/~kazu/ Collaborators Muneki Yasuda (GSIS, Tohoku University, Japan) D. Michael Titterington (University of Glasgow, UK) 12 June, 2012 Interdisciplinary Workshop on Inference 2012 (Paris)

Bayesian image modeling by generalized sparse … · 1 Bayesian image modeling by generalized sparse Markov random fields and loopy belief propagation Kazuyuki Tanaka Graduate School

Embed Size (px)

Citation preview

1

Bayesian image modeling by

generalized sparse Markov random

fields and loopy belief propagation Kazuyuki Tanaka

Graduate School of Information Sciences (GSIS),

Tohoku University, Sendai, Japan

http://www.smapip.is.tohoku.ac.jp/~kazu/

Collaborators

Muneki Yasuda (GSIS, Tohoku University, Japan)

D. Michael Titterington (University of Glasgow, UK)

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

2

Outline

1. Introduction

2. Bayesian Image Modeling by Generalized Sparse Prior

3. Noise Reductions by Generalized Sparse Prior

4. Expansions to Segmentations

5. Concluding Remarks

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

3

Noise Reduction

by Bayesian Image Modeling

PriorProcessn Degradatio

Posterior

}Image OriginalPr{}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

Assumption 1: Original

images are randomly

generated by according to a

prior probability. Bayes Formula

Original

Image

Degraded

Image

Transmission

Noise

Estimate

Posterior

3

Assumption 2: Degraded

images are randomly

generated from the original

image by according to a

conditional probability of

degradation process.

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

4

Prior in Bayesian Image Modeling

Assumption: Prior Probability is given

as the following Gibbs distribution with

the interaction a between every nearest

neighbour pair of pixels:

Prior

ProcessnDegradatio

Posterior

}Image OriginalPr{

}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

4

Eji

ji

pff

pZpPp

},{2

1exp

,

1,},|Pr{ a

aaa ffF

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

p=0: q-state

Potts model

p=2: q-Ising model

(Discrete Gaussian

Graphical Model)

1,,1,0 qfi

5

Prior in Bayesian Image Modeling

5

Eji

ji

pff

pZpPp

},{2

1exp

,

1,},|Pr{ a

aaa ffF

**),( upu a

Eji

ji

pff

Eu

},{

*** 1

In the region of 0<p<0.3504…, the first order phase transition

appears and the solution a * does not exist.

Eji

ji pPffpup

,

,),(f

f aa

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

Loopy Belief Propagation

},|Pr{lnmaxarg ** aaa

pfF

6

Outline

1. Introduction

2. Bayesian Image Modeling by Generalized Sparse Prior in Noise Reductions

3. Noise Reduction Procedure by Generalized Sparse Prior

4. Expansions to Segmentations

5. Concluding Remarks

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

7

Prior by Conditional Maximization of Entropy

in Bayesian Image Modeling

)( )(ln)(maxarg

},|Pr{

},{)(

z Eji

p

jiP

EuPzzPP

up

zzz

fF

zf

7

Eji

p

ji ffupuppZ

uppfPupfF},{

,2

1exp

,,

1,,},|Pr{ a

aa

Lagrange Multiplier

Eji

p

ji ApPffEu

AAup},{

,1

,f

faRepeat until

A converges

Loopy Belief Propagation

Assumption: Prior Probability is given by the conditional

maximization of entropy under constraints.

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

8

Prior Analysis by LBP and Conditional Maximization

of Entropy in Bayesian Image Modeling

8

Eji

p

ji ffupuppZ

uppPup},{

,2

1exp

,,

1,,},|Pr{ a

aaffF

** ,, uupa

Eji

jiji

p

ji

jiji

f

ji

ijk

jkij

ApffPffEu

AAup

CpffP

EjiffAMj

p

},{

},{

},{

\

,,1

),(

messagesby LBPin ,, marginals Compute

)},{( 2

1exp

f

M

M

a

Prior Probability

q=16 p=0.2 q=16 p=0.5

Eji

p

ji ffE

u},{

*** 1

Repeat until

A converges

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

LBP

9

Prior in Bayesian Image Modeling

9

Log-Likelihood for

p when the original

image f* is given

Eji

p

ji ffE

u},{

*** 1

*

},{

***

**

,,ln,2

1

},|Pr{ln

uppZffup

up

Eji

p

ji aa

fF

q=16

},|Pr{lnmaxarg *** uppp

fF

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

LBP

Free Energy of Prior

10

Outline

1. Introduction

2. Bayesian Image Modeling by Generalized Sparse Prior

3. Noise Reductions by Generalized Sparse Prior

4. Expansions to Segmentations

5. Concluding Remarks

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

11

Degradation Process in Bayesian Image Modeling

Assumption: Degraded image is

generated from the original image by

Additive White Gaussian Noise.

Prior

Processn Degradatio

Posterior

}Image OriginalPr{

}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

0

11

Vi

ii fg 2

2)(

2

1exp

},|Pr{

fFgG

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

12

Posterior Probability and Conditional

Maximization of Entropy in Bayesian Image

Modeling

z

z

zf

z

z

zz

gGfF

Vi

ii

Eji

p

ji

PVPgz

EuPzz

PP

up

22

},{

)()()(

)(

)(ln)(maxarg

},,,|Pr{

Posterior Probability

PriorProcessn Degradatio

Posterior

}Image OriginalPr{}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

12

Eji

p

ji

Vi

ii ffCgfBCBpZ

CBpPup

},{

2

2

1

2

1exp

,,,

1

,,,},,,|Pr{

g

gfgGfF

Lagrange Multipliers

CupB , and 2 a Bayes Formula

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

13

Posterior Probability and Conditional

Maximization of Entropy in Bayesian Image

Modeling

PriorProcessn Degradatio

Posterior

}Image OriginalPr{}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

13

Eji

p

ji

Vi

ii ffCgfB

CBpP},{

2

2

1

2

1exp,,,gf

Eji

p

ji ffCCpP},{2

1exp,f

BCBpPgfV

uCpPffE

CBpPffE

ii

Eji

p

ji

Eji

p

ji

f

ff

gf

fgf

,,,1

,1

,,,1

2

},{},{

Deterministic Equations for B, C and u

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

14

Noise Reduction Procedures Based on LBP

and Conditional Maximization of Entropy

14

Repeat until C converge

Eji f f

jijiji

f

ji

ijk

jkijjiji

i j

j

CpffPffEu

CC

ffCMCpffP

C

p

p

},{

},{

\

},{

,,1

2

1exp,,

converges untilRepeat

M

i

i j

j

f

iiii

Eji f f

jiijji

iijiij

f

jijj

ijk

jkij

CBpfPgfV

B

CBpffPffE

u

CBpfPCBpffP

ij

ViffCgf

Bj

p

p

,,,1

,,,,1

messagesby ,,, and ,,,, marginals Compute

2

1

||2

1exp

2

2

},{

\

g

g

Lgg

LL

Repeat

until u, B and L

converge

CBpfPpf

CupuuB

iii ,,,maxarg,ˆ

ˆ, ,ˆ ,ˆ

gg

a

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

}ˆ,ˆ,|Pr{maxargˆ

ˆ2ln2

1ˆ,,ln

ˆ,,ˆ,,ln

}ˆ,ˆ,|Pr{ln

2

upp

VuppZ

uppZ

up

p

a

a

gG

g

gG

Marginals and

Free Energy in

LBP

Repeat until C and M converge Input p

and data g

15

Noise Reductions by

Generalized Sparse Priors

and Loopy Belief Propagation

p=0.5 p=0.2

Original Image Degraded Image

Restored

Image

p=1

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

16

Noise Reductions by

Generalized Sparse Priors

and Loopy Belief Propagation

p=0.5 p=0.2

Original Image Degraded Image

Restored

Image

p=1

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

17

Outline

1. Introduction

2. Bayesian Image Modeling by Generalized Sparse Prior

3. Noise Reductions by Generalized Sparse Prior

4. Expansions to Segmentations

5. Concluding Remarks

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

18

Segmentation based on Bayesian Image

Modeling by Gaussian Mixture Model

18

Vi

i

if

if

if

g

P

P2

2

1exp

,,,

,,,

γσμgf

γσμgf

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

2

111100

,,

2

1exp

2

1

)()()(

,,,,,

,,maxargˆ,ˆ,ˆ

l

li

l

il

Vi

iqqii

gg

ggg

PP

P

f

γσμ

γσμgfγσμg

γσμgγσμ

1

1

0

,

1

1

0

,

1

1

0

qqq

γσμ

g:Original Image

f:Segmented Image

19

Extension of Gaussian Mixture Model based

on Generalized Sparse Prior in Bayesian

Segmentation

19

Eji

p

ji

Vi

iffC

g

CpP

if

if

},{

2

2

1

2

1exp

,,,,

σμgf

uCpPffE

CpPffE

Eji

p

ji

Eji

p

ji

},{

},{

,1

,,,,1

f

f

f

σμgf

Deterministic Equations for C and u

12 June, 2012

Interdisciplinary Workshop on Inference

2012 (Paris)

Eji

p

ji ffCCpP},{2

1exp,f

T110

T

110

,,,

,,,

q

q

σ

μ

g:Original Image

f:Segmented Image

20

Segmentation by Generalized Sparse Prior

and Loopy Belief Propagation

Segmented Image

q=3, p=0.2 Original Image g

Segmented Image

q=5, p=0.2

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

Segmented Image

q=4, p=0.2

21

Outline

1. Introduction

2. Bayesian Image Modeling by Generalized Sparse Prior

3. Noise Reductions by Generalized Sparse Prior

4. Expansions to Segmentations

5. Concluding Remarks

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

22

Summary

Formulation of Bayesian image modeling for image processing by means of generalized sparse priors and loopy belief propagation are proposed.

Our formulation is based on the conditional maximization of entropy with some constraints.

In our sparse priors, although the first order phase transitions often appear, our algorithm works well also in such cases.

Numerical experimental results in noise reductions and segmentations have been shown.

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)

23

References 1. J. Inoue and K. Tanaka: Mean Field Theory of EM Algorithm for Bayesian Gray

Scale Image Restoration, Journal of Physics A: Mathematical and General,

Vol.36, No.43, pp.10997-11010,2003.

2. K. Tanaka, J. Inoue and D. M. Titterington: Probabilistic Image Processing by

Means of Bethe Approximation for Q-Ising Model, Journal of Physics A:

Mathematical and General, Vol. 36, No. 43 (October 2003), pp.11023-11036, 2003.

3. K. Tanaka: Mathematical Structures of Loopy Belief Propagation and Cluster

Variation Method, Journal of Physics: Conference Series, vol.143, article

no.012023, pp.1-18, 2009

4. S. Kataoka, M. Yasuda and K. Tanaka: Statistical Performance Analysis in

Probabilistic Image Processing, Journal of the Physical Society of Japan, vol.79,

no.2, article no.025001, 2010.

5. S. Kataoka, M. Yasuda, K. Tanaka and D. M. Titterington: Statistical Analysis of

the Expectation-Maximization Algorithm with Loopy Belief Propagation in

Bayesian Image Modeling, Philosophical Magazine: The Study of Condensed

Matter, Vol.92, Nos.1-3, pp.50-63,2012.

6. S. Kataoka, M. Yasuda and K. Tanaka: Statistical Analysis of Gaussian Image

Inpainting Problems, Journal of the Physical Society of Japan, Vol.81, No.2

(February 2012), Article No.025001, pp.1-2, 2012.

12 June, 2012 Interdisciplinary Workshop on Inference 2012

(Paris)