34
Embedding Gestalt Laws Embedding Gestalt Laws in Markov Random Fields in Markov Random Fields by Song-Chun Zhu by Song-Chun Zhu

Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Embed Size (px)

DESCRIPTION

Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu. Purpose of the Paper. Proposes functions to measure Gestalt features of shapes Adapts [Zhu, Wu Mumford] FRAME method to shapes Exhibits effect of MRF model obtained by putting these together. Recall Gestalt Features. - PowerPoint PPT Presentation

Citation preview

Page 1: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Embedding Gestalt LawsEmbedding Gestalt Laws

in Markov Random Fieldsin Markov Random Fields

by Song-Chun Zhuby Song-Chun Zhu

Page 2: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Purpose of the PaperPurpose of the Paper

Proposes functions to measure Gestalt features of shapes

Adapts [Zhu, Wu Mumford] FRAME method to shapes

Exhibits effect of MRF model obtained by putting these together.

Page 3: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Recall Recall GestaltGestalt Features Features(à la [Lowe], and others)

Colinearity

Cocircularity

Proximity

Parallelism

Symmetry

Continuity

Closure

Familiarity

Page 4: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

FRAMEFRAME[Zhu, Wu, Mumford]

F ilters

R andom fields

A nd

M aximum

E ntropy

A general procedure for constructing MRF models

Page 5: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Three Main PartsThree Main Parts

Data

Learn MRF models from data

Test generative power of learned model

Page 6: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Elements of DataElements of Data

A set of images representative of the chosen application domain

An adequate collection of feature measures or filters

The (marginal) statistics of applying the feature measures or filters to the set of images

Page 7: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Data: ImagesData: Images

Zhu considers 22 animal shapes and their horizontal flips

The resulting histograms are symmetric

More data can be obtainedBut are there other effects?

Page 8: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Sample Animate ImagesSample Animate Images

Page 9: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Contour-based Feature MeasuresContour-based Feature MeasuresGoal is to be generic

But generic shape features are hard to find

φ1 = κ(s), the curvature

κ(s) = 0 implies the linelets on either side of Γ(s) are colinear

φ2 = κ'(s), its derivative

κ'(s) = 0 implies three sequential linelets are cocircular

“Other contour-based shape filters can be defined in the same way”

Page 10: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Zhu's Symmetry FunctionZhu's Symmetry Function

Ψ(s) pairs linelets across medial axesDefined and computed by minimizing an energy functional constructed so that

Paired linelets are as close, parallel and symmetric as possible, and

There are as few discontinuities as possible

Page 11: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Region-based Feature MeasuresRegion-based Feature Measures

φ3(s) = dist(s, ψ(s))

Measures proximity of paired linelets across a region

φ4(s) = φ3'(s), the derivative

φ4(s) = 0 implies paired linelets are parallel

φ5(s) = φ'4(s) = φ3''(s)

φ5(s) = 0 implies paired linelets are symmetric

Page 12: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Another Possible Shape FeatureAnother Possible Shape Feature

φ6(s) = 1 where ψ(s) is discontinuous

0 otherwise

Counts the number of “parts” a shape has

Can Gestalt “familiarity” be (statistically?) measured?

Page 13: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

The StatisticThe Statistic

The histogram of feature φ over curve Γ is

H(z; φk, Γ) = ∫δ(z-φk(s)) ds

δ is the Dirac function: mass 1 at 0, and 0 otherwise

μ(z; φk) denotes the average over all images

Zhu claims μ is a close estimation of the marginal distribution of the “true distribution” over shape space, assuming the total number of linelets is small.

Page 14: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Statistical ObservationsStatistical Observations

φ1 at scales 0, 1, 2

φ3 φ4 φ5

On 22 images and their flips

Page 15: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Construct a ModelConstruct a Model

Ω is the space of shapes

Φ is a finite subset of feature filters

We seek a probability distribution p on Ω

∫Ω p(Γ) dΓ = 1 (1)

That reproduces the statistics for all φ in Ω

∫Ω p(Γ) δ(z-φ(s)) dΓ = μ(z; φ) (2)

Page 16: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Construct a Model, 2Construct a Model, 2Idea: Choose the p with maximal entropy

Seems reasonable and fair, but is it really the best target/energy function?

Lagrange multipliers and calculus of variations lead to

p(Γ; Φ, Λ) = exp(–∑φЄΦ ∫ λφ(z) H(φ, Γ, z) dz) / Zwhere Z is the usual normalizing factor

Λ = { λφ | φЄΦ }

Page 17: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

It's a Gibbs DistributionIt's a Gibbs Distribution

In other words, it has the form of a Gibbs distribution, and therefore determines a Markov Random Field (MRF) model.

Page 18: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Markov Chain Monte CarloMarkov Chain Monte Carlo

Too hard to compute λ's and p analytically

Idea: Sample Ω according to the distribution p, stochastically update Λ to update p, and repeat until p reproduces all μ(z; φ) for φ Є Φ

Monte Carlo because of random walk

Markov Chain in the nature of the loop

Page 19: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Markov Chain Monte Carlo, 2Markov Chain Monte Carlo, 2

From the sampling produce μ'(z; φ)Same as μ(z; φ) except based on a random sample of shape space

For the purposes of today's discussion, the details are not important

For φ Є Φ

μ'(z; φ) = μ(z; φ)

Zhu et al. assume there exists a “true underlying distribution”

Page 20: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

The Nonaccidental StatisticThe Nonaccidental Statistic

For φ' not in the set Φ we expect

μ'(z; φ') ≠ μ(z; φ')

μ'(z; φ') is the accidental statistic for φ'It is a measure of correlation between φ' and Φ

The “distance” (L1, L2, or other) between μ'(z; φ') and μ(z; φ') is the nonaccidental statistic for φ'

It is a measure of how much “additional information” φ' carries above what is already in Φ

Page 21: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

The Algorithm (simplified)The Algorithm (simplified)

Enter your set Γ = { γ } of shapes

Enter a (large) set { φ } of candidate feature measures

Compute μ(φ, Γ) for all φ in Φ

Compute μ'(φ) relative to a uniform distribution on Ω

Until the nonaccidental statistic of all unused features is small enough, repeat:

Page 22: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Algorithm, 2Algorithm, 2

Of the remaining φ , add to Φ one with maximal nonaccidental statistic

Update:Set of Lagrange multipliers Λ = { λ }

Probability distribution model p(Φ, Λ)

The μ'(φ) for remaining candidate features φ

Page 23: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiments and DiscussionExperiments and Discussion

Let my description of these experiments stimulate your thoughts on such issues as

Are there better Gestalt feature measures?

What is the best possible outcome of a generative model of shape?

What feature measures should be added to the Gestalt ones?

How useful were these experiments and what other might be worth doing?

Page 24: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 1Experiment 1

When the only feature used is the curvature the model generated

Page 25: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 1, continuedExperiment 1, continued

A Gaussian model (with the same κ-variance) produced

Page 26: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 2Experiment 2

Experiment 2 uses both κ and κ'

The nonaccidental statistic of κ' with respect to the model based on κ can be seen here

Page 27: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 2, continuedExperiment 2, continued

This time the model generated these shapes, purported to be smoother and more scale invariant

Page 28: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 3Experiment 3

The nonaccidental statistics of the three region-based shape features relative to the model produced in Experiment 2

Page 29: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 3, continuedExperiment 3, continued

So r'' was omitted, this model has

Φ = { κ, κ', r, r' }

Page 30: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Experiment 3, continuedExperiment 3, continued

This model produced such shapes as

Page 31: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Concluding DiscussionConcluding Discussion

Zhu acknowledges that the selection of training shapes might introduce a bias; but

Page 32: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Discussion, continuedDiscussion, continued

Zhu acknowledges that the paucity of Gestalt features limits the possible neighborhood structures used to define a MRF.

Zhu acknowledges that these models do not account for high-level shape properties, and suggests that a composition system might address this problem.

Page 33: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Questions and CommentsQuestions and Comments

Although it is in the nature of an MRF-model to propagate local properties, I think there needs to be a higher-level basis (than linelets) for measuring the Gestalt features of a shape!

Are there better Gestalt feature measures?

What feature measures should be added to the Gestalt ones?

Page 34: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

More Questions for DiscussionMore Questions for Discussion

What is the best possible outcome of a generative model of shape? Is such a thing worth pursuing?

How useful were Zhu' experiments and what others might be worth doing?