Chapter 12 Object Recognition Chapter 12 Object Recognition 12.1 Patterns and pattern classes...

Preview:

Citation preview

Chapter 12Object Recognition

Chapter 12Object Recognition

12.1 Patterns and pattern classes• Definition of a pattern class:a family of patterns

that share some common properties• Feature and descriptor• Pattern arrangements used in practice are: vecto

rs, strings and trees• Pattern vectors:

– The nature of a pattern vector x depends on the approach

Used to describe the physical pattern itself– Discriminant analysis of iris flowers– The classic feature selection problem: the degrees of

class separability depends on the choice of descriptors selected for application

nx

x

x

x

.

.

.2

1

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

• Noisy object and its corresponding signature– Select the descriptors on which to base each component of a patte

rn vector

– Pattern characteristics are described by structural properties:• For example: fingerprint recognition: based on minutiae

– Pattern classes are based on quantitative information: size and location

– Features based on spatial relationships: abrupt ending, branching, merging, and disconnected segments

• Strings pattern problem: a staircase pattern– This pattern could be sampled and expresses in terms of a pattern

vector

– The basic structure would be lost in the method of description• Resol: define the elements and b and let the pattern be the string of s

ymbols w=….abababab…

– String descriptions generate patterns of objects

• Other entities whose structure is based on the relatively simple connectivity of primitives, usually associated with boundary shape– Hierarchical ordering leads to tree structures

• For example: a satellite image :

• based on the structural relationship ”composed of”

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

12.2 Recognition base on Decision- theoretic methods

• Decision functions:the property of class belong to wi

– di(x) > dj(x) j=1,2,…,W; ji

• Decision boundary : separating class wi from wj

– di(x)- dj(x)=0

12.2.1 Matching

• Represent each class by a proto-type pattern vector

Minimum distance classifier

• Define the prototype class to be the mean vector of the patterns of that class:

• Determine the class membership of an unknown pattern vector x is to assign it to class of its closet prototype– Use Euclidean distance to determine closeness : computing the distanc

e measures: WjmxxD jj ,...,2,1 )(

– Assign x to class wi if Di(x) is the smallest distance: the best match

– Selecting the smallest distance is equivalent to evaluating the functions

– the decision boundary between class wi and class wj for a minimum distance classifier is

• Matching by correlation– The correlation between f(x,y) and w(x,y) is

• Disadvantage: sensitive to changes in the amplitude of f and w

• Resol: performing matching via the correlation coefficient– Obtaining normalization for changes in size and rotation can be difficult

» Add a significant computation

12.2.2 Optimum statistical classifiers

• A probabilistic approach to recognition

• Become important because of the randomness which pattern classes normally are generated

jTjj

Tj mmmxxd

2

1)(

s t

tysxwtsfyxc ),(),(),(

0)()(2

1)(

)()()(

jiT

jijiT

jiij

mmmmmmx

xdxdxd

• Foundation– The probability that a particular pattern x comes from c

lass wi

– The average loss incurred in assigning x to class wj

– Rewrite the average loss as• Drop 1/p(x), the average loss reduces to

– The Bayes classifier: the classifier that minimize the total average loss

• Assign an unknown pattern x to wj if rj(x) < rj(x)

– Loss of unity for incorrect decision and a loss of zero for correct decision

– A pattern vector z is assigned to the class whose decision function

yields the largest numerical value

W

kkkjj xwpLxr

1

)/( )(

)()/()(

1)(

1k

W

kkkjj wpwxpL

xpxr

kw

)()/()(1

k

W

kkkjj wpwxpLxr

)()/()()/(11

q

W

qqqjk

W

kkki wpwxpLwpwxpL

)()/()(

)()/( )1()( 1

jj

W

kkkkj

j

wPwxpxp

wPwxpxr

WjwPwxpxd jjj ,...,2,1)()/()(

• Bayes classifier for Gaussian pattern classes– The Bayes decision function have the form

– The Gaussian density of the vectors in the j-th pattern class has the form

– Mean and co-variance matrix (12.2-22,, 12.2-23)

– If all the covariance matrices are equal, then Cj=C, we obtain (12.2-27) linear decision functions (hyper-plane)

– If C=I, p(wj)=1/W, for j=1,2,…,W, then dj(x) (12.2-28)

2

2

2

)(

2

1

,...,2,1)()/(

)(j

jmx

jj

je

WjwPwxp

xd

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Chapter 12Object Recognition

Recommended