Upload
clinton-ellis
View
221
Download
0
Tags:
Embed Size (px)
Citation preview
Image Features
Last update 2015.04.12
Heejune Ahn, SeoulTech
Outline Goal of feature extraction and representation External features
Shape vectors Single-parameter shape Signatures and radical Fourier expansion Statistical moments of region
Internal features Statistical measures of texture features
Principal component Analysis
0. Feature extraction Purpose of feature extraction
Feature types (representation) external: boundaries or shapes of objects Internal : texture (variation of intensity/colors) statistical: e.g. PCA
To consider which features to extract? how to extract?
Featuresconcise (small
number) (points, boundaries, texture, etc)
Image processi
ng
recognition, classificationImage
(big data)NxM
1. Landmarks and shape vectors Landmarks
Desired properties smaller number (not all pixels on boundary lines) robust’ landmarks (similar to different observers)
Types of landmarks Mathematical (Black)
Extremes of gradient, curvature e.g Harris points,
Anatomical/true (Red) by experts
Pseudo landmark (Green)
Shape vectors Sequence of landmark points
2. Shape descriptors: single params Simple and concise values for classification
verbal expr. : long & curved, roughly square, thin etc
Ex 9.1 & Fig 9.2 MATLAB
regionprops(LabledImage, ‘area’, perimeter’, …) See the code
3. Signatures and radial F-expansion definition
shape as points in 2-dim 1D function
r() : periodic (2)
Limited number of coefs
Strength : Robustness form transform Invariant to translation: calculated from centroid Scale: simple constant multiple in coefs Rotation: phase in coefs
Weakness cannot handle non-single valued shape
4. Statistical moments as region descriptors definitions
moment
Central moment
normalizedmoment
Translation invariant
scale invariant
rotation invariant
Hu moments
Ex 9.3 & F 9.5bwlabel GpqMpq
H1
H2H2
5 Texture features Texture
fluctuations of intensity of neighboring pixels Statistical texture features
Range = {max – min} for Variations = {I2 – <I>2} for Entropy of information theory
9.6 Principal component analysis Concept
An expert can explain by 10 words what a normal needs 100 words.
featureshuge size
Principal components
eigenvalue analysis
& dimension reduction
ClassificationSynthesis
100x100 binary imageAll cases = 2^10000
e.g.) 10k facese.g.) unique1000 faces
eye colors, shapesface shapesskin colors
e.g.) combine with
weightingthem
But not all combination is meaningful for human faces
Why ? CorrectionCorrection of pixels
7. An illustrative PCA example Korean medicine: 4 type of human bodies Psychology : MBTI test categorize 16 types Physical measures
{age, height, weight, waist size, arm length, neck circumference, finger length } => { age, 0.7*height + 0.3*weight }
Algorithm for PCA1. Calculate principal axes I
2. Calculate the residuals
3. Repeat 1 if i <= N or error > eTarget
Note New coordinates do not have physical
meanings.
8. Theory of PCA Goal and problem
N (>=M) samples of M-dim vectors: <x1,x2,…, xM> Want to find Pas, and transform
And
Such that y’s are not correlated, i.e.,
Solution The question is typical eigenvec/value problem. i.e., if R with eigenvectors and with eiegen-
values to matrix Explanation
Definition of eigen-vec/value to a Matrix R: R x = x Simply writing it in matrix form.
Note We can choose the largest components only.
=
PCA Calculation Procedure Do Ex 9.5, Fig. 9.6 MATLAB
[U,S,V] = svd(X) a diagonal matrix S of the
same dimension as X with nonnegative diagonal
elements in decreasing order unitary matrices U and V so that X = U*S*V'.
Question Eigenface’s M = WxH. (large)
9.10 Principal axes and components Dimension
# of variables # of observations
Principal components A project of data onto principal axes
11. PCA key properties PCA key properties
PA (axes) are mutually orthogonal. Data in new axes are un-correlated. Successive axes add maximal variance. Diagonal matrix’s eigenvalues are variance.
Ex 9.6 & Fig. 9.11 Data set : pixel directions
from centroid in bw images PCA is two key directions
12. PCA dimensionality reduction PCA dimensionality reduction
Sorting in order of variance size. Discard small variance components
main purpose of PCA E.g.) 0.8 height + 0.2 weight + 0.1 waist
Physical vars
X1
X2
X3
.
.
Xn
VAR(X1)
VAR(X2)
VAR(X3)
.
.
Xn
60%
50%
20%..
3%
Y1
Y2
Y3
.
.
Yn
VAR(Y1)
VAR(y2)
VAR(Y3)
.
.
VAR(Yn)
60%
20%
10%..
1%
Principal axes
13. PCA in Image processing Covariance calculation
over images vs elements (pixels)
M (# of pixels) often >> N (# of images) So image domain variance is used.
Cov[image I, image j]NxN matrix Cov[position i, position j]
M x M matrix
14. Out of sample Training data vs test data
Training data: data set used for model generation Test data(i.e. out of sample): test the performance
Procedure
Since
Accuracy vs compactness
obtained using
15. Eignenface Face library
Face identification
N registered Face Images K PAs <ak> for n
Test Face K PAs <ak>
Similarity measure(Eucleadian)
Sample of registered faces
Top 6 PCA