45
PCA Extension By Jonash

PCA Extension

Embed Size (px)

DESCRIPTION

PCA Extension. By Jonash. Outline. Robust PCA Generalized PCA Clustering points on a line Clustering lines on a plane Clustering hyperplanes in a space. Robust PCA. Rrbust Principal Component Analysis for Computer Vision Fernando De la Torre Mochael J. Black CS, Brown University. - PowerPoint PPT Presentation

Citation preview

Page 1: PCA Extension

PCA Extension

By Jonash

Page 2: PCA Extension

Outline

• Robust PCA

• Generalized PCA– Clustering points on a line– Clustering lines on a plane– Clustering hyperplanes in a space

Page 3: PCA Extension

Robust PCA

• Rrbust Principal Component Analysis for Computer Vision– Fernando De la Torre– Mochael J. Black

• CS, Brown University

Page 4: PCA Extension

PCA is Least-Square Fit

Page 5: PCA Extension

PCA is Least-Square Fit

Page 6: PCA Extension

Robust Statistics

• Recover the best fit for the majority of the data

• Detect and reject outliers

Page 7: PCA Extension

Robust PCA

Page 8: PCA Extension

Robust PCA

Page 9: PCA Extension

Robust PCA

• Training images

Page 10: PCA Extension

Robust PCA

• Naïve PCA

• Simply reject

• Robust PCA

Page 11: PCA Extension

RPCA

• In traditional PCA, we minimize• Σn

i = 0 (di – B BT di)2 = Σni = 0 (di - Bci)2

• EM PCA Limσ -> 0(D = BC + σ2I)

E-step C = (BTB)-1BTDM-step B = DCT(CCT)-1

di B

BTdi = ci

BBTdi

Page 12: PCA Extension

RPCA

• Xu and Yuille [1995] tries to minimize

– Σni = 1 [ V i(di – B ci)2 + n(1-Vi) ]

– Hard to solve (continuous + discrete)

Page 13: PCA Extension

RPCA

• Gabriel and Zamir [1979] tries to minimize

– Σni = 1 Σd

p = 1 [ wpi(dpi – Bci)2]

– Impratical for high dimension

– “Low rank approximation of matrices by least squares with any choice of weights” 1979

Page 14: PCA Extension

RPCA

• Idea is to use a robust function ρ– Geman-McClureρ (x,σ) = x2/(x2 + σ2)

– Σni = 1 Σd

p = 1 ρ[ (dpi–μp –Σkj = 1 bpjcji), σp]

– Approximated by local quadratic function– Use gradient descent– The rest is nothing but heuristics

Page 15: PCA Extension

RPCA

Page 16: PCA Extension

Robust PCA - Experiment

• 256 training images (120x160)

• Obtain 20 RPCA basis

• 3 hrs on 900MHz Pentinum III in Matlab

Page 17: PCA Extension

Outline

• Robust PCA

• Generalized PCA– Clustering points on a line– Clustering lines on a plane– Clustering hyperplanes in a space

Page 18: PCA Extension

Generalized PCA

• Generalized Principal Component Analysis– Rene Vidal– Yi Ma– Shankar Sastry

• UC Berkley and UIUC

Page 19: PCA Extension

GPCA

Page 20: PCA Extension

GPCA Example 1

Page 21: PCA Extension

GPCA Example 2

Page 22: PCA Extension

GPCA Example 3

Page 23: PCA Extension

GPCA Goals

1. # of subspaces and their dimension2. Basis for subspace3. Segmentation of data

Page 24: PCA Extension

GPCA Ideas

• Union of subspaces = certain polynomials

Page 25: PCA Extension

Outline

• Robust PCA

• Generalized PCA– Clustering points on a line– Clustering lines on a plane– Clustering hyperplanes in a space

Page 26: PCA Extension

GPCA 1D Case

Page 27: PCA Extension

GPCA 1D Case Cont’d

Page 28: PCA Extension

GPCA 1D Case Cont’d

MN = n+1

To have a unique solution, rank(Vn) = n = Mn- 1

unknowns

Page 29: PCA Extension

GPCA 1D Example

• n = 2 groups• pn(x) = ( x – μ1) ( x – μ2)

• No polynomial of degree 1• Infinite polynomial of degree 3

• pn(x) = x2 + c1x + c2 => Polynomial factor

Page 30: PCA Extension

Outline

• Robust PCA

• Generalized PCA– Clustering points on a line– Clustering lines on a plane– Clustering hyperplanes in a space

Page 31: PCA Extension

GPCA 2D Case

• L j = { X = [x, y]T: bj1x + bj2y = 0 }

• (b11x + b12y = 0) or (b21x + b22y = 0)…

Page 32: PCA Extension

GPCA 2D Case Cont’d

• (b11x + b12y = 0) or (b21x + b22y = 0)…

• Pn(x) = (b11x + b12y)…(bn1x + bn2y) = 0 = Σck xn-k yk

Page 33: PCA Extension

GPCA 2D Case Cont’d

• Take n = 2 for example…• p2(x) = (b11x + b12y)(b21x + b22y)

• ▽p2(x) = (b21x + b22y)b1 + (b11x + b12y)b2

, bj = [bj1, bj2] T

• if x ~ L1, then ▽p2(x) ~ b1, otherwise ~ b2

Page 34: PCA Extension

GPCA 2D Case Cont’d

• Given that {yj ε Lj}, the normal vector of Lj is bj ~ ▽pn(yj)

• 3 things…1. Determine “ n ” as min{ j: rank(Vj) = j }

2. Solve cn for Vncn = 03. Find normal vector bj

Page 35: PCA Extension

Outline

• Robust PCA

• Generalized PCA– Clustering points on a line– Clustering lines on a plane– Clustering hyperplanes in a space

Page 36: PCA Extension

GPCA Hyperplanes

• Still assume d1 = … = dn = d = D – 1• Sj = { bj

Tx = bj1x1 + bj2x2 + … + bjDxD = 0}

Page 37: PCA Extension

GPCA Hyperplanes

MN = C(D+n-1, D)

Page 38: PCA Extension

GPCA Hyperplanes

Page 39: PCA Extension

GPCA Hyperplanes

• Since we know n, we can solve for ck

• ck => bk by ▽pn(x)

• If we know yj on each Sj, finding bj will be easy

Page 40: PCA Extension

GPCA Hyperplanes

• One point yj on each hyperplane Sj

• Consider a random line L = t * v + x0

• Obtain yj by intersecting L and Sj

– yj = tj * v + x0

– Find roots tj by … Pn(t v + xo)

Page 41: PCA Extension

GPCA Hyperplanes

• Summarize1. We want to find n to solve for c

2. To get b (normal) for each S, find ▽pn(x) 3. To get label j, solve pn(yj = tj * v + x0) = 0

Page 42: PCA Extension
Page 43: PCA Extension

One More Thing

Page 44: PCA Extension

One More Thing

• Previously we assume d1 = … =dn= D – 1• Actually we cannot assume that…

• Please read section 4.2 & 4.3 … by yourself– Discuss how to recursively reduce dimensi

on

Page 45: PCA Extension