Transcript
Page 1: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Algorithm Evaluation and Error Analysis

class 7

Multiple View GeometryComp 290-089Marc Pollefeys

Page 2: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Multiple View Geometry course schedule(subject to change)

Jan. 7, 9 Intro & motivation Projective 2D Geometry

Jan. 14, 16

(no class) Projective 2D Geometry

Jan. 21, 23

Projective 3D Geometry (no class)

Jan. 28, 30

Parameter Estimation Parameter Estimation

Feb. 4, 6 Algorithm Evaluation Camera Models

Feb. 11, 13

Camera Calibration Single View Geometry

Feb. 18, 20

Epipolar Geometry 3D reconstruction

Feb. 25, 27

Fund. Matrix Comp. Structure Comp.

Mar. 4, 6 Planes & Homographies Trifocal Tensor

Mar. 18, 20

Three View Reconstruction

Multiple View Geometry

Mar. 25, 27

MultipleView Reconstruction

Bundle adjustment

Apr. 1, 3 Auto-Calibration Papers

Apr. 8, 10

Dynamic SfM Papers

Apr. 15, 17

Cheirality Papers

Apr. 22, 24

Duality Project Demos

Page 3: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

• Maximum Likelihood Estimation

• DLT not invariant normalization• Geometric minimization invariant

• Iterative minimization• Cost function• Parameterization• Initialization• Minimization algorithm

22XXXX

iiii

iii

Page 4: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Automatic computation of HObjective

Compute homography between two imagesAlgorithm

(i) Interest points: Compute interest points in each image

(ii) Putative correspondences: Compute a set of interest point matches based on some similarity measure

(iii) RANSAC robust estimation: Repeat for N samples

(a) Select 4 correspondences and compute H

(b) Calculate the distance d for each putative match

(c) Compute the number of inliers consistent with H (d<t)

Choose H with most inliers

(iv) Optimal estimation: re-estimate H from all inliers by minimizing ML cost function with Levenberg-Marquardt

(v) Guided matching: Determine more matches using prediction by computed H

Optionally iterate last two steps until convergence

Page 5: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Algorithm Evaluation and Error Analysis

• Bounds on performance• Covariance estimation

?

? residual error

uncertainty

Page 6: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Algorithm evaluation

measured coordinates

estimated quantities

true coordinates

xH ,x

H ,x

Test on real data ortest on synthetic data

ii xx • Generate synthetic correspondences

• Add Gaussian noise, yielding ii xx

• Estimate from using algorithmii xx Hmaybe also ii x,x

• Verify how well or ii xHx ii xHx • Repeat many times (different noise, same )

Page 7: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

• Error in one image

• Error in two images

ii xx )σ,0(xx Gii

2/1

1

2x,x2

1

n

iires dn

e

Estimate , thenH

Note: residual error ≠ absolute measure of quality of He.g. estimation from 4 points yields eres=0

more points better results, but eres will increase

2/1

1

2

1

2 x,xx,x4

1

n

ii

n

iires ddn

e

)σ,0(xx 2Gii ii xx )σ,0(xx 2Gii Estimate so that , thenii x,x,H ii xHx

Page 8: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Optimal estimators (MLE)

f

XP

Estimate expected residual error of MLE,Other algorithms can then be judged to this standard

f : M → N (parameter space to measurement space)

X N P M XP f

M NSM

dimension of submanifold SM = #essential parameters

Page 9: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

nX

X

X

SM

2σ,XX NG

)X(X MLE

Assume SM locally planar around X

projection of isotropic Gaussian distribution on N with total variance N2 onto a subspace of dimension s is an isotropic Gaussian distribution with total variance s2

Page 10: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

N measurements (independent Gaussian noise ) model with d essential parameters(use s=d and s=(N-d))

(i) RMS residual error for ML estimator

(ii) RMS estimation error for ML estimator

2/12/12

/1/XX NdNEeres

2/12/12

//XX NdNEeest

nX

X

X

SM

Page 11: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Error in one image Error in two images

2/1/1 Nderes

2/1/ Ndeest

2/1/41 neres

2/1/4 neest

2/1

2

4

n

neres

2/1

2

4

n

neest

nNd 2 and 8 nNnd 4 and 28

Page 12: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Covariance of estimated model

• Previous question: how close is the error to smallest possible error?

• Independent of point configuration

• Real question: how close is estimated model to real model?

• Dependent on point configuration (e.g. 4 points close to a line)

Page 13: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Forward propagation of covariance

Let v be a random vector in M with mean v and covariance matrix , and suppose that f: M →N is an affine mapping defined by f(v)=f(v)+A(v-v). Then f(v) is a random variable with mean f(v) and covariance matrix AAT.

Note: does not assume A is a square matrix

Page 14: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Example:

22 2,0,1,0 GyGx

723),( yxyxfx

7)0,0( fx

40

01 23A

252

3

40

0123AA T

5)( xstd

Page 15: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Example:

22 2,0,1,0 GyGx

yxy

yxx

23

23

40

01

23

23A

257

725

22

33

40

01

23

23AA T

;7;25;25 yxEyyExxE

Page 16: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Non-linear propagation of covariance

Let v be a random vector in M with mean v and covariance matrix , and suppose that f: M →N

differentiable in the neighborhood of v. Then, up to a first order approximation, f(v) is a random variable with mean f(v) and covariance matrix JJT, where J is the Jacobian matrix evaluated at v

)vv(J)v()v( ffj

iij

fJ

v

Note: good approximation if f close to linear within variability of v

Page 17: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Example:

40

01,

0

0),(x T Gyx

523),( 2 yxxyxfx

dxdyyxfyxPx ),(),(

dxdyxyxfyxP 22 ),(),(

222 2/)4/(

π8

1),( yxeyxP

2σ5x422 σ2σ25σ x

5x 222 σ25

2

3

4

123σσ

x

23J

Page 18: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Example:

feydxcybxyaxyxf 22),(

fca 2y

2x σσmean

2y

2x

2y

22y

2x

4x

2 eσσσ2σσσ2variance dcba

fmean est.2y

2x eσσ varianceest. d

Page 19: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Backward propagation of covariance

f : M → N

X N P M

PX1 f

X

f -1

P

X

XX η

PX1 f

Page 20: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Backward propagation of covariance

PX1 f

PPJPP -ff XP f

PXXX f

PPXX J

XXJJJPP 1T-11T

XP 1 f XP 1 f

PX1 f XXXJJJ 11T-11T f

assume f is affineX

f -1PX

what about f -1o ?

solution:

minimize:

XXXJJJ 11T-11T f

Page 21: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Backward propagation of covariance

PX1 fX

f -1PX

1T-11T JJJA

T1T-11T1T-11T JJJJJJ1 f

-T1TT1T-11T JJJJJJ

-11T JJ

Page 22: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Backward propagation of covariance

PX1 fX

f -1PX

-11x

TP JJ If f is affine, then

non-linear case, obtain first order approximations by using Jacobian

Page 23: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Over-parameterization

In this case f is not one-to-one and rank J<M

-11x

TP JJ so can not hold

e.g. scale ambiguity infinite variance!

However, if constraints are imposed, then ok.

T11x

TTA1x

TP AJAJAAJJ

A

Invert d xd in stead of MxM

Page 24: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Over-parameterization

When constraint surface is locally orthogonal to the null space of J

JJ 1x

TP

e.g. usual constraint ||P||=1

nullspace||P||=1

(pseudo-inverse)

Page 25: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Example: error in one image

(i) Estimate the transformation from the data(ii) Compute Jacobian , evaluated at(iii) The covariance matrix of the estimated is

given by JJ 1

xT

h

h/XJ f

Hh

h

TT

TT

x~x~0

x~0x~1h/xJ

iii

iii

iii

y

x

w

iiii JJJJ 1T1

xT

h

Page 26: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Example: error in both images

BBAB

BAAAJJ

1x

T1x

T

1x

T1x

T1

xT

B|AJ

separate in homography and point parameters

Page 27: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Using covariance matrix in point transfer

Thhhx JJ

Txxx

Thhhx JJJJ

Error in one image

Error in two images

(if h and x independent, i.e. new points)

Page 28: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

=1 pixel =0.5cm (Crimisi’97)

Example:

Page 29: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

=1 pixel =0.5cm (Crimisi’97)

Example:

Page 30: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

(Crimisi’97)

Example:

Page 31: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Monte Carlo estimation of covariance

• To be used when previous assumptions do not hold (e.g. non-flat within variance) or to complicate to compute.

• Simple and general, but expensive

• Generate samples according to assumed noise distribution, carry out computations, observe distribution of result

Page 32: Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp 290-089 Marc Pollefeys

Next class: Camera models


Recommended