Relations between image coordinates Given coordinates in one image, and the tranformation Between...

Preview:

Citation preview

Relations between image coordinates

Given coordinates in one image, and the tranformationBetween cameras, T = [R t], what are the image coordinatesIn the other camera’s image.

Definitions

Essential Matrix: Relating between image coordinates

camera coordinate systems, related by a rotation R and a translation T:

O x→

O' x→

' OO'→

Are Coplanar, so:

x'=R t

000 1

⎣ ⎢

⎦ ⎥x€

O' x→

'• O x→

× OO'→ ⎛

⎝ ⎜

⎞ ⎠ ⎟= 0

x'=R t

000 1

⎣ ⎢

⎦ ⎥x

x'•r t × Rx( ) = 0

O' x→

'• O x→

× OO'→ ⎛

⎝ ⎜

⎞ ⎠ ⎟= 0

x'• E x( ) = 0

x'E x = 0

E =

0 −tz ty

tz 0 −tx

−ty tx 0

⎢ ⎢ ⎢

⎥ ⎥ ⎥R

What does the Essential matrix do?

It represents the normal to the epipolar line in the other image

n = E x

xon epipolar line' ⇒ n ⋅ x ' = 0

n1x1 + n2x2 + n31 = 0

y = mx + b( )⇒ b = −n3, m = −n1

n2

The normal defines a line in image 2:

What if cameras are uncalibrated? Fundamental Matrix

Choose world coordinates as Camera 1. Then the extrinsic parameters for camera 2 are just R and tHowever, intrinsic parameters for both cameras are unknown.Let C1 and C2 denote the matrices of intrinsic parameters. Then the pixel coordinates measured are not appropriate for the Essential matrix. Correcting for this distortion creates a new matrix: the Fundamental Matrix.

C=

x'measured = C2x ' xmeasured = C1x

x '( )tE x = 0⇒ C2

−1x 'measured( )tE C1

−1xmeasured( ) = 0

x 'measured( )tF xmeasured = 0

F = C2−tEC1

−1

Computing the fundamental Matrix

Computing : I Number of Correspondences Given perfect image points (no noise) in general position. Each point correspondence generates one constraint on the fundamental matrix

Constraint for one point

Each constraint can be rewritten as a dot product.Stacking several of these results in:

Stereo Reconstruction

> 8 Point matches

C

Reconstruction Steps

Determining Extrinsic Camera Parameters

M1 = C1 Ir 0 [ ]

M2 = C2 Rr t [ ]

Why can we just use this as the external parameters for camera M1? Because we are only interested in the relative position of the two cameras.

• First we undo the Intrinsic camera distortions by defining newnormalized cameras

M1

norm = C1−1M1 and M2

norm = C2−1M2

Determining Extrinsic Camera Parameters• The normalized cameras contain unknown parameters

M1

norm = C1−1M1 ⇒ M

1

norm = Ir 0 [ ]

M2norm = C2

−1M2 ⇒ M2norm = R

r t [ ]

• However, those parameters can be extracted from the Fundamental matrix

F =C2−tEC1

−1

E = C2t FC1

E =0 −tz ty

tz 0 −tx

−ty tx 0

⎢ ⎢ ⎢

⎥ ⎥ ⎥R =

r t ×R

Extract t and R from the Essential Matrix

How do we recover t and R? Answer: SVD of E

E =USV t

S diagonal

U,V orthogonal and det() =1 (rotation)

R = UWV t or R = UW tV t

W =

0 −1 0

1 0 0

0 0 1

⎢ ⎢ ⎢

⎥ ⎥ ⎥

rt = u3 or

r t = −u3

Reconstruction AmbiguitySo we have 4 possible combinations of translations and rotations giving 4 possibilities for M2

norm = [R | t]

1. M2norm = [UWtVt | t]

2. M2norm = [UWVt | t]

3. M2norm = [UWtVt | -t]

4. M2norm = [UWVt | -t]

Which one is right?

Both Cameras must be facing the same direction

Which one is right?

How do we backproject?

x'measured = C2x ' xmeasured = C1x

Knowing Ci allows us to determine the undistorted image points :x'= C2

−1x'measured x = C1−1xmeasured

Recalling the projection equations allows to relate the world point and the image points.x'= C2

−1x'measured x = C1−1xmeasured

z' x '= C2−1C2 M2

norm X zx = C1−1C1 I | 0[ ]X

z' x '= M2norm X zx = I | 0[ ]X

Backprojection to 3D

We now know x, x’, R, and tNeed X

Solving…

zx i = M norm X i

x i

y i

1

⎢ ⎢ ⎢

⎥ ⎥ ⎥=

m1t

m2t

m3t

⎢ ⎢ ⎢

⎥ ⎥ ⎥X i ⇒

z ui = m1t ⋅X i

zv i = m2t ⋅X i

z = m3t ⋅X i

⎨ ⎪

⎩ ⎪

⇒(m3

t ⋅X i) ui = m1t ⋅X i

(m3t ⋅X i)v i = m2

t ⋅X i

⎧ ⎨ ⎩

⇒(m3

t ⋅X i) ui − m1t ⋅X i = 0

(m3t ⋅X i)v i − m2

t ⋅X i = 0

⎧ ⎨ ⎩

⇒ui(m3

t )− m1t

v i(m3t )− m2

t

⎣ ⎢

⎦ ⎥X i = 0

Solving…

u'i(2m3

t )−2m1t

v 'i(2m3

t )−2m2t

⎣ ⎢

⎦ ⎥X i = 0

Combining 1 & 2 :ui(m3

t )− m1t

v i(m3t )− m2

t

u'i(2m3

t )−2m1t

v 'i(2m3

t )−2m2t

⎢ ⎢ ⎢ ⎢

⎥ ⎥ ⎥ ⎥

X i = 0

AX i = 0It has a solvable form! Solve using minimum eigenvalue-Eigenvector approach (e.g. Xi = Null(A))

Where 2mti denotes the ith

row of the second camera’s normalized projection matrix.

Finishing up

What else can you do with these methods? Synthesize new views

60 deg!Image 1 Image 2

Avidan & Shashua, 1997

Faugeras and Robert, 1996

Undo Perspective Distortion (for a plane)Original images (left and right)

Transfer and superimposed images

The ``transfer'' image is the left image projectively warped so that points on the plane containing the Chinese text are mapped to their position in the right image.

•The ``superimpose'' image is a superposition of the transfer and right image. The planes exactly coincide. However, points off the plane (such as the mug) do not coincide.

*This is an example of planar projectively induced parallax. Lines joining corresponding points off the plane in the ``superimposed'' image intersect at the epipole.

Its all about point matches

Point match ambiguity in human perception

Traditional Solutions

• Try out lots of possible point matches.

• Apply constraints to weed out the bad ones.

Find matches and apply epipolar uniqueness constraint

Compute lots of possible matches

1) Compute “match strength”2) Find matches with highest strength

Optimization problem with many possible solutions

Example

Recommended