18
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Embed Size (px)

Citation preview

Page 1: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Localization for Mobile Robot Using Monocular

Vision

Hyunsik AhnJan. 2006

Tongmyong University

Page 2: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

1. Introduction (1)

Self-localization methods of mobile robot Position tracking : encoder, ultrasonic sensors, local sensors Global localization : laser-range scanner, vision-based methods

Vision-based methods of indoor application Stereo vision

Directly detects the geometric information, complicated H/W, much processing time

Omni-directional view Using conic mirror, low resolution

Mono view using landmarks Using artificial landmarks

Page 3: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

1. Introduction (2)

Related work in monocular method Sugihara(1988) did pioneering works in localization using vertical edge

s. Atiya and Hager(1993) used geometric tolerance to describe observatio

n error. Kosaka and Kak (1992) proposed a model-based monocular vision syst

em with a 3D geometric model. Munoz and Gonzalez (1998) added an optimization procedure. Talluri and Aggarwal (1996) considered correspondence problem betw

een a stored 3D model and 2D image in an outdoor urban environment.

Aider et. al. (2005) proposed an incremental model-based localization using view-invariant regions.

Another approach adopting SIFT (Scale-Invariant Feature Transformation) algorithm to comput correspondence between the SIFT features saved and images during navigation.

Page 4: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

1. Introduction (3)

A self-localization method using vertical lines with mono view is proposed.

Indoor environment, use horizontal and vertical line features(doors, furniture)

Find vertical lines, compute pattern vectors Match the lines with the corners of map Find position (x,y,θ) with matched information

Page 5: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

Detect line segments

2. Localization algorithm

Map-making and path planning

Line segments ≥ 3

Matching lines with map

Input image

end

Uncertainty > T

Yes

Yes

Yes

No

No

No

Localization(x,y,θ)

Destination

Fig. 1 The flowchart of self-localization

Page 6: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.1 Line feature detection

Vertical Sobel operation Vertically projected histogram One dimensional averaging, and thresholding Local maximum are indexed as feature points

Fig. 2 Projected histogram and a local maximum

),,( 21 nxxx

U

Local maximum

Threshold value

1x 2x 3x

Page 7: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.2 Correspondence of feature vectors (1)

Using geometrical information of the line features of the map Feature vectors are defined with hue(H) and saturation(S) Feature vectors of the right and left regions are defined

Check whether a line meats floor regions Contacted line, non-contacted line :

Define visibility of regions of contacted line Visible region, Occluded region

2

1

l

lLi

2

1

r

rRi ni ,,2,1 (1)

Page 8: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.2 Correspondence using feature vectors (2)

Matching of feature vector of lines with map. Lines of both visible region, one visible region, non-contacted line

The correspondence of neighbor lines are investigated with the lines having geometrical relationship.

1x2x

3x

4x

1l

2r3l

3r

2l

4r

Fig. 3 Floor contacted lines and visible regions

. Contacted line : x1 , x2 , x3

. Non-contacted line : x 4

. Visible region : l1, l2, r2, l3, r3, r4

. Occluded region : r1 , l4

Page 9: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.3 Self-localization using vertical lines (1)

The coordinates of feature points are matched to the camera coordinates of the map .),( ii CyCx

),,( 21 nxxx

Fig. 4 Global and camera coordinates

G

X

Yc

Xca

b

),( 11 yx GG

),( 22 yx GG

),( 33 yx GG

Y

Page 10: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.3 Self-localization using vertical lines (2)

Fig. 5 Perspective transformation of camera coordinates

Zc

Xc

V

U

Yc

),( 11 yx CC

),( 22 yx CC

),( 33 yx CC

1x

2x

3x

: Image plane coordinates

: Camera coordinates : Feature points of camera coordinates

: Features of image plane : Focal length of camera

),,( ZcYcXc

),,( 21 nxxx

),( VU

),( ii CyCx ni ,,2,1

Page 11: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.3 Self-localization using vertical lines (3)

Camera coordinates can be transformed to world coordinates by a rigid body transformation T.

1

0

1

0i

i

i

i

Cy

Cx

Gy

Gx

T ni ,,2,1

ztrans TTT

1000

0100

00cossin

00sincos

1000

100

010

001

c

b

a

The camera coordinates and world coordinates are related with translation and rotation. The transformation T can be defined as

(2)

(3)

Page 12: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.3 Self-localization using vertical lines (4)

Global coordinates are mapped to camera coordinates.

The perspective transformation is (5)

Perspective transformation and rigid transformation of the coordinates induce a system of nonlinear equations.

induces from (4), (5).

1

0

1

01 i

i

i

i

Gy

Gx

Cy

Cx

T ni ,,2,1

ii

i Cyx

Cx

ni ,,2,1

tn bafbafbafba )),,(,),,,(),,,((),,( 21 F

(5)

),,( baf i

(6)

(4)

Page 13: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

2.3 Self-localization using vertical lines (5)

Jacobian matrix

Newton’s method to find the solution of the nonlinear equations is (8) when initial value is given.

where

(7)

),,( baJ

),,(),,(),,(

),,(),,(),,(

),,(),,(),,(

),,(222

111

baf

b

baf

a

baf

baf

b

baf

a

baf

baf

b

baf

a

baf

ba

nnn

J

0),,( baF)0(p

1),( )1(1)1()1()( kkkkk PFJPpp (8)

k

k

kk b

a

)(P

Page 14: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

3. Experimental results (1)

Real position (mm, °) Measured position (mm, °)

No. X Y Angle X Y Angle

1 0 0 0 23.79 46.13 0.04

2 -160 100 0 32.51 41.54 1.53

3 -160 200 0 49.90 58.28 1.24

4 -160 400 0 34.54 74.82 1.39

5 -160 600 0 37.67 61.41 1.20

6 -160 800 0 29.35 43.86 1.31

7 -160 1000

0 26.57 100.37

1.37

8 -160 1200

0 30.46 18.47 1.38

9 -160 1400

0 18.59 96.39 1.49

10 -160 1600

0 14.18 93.74 1.31

11 -160 1800

0 9.38 9.46 2.00

12 0 660 0 20.21 7.84 3.32

13 0 1150

0 34.61 72.98 2.29

14 0 1555

0 24.36 44.78 1.83

Real position (mm, °) Measured position (mm, °)

No. X Y Angle X Y Angle

15 0 2005 0 15.66 35.60 0.26

16 -650 380 0 50.08 88.55 0.61

17 -630 660 0 15.84 32.29 0.89

18 -560 1045 -48 4.15 27.56 1.78

19 -465 1300 -45 36.01 59.97 0.84

20 -375 1465 -47 28.03 41.17 2.82

21 -285 1740 -37 11.27 25.81 0.83

22 -190 2125 -32 7.33 73.29 1.69

23 -75 2435 -22 18.21 70.05 1.15

24 -25 2715 -10 19.25 44.07 3.24

25 165 3034 -23 10.03 70.43 1.97

26 325 3455 -27 80.63 66.94 1.45

27 370 3915 -15 9.045 11.04 2.5

2 of errors 32.83 53.80 1.58

Table 1 Real positions and errors

Page 15: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

3. Experimental results (2)

Fig. 7 The procedures of detecting vertical lines(c) Projected histogram (d) Vertical lines

(a) Original Image (b) Vertical edges

Fig. 6 Mobile robot

Page 16: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

3. Experimental results (3)

Fig. 8. Input image of each sequence

Page 17: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

3. Experimental results (4)

Fig. 10. Errors through Y axis

Fig. 9. The result of localization in the given map

Page 18: Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

Robot Vision Lab.

4. Conclusions

A self-localization method using vertical line segment with mono view was proposed.

Line features are detected by projected histogram of edge image. Pattern vectors and their geometrical properties are used for match

with the point of map. A system of nonlinear equations with perspective and rigid

transformation of the matched points is induced. Newton’s method was used to solve the equations. The proposed algorithm using mono view is simple and applicable

to indoor environment.