10
Advanced Robotics, 2014 Vol. 28, No. 18, 1243–1251, http://dx.doi.org/10.1080/01691864.2014.920721 FULL PAPER Scan matching method using projection in dominant direction of indoor environment Shigeru Bando a , Takashi Tsubouchi a and Shin’ichi Yuta b a Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1 Tennoudai, Tsukuba, Ibaraki 305-8573, Japan; b Department of Electrical Engineering, Shibaura Institute of Technology, 3-7-5 Toyosu, Koto-ku, Tokyo 135-8548, Japan (Received 16 October 2013; revised 24 February 2014; accepted 6 April 2014) In this paper, we propose a new scan matching method, which uses the feature of structure in an artificial environment where walls are located parallel or perpendicular to one another. This method can precisely obtain the relative pose between two scan data using the dominant direction with a small calculation cost. Efficacy of this method is proved by an experiment conducted in an indoor environment. Keywords: scan matching; dominant direction; mobile robot; indoor 1. Introduction Precise recognition of the current location is crucial for autonomous mobile robots. Generally, an algorithm which conducts overlapping of scan data obtained by sensors and estimation of the current pose is implemented in robots. This process is known as SLAM (Simultaneous Localization and Mapping), and various techniques have been developed and reported so far. Translational and rotational movements are determined based on the pose which gives the most overlapped match- ing between a current and a previous scan data. In the two-dimensional (2D) coordinates, a lot of scan matching methods have been proposed. One of the most well-known methods is Interactive Closest Point (ICP),[1] which uses point cloud data. Other methods which use projection of data,[2] or image processing techniques for matching have also been reported.[3,4] We have developed a scan matching algorithm using the feature of the direction of a building in an artificial environment. It exploits the dominant direction of an artificial environment. This method can precisely give the relative pose between two scan data using the dominant direction in an indoor environment with a small calculation cost, even when a robot makes a relatively large move and no initial values for the matching are given. We compare our method with ICP regarding the calculation time and preci- sion. The efficacy of our method is examined for various kinds of scan data under different conditions. 2. Related works Scan matching methods using two scan data obtained by a 2D laser scanner have been widely investigated. Here a Corresponding author. Email: [email protected] scan data under analysis is called a current scan data and matching is conducted between a current scan data and a data collected one step before, which is called a reference scan data. In a scan matching, the pose of the current step is obtained when a degree of matching between the current scan data and a reference gives the maximum. ICP [1] is known as one of the most popular scan match- ing methods that use point cloud data. In both ICP and ICP- like algorithms,[57] an initial value is tentatively given to the relative displacement (translation and rotation) of a current scan data to a reference, and closest points are picked up from both as corresponding points, and the sum of all the distances between corresponding points is used for evaluation of matching results. The calculation is repeated until the sum of distances gives the minimum value. The relative displacement is obtained based on the pose when the calculation is converged. In general, this method gives a good result when the scan data are composed of densely distributed points, and the initial estimation for the displace- ment is close to the true value. When the initial estimation of the displacement is off the true value, the calculation would sometimes fall into a local minimum and give an incorrect result. There is another approach, which applies Hough trans- form [8] to scan data and conducts the matching in Hough coordinates.[3,4] In this approach, the relative displacement is quantized with a certain spacing and every quantized state is regarded as a candidate for the displacement. Since all the possible displacement is included in the candidates, an information about the initial displacement is not required for the matching. However, the accuracy of matching de- pends on the resolution of the quantization, which cannot be © 2014 Taylor & Francis and The Robotics Society of Japan

Scan matching method using projection in dominant direction of indoor environment

Embed Size (px)

Citation preview

Page 1: Scan matching method using projection in dominant direction of indoor environment

Advanced Robotics, 2014Vol. 28, No. 18, 1243–1251, http://dx.doi.org/10.1080/01691864.2014.920721

FULL PAPER

Scan matching method using projection in dominant direction of indoor environment

Shigeru Bandoa∗, Takashi Tsubouchia and Shin’ichi Yutab

aGraduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1 Tennoudai, Tsukuba, Ibaraki 305-8573, Japan;bDepartment of Electrical Engineering, Shibaura Institute of Technology, 3-7-5 Toyosu, Koto-ku, Tokyo 135-8548, Japan

(Received 16 October 2013; revised 24 February 2014; accepted 6 April 2014)

In this paper, we propose a new scan matching method, which uses the feature of structure in an artificial environmentwhere walls are located parallel or perpendicular to one another. This method can precisely obtain the relative posebetween two scan data using the dominant direction with a small calculation cost. Efficacy of this method is proved byan experiment conducted in an indoor environment.

Keywords: scan matching; dominant direction; mobile robot; indoor

1. Introduction

Precise recognition of the current location is crucial forautonomous mobile robots. Generally, an algorithm whichconducts overlapping of scan data obtained by sensors andestimation of the current pose is implemented in robots. Thisprocess is known as SLAM (Simultaneous Localization andMapping), and various techniques have been developed andreported so far.

Translational and rotational movements are determinedbased on the pose which gives the most overlapped match-ing between a current and a previous scan data. In thetwo-dimensional (2D) coordinates, a lot of scan matchingmethods have been proposed. One of the most well-knownmethods is Interactive Closest Point (ICP),[1] which usespoint cloud data. Other methods which use projection ofdata,[2] or image processing techniques for matching havealso been reported.[3,4] We have developed a scan matchingalgorithm using the feature of the direction of a building inan artificial environment. It exploits the dominant directionof an artificial environment. This method can precisely givethe relative pose between two scan data using the dominantdirection in an indoor environment with a small calculationcost, even when a robot makes a relatively large move andno initial values for the matching are given. We compare ourmethod with ICP regarding the calculation time and preci-sion. The efficacy of our method is examined for variouskinds of scan data under different conditions.

2. Related works

Scan matching methods using two scan data obtained bya 2D laser scanner have been widely investigated. Here a

∗Corresponding author. Email: [email protected]

scan data under analysis is called a current scan data andmatching is conducted between a current scan data and adata collected one step before, which is called a referencescan data. In a scan matching, the pose of the current stepis obtained when a degree of matching between the currentscan data and a reference gives the maximum.

ICP [1] is known as one of the most popular scan match-ing methods that use point cloud data. In both ICP and ICP-like algorithms,[5–7] an initial value is tentatively givento the relative displacement (translation and rotation) ofa current scan data to a reference, and closest points arepicked up from both as corresponding points, and the sumof all the distances between corresponding points is used forevaluation of matching results. The calculation is repeateduntil the sum of distances gives the minimum value. Therelative displacement is obtained based on the pose whenthe calculation is converged. In general, this method givesa good result when the scan data are composed of denselydistributed points, and the initial estimation for the displace-ment is close to the true value. When the initial estimation ofthe displacement is off the true value, the calculation wouldsometimes fall into a local minimum and give an incorrectresult.

There is another approach, which applies Hough trans-form [8] to scan data and conducts the matching in Houghcoordinates.[3,4] In this approach, the relative displacementis quantized with a certain spacing and every quantized stateis regarded as a candidate for the displacement. Since allthe possible displacement is included in the candidates, aninformation about the initial displacement is not requiredfor the matching. However, the accuracy of matching de-pends on the resolution of the quantization, which cannot be

© 2014 Taylor & Francis and The Robotics Society of Japan

Page 2: Scan matching method using projection in dominant direction of indoor environment

1244 S. Bando et al.

altered once the matching process begins.Ahigh calculationcost required for high-resolution quantization could also bea problem.

Previous studies on scan matching haven’t taken accountof characteristic properties of the environment in artificialfacilities like buildings, that is most of the structures arecomposed of planes connected parallel or perpendicularto each other. However, humans use these features effec-tively to recognize the environment. If we can install thisrecognition process in a mobile robot, the scan matchingwould be conducted more efficiently and stably. In thispaper, we present a scan matching method using the featureof shapes of indoor environment, which can be conductedwithout an initial value of displacement and realize highlyaccurate scan matching with a lower calculation cost. Thismethod is based on the same approach with Hough ScanMatching. But, compared to it, our method has achieved animprovement in the calculation cost.

3. New approach of scan matching using dominantdirection

In this study, we focused on the artificial environment,especially the indoor space. Most of artificial environmentsconsist of the planes, which are aligned along specificdirections. Especially, in buildings, walls and furniture areusually located parallel or perpendicular to one another(Figure 1). Since common specific directions can be foundfrom any location in such an environment in a whole area, itwould be helpful if a mobile robot can exploit such informa-tion while recognizing and understanding the environment.Hereafter, the specific direction is called the dominant direc-tion of the environment. In general, an indoor environmenthas two dominant directions and they are perpendicular toeach other.

A dominant direction of the environment is obtained asa direction of lines which appear in a scan data, wherea large number of points form lines which expand in thesame direction. In other words, the projected profile ofthe scan data to the dominant direction shows quite sharppeaks. So, the projected profiles obtained in the two domi-

(a)

(b-1)

(b-2)

(c)

(d)

1 [m]

Figure 2. An example of 2D scan data and their 1D projectedprofile. (a) Reflection points of 2D scan data. (b-1) and (b-2) Caseof parallel angle to the dominant direction: the histogram showsa sharp profiles. (c) Case of off-parallel angle to the wall: thehistogram shows flat profile. (d) Scenery of inside of the hallway.

nant directions show the spatial feature of the environment(Figure 2). Based on this feature of the dominant direction,our method determines the displacement in translation androtation, respectively.

The proposed scan matching method is conducted asfollows.Estimation of rotation

• The dominant directions are obtained for two sets ofscan data. The dominant directions are assumed to bethe same for both scan data.

• However, there is a possibility that the dominant direc-tions for the current scan data are oriented 0◦, 90◦, 180◦,or 270◦ to those for the reference. This arbitrariness indirection can be avoided by a technique which exploits

Figure 1. Photos of inside of the building (University of Tsukuba, 3L302-1 and Hall L).

Page 3: Scan matching method using projection in dominant direction of indoor environment

Advanced Robotics 1245

a maximum value of cross-correlation of projected pro-files as shown in the next step.

Estimation of translation

• The translation is obtained between the poses whichmaximize cross-correlation between the projected pro-files of the current scan data and the reference scan data.The projection is conducted to the dominant direction.

Decision of validity of matching

• The result of the scan matching is evaluated based onthe cross-correlation value of the projected profiles. Ingeneral, the scan matching cannot give a correct poseunless a pair of scan data are well overlapped. Therefore,if the difference in shape of the environment is toolarge, the cross-collation value of projected profiles isnot available for the evaluation of the scan matching.

The proposed method has essentially the same schemewith the previous works,[3,4] which obtain the rotationaland translational movement using a projected profile of scandata. The previous studies did not use the feature of shapes inthe artificial environment, and examined projected profilesin all the directions. Thus, they need to calculate the cross-correlation of the 2D projected profiles. In contrast, ourmethod only requires calculation of the cross-correlationfor an one-dimensional (1D) projected profile, because itsimultaneously deals with projected profiles in the two or-thogonal dominant directions. As a result, the translationalmovement can be obtained with a low calculation cost evenwith a high resolution. Also, the rotational movement is notaccompanied by accumulated errors, because it is estimatedas an angle regarding to the dominant directions of the realenvironment instead of comparison of the current scan datawith the reference. In the next section, the details of thismethod are described.

4. Detailed algorithm and implementation of scanmatching using dominant direction

4.1. Structure of processing

The proposed method is composed of two parts as shown intheAlgorithm 1, which is described in the pseudo-code. Thefirst part consists of an algorithm of the dominant directiondetection in the scan data (see Algorithm 2), the secondis an algorithm of obtaining relative displacement usingcross-correlation as a parameter (see Algorithm 3). Theinput data for the proposed method are indicated as Scanre f

and Scancur for the reference and the current scan data,respectively. Both of Scanre f and Scancur are expressedin a point cloud format. The output of the proposed methodis the relative displacement T = (�x,�y,�θ) of the posebetween Scanre f and Scancur .

Algorithm 1 Overview of the proposed method

Ensure: T = ScanMatching(Scanre f , Scancur )

Scanre f ′ = T hinOut (Scanre f )

Scancur ′ = T hinOut (Scancur )

θre fD ⇐ Get Dominant Direction(Scanre f ′

)

θcurD ⇐ Get Dominant Direction(Scancur ′)

T ⇐ Get RelativePosi tion(Scanre f ′, θ

re fD , Scancur ′, θcur

D )return T

A2D laser scanner, UTM-30LX (HokuyoAutomatic Co.,Ltd.), is used for this work. The area which can be observedby the scanner is 270◦ in a viewing angle (2ϕo) and 30 m in adistance. The viewing angle, 2ϕo, is divided by N (= 1080)which gives �ϕ (= 0.25◦). The error of distance is less than1%. The divergence of the laser beam is less than 1◦.

4.2. Thin out of scan data – ThinOut –

Figure 3 shows the overview of thinning out of points ina scan data. This process is conducted to homogenize thedensity of points in a scan data. The points in a scan dataobtaining a 2D laser scanner have a tendency of localizationdue to the mechanistic properties of the sensor. The densityof the scan data varies with the distance of reflection points;that is, it becomes high in the area close to the origin. Asa result, projected profile of scan data is affected by thedistance from the origin. In this method, the scan data arethinned out if the distance between a pair of points is shorterthan a threshold so that the density of scan data is keptalmost constant. In this work, the threshold of distance isset at 0.08 m.

4.3. Dominant direction detection in the scan data– GetDominantDirection –

4.3.1. Flow of processing

The algorithm of the dominant direction detection has al-ready been proposed by the authors [9,10]. The dominant di-rection of the scan data θD is obtained as the one which givesthe maximum value for the evaluation parameter M(θ) (seeAlgorithm 2). In this section, the details of the algorithmsare described.

ϕi

i=1i=N

x

y

Δ

x

y

di

Figure 3. Example of an original 2D laser range sensor data (left)and after thinning out (right).

Page 4: Scan matching method using projection in dominant direction of indoor environment

1246 S. Bando et al.

4.3.2. Projected profile of scan data – GetProjectedProfile–

The projected profile of the scan data is obtained by countingthe number of points which are included in the bin on theprojection axis. A set of scan data which is obtained by a2D laser scanner is defined as follows:

Scan(ϕi , di ) : {i = 1, · · · , N } (1)

where i is a reflection point number obtained by the 2D laserscanner, and N is the number of reflection points. Settingthe front side of the sensor as the origin, the direction of thereflection point is obtained as follows (see Figure 3):

ϕi = −ϕo + �ϕ · i (2)(�ϕ = 2ϕo

N − 1

)(3)

where �ϕi is the angular resolution of the sensor. The rangeof angle is from −ϕo to ϕo, and di is the measured distanceof the reflection point to the angle ϕi . For each projectionangle θ , the reflection point is projected to the bin, and theprojected profile of the scan data is obtained as follows:

ρθ (i) = di · sin(ϕi + θ) (4)

Pθ (κ) : {κ = 0, · · · , len − 1}

=N∑

i=1

f (ρθ (i), κ) (5)

f (ρ, κ) ={1 if (κ�ρ − dmax < ρ ≤ (κ + 1)�ρ − dmax)

0 otherwise(6)

where len is the length of one-directional array of the pro-jected profile. We implemented the algorithm which collectsthe projected profile of the scan data for each θ . In this work,the resolution of θ was set at 0.1◦, the maximum distancedmax was at 10 m, and the width of bin of the projectedprofile �ρ was at 0.04 m.

4.3.3. Evaluation of the direction using power spectrum– EvaluateDirection –

In this method, the direction of the projection is evaluatedusing the power spectrum of the projected profile of the scandata by the following steps.

• The projected profile of the scan data Pθ (κ) is Fouriertransformed to obtain H(θ, ω).

• The second moment of the power spectrum to θ isobtained as follows:

M(θ) =∫ ∞

0ω2|H(θ, ω)|dω (7)

The value M(θ) is used to find θ for the dominant direction.

Algorithm 2 Dominant direction detectionEnsure: θD = Get Dominat Direction(Scan)

θD ⇐ 0, Mmax ⇐ 0θ = −180◦while θ < 180◦ do

Pθ ⇐ Get Projected Prof ile(Scan, θ)

M ⇐ EvaluateDirection(Pθ )

if M > Mmax thenMmax ⇐ MθD ⇐ θ

end ifθ ⇐ θ + 0.1◦

end whilereturn θD

4.4. Relative pose estimation – GetRelativePose –

4.4.1. Flow of processing

Initially, an axis x ′ is assumed as a dominant direction in thereference scan data and the axis y′ is taken perpendicularto x ′. Algorithm 3 shows the overview of the relative poseestimation method. First, the projected profile is obtainedfrom the reference scan data to the axis x ′ and y′. Second,the cross-correlation is calculated between the projectedprofiles of the current scan data and the reference, which isconducted in the area (−λmax, λmax). λmax is the maximumvalue of λ, where λ is the distance of translational move-ment of the current scan data along the dominant direction.The cross-correlation is examined for all patterns of thedominant direction with θof f set = {0◦, 90◦, 180◦, 270◦}.The cross-correlation value in the axis x ′ is denoted asCorrx ′ and that in the axis y′ is Corry′ . θof f set is deter-mined as one which gives the maximum value for E =(Corrx ′ max · Corry′ max). Finally, the relative displacementin the x ′–y′ coordinates (�x ′,�y′) is obtained. The resultof scan matching is the global maximum regarding the rela-tive displacement, because our method conducted matchingwithout defining an initial value.

The decision on success or failure of the scan matchingis made based on the value of the cross-correlation. Thevalue is defined by the product of zero-mean normalizedcross-correlation (ZNCC) for both axises, x ′ and y′. Whenthe value is bigger than the threshold, the scan matching isfound successful. The threshold is set for avoiding errorsin the relative displacement, which are caused by occlusiondue to movement of the sensors.

In the above process, the relative displacement is ob-tained in the x ′–y′ coordinates which are the dominantdirection coordinates of the scan data. Hence the relativedisplacement in the global coordinates T = (�x,�y,�θ)

is obtained as follows:

T = (�x,�y,�θ)

�x = �x ′ cos(θ

re fD

)− �y′ sin

re fD

)

Page 5: Scan matching method using projection in dominant direction of indoor environment

Advanced Robotics 1247

�y = �x ′ sin(θ

re fD

)+ �y′ cos

re fD

)

�θ = (θcur

D + θof f set) − θ

re fD (8)

Algorithm 3 Relative Pose EstimationEnsure: T = Get RelativePosi tion(Scanr , θr , Scanc, θc)

Emax ⇐ 0, �x ′ ⇐ 0, �y′ ⇐ 0Pr

x ′ ⇐ Get Projected Prof ile(Scanr , θr )

Pry′ ⇐ Get Projected Prof ile(Scanr , θr + 90◦)

θof f set ⇐ 0◦while θof f set ≤ 270◦ do

θc ′ ⇐ θc + θof f setCorrx ′ max ⇐ 0, Corry′ max ⇐ 0, �x ′

tmp ⇐ 0, �y′tmp ⇐0

λ = −λmaxwhile λ ≤ λmax do

Pcx ′ ⇐ Get Projected Prof ileWithλ(Scanc, θc ′, λ)

Corrx ′ ⇐ GetCorrelation(Prx ′ , Pc

x ′)if Corrx ′ > Corrx ′ max then

Corrx ′ max ⇐ Corrx ′�x ′

tmp ⇐ λ

end ifPc

y′⇐Get Projected Prof ileWithλ(Scanc, θc ′+90◦, λ)Corry′ ⇐ GetCorrelation(Pr

y′ , Pcy′)

if Corry′ > Corry′ max thenCorry′ max ⇐ Corry′�y′

tmp ⇐ λ

end ifλ ⇐ λ + �λ

end whileE ⇐ Corrx ′ max · Corry′ maxif E > Emax then

Emax ⇐ E�x ′ ⇐ �x ′

tmp�y′ ⇐ �y′

tmp�θ ⇐ (θc + θof f set ) − θr

end ifθof f set ⇐ θof f set + 90◦

end whileif Emax < Ethreshold then

return falseelse

�x ⇐ �x ′ cos(θr ) − �y′ sin(θr )

�y ⇐ �x ′ sin(θr ) + �y′ cos(θr )

T ⇐ {�x,�y, �θ}return T

end if

4.4.2. Projected profile of the scan data– GetProjectedProfileWithλ –

The relative displacement is estimated by the projected pro-file of the scan data which is obtained with the dominantdirection θD and the direction perpendicular to it. The pro-jected profile of the reference scan data is obtained usingEquation 5. But, the projected profile of the current scandata is obtained by the other process. The current scan dataare moved by λ along the dominant direction θD , where λ

is varied from −λmax to λmax with a step of �λ. Then, theprojected profile of the scan data is obtained as follows:

PcurθD

(λ, κ) : {κ = 0, · · · , len − 1}= ∑N

i=1 f (ρθD (i) + λ, κ)(9)

4.4.3. Cross-correlation of the projected profile

The cross-correlation of the projected profile between thereference scan and the current scan is obtained as follows:

Corr(λ) =len−1∑k=0

(Pref

θr (κ)Pcurθc ′ (λ, κ)

)(10)

θr is the dominant direction of the reference scan data.θc ′ = θc + θof f set is the dominant direction of the currentscan data which is the same as that of the reference scan.Pref

θr (κ) and Pcurθc ′ (λ, κ) are the projected profile which are

obtained in the dominant direction θr . This process can beconducted by a 1D search and a parallel processing. Thus,it can find the most probable displacement from all possibleones with a low calculation cost. The relative displacementof translation in the dominant direction θr is obtained fromλ which gives the maximum value for Corr(λ).

In the proposed method, the projected profile is not thoughtto be circular. Because, in implementation, the proposedmethod does not use Fourier transform, but it simply con-ducts calculation of cross-correlation between two profiles.In this process, a cross-correlation value between two pro-files is obtained as the sum of products of corresponding thebins in the region where the value of profile is positive.

5. Experiments for evaluation

In this research, we examined our method with two kinds ofexperiments regarding the processing time and the accuracyof estimation of the displacement. Scan matching of a pairof scan data and a map building using a sequential scanmatching were tried, respectively. In these experiments, thescan data are obtained by a 2D laser scanner (UTM-30LX,Hokuyo Automatic Co., Ltd.) and the horizontal plane at aheight of 0.3 m from the floor was scanned. And these wereconducted in the hallway of a building, at Department ofEngineering, University of Tsukuba.

5.1. Results and evaluation of the pair of the scanmatching

In this experiment, the proposed method is evaluated withthe time for the processing, errors of matching results anda final decision of successfulness. An ICP scan matchingwhich is implemented in Mobile Robot ProgrammingToolkit (MRPT[11]) is used for comparison. The ICPis usedwith default parameters, and the optimizer that minimizesthe average square error between two scan data is imple-mented following the Levenberg-Marquardt algorithm.[12]

Page 6: Scan matching method using projection in dominant direction of indoor environment

1248 S. Bando et al.

-10

-5

0

5

10

-5 0 5 10 15

Y[m

]

X[m]

No.4

No.11

No.0

0 1 2 3 4

5 6 7 8 9

10 11 12 13 14

Figure 4. Map of experiment.

Table 1. Percentage of overlapped area between a pair of scan data.

Note: means that the proportion is more than 40%.

The measurement of scan data was automatically carriedout whenever the robot made a translation of about 0.4 m ora rotation of 10◦. First, the robot followed the path as shownin Figure 4, and acquired 15 sets of scan data. The proposedmethod and ICP are conducted for all the combinationsof scan data. The number of combinations is 105 in total.Table 1 shows the percentage of overlapped area in eachmatching. For some of the matching, more than 50% of ascan data overlaps, which occurs for scan data measuredat locations next to each other. But the other combinations,overlapping of scan data is less than 30%.

ICP assumes that the origin of pose is the same for thefirst two sets of scan data. On the other hand, our methodis conduced based on the cross-correlation of the projectedprofile from the initial pose to ±λmax = 2.5 m with a stepof �λ = 0.005 m.

Tables 2 and 3 show the results of the scan matchingwhether the matching is successful (S) or failed (F). If theresult has a big difference from the true value (i.e. morethan 0.2 m or 30◦), it is found as a failure. The results showthat the number of successful matching of the proposedmethod is larger than ICP. When overlapping of scan data islarge, both methods would work similarly well. However, insome cases, even though the overlapping proportion is high,the ICP method fails. That is attributed to a big differencebetween the initial estimation and the true value in therelative displacement. The threshold value for evaluationof matching, indicated as ZNCC, for the axis x ′ and y′ isset at 0.25. And when the value is bigger than the threshold(> 0.25), the scan matching can be determined as success.The value of threshold was determined empirically withvarious patterns of the scan data.

Page 7: Scan matching method using projection in dominant direction of indoor environment

Advanced Robotics 1249

Table 2. Experimental results of the scan matching by ICP.

Note: S: success, F: failure.

Table 3. Experimental results of the scan matching by the proposed method.

Note: S: success, F: failure.

The processing time of ICP is 0.011 s and that of theproposed method is 0.017 s. So, both methods need almostthe same calculation time for matching.

In the case of ICP, the average of errors is (0.07 m, 0.51◦)and the standard deviation of errors is (0.03 m, 0.32◦). Inthe case of the proposed method, the average of errorsis (0.04 m, 0.33◦), and the standard deviation of errors is(0.02 m, 0.25◦). The result shows that our method gives theresults with better efficiency than ICP regarding the errorof the relative displacement. This experiment shows thatthe proposed method does not need the initial value formatching and it has an ability to deal with a wider movementwith a good accuracy with almost the same calculationcost.

Tables 1 and 3 show that the proposed method can obtainthe correct pose when the proportion of the overlapping partis more than 45%. Also, the accuracy and calculation timefor the proposed method is better than ICP. Thus, efficacyof this method is much proved in the indoor environment.

5.2. Experiment of large map building

In this experiment, the pose tracking of the robot and themap building were conducted by the sequential scan match-ing in the indoor environment. The building in which theexperiment was conducted is 136 m long and 236 m wide(see Figure 5). The total distance of travelling path wasabout 1124 m. The measurement of scan data was automat-

Page 8: Scan matching method using projection in dominant direction of indoor environment

1250 S. Bando et al.

236[m]

136[m]

Start End

B

A

CD

L

M G

F E

Figure 5. Track of experiment (1124 m).

-50

0

50

100

-50 0 50 100 150 200 250

Y[m

]

X[m]

Figure 6. Map built by the proposed method.

ically carried out. The robot followed this path once, andacquired 2800 sets of scan data. The scan matching wasconducted with the proposed method which uses the sameparameters as the previous experiment as mentioned in 5.1.Conventional matching methods which used ICP or othermethods would have errors in each estimation of rotation.Therefore, angular errors of the robot were accumulatedand the map building failed when the environment wasvery large. But Figure 6 shows that our method succeededin building the map without errors in the direction. Theaccumulated errors of a map are evaluated by measuringerrors between the initial and final poses of a circuital route.In this result, the errors regarding to two circuital routeswhich began at (85, 33) and (83, −19) in Figure 6 weremeasured. For the route which began at (85, 33), the robotfollowed the path in the order of D, C, A, B, and D. Theerrors were 0.3 m in the X direction and 0.8 m in Y direction.Compared with the travel distance 398 m during the circuitalroute, these errors were very small. For the other route whichbegan at (83, −19), the robot followed the path in the orderof G, D, C,A, B, D, C, B, F, and G. The errors were of 0.5 m inthe X direction and 1.2 m in Y direction. Compared with thetravel distance 901 m during the circuital route, these errorswere very small, too. So, efficacy of this method was provedin building a map for a large area in the indoor environment.

6. Conclusion

We have developed a scan matching algorithm using thefeature of structures in the artificial environment. In thispaper, we proposed a new scan matching method, whichuses the dominant direction in the artificial environment.This method can precisely obtain the relative pose betweentwo scan data using the dominant direction in an indoorenvironment with a small calculation cost, especially whenthe movement is large and the initial values for the matchingare not given. Our approach uses this feature regarding thedirection, then the translational and rotational movementsare individually obtained. This approach would be similarto the recognition of the environment which is conductedby humans. We compared our method and ICP regardingthe calculation cost, time, and precision, then the efficacyof our method was examined for matching of 2D shapes inthe indoor environment, and it was finally proved.

AcknowledgementsThis work was supported by JSPS Grant-in-Aid for JSPS FellowsGrant Number 25-1761.

Notes on contributorsShigeru Bando received the BE, ME and PhDdegrees in Engineering from the Universityof Tsukuba, Japan in 2009, 2011 and 2014.He is currently JSPS Postdoctoral ResearchFellow at Intelligent Robot Laboratory ofGraduate School of Systems and InformationEngineering, University of Tsukuba. Hiscurrent research interests are localization andnavigation in indoor or outdoor environments,

service robots and field robots. He is a Member of the RSJ.

Takashi Tsubouchi received the BS degreein Physics, and ME and PhD degrees inInformation Science and Electronics from theUniversity of Tsukuba, Tsukuba, Japan, in1983, 1985 and 1988, respectively. He iscurrently working at the Graduate Schoolof Systems and Information Engineering,University of Tsukuba as a Professor. Hiscurrent research interests include service

robots, navigation and localization in outdoor or indoorenvironments, automation of construction machines at open cutmine and control configured vehicles. He is a Fellow of the JSME,and a Member of the IEEE, RSJ and JSAI.

Shin’ichi Yuta He completed his PhD inElectrical engineering at Keio University in1975. Since 1978, he had been at Universityof Tsukuba and its full professor after 1992.He served a vice-president for research, in2004–2006, and then the director of TsukubaIndustrial Liaison and Cooperative ResearchCenter. In 2012 March, he retired fromUniversity of Tsukuba, and he is now an adjunct

professor in Shibaura Institute of Technology, Tokyo, as well as

Page 9: Scan matching method using projection in dominant direction of indoor environment

Advanced Robotics 1251

an emeritus professor at University of Tsukuba. As an expert inrobotics, he has conducted an autonomous mobile robot project,since 1980, and published more than 500 technical papers in thisfield. He has been keeping a close relationship and collaborationwith many industries, for the developments of practical mobilerobot systems and devices for intelligent robots. The typicalachievements include a development of small size scanning laserrange sensors for mobile robots, which is produced by HokuyoLtd., and now widely used in the world for robotics applications.He is a fellow of RSJ and IEEE.

References[1] Besl J, Warren MI. A method of registration of 3D shapes.

IEEE Trans. Pattern Anal. Mach. Intell. 1992;14:239–256.[2] Weiss G, Wetzler C, Puttkamer E. Keeping track of position

and orientation of moving indoor systems by correlationof range-finder scans. In: Proceedings of the IEEE/RSJInternational Conference on Intelligent Robots and Systems.Munich; 1994. p. 595–602.

[3] Grossmann A, Poli R. Robust mobile robot localisation fromsparse and noisy proximity readings using Hough transformand probability grids. Robot. Auton. Syst. 2001;37:1–18.

[4] Censi A, Iocchi L, Grisetti G. Scan matching in theHough domain. In: Proceedings of the IEEE InternationalConference on Robotics and Automation. Barcelona; 2005.p. 2739–2744.

[5] Lu F, Milios E. Robot pose estimation in unknownenvironments by matching 2D range scans. J. Intell. Rob.Syst. 1997;18:249–275.

[6] Pfister S, Kreichbaum K, Roumeliotis S, Burdick J. Aweighted range sensor matching algorithm for mobilerobot displacement estimation. In: Proceedings of the IEEEInternational Conference on Robotics and Automation.Washington; 2002. p. 1667–1674.

[7] Minguez J, Montesano L, Lamiraux F. Metric-basediterative closest point scan matching for sensor displacementestimation. IEEE Trans. Rob. 2006;22:1047–1054.

[8] Duda R, Hart P. Use of the Hough transformation to detectlines and curves in pictures. J. ACM. 1972;15:11–15.

[9] Bando S, Yuta S. Use of the parallel and perpendicularcharacteristics of building shape for indoor map making andpositioning. In: Proceedings of the IEEE/RSJ InternationalConference on Intelligent Robots and Systems. Taipei; 2010.p. 4318–4323.

[10] Bando S, Yuta S. Making 2D map of environment basedin major direction detection on range data using 2D powerspectrum. In: Proceedings of 16th Robotics Symposia.Kagoshima; 2011. p. 457–462. Japanese.

[11] The mobile robot programming toolkit. Available from:http://www.mrpt.org/.

[12] Fitzgibbon A. Robust registration of 2D and 3D point sets.In: Proceedings of British Machine Vision Conference.Manchester; 2001. p. 1–10.

Page 10: Scan matching method using projection in dominant direction of indoor environment

Copyright of Advanced Robotics is the property of Taylor & Francis Ltd and its content maynot be copied or emailed to multiple sites or posted to a listserv without the copyright holder'sexpress written permission. However, users may print, download, or email articles forindividual use.