7
Robotics and Autonomous Systems 63 (2015) 219–225 Contents lists available at ScienceDirect Robotics and Autonomous Systems journal homepage: www.elsevier.com/locate/robot Consistent robot localization using Polar Scan Matching based on Kalman Segmentation José Manuel Cuadra Troncoso a,, José Ramón Álvarez-Sánchez a , Israel Navarro Santosjuanes b , Félix de la Paz López a , Raúl Arnau Prieto c a Dpto. de Inteligencia Artificial, UNED, Madrid, Spain b Airbus Defence and Space, Madrid, Spain c Fundación Centro Tecnológico de Componentes, Santander, Spain highlights Consistent robot localization using only scan matching. Consistent robot mapping using only scan matching. Laser scan segmentation by means of regression using Kalman filtering. article info Article history: Available online 13 October 2014 Keywords: Robot localization Polar scan-matching Kalmann segmentation Area center method SLAM abstract This work presents a revision of polar scan-matching procedure (PSM) in order to obtain an accurate robot localization in long runs without applying any other later procedure. In order to obtain an accurate robot pose estimation in long runs, PSM uses some SLAM procedure to correct error introduced by the scan- matching procedure. PSM compares scan points obtained from different positions, we have observed that if it associates points belonging to different objects, significant errors could appear in pose estimation. In our approach, an advanced line segmentation algorithm, based on Kalman filtering, is used to prevent as much as possible this kind of associations and to improve other areas in the original procedure. This new scan-matching procedure is named Polar Scan Matching based on Kalman Segmentation (PSM-KS). Experimental results in simulations show a reasonably accurate robot pose estimation, that outperform the accuracy obtained using odometry only, with low execution times. Consistent maps are obtained by simply overlapping estimated segments drawn from estimated poses. © 2014 Elsevier B.V. All rights reserved. 1. Introduction Laser scan matching is an approach to solve the problem of mobile robot Simultaneous Localization and Mapping (SLAM). SLAM provides a robot with the capability to perform its tasks autonomously in an unknown environment by creating a map of its surrounding environment and at the same time locating itself [1,2]. There are two versions for the SLAM problem [3]: online SLAM, by trying to estimate the current position and the map from Corresponding author. Tel.: +34 913987144. E-mail addresses: [email protected] (J.M. Cuadra Troncoso), [email protected] (J.R. Álvarez-Sánchez), [email protected] (I. Navarro Santosjuanes), [email protected] (F. de la Paz López), [email protected] (R. Arnau Prieto). past measures, and off-line SLAM, by trying to estimate the entire path and the map from the set of all measures. Scan matching can be applied to static and dynamic environments and can be performed in 2D as well as in 3D. Also, scan matching approaches can be local [4], or global [5]. In local scan matching the robot pose is estimated by comparing a pair of laser scans [6]. Starting from a known position, the matching process is repeated at regular intervals. For each stretch traveled, the scan taken at the initial position (reference scan) is stored. Additionally, the scan taken at the end of the stretch (called current scan) is matched against the reference scan; from this comparison, the robot pose is calculated relatively to its initial pose. This process provides translation and orientation corrections to offset errors in the robot pose calculation obtained from odometry data alone. On the other hand, in global scan matching [7,8], the current scan is matched against a global map of features (determined a priori or built while exploring the environment) or a database of scans. http://dx.doi.org/10.1016/j.robot.2014.09.017 0921-8890/© 2014 Elsevier B.V. All rights reserved.

Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

  • Upload
    raul

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

Robotics and Autonomous Systems 63 (2015) 219–225

Contents lists available at ScienceDirect

Robotics and Autonomous Systems

journal homepage: www.elsevier.com/locate/robot

Consistent robot localization using Polar Scan Matching based onKalman Segmentation

José Manuel Cuadra Troncoso a,∗, José Ramón Álvarez-Sánchez a,Israel Navarro Santosjuanes b, Félix de la Paz López a, Raúl Arnau Prieto c

a Dpto. de Inteligencia Artificial, UNED, Madrid, Spainb Airbus Defence and Space, Madrid, Spainc Fundación Centro Tecnológico de Componentes, Santander, Spain

h i g h l i g h t s

• Consistent robot localization using only scan matching.• Consistent robot mapping using only scan matching.• Laser scan segmentation by means of regression using Kalman filtering.

a r t i c l e i n f o

Article history:Available online 13 October 2014

Keywords:Robot localizationPolar scan-matchingKalmann segmentationArea center methodSLAM

a b s t r a c t

Thiswork presents a revision of polar scan-matching procedure (PSM) in order to obtain an accurate robotlocalization in long runs without applying any other later procedure. In order to obtain an accurate robotpose estimation in long runs, PSM uses some SLAM procedure to correct error introduced by the scan-matching procedure. PSM compares scan points obtained from different positions, we have observed thatif it associates points belonging to different objects, significant errors could appear in pose estimation.In our approach, an advanced line segmentation algorithm, based on Kalman filtering, is used to preventas much as possible this kind of associations and to improve other areas in the original procedure. Thisnew scan-matching procedure is named Polar Scan Matching based on Kalman Segmentation (PSM-KS).Experimental results in simulations show a reasonably accurate robot pose estimation, that outperformthe accuracy obtained using odometry only, with low execution times. Consistent maps are obtained bysimply overlapping estimated segments drawn from estimated poses.

© 2014 Elsevier B.V. All rights reserved.

1. Introduction

Laser scan matching is an approach to solve the problem ofmobile robot Simultaneous Localization and Mapping (SLAM).SLAM provides a robot with the capability to perform its tasksautonomously in an unknown environment by creating a map ofits surrounding environment and at the same time locating itself[1,2]. There are two versions for the SLAM problem [3]: onlineSLAM, by trying to estimate the current position and the map from

∗ Corresponding author. Tel.: +34 913987144.E-mail addresses: [email protected] (J.M. Cuadra Troncoso),

[email protected] (J.R. Álvarez-Sánchez), [email protected](I. Navarro Santosjuanes), [email protected] (F. de la Paz López),[email protected] (R. Arnau Prieto).

http://dx.doi.org/10.1016/j.robot.2014.09.0170921-8890/© 2014 Elsevier B.V. All rights reserved.

past measures, and off-line SLAM, by trying to estimate the entirepath and the map from the set of all measures. Scan matchingcan be applied to static and dynamic environments and can beperformed in 2D as well as in 3D. Also, scan matching approachescan be local [4], or global [5]. In local scan matching the robotpose is estimated by comparing a pair of laser scans [6]. Startingfrom a known position, thematching process is repeated at regularintervals. For each stretch traveled, the scan taken at the initialposition (reference scan) is stored. Additionally, the scan taken atthe end of the stretch (called current scan) is matched against thereference scan; from this comparison, the robot pose is calculatedrelatively to its initial pose. This process provides translation andorientation corrections to offset errors in the robot pose calculationobtained from odometry data alone. On the other hand, in globalscan matching [7,8], the current scan is matched against a globalmap of features (determined a priori or built while exploring theenvironment) or a database of scans.

Page 2: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

220 J.M. Cuadra Troncoso et al. / Robotics and Autonomous Systems 63 (2015) 219–225

The core of the scan matching process is the methodologyemployed to associate and then compare the data obtained inthe current scan to the original reference scan data. Some of themethods are: feature to feature [9,10,6], points to line [8], and pointto point [11,4]. Usually these three approaches are defined in aCartesian coordinate frame and do not take advantage of the nativepolar coordinate system of a laser scan. Diosi [12,13] providesa novel method called Polar Scan Matching (PSM). PSM’s mainadvantage is its ability to search for point associations by simplymatching points with the same bearing and thus eliminating thesearch for corresponding points.

Robot pose estimation using scan matching or other procedurehas two sources of error [4,3]: odometry errors and errors insensor measures. However we have to take into account theerrors produced by the procedure itself. When using PSM inlong runs without a SLAM procedure, the results obtained byodometry outperform scanmatching [13], but other scanmatchingprocedures suffer the same problem [14]. It is amazing that otherprocedures that give good results in static experiment behaveworse than odometry when robot is moving. This problem alsoappears in the initial version of our work on this subject [15], inthat paper we only study short robot runs, but when we tried touse it in long robot runs our procedure obtained worse resultsthan odometry in pose estimation. In a matching process canbe established associations between points belonging to differentsegments, bad associations. We have identified this issue as one ofthe main source for pose estimation errors.

In order to ease the understanding of the proposed changes,we summarize the original PSM algorithm [13] steps, that will bereferenced in Section 2:

1. Scan preprocessing: median filtering and basic segmentation.2. Current scan projection from reference scan estimated pose,

interpolating projection values to reference scan bearings.Differences between projection values and reference values arethe matching errors.

3. Orientation estimation by brute-force to search the leastmatching error angle.

4. Translation estimation by non-linear regression.5. Repeat 4 and 5 until small pose modification.

In this paper we will detail our new improvements to PSM tominimize as possible the existence of bad associations to obtaina reasonably accurate estimation for robot pose.

2. PSM using Kalman Segmentation (PSM-KS) overview

In previous work [15] we presented a modification of originalPSM [step 1]. That modification consists in the use of an advancedline segmentation algorithm (KS), described in [16]. That algorithmuses an extendedKalman filter equivalent to a non-linearweightedleast squares method for estimating straight segments equationsin polar coordinates. Original PSM considers a segment as a setof consecutive scan points such as the distance from each pointto the next one is less than 20 cm, original procedure can mergeseveral different lines in only one set or split a segment if raysincidence angles are small. We used results from segmentationonly for filtering sensors measures noise and for providing theends of the segments. We did not use statistical values from lineestimation: line parameters, error standard deviation, etc. Thesevalues can help to identify segments and to avoid bad associationsas much as possible.

In this paper we have reconsidered some of our initial ideas andPSM procedures. The main changes with respect to the previouswork are:

• The use of statistical values from line estimation commentedabove.

• New procedures for projection and interpolation of the currentscan in the reference frame of coordinates [step 2]. In originalPSM we have identified some situations originating badassociations: not detected occlusions and discretization ininterpolation. Projection and interpolation are described inSection 2.1.

• New computation of matching error [steps 3 and 4] in order toadequate it toweighted least square theoretical frame, to reflectchanges in scan projection and interpolation and to take intoaccount segment reliability. We will comment matching errorcomputation in Section 2.2.

• In orientation and translation search [steps 3 and 4] newcomputation of matching error has induced several changesfrom the original procedure, they are commented in Section 2.3.

2.1. Projection and interpolation changes

Once we have de-noised the reference and current scans, wemust get into the position to be able to compare them [step 2]. Forthis purpose, we must find a common viewpoint; in this work, thepoint selected is the reference scan viewpoint. This means that wehave to determine the values of the range readings obtained in thefinal robot pose (i.e. the current scan) as if they had been obtainedinstead from the reference viewpoint.

One case where appears a bad association between a currentscan point and its associated reference scan point is when the cur-rent point is not visible from the reference scan pose (an occlusion).Many of the occlusions are easily detected by original algorithm,see Fig. 1(a), but other less probable occlusions are not detected,see Fig. 1(b). These cases are filtered by the original algorithm if theabsolute difference between themeasure in the projection and themeasure in the reference scan are bigger than 1m.When there arenot bad association, orientation search error graph shows a clearminimum; see Fig. 1(c). When bad associations appear, error min-imum is not very clear, see Fig. 1(d), in this case errors due to lin-earization in translation search and succeeding iterations can leadto a bigger error pose.

In PSM-KS, reference scan segments andprojected segments arecompared and if their estimated parameters are similar then theirpoints can be associated. Line parameters in polar coordinates areradius r0, distance from origin to the line, and ϕ0 angle from the xaxis to a line that is perpendicular to the line. In experiments in thispaper, segments are considered representing the same object, iftheir radii differ less than 20 cm. and their angles differ less than 5°.

Other cause of bad associations is the discretization in theinterpolation process. In original PSM, measures values in theprojection scan are compared with values in reference scan. Soprojection values have to be interpolated to the corresponding raybearing in the reference scan; this is a discretization process. Ifseveral points in the projection scan are associated to the samepoint in the reference scan, the smaller measure is selected forcomparisons. Sometimes, due to discretization, an existing smallervalue is not selected, associating different segments, and thus abigger value for comparisons is selected.

We propose to interpolate values in the reference scan for theangles obtained in scan projection, and to compare these new val-ues with the scan projection values. Interpolation is done using theline equation. This procedure does not imply discretization. Theline equation can also be used for extrapolation at ends of suc-ceeding segments, to obtain alternative values for themeasure, thevalue producing less error is used later for comparisons.

2.2. Matching error computation

Original PSM computes matching error as the sum of absolutevalues of individual errors for associated points [steps 3 and 4], butit computes a least squares estimation for translation search. Least

Page 3: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

J.M. Cuadra Troncoso et al. / Robotics and Autonomous Systems 63 (2015) 219–225 221

Fig. 1. Occlusions. (a) and (b) show examples of consecutive current scan sensor readings and their projection from the estimated reference scan pose. In (a) point 2 occlusionis detected because its projection angle is less than point 1 projection angle, or some previous point projection angle. In (b) it is not possible to detect points 1 and 2 occlusionsin the same way, there are not previous points. The differences between projections and true reference scan readings are the projection errors. (c) and (d) show orientationsearch error graphs for (a) and (b) in the case of an ideal experiment without odometry and sensor errors, so the unique source of error is the procedure itself.

squares procedures try tominimize sumof squared values; thuswethink that it is better to work with squared errors. Using the sumof absolute values, the graphs for errors in the orientation searchlooks like a ‘‘V’’ letter, see Fig. 1, using squared values graphs looklike an ‘‘U’’ letter. We think that this issue is a source of error forthe orientation refinement in the original PSM, where a parabolicshape is supposed.

As we commented in Section 2.1, the comparison is donebetween associated measures from current scan projection andreference scan interpolated values, interpolation is done in currentscan projection angles.

In original PSM, a weighting schema is used in translationsearch based in the schema proposed in [17]. We have refinedthe weighting schema to take into account segment reliability. Wehave used as features to give a measure of segment reliability itsstandard deviation of estimation error (in least squares sense) andits points number. If segment error standard deviation is smalltheir points are really close to a straight line and if this segmenthas enough points, one can be really confident on having a realobject formed by a straight segment, at least at the height ofview of the sensor. So, we weight differences between values forassociated points directly proportional to segments points numberand inversely proportional to its standard deviation.

2.3. Alternating orientation and translation search

Following the general approach in [13], we are going to firstattempt to update the robot orientation θrobot and then the robotposition (xrobot , yrobot) [steps 3 and 4]. This process is repeated un-til convergence is reached or a maximum number of iterations areconsumed [step 5]. The orientation search process is based on abrute-force approach. Starting from the initial orientation guessprovided by the odometry sensors, a systematic re-evaluation ofthe objective function f is performed every 1 degrees on the inter-val [θrobot − 20, θrobot + 20]. A second search is performed for re-finement every 0.1° on the interval [θmin − 1, θmin + 1], being θmin,

the orientation corresponding to the first search minimum. Notethat every evaluation for each proposed θrobot implies performingthe projection, interpolation and pruning process again, this im-plies longer execution time than the original approach.

The robot pose update is only allowed if error matching afterorientation and translation search is less than errormatch at searchbeginning. The procedure can fail for two reasons: too big errorsor number of associated points not enough, in experiments in thispaper this number is 30 or more points. In this case, the procedureexecution is delayed to next or following scans taken, meanwhilerobot pose is updated from odometry.

3. Experiments

In this section we are going to comment some experimentsto make a map accumulating segments from every matched scanfrom their estimated poses. Experiments were performed in a sim-ulator using maps made of segments only and a map made of seg-ments and points, in this lastmap object shapes are not necessarilystraight segments. These experiments are separated in two groups.In the first group the robot travels for tenminutes in a randompathusing the Area Center method for robot navigation [18,16]. TheArea Center method drives robot at 60 cm/s in not crowded areas;it also produces a relatively high rotation speed, so it tests PSM-KSin a hard way, not the way recommended for making maps. In thesecond group simulated robot was driven by hand.

Maps for first group of experiments are shown in Fig. 2. Map (a)contains 32 segments, (b) contains 130 segments, (c) contains 130segments and 15,114 points and (d) contains 163 segments. Map(c) is to test the effect of curved or thin objects and (d) is to testlong corridors.

Maps for the second group of experiments are shown in Fig. 3,139 segments, and Fig. 4, 507 segments.

In all experiments the odometry error was simulated adding auniform noise to each wheel, equivalent to %5 of wheel velocity.

Page 4: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

222 J.M. Cuadra Troncoso et al. / Robotics and Autonomous Systems 63 (2015) 219–225

Fig. 2. Real maps for experiments using Area Center method for robot navigation. (a), (b) and (d) are formed only by straight segments, (c) has segments and points, soobjects can show a curved shape.

Table 1Mean values of several results for experiments in Fig. 2 maps. Number of scanmatches, total distance traveled in meters, sum of absolute values rotations inturns, time used in segmentation and time used in scan matching. Experimentswere run using a i7 processor at 2.5 GHz.

Map Matches Dist. (m) Rot. (turns) Seg. t. (ms) Match t. (ms)

(a) 498.8 312.04 10.6 1.57 10.71(b) 558.6 274.3 13.08 1.49 11.36(c) 568 202.18 14.76 1.94 13.86(d) 388.8 187.36 7.08 1.65 12.61

Also the error in simulated laser was modeled as a Gaussian dis-tribution with mean 0 and standard deviation 1 cm. Experimentswere run using a i7 processor at 2.5 GHz, computing everything inonly one execution thread.

4. Results

Results for the first group of experiments are shown in Tables 1and 2. The first table shown means values for a number of scanmatches, total distance traveled in meters, sum of absolute valuesrotations in turns (360°), time used in segmentation and timeused in scan matching. The values of these magnitudes are similarin experiments in a determined map, so means are illustrativeenough.

Values for a number of matches, distance traveled and rotationshow that experiments were run for a reasonable long test. Timefor segmentation is small enough compared with robot cycle, inPioneer3-AT 100 ms. Time between successive scan matches isabout 10 times robot cycle, thus time for scan matching is smallenough to be performed on-line, even with hardware slower thanwe have used.

Results for estimation pose are shown in Table 2. It must benoted that depending on map shape and map crowdedness, thenavigation procedure can drive the robot at different velocities.

It can be seen that PSM-KS outperform odometry results inboth robot translation and orientation. In simple maps, (a), PSM-KS performance is about ten times more accurate than odometryperformance. In more complex maps, (b), with smaller segmentsand many occlusions, due to holes between objects, PSM-KSaccuracy is a little bit low thanwith simplemaps.When curved andthin objects appears, map (c), PSM-KS accuracy goes down abouta half with respect to the similar map with only segments, butin this case odometry errors in orientation were especially biggerthan those from other experiments, and extra problem that PSM-KS could handle. In the corridor map, (d), PSM-KS errors are biggerthan in other cases but still smaller than odometry ones, except for

one case, note that in this case odometry errors in translation werespecially bigger than those from other experiments.

For the second group of experiments,mapmaking driving robotby hand, results are shown in Figs. 3 and 4, both formed only withsegments. Maps characteristics are detailed in figure captions.

In the first experiment, PSM-KS yields a reasonably correctestimation for robot path in spite of the big error in odometric path.PSM-KS map is very consistent compared with odometry map,

In the second experiment, with long corridors, PSM-KS resultsalso outperform odometry results. Although an isolated big error,see Fig. 4(c), degrades performance. Not all bad associations imply-ing big errors are filtered, but they are rare.

5. Conclusions and future work

In this paperwehave presentedmodifications to PSMalgorithmthat affect to many of its procedures, maintaining its basic struc-ture. Our objective was to obtain a robot pose estimation betterthan those obtained from robot odometry. Results detailed in thispaper show that the objective has been reached, at least under con-strains used in experiments: map formed by straight segments,synchronous sensor measures and static environment. Executiontime for the procedure is fast enough to be used on-line in most ofthe current robotics platforms equipped with laser.

Constrainsmentioned above open different improvement lines.PSM-KS has shown a good performance in a map with curved andthin objects shapes but it can be improved developing a KalmanSegmentation procedure for curved lines, like second or third ordersplines.

Real robots get asynchronous laser measures, so a straight linefacing a moving robot is not perceived like straight from the robotposition when the scan set is received. In this situation the seg-mentation procedure fails, so we need to develop a synchroniza-tion procedure to be applied before lines estimation.

Dynamic environments are a problem for mapping purposes.PSM-KS could deal with moving objects in some situations: if theylook small, then they are considered individual points not used inmatching, or if their speed is high enough to be considered likedifferent segments when comparing actual and reference scans. Inother cases, they could affect negatively to procedure performance.

The ability of PSM-KS to localize robotwith acceptable accuracycould be increased bymixing itwith someon-line SLAMprocedure,to integrate the collection of scans and estimated robot poses intoreal map.

Page 5: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

J.M. Cuadra Troncoso et al. / Robotics and Autonomous Systems 63 (2015) 219–225 223

Table 2Mean and maximum errors in robot pose estimation, for robot translation ∆r and orientation ∆ |θ |,using PSM-KS and using odometry information only. Maps are those from Fig. 2.

Map ∆r cm. ∆ |θ | dg.Scan match Odometry Scan match Odometrymean max mean max mean max mean max

(a)

5.2 14.9 68.6 180.2 0.23 0.61 2.31 10.3412.4 24.8 60.8 164.2 0.23 0.85 3.97 9.2716.9 35.5 85.3 168.7 0.63 1.42 2.45 7.437.6 16.5 55.2 196.3 0.30 0.78 2.78 7.769.9 25.1 198.6 620.8 0.36 0.92 14.08 25.56

(b)

9.9 47.0 49.6 163.7 1.03 2.94 4.54 10.5312.4 33.5 40.9 77.8 1.46 2.66 3.08 7.979.0 18.4 121.1 213.3 0.91 1.78 4.77 13.77

12.1 26.0 88.2 206.0 1.29 2.54 7.68 14.326.6 20.6 82.9 278.3 0.22 1.22 7.86 20.76

(c)

22.0 62.4 58.0 161.9 2.84 5.44 10.36 26.809.4 33.0 111.3 309.0 0.95 5.09 9.83 23.56

21.9 90.1 49.3 115.4 3.58 7.69 4.77 12.9721.6 45.1 67.9 187.9 2.29 5.64 5.00 13.2845.3 116.3 104.7 217.5 3.54 5.31 2.84 8.11

(d)

56.2 102.5 458.4 999.2 0.69 1.36 7.19 13.9778.6 172.6 284.2 758.6 2.72 5.79 5.41 19.7276.5 147.3 113.8 365.4 1.03 2.29 2.31 5.9552.1 194.6 98.4 211.0 2.14 3.93 1.58 6.2325.5 90.3 121.0 414.6 0.90 2.00 3.01 6.71

Fig. 3. This office map covers 200 m2 , robot traveled 94.3 m and rotated 7.9 turns. In (a) real map with real path starting at S and ending at S and paths estimated by scanmatching and odometry, (b) and (c) show errors in translation ∆r and orientation ∆ |θ | along robot translation. Maps obtained from PSM-KS (d) and odometry (e).

Page 6: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

224 J.M. Cuadra Troncoso et al. / Robotics and Autonomous Systems 63 (2015) 219–225

Fig. 4. Columbia map from Arnl library (Pioneer-3AT), it covers 1200m2 aprox., robot traveled 133.8 m and rotated 9.9 turns. In (a) real map with real path starting at S andending at E. (b) and (c) show errors in translation ∆r and orientation ∆ |θ |. Maps obtained from PSM-KS (d) and odometry (e).

References

[1] J.J. Leonard, H.F. Durrant-Whyte, Simultaneous Map Building and Localizationfor an Autonomous Mobile Robot, in: Proc. IEEE Int. Workshop on IntelligentRobots and Systems, Osaka, Japan, 1991, pp. 1442–1447.

[2] R. Smith, M. Self, P. Cheeseman, Estimating uncertain spatial relationships inrobotics, in: UAI, 1986, pp. 435–461.

[3] S. Thrun,W. Burgard, D. Fox, Probabilistic Robotics, MIT Press, Cambridge, MA,2005.

[4] F. Lu, E. Milios, Robot pose estimation in unknown environments bymatching 2D range scans, in: Computer Vision and Pattern Recognition, 1994.Proceedings CVPR ’94., 1994 IEEE Computer Society Conference on, 1994, pp.935–938.

[5] M. Tomono, A scan matching method using Euclidean invariant signaturefor global localization and map building, in: Robotics and Automation, 2004.Proceedings. ICRA ’04. 2004 IEEE International Conference on, vol. 1, 2004,pp. 866–871.

[6] G. Weiss, E. Puttkamer, A map based on laserscans without geometricinterpretation, in: Proceeding of: Intelligent Autonomous Systems4, IAS-4, IOSPress, 1995, pp. 403–407.

[7] S. Borthwick, H. Durrant-Whyte, Simultaneous localisation and map buildingfor autonomous guided vehicles, in: Intelligent Robots and Systems ’94.‘Advanced Robotic Systems and the Real World’, IROS’94. Proceedings of theIEEE/RSJ/GI International Conference on, vol. 2, 1994, pp. 761–768.

[8] I. Cox, Blanche-an experiment in guidance and navigation of an autonomousrobot vehicle, IEEE Trans. Robot. Autom. 7 (2) (1991) 193–204.

[9] J.-S. Gutmann, Robuste navigation autonomer mobiler systeme (Ph.D. thesis),Albert-Ludwigs-Universitat Freiburg, Institut fur Informatik, 2000.

[10] K. Lingemann, H. Surmann, A. Nüchter, J. Hertzberg, Indoor and outdoorlocalization for fast mobile robots, in: Intelligent Robots and Systems, 2004,IROS 2004, Proceedings. 2004 IEEE/RSJ International Conference on, vol. 3,2004, pp. 2185–2190.

[11] P. Besl, H. McKay, A method for registration of 3-D shapes, IEEE Trans. PatternAnal. Mach. Intell. 14 (2) (1992) 239–256.

[12] A. Diosi, L. Kleeman, Laser scanmatching in polar coordinates with applicationto SLAM, in: Intelligent Robots and Systems, 2005, IROS 2005, 2005 IEEE/RSJInternational Conference on, 2005, pp. 3317–3322. http://dx.doi.org/10.1109/IROS.2005.1545181.

Page 7: Consistent robot localization using Polar Scan Matching based on Kalman Segmentation

J.M. Cuadra Troncoso et al. / Robotics and Autonomous Systems 63 (2015) 219–225 225

[13] A. Diosi, L. Kleeman, Fast laser scan matching using polar coordinates, Int. J.Robot. Res. 26 (10) (2007) 1125–1153.

[14] K. Lingemann, A. Nüchter, J. Hertzberg, H. Surmann, High-speed laserlocalization for mobile robots, Robot. Auton. Syst. 51 (4) (2005) 275–296.http://dx.doi.org/10.1016/j.robot.2005.02.004.

[15] I.N. Santosjuanes, J.M.C. Troncoso, F. de la Paz López, R.A. Prieto, Improved po-lar scan-matching using an advanced line segmentation algorithm, in: Naturaland Artificial Computation in Engineering and Medical Applications, in: LNCS,vol. 7931, 2013, pp. 32–44.

[16] J.M. Cuadra Troncoso, J.R. Álvarez-Sánchez, F. de la Paz Lopez, A. Fernández-Caballero, Improving area center robot navigation using a novel range scansegmentation method, in: Foundations on Natural and Artificial Computation,in: LNCS, vol. 6686, 2011, pp. 233–245.

[17] G. Dudek, M.R.M. Jenkin, Computational Principles of Mobile Robotics,Cambridge University Press, 2000.

[18] J.R.Á. Sánchez, F. de la Paz López, J.M.C. Troncoso, D. de Santos Sierra, Reactivenavigation in real environments using partial center of area method, Robot.Auton. Syst. 58 (12) (2010) 1231–1237.

José Manuel Cuadra Troncoso received his B.Sc. andM.Sc. degrees in Mathematics (2000) and his Ph.D. degreein Computer Science and Artificial Intelligence (2011)from Univ. Nacional de Educación a Distancia, Spain.He is assistant professor at the Artificial IntelligenceDepartment of Univ. Nacional de Educación a Distancia,Spain. His main research interests include mobile roboticsand artificial neural networks.

José Ramón Álvarez Sánchez received his B.Sc and M.Scdegrees in Physics (1988) from Univ. Complutense deMadrid, Spain, and his Ph.D degree in Physics (1997) fromUniv. Nacional de Educación a Distancia, Madrid, Spain.He is an associate professor at the Artificial IntelligenceDepartment of Univ. Nacional de Educación a Distancia,Spain. His main research interests include artificial neuralnetworks and mobile robotics.

Israel Navarro Santosjuanes received his B.S. (1994) andM.S. (1997) degrees in Aerospace Engineering from theUniversity of Tennessee and the University of Washingtonrespectively and his M.Sc. degree in Computer Scienceand Artificial Intelligence (2013) from Univ. Nacional deEducación a Distancia, Spain. His main research interestsinclude mobile robotics and aerial robotics.

Félix de la Paz López received his B.Sc andM.Sc degrees inPhysics (1995) from Univ. Complutense de Madrid, Spain,and his Ph.D degree in Physics (2003) from Univ. Nacionalde Educación a Distancia, Madrid, Spain. He is associateprofessor at the Artificial Intelligence Department ofUniv. Nacional de Educación a Distancia, Spain. His mainresearch interests include navigation and representationtasks in mobile robots.

Raúl Arnau Prieto received his M.Eng. degree in Telecom-munications Engineering (2005) from University ofCantabria and his M.Sc. in Artificial Intelligence (2013)from Univ. Nacional de Educación a Distancia, Spain. He isa project manager at CTC where he focuses his researchactivities in real-time control system design and devel-opment. His current interests are in the field of mobilerobotics and integrated navigation system design.