Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Research ArticleAn Analytical Measuring Rectification Algorithm ofMonocular Systems in Dynamic Environment
Deshi Li12 and Xiaoliang Wang1
1Electronic Information School Wuhan University Wuhan Hubei 430072 China2Collaborative Innovation Center of Geospatial Technology 129 Luoyu Road Wuhan 430079 China
Correspondence should be addressed to Xiaoliang Wang xiaoliangwangwhueducn
Received 16 October 2015 Revised 25 January 2016 Accepted 2 March 2016
Academic Editor Yassine Ruichek
Copyright copy 2016 D Li and X Wang This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
Range estimation is crucial for maintaining a safe distance in particular for vision navigation and localization Monocularautonomous vehicles are appropriate for outdoor environment due to their mobility and operability However accurate rangeestimation using vision system is challenging because of the nonholonomic dynamics and susceptibility of vehicles In this papera measuring rectification algorithm for range estimation under shaking conditions is designed The proposed method focuseson how to estimate range using monocular vision when a shake occurs and the algorithm only requires the pose variations ofthe camera to be acquired Simultaneously it solves the problem of how to assimilate results from different kinds of sensors Toeliminatemeasuring errors by shakes we establish a pose-range variationmodel Afterwards the algebraic relation betweendistanceincrement and a camerarsquos poses variation is formulated The pose variations are presented in the form of roll pitch and yaw anglechanges to evaluate the pixel coordinate incensement To demonstrate the superiority of our proposed algorithm the approach isvalidated in a laboratory environment using Pioneer 3-DX robotsThe experimental results demonstrate that the proposed approachimproves in the range accuracy significantly
1 Introduction
The applications of mobile robots for observation and rescuemissions have received an increasing attention in recentyears Current advances in sensing and computing promotemobile robots as a suitable option in occasions such as searchand rescue SLAM (Simultaneous Localization and Map-ping) automatic navigation and target detection For mobilerobots retrieving their position is one of the important issuesIn recent years to solve this problem vision sensors haveattracted a lot of attention because vision sensors are relativelyinexpensive and compact with low power consumptionFurthermore methods using version sensors can localizethe robot in various environments where it is difficult forgeneral localizationmethods likewheel odometry andGPS Iflocalization can be performed only using image informationa robotrsquos flexibility will be improved remarkably
However the precondition to a successful intelligentrobot system is the exact perception of surroundings wherethe range and azimuth information of targets around play an
important role The range estimation algorithms using visionsensors are known as VO (visual odometry)
Approaches for range estimation can mainly be dividedinto three categories radar-based laser-based and vision-based As a typical paradigm of noncontact approachesultrasonic sensors have the advantages of time efficiency andmeasurement accuracy However it is arduous to detect thoseobjects with small surfaces or situated at a wide angle relatedto the ultrasonic sensor(s) Among all perception sensorscomputer visions have an added advantage of acquiringlarge amount of information at a lower cost The vision-based method can solve both range and azimuth estimationproblems using only the acquired image themselves Therehas been much interest in research on object detectionby stereo camera [1ndash4] but the monocular camera is stillstrongly advantageous for its large sensing area low cost andeasy installation
To achieve precise VO in outdoor environments someproblems remain to be solved In this work the problemsof VO on bumpy courses which exist mostly in outdoor
Hindawi Publishing CorporationJournal of SensorsVolume 2016 Article ID 4132721 9 pageshttpdxdoiorg10115520164132721
2 Journal of Sensors
environments are consideredwhere the bumpy coursesmeanthat the environments include rough roads on which theVO accuracy is dynamically affected by the pose change ofvision sensors Furthermore if precise VO is realized in theenvironments including rough roads we believe that it can beutilized in any outdoor environments
In the researches of intelligent unmanned vehicle systemscomputer vision generally adopts the methods of imagingprocessing algorithms In those works the image features areextracted along with the model of the ambient environmentfor vehicle localization and obstacle avoidance Range andazimuth information are then refined from the above modelusing vision system It is unrealistic to assume that the roadis absolutely flat in the process This paper concentrates onthe dynamicmeasurement rectification problem inwhich thecamera pose changes abruptly This approach is particularlysuitable for applications such as navigating the autonomousvehicles running on rough terrains Pose variations are firstlymeasured by a three-axis angle sensor and sequentiallyapplied to calculate the distance offsets using the proposedrange-pixel model Although monocular visual odometryhas an advantage in wide FOV (field of view) factors suchas lighting conditions shadows and random noise wouldunavoidably decrease the measurement precision which areinduced from both human limitations and sensor character-istics On the contrary noncontact sensors such as sonar aretypically not susceptible to those external conditions whichwould infect the result accuracy Nevertheless one maindefect is the existence of inherent blind areas In the viewof these possible advantages and corresponding limitationssensor assimilation technique based onOI (Optimal Interpo-lation) method is employed The main contributions of thispaper are summarized as follows
(1) The relation between range increment and cam-erarsquos pose variation has been formulated based onwhich a feasible data rectification algorithm has beendesigned to modify the metrical results To the best ofour knowledge it is the first work to solve range esti-mation problem under camera shaking conditions
(2) An improved estimation mechanism of range infor-mation in OI model has been developed whichenhances adaptability and accuracy of multisensormeasuring system
(3) Experiments on mobile robots and analytical resultshave been demonstrated
The rest of this paper is organized as follows Thefollowing section will provide some background and a moredetailed literature review Section 3 defines the problems andrelated literatures Section 4 details the proposed approachfor measurement rectification and sensor fusion Finallyexperiment results and conclusions are given in Sections 5and 6
2 Related Works
Visual distance estimation is a specialized set of approacheswhich focus on real-time and accurate image capture
followed by range information acquirement Several of thesemechanisms have been developed as foundational elementsof 3D reconstruction simultaneous localization and mapbuilding
Some basic algorithms as well as their improvements forrange estimation have been developed epipolar constraintmodel [2] defocusing method [3ndash5] coordinate mappingscheme [6 7] and camera movement approach [1 8] Kat-suyuki et al proposed a coupled estimation of unknownvehicle width and following distance by sequential Bayesianestimation The method can run in real-time and producehighly accurate estimation of the following distance under aprecondition that no camera shaking happens
Those proposed methods can be divided into two cate-gories monocular and stereo system Monocular approachesinvolve a single no sophisticated camera that compute thepixel size or coordinates which are used for range estimationExamples of these are studied in [9] Stereo vision approachescan provide much higher accuracy than monocular but theyhave small field of view and high operational complexitySeveral intelligent and operable algorithms [10 11] fall intothis category
Monocular and stereo vision approaches have advantagesin different aspects Monocular approaches are usually easyto be implemented and have optimal view scope Meanwhilethey require much lower cost compared to the former Stereovision methods in contrast have a good performance inaccuracy due to the subpixel synthetical localization tech-nology while their biggest drawback lies in the complicatedoperations and high computational complexity especiallyduring the calibration process
Among these emerged researches most work assumesthe camera pose is fixed [3 6 9 10 12ndash14] Some notableexceptions which have similarity to the present work are asfollows (1) Guo et al [15] put forward a parallel constraintmethod based on the two lane boundaries (2) Vehicles areequipped with an angle sensor to accurately acquire thepitching angle of the camera in [13 16 17] where the authorproposed an improved algorithm in angle calculation byusing a function of the angles representing the two parallellane lines
Some other approaches have also been proposed Typicalparadigms are as follows Han et al [18] devise a featurepoint based method for monocular measurement but theyhinder the real-time implementation Malis and Rives [19]design a hybrid algorithm to minimize the token relativedisplacements between two frames and then estimate theimage-space distance
3 Problem Formulation
In Figure 1 suppose P1015840 is a point in the image plane of thecamera in a pose of Ps and suppose that we have an estimateof the pose of the camera by a three-axis gyroscope From thisinformation a standard ground-constrained model [18] canbe used to estimate the position of P in the world coordinateIf the camerarsquos pose suddenly changes to Pf we can use thisinformation to project point P into the camerarsquos image plane
Journal of Sensors 3
Imaging plane
ZW
XW
O YW
P998400(u )
P(xw yw)
P998400998400 (u )
Figure 1 The imaging geometry of the observation model
which obtains a second pointP10158401015840 Now assuming that the posemeasurement is reasonably accurate and that the positionestimate algorithm works well the problem is to estimate Putilizing measurements including P1015840 P10158401015840 and pose variationof the camera
The initial and final poses of the camera are denoted byPs = (119903119904 119901119904 119910119904) and Pf = (119903119891 119901119891 119910119891) respectively where 119903119894119901119894 and 119910
119894(119894 = 119904 119891) stand for the initial roll pitch and yaw
angles Although the actual relative distance from the opticalcenter to the target changes slightly the measured resultsdeviate from the truth significantly This is mainly becauseof the nonlinear mapping between pixel coordinates andcorresponding distance values The problem is to correct theactual measurements to be close to the truth by eliminatingthe pose perturbation of the camera
4 Data Rectification Algorithm
In this section we describe our approach to the problem ofmonocular vision-based measurement rectification Since arobotrsquos trajectory is most conveniently described in a worldcoordinate system while the target on the ground is generallydescribed by its camera coordinate system we start witha preview of these two coordinate systems To model theproblem in a general geometrodynamical architecture thealgebraic relation between the camera pose displacement andthe displacement of measuring distance is derived
41 World and Camera Coordinates System Assume that119883119884119885 and119883101584011988410158401198851015840 are respectively a world coordinate systemand the camerarsquos coordinate system as shown in Figure 2Thecoordinates of a point P under these two coordinate systemsare transformed by
[[
[
119883
119884
119885
]]
]
= R[[[
[
1198831015840
1198841015840
1198851015840
]]]
]
+ T (1)
where (119883 119884 119885) and (1198831015840 1198841015840 1198851015840) are point P coordinates inthe world and robot camera coordinate system Moreover
Z
Y
XZ998400Y998400
X998400
P
Figure 2 The world and camera coordinate systems
R and T are respectively the rotation and translation fromthe camera to theworldrsquos coordinate systemwhich determinethe position and orientation of the camera in the worldcoordinate system Furthermore for a 3D point P in theFOV of the camera its image coordinates are given by theprojection equation as follows
[119906 V]119879 = [1198831015840
11988510158401198841015840
1198851015840]
119879
sdot 119891 (2)
where (119906 V) are the coordinates of P in the image coordinatesystem and 119891 is the camerarsquos focal length
42 Chebyshev Best Uniform Approximation RectificationAlgorithm The distance-orientation information betweentargets and the camera can be derived from correspondingpixel coordinate in the image [20 21] It is found that the ratioof image pixel motion to the camera rotation angles variesnonlinearly along the main optical axis The main idea of thedesigned algorithm is to piecewise linearize the nonlinear rateand then calculate the rate of change with respect to rotationangles as well as themeasured distance Equation (3) presentsthe rotation matrix in 3D space
119877 =[[
[
1 0 0
0 cos 120579 sin 1205790 minus sin 120579 cos 120579
]]
]
sdot[[
[
cos120595 0 minus sin1205950 1 0
sin120595 0 cos120595
]]
]
sdot[[
[
cos120593 sin120593 0
minus sin120593 cos120593 0
0 0 1
]]
]
(3)
Variations of pixel coordinates are associated with theworld coordinates by a rotation matrix whose parameters areattitude angles of the camera which is described by
Δ119906119896+1
= 119865119909sdot (119877119896+111119909119908+ 119877119896+112119910119908
119877119896+131119909119908+ 119877119896+132119910119908
minus11987711989611119909119908+ 11987711989612119910119908
11987711989631119909119908+ 11987711989632119910119908
)
ΔV119896+1
= 119865119910sdot (119877119896+121119909119908+ 119877119896+122119910119908
119877119896+131119909119908+ 119877119896+132119910119908
minus11987711989621119909119908+ 11987711989622119910119908
11987711989631119909119908+ 11987711989632119910119908
)
(4)
where inner parameters119865119909and119865119910are only determined by the
CCD structure itself
4 Journal of Sensors
For the convenience of discussion we assume that thecamera poses change mainly along the yaw angle direction
Denote Ψ(120579 120595 120593) = (11987711119909119908+ 11987712119910119908)(11987731119909119908+ 11987732119910119908)
using (3) and (4) we obtain
Ψ (120579 120595 120593) = cos 120597 + sin 120597 cot120573 (5)
where
120597 = arctan(119909119908
119910119908
) + arctan(119910119908
119909119908
)
120573 = 120593 minus arctan(119910119908
119909119908
)
(6)
Substituting (3) in (5) results in
Ψ (120579 120595 120593) = cot(120593 minus arctan(119910119908
119909119908
)) (7)
From Figure 3 we can see that the slope of curvestends to be constant within a sliding interval of independentvariable This interval becomes smaller when the ratio of119909119908to 119910119908increases The Chebyshev approximation method
has the characteristics of uniform approximation on selectedclosed-interval Inspired by this the nonlinear rate can beapproximated by linear polynomial and the deviation causedby poses change of a camera can be effectively compensatedThe second derivative is taken as
Ψ10158401015840(120593) = 2 cot(120593 minus arctan(
119910119908
119909119908
))
sdot (cot(120593 minus arctan(119910119908
119909119908
))
2
+ 1)
(8)
Considering the yaw angle variations of a PTZ (PanTiltZoom) camera caused by uneven pavement during practicalrobot motion a closed subinterval [119886 119887] ([119886 119887] sube [0 sim 12058712])is chosen for further deduction Since (8) is a continuousfunction and keeps consistency in sign the best consistentapproximation method can be used
Denote 119886lowast1= (Ψ(119887) minus Ψ(119886))(119887 minus 119886) using this to acquire
normal equation of approximation
Ψ1015840(120593) = 119886
lowast
1 (9)
Set the solution of (9) as 119909Δ Then the approximation
equation is written as follows
119901lowast
(119909) =119891 (119886) + 119891 (119909
Δ)
2+ 119886lowast
1sdot (119909 minus
119886 + 119909Δ
2) (10)
We explore the slope of line after linear approximation tostudy the function of different ratios of119909
119908to119910119908 Results show
that the slope converges to its limit uniformly Moreover thisconstant value is irrelevant to the ratio above
120581 =Ψ (119887) minus Ψ (119886)
119887 minus 119886 (11)
Yaw angle (rad)
Pixe
l (de
g)
XY = 05
XY = 02941
XY = 02
XY = 01
XY = 1e minus 6
2
18
16
14
12
1
08
06
04
02
00 01 02 03 04 05 06 07
Figure 3 Pixel coordinates along 119880-axis vary with the yaw anglesa set of 119910
119908119909119908values are considered (119909
119908 119910119908) is the coordinate of P
in the world coordinate system
Substituting (7) in (11) results in
120581 =1
119887 minus 119886sdot
1 + (119910119909)2
(tan (119887) minus 119910119909) (tan (119887) minus 119910119909)
sdot (tan (119886) minus tan (119887))
(12)
To demonstrate the convergence of 120581 we have also analyzedthe limit value given by
lim119910119909rarrinfin
119896 =tan (119886) minus tan (119887)
119887 minus 119886 (13)
Solid curves in Figure 4 are the results of actual slope andlinear approximation respectivelyThese two curves coincidewith each other well after a translation operation Thisindicates a high accuracy in slope using linear approximationFigure 5manifests the convergence of slope related to ametricof 119910119908119909119908 which is in good agreement with experimental
results Another important property that should be notedis that the function value rapidly reaches convergence aftera dramatic increase that is the measured range wouldvary with the metric 119910
119908119909119908
with high nonlinearity Thisalso implies that the measurement should be conducted onthe smooth interval to reduce the errors caused by camerashakings On the other hand it is impossible to compensatethe deviations when the metric is too small
43 Sonar and Camera Data Assimilation Model The Opti-mal Interpolation Algorithm is derived to generate the leastsquares results for vectors of observations and backgroundfields assuming ldquoa priorirdquo known statistical models for thebackground error covariance The Optimal InterpolationTechnique based on the minimization of variance esti-mation plays an important role in data assimilation Ituses several different real-world observations to produce acorrected output which is closer to the truth
Journal of Sensors 5
FunctionLinear proximity
YX = 300
minus05
minus1
minus15
minus2
minus25
minus3
minus350 01 02 03 04 05 06 07
Figure 4 Pixel values vary with the yaw angle our approach cansuccessfully approximate the actual changes
Slop
e
YX
minus1
minus105
minus11
minus115
minus12
minus125
minus13
minus135
minus14
minus145
minus1510008006004002000
Y minus1033
X 34
Figure 5 Slope converges to a stable value
The motivation of the proposed method comes from thesimilar characteristic and phenomena between a camera andsonar measurement system and an OI algorithm First acamera and sonar system can be considered to be an OIsystem that produces an optimal output through severalgroups of observations Second the OI algorithm has thedimension-extensible and loose a priori characteristics thatare attractive for camera and sonar measurement system
The following are given
(i) A background field 119909119887available in two or three
dimensions(ii) A set of 119901 observations 119909obs available at irregular
positions
The optimal estimation is described by
119909119886= 119909119887+119882(119909obs minus 119909119887) (14)
Table 1 Initial state of camera before calibration
Rotation angle Pitch angle Zoom0 minus15∘ 0
Table 2 Calibration results
119891119909
119891119910
1198800
1198810
73188 73865 33928 2497
The errors are given by
120576119886= 119909119886minus 119909119905
120576119887= 119909119887minus 119909119905
120576obs= 119909obs minus 119909119905
(15)
The optimal weight is then as follows
119882 =(120590119887)2
(120590obs)2
+ (120590119887)2 (16)
where 120590119886 and 120590119887 represent the mean value of 120576119886 and 120576119887 Asdata from camera and sonar are unrelated it is assumed that119864(120576119887120576obs) = 0
5 Evaluations and Analysis
In this section we present the results of a set of physicalexperiments to demonstrate the performance of the pro-posed algorithm in Section 4 To validate the effectivenessof the proposed data rectification algorithm we comparedthe results before and after a pose change with the truthMoreover we have conducted a set of experiments underdifferent initial poses of a camera to testify the robust-ness of this method Besides comparative experiments havebeen designed to show the validity of the data assimilationapproach
Autonomous vehicles can be modeled as mobile robotsand then we use the mobile robot Pioneer 3-DX (Figure 6)mounted with a camera for experiments To prepare forexperiments the PTZ camera is firstly calibrated
51 Camera Calibration Grid size of the calibration boardin experiments is 30mm lowast 30mm Picture resolution ofVCC50I is fixed as 640 lowast 480 To ensure error balance andcalibration accuracy a group of calibrated images containingfour images from various poses are collected at a distanceinterval of 10 cm Calibration distance ranges from 1500mmto 4000mm Considering effects of pitch and rotation anglesas well as the zoom value the camera state during calibrationis fixed as given in Table 1 Internal parameters are listed inTable 2 which are crucial to distance measurement
52 Performance Evaluation of the Data Rectification Algo-rithm Angle variations are acquired by a three-axis anglesensor which act as the input of the rectification algorithm
6 Journal of Sensors
Figure 6 Pioneer 3-DX robot equipped with a Canon-VCC50I PTZ Camera
Table 3 Different conditions set for measurement
120595 120579 120593
213 718 005489 484 281568 405 359293 642 277407 501 110
The module (MPU6050) has advantages in low temperaturedependency high resolution and low noise Due to theseadvantages this module is chosen as a tool to measure theEuler angles
To validate the robustness of the algorithm a target is setat different positions randomly For each metric 119910119909 resultsunder a set of camera poses are analyzed (yaw pitch and rollangles are random for each metric as set in Table 3) Initialreadings of angle sensor are as follows 120579 (pitch) 965120593 (yaw)minus027∘ and 120595 (roll) minus094∘ The target position tuples are setas (1500 100) (2000 100) (2500 100) (3000 100)
Figure 7(a) shows measurements from the initial camerastate Compared with the truth it shows bias along bothhorizontal and vertical directions
The deviations of VO results caused by the cameramotions are rectified independently Based on the analysisof the range model in Section 3 the pitch angle is anindependent variable of distance function Therefore werecalculate the pitch angle instead of inversion operationson pixel coordinates Figures 7(b)ndash7(f) show results beforeand after rectification In Figure 7(e) the distance erroralong the optical axis is almost as high as 50 using directmeasurement However this value decreases to be only 6using the proposed algorithm We can also see that the leastimprovement in accuracy in this direction is 10 as shown in
Figure 7(c) Accordingly much more remarkable effect canbe seen from the results along the direction perpendicularto the optical axis In the worst case as shown in Fig-ure 7(f) the measured distance along119883-axis is rectified fromthe measured minus780mm to the final 98mm The percentagegains of measuring precision approach 878 Even in ageneral situation this percentage can be close to 35 asdemonstrated in Figure 7(c) These figures also show thatthe range deviation becomes larger as the distance along theoptical axis direction increases This is mainly because theratio of physical distance to pixel unit increases along theoptical axis
53 Data Assimilation Evaluation For generality the assim-ilation results under different metric size (ie manipulating119910119909) with a fixed camera pose are demonstrated Data fromsonar sensors are set as the background field value and thosefrom camera are set as the observation field value In Fig-ure 8(a) measurement results at some positions are missingwhich indicate that blind zones exist when sonar system onlyis adopted Range data in Figure 8(b) are the results of sensorassimilation It demonstrates the accuracy improvement fromboth 119883-axis and 119884-axis compared with measurements solelyfrom the vision system and sonar sensors Assimilated resultsproved to be as much as 25 percent accurate along119883-axis and9 percent accurate along 119884-axis compared to those acquiredusing a single type of sensor This is mainly because newinformation is brought in to compensate the output from asingle measuring system that is the wild FOV of camera andthe high measurement accuracy of sonar sensors
6 Conclusions
In this paper we have proposed an analytical measuringrectification algorithm formonocular range estimation undercamera shaking conditions Specifically we have establisheda pose-range model and then the algebraic relation between
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
2 Journal of Sensors
environments are consideredwhere the bumpy coursesmeanthat the environments include rough roads on which theVO accuracy is dynamically affected by the pose change ofvision sensors Furthermore if precise VO is realized in theenvironments including rough roads we believe that it can beutilized in any outdoor environments
In the researches of intelligent unmanned vehicle systemscomputer vision generally adopts the methods of imagingprocessing algorithms In those works the image features areextracted along with the model of the ambient environmentfor vehicle localization and obstacle avoidance Range andazimuth information are then refined from the above modelusing vision system It is unrealistic to assume that the roadis absolutely flat in the process This paper concentrates onthe dynamicmeasurement rectification problem inwhich thecamera pose changes abruptly This approach is particularlysuitable for applications such as navigating the autonomousvehicles running on rough terrains Pose variations are firstlymeasured by a three-axis angle sensor and sequentiallyapplied to calculate the distance offsets using the proposedrange-pixel model Although monocular visual odometryhas an advantage in wide FOV (field of view) factors suchas lighting conditions shadows and random noise wouldunavoidably decrease the measurement precision which areinduced from both human limitations and sensor character-istics On the contrary noncontact sensors such as sonar aretypically not susceptible to those external conditions whichwould infect the result accuracy Nevertheless one maindefect is the existence of inherent blind areas In the viewof these possible advantages and corresponding limitationssensor assimilation technique based onOI (Optimal Interpo-lation) method is employed The main contributions of thispaper are summarized as follows
(1) The relation between range increment and cam-erarsquos pose variation has been formulated based onwhich a feasible data rectification algorithm has beendesigned to modify the metrical results To the best ofour knowledge it is the first work to solve range esti-mation problem under camera shaking conditions
(2) An improved estimation mechanism of range infor-mation in OI model has been developed whichenhances adaptability and accuracy of multisensormeasuring system
(3) Experiments on mobile robots and analytical resultshave been demonstrated
The rest of this paper is organized as follows Thefollowing section will provide some background and a moredetailed literature review Section 3 defines the problems andrelated literatures Section 4 details the proposed approachfor measurement rectification and sensor fusion Finallyexperiment results and conclusions are given in Sections 5and 6
2 Related Works
Visual distance estimation is a specialized set of approacheswhich focus on real-time and accurate image capture
followed by range information acquirement Several of thesemechanisms have been developed as foundational elementsof 3D reconstruction simultaneous localization and mapbuilding
Some basic algorithms as well as their improvements forrange estimation have been developed epipolar constraintmodel [2] defocusing method [3ndash5] coordinate mappingscheme [6 7] and camera movement approach [1 8] Kat-suyuki et al proposed a coupled estimation of unknownvehicle width and following distance by sequential Bayesianestimation The method can run in real-time and producehighly accurate estimation of the following distance under aprecondition that no camera shaking happens
Those proposed methods can be divided into two cate-gories monocular and stereo system Monocular approachesinvolve a single no sophisticated camera that compute thepixel size or coordinates which are used for range estimationExamples of these are studied in [9] Stereo vision approachescan provide much higher accuracy than monocular but theyhave small field of view and high operational complexitySeveral intelligent and operable algorithms [10 11] fall intothis category
Monocular and stereo vision approaches have advantagesin different aspects Monocular approaches are usually easyto be implemented and have optimal view scope Meanwhilethey require much lower cost compared to the former Stereovision methods in contrast have a good performance inaccuracy due to the subpixel synthetical localization tech-nology while their biggest drawback lies in the complicatedoperations and high computational complexity especiallyduring the calibration process
Among these emerged researches most work assumesthe camera pose is fixed [3 6 9 10 12ndash14] Some notableexceptions which have similarity to the present work are asfollows (1) Guo et al [15] put forward a parallel constraintmethod based on the two lane boundaries (2) Vehicles areequipped with an angle sensor to accurately acquire thepitching angle of the camera in [13 16 17] where the authorproposed an improved algorithm in angle calculation byusing a function of the angles representing the two parallellane lines
Some other approaches have also been proposed Typicalparadigms are as follows Han et al [18] devise a featurepoint based method for monocular measurement but theyhinder the real-time implementation Malis and Rives [19]design a hybrid algorithm to minimize the token relativedisplacements between two frames and then estimate theimage-space distance
3 Problem Formulation
In Figure 1 suppose P1015840 is a point in the image plane of thecamera in a pose of Ps and suppose that we have an estimateof the pose of the camera by a three-axis gyroscope From thisinformation a standard ground-constrained model [18] canbe used to estimate the position of P in the world coordinateIf the camerarsquos pose suddenly changes to Pf we can use thisinformation to project point P into the camerarsquos image plane
Journal of Sensors 3
Imaging plane
ZW
XW
O YW
P998400(u )
P(xw yw)
P998400998400 (u )
Figure 1 The imaging geometry of the observation model
which obtains a second pointP10158401015840 Now assuming that the posemeasurement is reasonably accurate and that the positionestimate algorithm works well the problem is to estimate Putilizing measurements including P1015840 P10158401015840 and pose variationof the camera
The initial and final poses of the camera are denoted byPs = (119903119904 119901119904 119910119904) and Pf = (119903119891 119901119891 119910119891) respectively where 119903119894119901119894 and 119910
119894(119894 = 119904 119891) stand for the initial roll pitch and yaw
angles Although the actual relative distance from the opticalcenter to the target changes slightly the measured resultsdeviate from the truth significantly This is mainly becauseof the nonlinear mapping between pixel coordinates andcorresponding distance values The problem is to correct theactual measurements to be close to the truth by eliminatingthe pose perturbation of the camera
4 Data Rectification Algorithm
In this section we describe our approach to the problem ofmonocular vision-based measurement rectification Since arobotrsquos trajectory is most conveniently described in a worldcoordinate system while the target on the ground is generallydescribed by its camera coordinate system we start witha preview of these two coordinate systems To model theproblem in a general geometrodynamical architecture thealgebraic relation between the camera pose displacement andthe displacement of measuring distance is derived
41 World and Camera Coordinates System Assume that119883119884119885 and119883101584011988410158401198851015840 are respectively a world coordinate systemand the camerarsquos coordinate system as shown in Figure 2Thecoordinates of a point P under these two coordinate systemsare transformed by
[[
[
119883
119884
119885
]]
]
= R[[[
[
1198831015840
1198841015840
1198851015840
]]]
]
+ T (1)
where (119883 119884 119885) and (1198831015840 1198841015840 1198851015840) are point P coordinates inthe world and robot camera coordinate system Moreover
Z
Y
XZ998400Y998400
X998400
P
Figure 2 The world and camera coordinate systems
R and T are respectively the rotation and translation fromthe camera to theworldrsquos coordinate systemwhich determinethe position and orientation of the camera in the worldcoordinate system Furthermore for a 3D point P in theFOV of the camera its image coordinates are given by theprojection equation as follows
[119906 V]119879 = [1198831015840
11988510158401198841015840
1198851015840]
119879
sdot 119891 (2)
where (119906 V) are the coordinates of P in the image coordinatesystem and 119891 is the camerarsquos focal length
42 Chebyshev Best Uniform Approximation RectificationAlgorithm The distance-orientation information betweentargets and the camera can be derived from correspondingpixel coordinate in the image [20 21] It is found that the ratioof image pixel motion to the camera rotation angles variesnonlinearly along the main optical axis The main idea of thedesigned algorithm is to piecewise linearize the nonlinear rateand then calculate the rate of change with respect to rotationangles as well as themeasured distance Equation (3) presentsthe rotation matrix in 3D space
119877 =[[
[
1 0 0
0 cos 120579 sin 1205790 minus sin 120579 cos 120579
]]
]
sdot[[
[
cos120595 0 minus sin1205950 1 0
sin120595 0 cos120595
]]
]
sdot[[
[
cos120593 sin120593 0
minus sin120593 cos120593 0
0 0 1
]]
]
(3)
Variations of pixel coordinates are associated with theworld coordinates by a rotation matrix whose parameters areattitude angles of the camera which is described by
Δ119906119896+1
= 119865119909sdot (119877119896+111119909119908+ 119877119896+112119910119908
119877119896+131119909119908+ 119877119896+132119910119908
minus11987711989611119909119908+ 11987711989612119910119908
11987711989631119909119908+ 11987711989632119910119908
)
ΔV119896+1
= 119865119910sdot (119877119896+121119909119908+ 119877119896+122119910119908
119877119896+131119909119908+ 119877119896+132119910119908
minus11987711989621119909119908+ 11987711989622119910119908
11987711989631119909119908+ 11987711989632119910119908
)
(4)
where inner parameters119865119909and119865119910are only determined by the
CCD structure itself
4 Journal of Sensors
For the convenience of discussion we assume that thecamera poses change mainly along the yaw angle direction
Denote Ψ(120579 120595 120593) = (11987711119909119908+ 11987712119910119908)(11987731119909119908+ 11987732119910119908)
using (3) and (4) we obtain
Ψ (120579 120595 120593) = cos 120597 + sin 120597 cot120573 (5)
where
120597 = arctan(119909119908
119910119908
) + arctan(119910119908
119909119908
)
120573 = 120593 minus arctan(119910119908
119909119908
)
(6)
Substituting (3) in (5) results in
Ψ (120579 120595 120593) = cot(120593 minus arctan(119910119908
119909119908
)) (7)
From Figure 3 we can see that the slope of curvestends to be constant within a sliding interval of independentvariable This interval becomes smaller when the ratio of119909119908to 119910119908increases The Chebyshev approximation method
has the characteristics of uniform approximation on selectedclosed-interval Inspired by this the nonlinear rate can beapproximated by linear polynomial and the deviation causedby poses change of a camera can be effectively compensatedThe second derivative is taken as
Ψ10158401015840(120593) = 2 cot(120593 minus arctan(
119910119908
119909119908
))
sdot (cot(120593 minus arctan(119910119908
119909119908
))
2
+ 1)
(8)
Considering the yaw angle variations of a PTZ (PanTiltZoom) camera caused by uneven pavement during practicalrobot motion a closed subinterval [119886 119887] ([119886 119887] sube [0 sim 12058712])is chosen for further deduction Since (8) is a continuousfunction and keeps consistency in sign the best consistentapproximation method can be used
Denote 119886lowast1= (Ψ(119887) minus Ψ(119886))(119887 minus 119886) using this to acquire
normal equation of approximation
Ψ1015840(120593) = 119886
lowast
1 (9)
Set the solution of (9) as 119909Δ Then the approximation
equation is written as follows
119901lowast
(119909) =119891 (119886) + 119891 (119909
Δ)
2+ 119886lowast
1sdot (119909 minus
119886 + 119909Δ
2) (10)
We explore the slope of line after linear approximation tostudy the function of different ratios of119909
119908to119910119908 Results show
that the slope converges to its limit uniformly Moreover thisconstant value is irrelevant to the ratio above
120581 =Ψ (119887) minus Ψ (119886)
119887 minus 119886 (11)
Yaw angle (rad)
Pixe
l (de
g)
XY = 05
XY = 02941
XY = 02
XY = 01
XY = 1e minus 6
2
18
16
14
12
1
08
06
04
02
00 01 02 03 04 05 06 07
Figure 3 Pixel coordinates along 119880-axis vary with the yaw anglesa set of 119910
119908119909119908values are considered (119909
119908 119910119908) is the coordinate of P
in the world coordinate system
Substituting (7) in (11) results in
120581 =1
119887 minus 119886sdot
1 + (119910119909)2
(tan (119887) minus 119910119909) (tan (119887) minus 119910119909)
sdot (tan (119886) minus tan (119887))
(12)
To demonstrate the convergence of 120581 we have also analyzedthe limit value given by
lim119910119909rarrinfin
119896 =tan (119886) minus tan (119887)
119887 minus 119886 (13)
Solid curves in Figure 4 are the results of actual slope andlinear approximation respectivelyThese two curves coincidewith each other well after a translation operation Thisindicates a high accuracy in slope using linear approximationFigure 5manifests the convergence of slope related to ametricof 119910119908119909119908 which is in good agreement with experimental
results Another important property that should be notedis that the function value rapidly reaches convergence aftera dramatic increase that is the measured range wouldvary with the metric 119910
119908119909119908
with high nonlinearity Thisalso implies that the measurement should be conducted onthe smooth interval to reduce the errors caused by camerashakings On the other hand it is impossible to compensatethe deviations when the metric is too small
43 Sonar and Camera Data Assimilation Model The Opti-mal Interpolation Algorithm is derived to generate the leastsquares results for vectors of observations and backgroundfields assuming ldquoa priorirdquo known statistical models for thebackground error covariance The Optimal InterpolationTechnique based on the minimization of variance esti-mation plays an important role in data assimilation Ituses several different real-world observations to produce acorrected output which is closer to the truth
Journal of Sensors 5
FunctionLinear proximity
YX = 300
minus05
minus1
minus15
minus2
minus25
minus3
minus350 01 02 03 04 05 06 07
Figure 4 Pixel values vary with the yaw angle our approach cansuccessfully approximate the actual changes
Slop
e
YX
minus1
minus105
minus11
minus115
minus12
minus125
minus13
minus135
minus14
minus145
minus1510008006004002000
Y minus1033
X 34
Figure 5 Slope converges to a stable value
The motivation of the proposed method comes from thesimilar characteristic and phenomena between a camera andsonar measurement system and an OI algorithm First acamera and sonar system can be considered to be an OIsystem that produces an optimal output through severalgroups of observations Second the OI algorithm has thedimension-extensible and loose a priori characteristics thatare attractive for camera and sonar measurement system
The following are given
(i) A background field 119909119887available in two or three
dimensions(ii) A set of 119901 observations 119909obs available at irregular
positions
The optimal estimation is described by
119909119886= 119909119887+119882(119909obs minus 119909119887) (14)
Table 1 Initial state of camera before calibration
Rotation angle Pitch angle Zoom0 minus15∘ 0
Table 2 Calibration results
119891119909
119891119910
1198800
1198810
73188 73865 33928 2497
The errors are given by
120576119886= 119909119886minus 119909119905
120576119887= 119909119887minus 119909119905
120576obs= 119909obs minus 119909119905
(15)
The optimal weight is then as follows
119882 =(120590119887)2
(120590obs)2
+ (120590119887)2 (16)
where 120590119886 and 120590119887 represent the mean value of 120576119886 and 120576119887 Asdata from camera and sonar are unrelated it is assumed that119864(120576119887120576obs) = 0
5 Evaluations and Analysis
In this section we present the results of a set of physicalexperiments to demonstrate the performance of the pro-posed algorithm in Section 4 To validate the effectivenessof the proposed data rectification algorithm we comparedthe results before and after a pose change with the truthMoreover we have conducted a set of experiments underdifferent initial poses of a camera to testify the robust-ness of this method Besides comparative experiments havebeen designed to show the validity of the data assimilationapproach
Autonomous vehicles can be modeled as mobile robotsand then we use the mobile robot Pioneer 3-DX (Figure 6)mounted with a camera for experiments To prepare forexperiments the PTZ camera is firstly calibrated
51 Camera Calibration Grid size of the calibration boardin experiments is 30mm lowast 30mm Picture resolution ofVCC50I is fixed as 640 lowast 480 To ensure error balance andcalibration accuracy a group of calibrated images containingfour images from various poses are collected at a distanceinterval of 10 cm Calibration distance ranges from 1500mmto 4000mm Considering effects of pitch and rotation anglesas well as the zoom value the camera state during calibrationis fixed as given in Table 1 Internal parameters are listed inTable 2 which are crucial to distance measurement
52 Performance Evaluation of the Data Rectification Algo-rithm Angle variations are acquired by a three-axis anglesensor which act as the input of the rectification algorithm
6 Journal of Sensors
Figure 6 Pioneer 3-DX robot equipped with a Canon-VCC50I PTZ Camera
Table 3 Different conditions set for measurement
120595 120579 120593
213 718 005489 484 281568 405 359293 642 277407 501 110
The module (MPU6050) has advantages in low temperaturedependency high resolution and low noise Due to theseadvantages this module is chosen as a tool to measure theEuler angles
To validate the robustness of the algorithm a target is setat different positions randomly For each metric 119910119909 resultsunder a set of camera poses are analyzed (yaw pitch and rollangles are random for each metric as set in Table 3) Initialreadings of angle sensor are as follows 120579 (pitch) 965120593 (yaw)minus027∘ and 120595 (roll) minus094∘ The target position tuples are setas (1500 100) (2000 100) (2500 100) (3000 100)
Figure 7(a) shows measurements from the initial camerastate Compared with the truth it shows bias along bothhorizontal and vertical directions
The deviations of VO results caused by the cameramotions are rectified independently Based on the analysisof the range model in Section 3 the pitch angle is anindependent variable of distance function Therefore werecalculate the pitch angle instead of inversion operationson pixel coordinates Figures 7(b)ndash7(f) show results beforeand after rectification In Figure 7(e) the distance erroralong the optical axis is almost as high as 50 using directmeasurement However this value decreases to be only 6using the proposed algorithm We can also see that the leastimprovement in accuracy in this direction is 10 as shown in
Figure 7(c) Accordingly much more remarkable effect canbe seen from the results along the direction perpendicularto the optical axis In the worst case as shown in Fig-ure 7(f) the measured distance along119883-axis is rectified fromthe measured minus780mm to the final 98mm The percentagegains of measuring precision approach 878 Even in ageneral situation this percentage can be close to 35 asdemonstrated in Figure 7(c) These figures also show thatthe range deviation becomes larger as the distance along theoptical axis direction increases This is mainly because theratio of physical distance to pixel unit increases along theoptical axis
53 Data Assimilation Evaluation For generality the assim-ilation results under different metric size (ie manipulating119910119909) with a fixed camera pose are demonstrated Data fromsonar sensors are set as the background field value and thosefrom camera are set as the observation field value In Fig-ure 8(a) measurement results at some positions are missingwhich indicate that blind zones exist when sonar system onlyis adopted Range data in Figure 8(b) are the results of sensorassimilation It demonstrates the accuracy improvement fromboth 119883-axis and 119884-axis compared with measurements solelyfrom the vision system and sonar sensors Assimilated resultsproved to be as much as 25 percent accurate along119883-axis and9 percent accurate along 119884-axis compared to those acquiredusing a single type of sensor This is mainly because newinformation is brought in to compensate the output from asingle measuring system that is the wild FOV of camera andthe high measurement accuracy of sonar sensors
6 Conclusions
In this paper we have proposed an analytical measuringrectification algorithm formonocular range estimation undercamera shaking conditions Specifically we have establisheda pose-range model and then the algebraic relation between
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 3
Imaging plane
ZW
XW
O YW
P998400(u )
P(xw yw)
P998400998400 (u )
Figure 1 The imaging geometry of the observation model
which obtains a second pointP10158401015840 Now assuming that the posemeasurement is reasonably accurate and that the positionestimate algorithm works well the problem is to estimate Putilizing measurements including P1015840 P10158401015840 and pose variationof the camera
The initial and final poses of the camera are denoted byPs = (119903119904 119901119904 119910119904) and Pf = (119903119891 119901119891 119910119891) respectively where 119903119894119901119894 and 119910
119894(119894 = 119904 119891) stand for the initial roll pitch and yaw
angles Although the actual relative distance from the opticalcenter to the target changes slightly the measured resultsdeviate from the truth significantly This is mainly becauseof the nonlinear mapping between pixel coordinates andcorresponding distance values The problem is to correct theactual measurements to be close to the truth by eliminatingthe pose perturbation of the camera
4 Data Rectification Algorithm
In this section we describe our approach to the problem ofmonocular vision-based measurement rectification Since arobotrsquos trajectory is most conveniently described in a worldcoordinate system while the target on the ground is generallydescribed by its camera coordinate system we start witha preview of these two coordinate systems To model theproblem in a general geometrodynamical architecture thealgebraic relation between the camera pose displacement andthe displacement of measuring distance is derived
41 World and Camera Coordinates System Assume that119883119884119885 and119883101584011988410158401198851015840 are respectively a world coordinate systemand the camerarsquos coordinate system as shown in Figure 2Thecoordinates of a point P under these two coordinate systemsare transformed by
[[
[
119883
119884
119885
]]
]
= R[[[
[
1198831015840
1198841015840
1198851015840
]]]
]
+ T (1)
where (119883 119884 119885) and (1198831015840 1198841015840 1198851015840) are point P coordinates inthe world and robot camera coordinate system Moreover
Z
Y
XZ998400Y998400
X998400
P
Figure 2 The world and camera coordinate systems
R and T are respectively the rotation and translation fromthe camera to theworldrsquos coordinate systemwhich determinethe position and orientation of the camera in the worldcoordinate system Furthermore for a 3D point P in theFOV of the camera its image coordinates are given by theprojection equation as follows
[119906 V]119879 = [1198831015840
11988510158401198841015840
1198851015840]
119879
sdot 119891 (2)
where (119906 V) are the coordinates of P in the image coordinatesystem and 119891 is the camerarsquos focal length
42 Chebyshev Best Uniform Approximation RectificationAlgorithm The distance-orientation information betweentargets and the camera can be derived from correspondingpixel coordinate in the image [20 21] It is found that the ratioof image pixel motion to the camera rotation angles variesnonlinearly along the main optical axis The main idea of thedesigned algorithm is to piecewise linearize the nonlinear rateand then calculate the rate of change with respect to rotationangles as well as themeasured distance Equation (3) presentsthe rotation matrix in 3D space
119877 =[[
[
1 0 0
0 cos 120579 sin 1205790 minus sin 120579 cos 120579
]]
]
sdot[[
[
cos120595 0 minus sin1205950 1 0
sin120595 0 cos120595
]]
]
sdot[[
[
cos120593 sin120593 0
minus sin120593 cos120593 0
0 0 1
]]
]
(3)
Variations of pixel coordinates are associated with theworld coordinates by a rotation matrix whose parameters areattitude angles of the camera which is described by
Δ119906119896+1
= 119865119909sdot (119877119896+111119909119908+ 119877119896+112119910119908
119877119896+131119909119908+ 119877119896+132119910119908
minus11987711989611119909119908+ 11987711989612119910119908
11987711989631119909119908+ 11987711989632119910119908
)
ΔV119896+1
= 119865119910sdot (119877119896+121119909119908+ 119877119896+122119910119908
119877119896+131119909119908+ 119877119896+132119910119908
minus11987711989621119909119908+ 11987711989622119910119908
11987711989631119909119908+ 11987711989632119910119908
)
(4)
where inner parameters119865119909and119865119910are only determined by the
CCD structure itself
4 Journal of Sensors
For the convenience of discussion we assume that thecamera poses change mainly along the yaw angle direction
Denote Ψ(120579 120595 120593) = (11987711119909119908+ 11987712119910119908)(11987731119909119908+ 11987732119910119908)
using (3) and (4) we obtain
Ψ (120579 120595 120593) = cos 120597 + sin 120597 cot120573 (5)
where
120597 = arctan(119909119908
119910119908
) + arctan(119910119908
119909119908
)
120573 = 120593 minus arctan(119910119908
119909119908
)
(6)
Substituting (3) in (5) results in
Ψ (120579 120595 120593) = cot(120593 minus arctan(119910119908
119909119908
)) (7)
From Figure 3 we can see that the slope of curvestends to be constant within a sliding interval of independentvariable This interval becomes smaller when the ratio of119909119908to 119910119908increases The Chebyshev approximation method
has the characteristics of uniform approximation on selectedclosed-interval Inspired by this the nonlinear rate can beapproximated by linear polynomial and the deviation causedby poses change of a camera can be effectively compensatedThe second derivative is taken as
Ψ10158401015840(120593) = 2 cot(120593 minus arctan(
119910119908
119909119908
))
sdot (cot(120593 minus arctan(119910119908
119909119908
))
2
+ 1)
(8)
Considering the yaw angle variations of a PTZ (PanTiltZoom) camera caused by uneven pavement during practicalrobot motion a closed subinterval [119886 119887] ([119886 119887] sube [0 sim 12058712])is chosen for further deduction Since (8) is a continuousfunction and keeps consistency in sign the best consistentapproximation method can be used
Denote 119886lowast1= (Ψ(119887) minus Ψ(119886))(119887 minus 119886) using this to acquire
normal equation of approximation
Ψ1015840(120593) = 119886
lowast
1 (9)
Set the solution of (9) as 119909Δ Then the approximation
equation is written as follows
119901lowast
(119909) =119891 (119886) + 119891 (119909
Δ)
2+ 119886lowast
1sdot (119909 minus
119886 + 119909Δ
2) (10)
We explore the slope of line after linear approximation tostudy the function of different ratios of119909
119908to119910119908 Results show
that the slope converges to its limit uniformly Moreover thisconstant value is irrelevant to the ratio above
120581 =Ψ (119887) minus Ψ (119886)
119887 minus 119886 (11)
Yaw angle (rad)
Pixe
l (de
g)
XY = 05
XY = 02941
XY = 02
XY = 01
XY = 1e minus 6
2
18
16
14
12
1
08
06
04
02
00 01 02 03 04 05 06 07
Figure 3 Pixel coordinates along 119880-axis vary with the yaw anglesa set of 119910
119908119909119908values are considered (119909
119908 119910119908) is the coordinate of P
in the world coordinate system
Substituting (7) in (11) results in
120581 =1
119887 minus 119886sdot
1 + (119910119909)2
(tan (119887) minus 119910119909) (tan (119887) minus 119910119909)
sdot (tan (119886) minus tan (119887))
(12)
To demonstrate the convergence of 120581 we have also analyzedthe limit value given by
lim119910119909rarrinfin
119896 =tan (119886) minus tan (119887)
119887 minus 119886 (13)
Solid curves in Figure 4 are the results of actual slope andlinear approximation respectivelyThese two curves coincidewith each other well after a translation operation Thisindicates a high accuracy in slope using linear approximationFigure 5manifests the convergence of slope related to ametricof 119910119908119909119908 which is in good agreement with experimental
results Another important property that should be notedis that the function value rapidly reaches convergence aftera dramatic increase that is the measured range wouldvary with the metric 119910
119908119909119908
with high nonlinearity Thisalso implies that the measurement should be conducted onthe smooth interval to reduce the errors caused by camerashakings On the other hand it is impossible to compensatethe deviations when the metric is too small
43 Sonar and Camera Data Assimilation Model The Opti-mal Interpolation Algorithm is derived to generate the leastsquares results for vectors of observations and backgroundfields assuming ldquoa priorirdquo known statistical models for thebackground error covariance The Optimal InterpolationTechnique based on the minimization of variance esti-mation plays an important role in data assimilation Ituses several different real-world observations to produce acorrected output which is closer to the truth
Journal of Sensors 5
FunctionLinear proximity
YX = 300
minus05
minus1
minus15
minus2
minus25
minus3
minus350 01 02 03 04 05 06 07
Figure 4 Pixel values vary with the yaw angle our approach cansuccessfully approximate the actual changes
Slop
e
YX
minus1
minus105
minus11
minus115
minus12
minus125
minus13
minus135
minus14
minus145
minus1510008006004002000
Y minus1033
X 34
Figure 5 Slope converges to a stable value
The motivation of the proposed method comes from thesimilar characteristic and phenomena between a camera andsonar measurement system and an OI algorithm First acamera and sonar system can be considered to be an OIsystem that produces an optimal output through severalgroups of observations Second the OI algorithm has thedimension-extensible and loose a priori characteristics thatare attractive for camera and sonar measurement system
The following are given
(i) A background field 119909119887available in two or three
dimensions(ii) A set of 119901 observations 119909obs available at irregular
positions
The optimal estimation is described by
119909119886= 119909119887+119882(119909obs minus 119909119887) (14)
Table 1 Initial state of camera before calibration
Rotation angle Pitch angle Zoom0 minus15∘ 0
Table 2 Calibration results
119891119909
119891119910
1198800
1198810
73188 73865 33928 2497
The errors are given by
120576119886= 119909119886minus 119909119905
120576119887= 119909119887minus 119909119905
120576obs= 119909obs minus 119909119905
(15)
The optimal weight is then as follows
119882 =(120590119887)2
(120590obs)2
+ (120590119887)2 (16)
where 120590119886 and 120590119887 represent the mean value of 120576119886 and 120576119887 Asdata from camera and sonar are unrelated it is assumed that119864(120576119887120576obs) = 0
5 Evaluations and Analysis
In this section we present the results of a set of physicalexperiments to demonstrate the performance of the pro-posed algorithm in Section 4 To validate the effectivenessof the proposed data rectification algorithm we comparedthe results before and after a pose change with the truthMoreover we have conducted a set of experiments underdifferent initial poses of a camera to testify the robust-ness of this method Besides comparative experiments havebeen designed to show the validity of the data assimilationapproach
Autonomous vehicles can be modeled as mobile robotsand then we use the mobile robot Pioneer 3-DX (Figure 6)mounted with a camera for experiments To prepare forexperiments the PTZ camera is firstly calibrated
51 Camera Calibration Grid size of the calibration boardin experiments is 30mm lowast 30mm Picture resolution ofVCC50I is fixed as 640 lowast 480 To ensure error balance andcalibration accuracy a group of calibrated images containingfour images from various poses are collected at a distanceinterval of 10 cm Calibration distance ranges from 1500mmto 4000mm Considering effects of pitch and rotation anglesas well as the zoom value the camera state during calibrationis fixed as given in Table 1 Internal parameters are listed inTable 2 which are crucial to distance measurement
52 Performance Evaluation of the Data Rectification Algo-rithm Angle variations are acquired by a three-axis anglesensor which act as the input of the rectification algorithm
6 Journal of Sensors
Figure 6 Pioneer 3-DX robot equipped with a Canon-VCC50I PTZ Camera
Table 3 Different conditions set for measurement
120595 120579 120593
213 718 005489 484 281568 405 359293 642 277407 501 110
The module (MPU6050) has advantages in low temperaturedependency high resolution and low noise Due to theseadvantages this module is chosen as a tool to measure theEuler angles
To validate the robustness of the algorithm a target is setat different positions randomly For each metric 119910119909 resultsunder a set of camera poses are analyzed (yaw pitch and rollangles are random for each metric as set in Table 3) Initialreadings of angle sensor are as follows 120579 (pitch) 965120593 (yaw)minus027∘ and 120595 (roll) minus094∘ The target position tuples are setas (1500 100) (2000 100) (2500 100) (3000 100)
Figure 7(a) shows measurements from the initial camerastate Compared with the truth it shows bias along bothhorizontal and vertical directions
The deviations of VO results caused by the cameramotions are rectified independently Based on the analysisof the range model in Section 3 the pitch angle is anindependent variable of distance function Therefore werecalculate the pitch angle instead of inversion operationson pixel coordinates Figures 7(b)ndash7(f) show results beforeand after rectification In Figure 7(e) the distance erroralong the optical axis is almost as high as 50 using directmeasurement However this value decreases to be only 6using the proposed algorithm We can also see that the leastimprovement in accuracy in this direction is 10 as shown in
Figure 7(c) Accordingly much more remarkable effect canbe seen from the results along the direction perpendicularto the optical axis In the worst case as shown in Fig-ure 7(f) the measured distance along119883-axis is rectified fromthe measured minus780mm to the final 98mm The percentagegains of measuring precision approach 878 Even in ageneral situation this percentage can be close to 35 asdemonstrated in Figure 7(c) These figures also show thatthe range deviation becomes larger as the distance along theoptical axis direction increases This is mainly because theratio of physical distance to pixel unit increases along theoptical axis
53 Data Assimilation Evaluation For generality the assim-ilation results under different metric size (ie manipulating119910119909) with a fixed camera pose are demonstrated Data fromsonar sensors are set as the background field value and thosefrom camera are set as the observation field value In Fig-ure 8(a) measurement results at some positions are missingwhich indicate that blind zones exist when sonar system onlyis adopted Range data in Figure 8(b) are the results of sensorassimilation It demonstrates the accuracy improvement fromboth 119883-axis and 119884-axis compared with measurements solelyfrom the vision system and sonar sensors Assimilated resultsproved to be as much as 25 percent accurate along119883-axis and9 percent accurate along 119884-axis compared to those acquiredusing a single type of sensor This is mainly because newinformation is brought in to compensate the output from asingle measuring system that is the wild FOV of camera andthe high measurement accuracy of sonar sensors
6 Conclusions
In this paper we have proposed an analytical measuringrectification algorithm formonocular range estimation undercamera shaking conditions Specifically we have establisheda pose-range model and then the algebraic relation between
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
4 Journal of Sensors
For the convenience of discussion we assume that thecamera poses change mainly along the yaw angle direction
Denote Ψ(120579 120595 120593) = (11987711119909119908+ 11987712119910119908)(11987731119909119908+ 11987732119910119908)
using (3) and (4) we obtain
Ψ (120579 120595 120593) = cos 120597 + sin 120597 cot120573 (5)
where
120597 = arctan(119909119908
119910119908
) + arctan(119910119908
119909119908
)
120573 = 120593 minus arctan(119910119908
119909119908
)
(6)
Substituting (3) in (5) results in
Ψ (120579 120595 120593) = cot(120593 minus arctan(119910119908
119909119908
)) (7)
From Figure 3 we can see that the slope of curvestends to be constant within a sliding interval of independentvariable This interval becomes smaller when the ratio of119909119908to 119910119908increases The Chebyshev approximation method
has the characteristics of uniform approximation on selectedclosed-interval Inspired by this the nonlinear rate can beapproximated by linear polynomial and the deviation causedby poses change of a camera can be effectively compensatedThe second derivative is taken as
Ψ10158401015840(120593) = 2 cot(120593 minus arctan(
119910119908
119909119908
))
sdot (cot(120593 minus arctan(119910119908
119909119908
))
2
+ 1)
(8)
Considering the yaw angle variations of a PTZ (PanTiltZoom) camera caused by uneven pavement during practicalrobot motion a closed subinterval [119886 119887] ([119886 119887] sube [0 sim 12058712])is chosen for further deduction Since (8) is a continuousfunction and keeps consistency in sign the best consistentapproximation method can be used
Denote 119886lowast1= (Ψ(119887) minus Ψ(119886))(119887 minus 119886) using this to acquire
normal equation of approximation
Ψ1015840(120593) = 119886
lowast
1 (9)
Set the solution of (9) as 119909Δ Then the approximation
equation is written as follows
119901lowast
(119909) =119891 (119886) + 119891 (119909
Δ)
2+ 119886lowast
1sdot (119909 minus
119886 + 119909Δ
2) (10)
We explore the slope of line after linear approximation tostudy the function of different ratios of119909
119908to119910119908 Results show
that the slope converges to its limit uniformly Moreover thisconstant value is irrelevant to the ratio above
120581 =Ψ (119887) minus Ψ (119886)
119887 minus 119886 (11)
Yaw angle (rad)
Pixe
l (de
g)
XY = 05
XY = 02941
XY = 02
XY = 01
XY = 1e minus 6
2
18
16
14
12
1
08
06
04
02
00 01 02 03 04 05 06 07
Figure 3 Pixel coordinates along 119880-axis vary with the yaw anglesa set of 119910
119908119909119908values are considered (119909
119908 119910119908) is the coordinate of P
in the world coordinate system
Substituting (7) in (11) results in
120581 =1
119887 minus 119886sdot
1 + (119910119909)2
(tan (119887) minus 119910119909) (tan (119887) minus 119910119909)
sdot (tan (119886) minus tan (119887))
(12)
To demonstrate the convergence of 120581 we have also analyzedthe limit value given by
lim119910119909rarrinfin
119896 =tan (119886) minus tan (119887)
119887 minus 119886 (13)
Solid curves in Figure 4 are the results of actual slope andlinear approximation respectivelyThese two curves coincidewith each other well after a translation operation Thisindicates a high accuracy in slope using linear approximationFigure 5manifests the convergence of slope related to ametricof 119910119908119909119908 which is in good agreement with experimental
results Another important property that should be notedis that the function value rapidly reaches convergence aftera dramatic increase that is the measured range wouldvary with the metric 119910
119908119909119908
with high nonlinearity Thisalso implies that the measurement should be conducted onthe smooth interval to reduce the errors caused by camerashakings On the other hand it is impossible to compensatethe deviations when the metric is too small
43 Sonar and Camera Data Assimilation Model The Opti-mal Interpolation Algorithm is derived to generate the leastsquares results for vectors of observations and backgroundfields assuming ldquoa priorirdquo known statistical models for thebackground error covariance The Optimal InterpolationTechnique based on the minimization of variance esti-mation plays an important role in data assimilation Ituses several different real-world observations to produce acorrected output which is closer to the truth
Journal of Sensors 5
FunctionLinear proximity
YX = 300
minus05
minus1
minus15
minus2
minus25
minus3
minus350 01 02 03 04 05 06 07
Figure 4 Pixel values vary with the yaw angle our approach cansuccessfully approximate the actual changes
Slop
e
YX
minus1
minus105
minus11
minus115
minus12
minus125
minus13
minus135
minus14
minus145
minus1510008006004002000
Y minus1033
X 34
Figure 5 Slope converges to a stable value
The motivation of the proposed method comes from thesimilar characteristic and phenomena between a camera andsonar measurement system and an OI algorithm First acamera and sonar system can be considered to be an OIsystem that produces an optimal output through severalgroups of observations Second the OI algorithm has thedimension-extensible and loose a priori characteristics thatare attractive for camera and sonar measurement system
The following are given
(i) A background field 119909119887available in two or three
dimensions(ii) A set of 119901 observations 119909obs available at irregular
positions
The optimal estimation is described by
119909119886= 119909119887+119882(119909obs minus 119909119887) (14)
Table 1 Initial state of camera before calibration
Rotation angle Pitch angle Zoom0 minus15∘ 0
Table 2 Calibration results
119891119909
119891119910
1198800
1198810
73188 73865 33928 2497
The errors are given by
120576119886= 119909119886minus 119909119905
120576119887= 119909119887minus 119909119905
120576obs= 119909obs minus 119909119905
(15)
The optimal weight is then as follows
119882 =(120590119887)2
(120590obs)2
+ (120590119887)2 (16)
where 120590119886 and 120590119887 represent the mean value of 120576119886 and 120576119887 Asdata from camera and sonar are unrelated it is assumed that119864(120576119887120576obs) = 0
5 Evaluations and Analysis
In this section we present the results of a set of physicalexperiments to demonstrate the performance of the pro-posed algorithm in Section 4 To validate the effectivenessof the proposed data rectification algorithm we comparedthe results before and after a pose change with the truthMoreover we have conducted a set of experiments underdifferent initial poses of a camera to testify the robust-ness of this method Besides comparative experiments havebeen designed to show the validity of the data assimilationapproach
Autonomous vehicles can be modeled as mobile robotsand then we use the mobile robot Pioneer 3-DX (Figure 6)mounted with a camera for experiments To prepare forexperiments the PTZ camera is firstly calibrated
51 Camera Calibration Grid size of the calibration boardin experiments is 30mm lowast 30mm Picture resolution ofVCC50I is fixed as 640 lowast 480 To ensure error balance andcalibration accuracy a group of calibrated images containingfour images from various poses are collected at a distanceinterval of 10 cm Calibration distance ranges from 1500mmto 4000mm Considering effects of pitch and rotation anglesas well as the zoom value the camera state during calibrationis fixed as given in Table 1 Internal parameters are listed inTable 2 which are crucial to distance measurement
52 Performance Evaluation of the Data Rectification Algo-rithm Angle variations are acquired by a three-axis anglesensor which act as the input of the rectification algorithm
6 Journal of Sensors
Figure 6 Pioneer 3-DX robot equipped with a Canon-VCC50I PTZ Camera
Table 3 Different conditions set for measurement
120595 120579 120593
213 718 005489 484 281568 405 359293 642 277407 501 110
The module (MPU6050) has advantages in low temperaturedependency high resolution and low noise Due to theseadvantages this module is chosen as a tool to measure theEuler angles
To validate the robustness of the algorithm a target is setat different positions randomly For each metric 119910119909 resultsunder a set of camera poses are analyzed (yaw pitch and rollangles are random for each metric as set in Table 3) Initialreadings of angle sensor are as follows 120579 (pitch) 965120593 (yaw)minus027∘ and 120595 (roll) minus094∘ The target position tuples are setas (1500 100) (2000 100) (2500 100) (3000 100)
Figure 7(a) shows measurements from the initial camerastate Compared with the truth it shows bias along bothhorizontal and vertical directions
The deviations of VO results caused by the cameramotions are rectified independently Based on the analysisof the range model in Section 3 the pitch angle is anindependent variable of distance function Therefore werecalculate the pitch angle instead of inversion operationson pixel coordinates Figures 7(b)ndash7(f) show results beforeand after rectification In Figure 7(e) the distance erroralong the optical axis is almost as high as 50 using directmeasurement However this value decreases to be only 6using the proposed algorithm We can also see that the leastimprovement in accuracy in this direction is 10 as shown in
Figure 7(c) Accordingly much more remarkable effect canbe seen from the results along the direction perpendicularto the optical axis In the worst case as shown in Fig-ure 7(f) the measured distance along119883-axis is rectified fromthe measured minus780mm to the final 98mm The percentagegains of measuring precision approach 878 Even in ageneral situation this percentage can be close to 35 asdemonstrated in Figure 7(c) These figures also show thatthe range deviation becomes larger as the distance along theoptical axis direction increases This is mainly because theratio of physical distance to pixel unit increases along theoptical axis
53 Data Assimilation Evaluation For generality the assim-ilation results under different metric size (ie manipulating119910119909) with a fixed camera pose are demonstrated Data fromsonar sensors are set as the background field value and thosefrom camera are set as the observation field value In Fig-ure 8(a) measurement results at some positions are missingwhich indicate that blind zones exist when sonar system onlyis adopted Range data in Figure 8(b) are the results of sensorassimilation It demonstrates the accuracy improvement fromboth 119883-axis and 119884-axis compared with measurements solelyfrom the vision system and sonar sensors Assimilated resultsproved to be as much as 25 percent accurate along119883-axis and9 percent accurate along 119884-axis compared to those acquiredusing a single type of sensor This is mainly because newinformation is brought in to compensate the output from asingle measuring system that is the wild FOV of camera andthe high measurement accuracy of sonar sensors
6 Conclusions
In this paper we have proposed an analytical measuringrectification algorithm formonocular range estimation undercamera shaking conditions Specifically we have establisheda pose-range model and then the algebraic relation between
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 5
FunctionLinear proximity
YX = 300
minus05
minus1
minus15
minus2
minus25
minus3
minus350 01 02 03 04 05 06 07
Figure 4 Pixel values vary with the yaw angle our approach cansuccessfully approximate the actual changes
Slop
e
YX
minus1
minus105
minus11
minus115
minus12
minus125
minus13
minus135
minus14
minus145
minus1510008006004002000
Y minus1033
X 34
Figure 5 Slope converges to a stable value
The motivation of the proposed method comes from thesimilar characteristic and phenomena between a camera andsonar measurement system and an OI algorithm First acamera and sonar system can be considered to be an OIsystem that produces an optimal output through severalgroups of observations Second the OI algorithm has thedimension-extensible and loose a priori characteristics thatare attractive for camera and sonar measurement system
The following are given
(i) A background field 119909119887available in two or three
dimensions(ii) A set of 119901 observations 119909obs available at irregular
positions
The optimal estimation is described by
119909119886= 119909119887+119882(119909obs minus 119909119887) (14)
Table 1 Initial state of camera before calibration
Rotation angle Pitch angle Zoom0 minus15∘ 0
Table 2 Calibration results
119891119909
119891119910
1198800
1198810
73188 73865 33928 2497
The errors are given by
120576119886= 119909119886minus 119909119905
120576119887= 119909119887minus 119909119905
120576obs= 119909obs minus 119909119905
(15)
The optimal weight is then as follows
119882 =(120590119887)2
(120590obs)2
+ (120590119887)2 (16)
where 120590119886 and 120590119887 represent the mean value of 120576119886 and 120576119887 Asdata from camera and sonar are unrelated it is assumed that119864(120576119887120576obs) = 0
5 Evaluations and Analysis
In this section we present the results of a set of physicalexperiments to demonstrate the performance of the pro-posed algorithm in Section 4 To validate the effectivenessof the proposed data rectification algorithm we comparedthe results before and after a pose change with the truthMoreover we have conducted a set of experiments underdifferent initial poses of a camera to testify the robust-ness of this method Besides comparative experiments havebeen designed to show the validity of the data assimilationapproach
Autonomous vehicles can be modeled as mobile robotsand then we use the mobile robot Pioneer 3-DX (Figure 6)mounted with a camera for experiments To prepare forexperiments the PTZ camera is firstly calibrated
51 Camera Calibration Grid size of the calibration boardin experiments is 30mm lowast 30mm Picture resolution ofVCC50I is fixed as 640 lowast 480 To ensure error balance andcalibration accuracy a group of calibrated images containingfour images from various poses are collected at a distanceinterval of 10 cm Calibration distance ranges from 1500mmto 4000mm Considering effects of pitch and rotation anglesas well as the zoom value the camera state during calibrationis fixed as given in Table 1 Internal parameters are listed inTable 2 which are crucial to distance measurement
52 Performance Evaluation of the Data Rectification Algo-rithm Angle variations are acquired by a three-axis anglesensor which act as the input of the rectification algorithm
6 Journal of Sensors
Figure 6 Pioneer 3-DX robot equipped with a Canon-VCC50I PTZ Camera
Table 3 Different conditions set for measurement
120595 120579 120593
213 718 005489 484 281568 405 359293 642 277407 501 110
The module (MPU6050) has advantages in low temperaturedependency high resolution and low noise Due to theseadvantages this module is chosen as a tool to measure theEuler angles
To validate the robustness of the algorithm a target is setat different positions randomly For each metric 119910119909 resultsunder a set of camera poses are analyzed (yaw pitch and rollangles are random for each metric as set in Table 3) Initialreadings of angle sensor are as follows 120579 (pitch) 965120593 (yaw)minus027∘ and 120595 (roll) minus094∘ The target position tuples are setas (1500 100) (2000 100) (2500 100) (3000 100)
Figure 7(a) shows measurements from the initial camerastate Compared with the truth it shows bias along bothhorizontal and vertical directions
The deviations of VO results caused by the cameramotions are rectified independently Based on the analysisof the range model in Section 3 the pitch angle is anindependent variable of distance function Therefore werecalculate the pitch angle instead of inversion operationson pixel coordinates Figures 7(b)ndash7(f) show results beforeand after rectification In Figure 7(e) the distance erroralong the optical axis is almost as high as 50 using directmeasurement However this value decreases to be only 6using the proposed algorithm We can also see that the leastimprovement in accuracy in this direction is 10 as shown in
Figure 7(c) Accordingly much more remarkable effect canbe seen from the results along the direction perpendicularto the optical axis In the worst case as shown in Fig-ure 7(f) the measured distance along119883-axis is rectified fromthe measured minus780mm to the final 98mm The percentagegains of measuring precision approach 878 Even in ageneral situation this percentage can be close to 35 asdemonstrated in Figure 7(c) These figures also show thatthe range deviation becomes larger as the distance along theoptical axis direction increases This is mainly because theratio of physical distance to pixel unit increases along theoptical axis
53 Data Assimilation Evaluation For generality the assim-ilation results under different metric size (ie manipulating119910119909) with a fixed camera pose are demonstrated Data fromsonar sensors are set as the background field value and thosefrom camera are set as the observation field value In Fig-ure 8(a) measurement results at some positions are missingwhich indicate that blind zones exist when sonar system onlyis adopted Range data in Figure 8(b) are the results of sensorassimilation It demonstrates the accuracy improvement fromboth 119883-axis and 119884-axis compared with measurements solelyfrom the vision system and sonar sensors Assimilated resultsproved to be as much as 25 percent accurate along119883-axis and9 percent accurate along 119884-axis compared to those acquiredusing a single type of sensor This is mainly because newinformation is brought in to compensate the output from asingle measuring system that is the wild FOV of camera andthe high measurement accuracy of sonar sensors
6 Conclusions
In this paper we have proposed an analytical measuringrectification algorithm formonocular range estimation undercamera shaking conditions Specifically we have establisheda pose-range model and then the algebraic relation between
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
6 Journal of Sensors
Figure 6 Pioneer 3-DX robot equipped with a Canon-VCC50I PTZ Camera
Table 3 Different conditions set for measurement
120595 120579 120593
213 718 005489 484 281568 405 359293 642 277407 501 110
The module (MPU6050) has advantages in low temperaturedependency high resolution and low noise Due to theseadvantages this module is chosen as a tool to measure theEuler angles
To validate the robustness of the algorithm a target is setat different positions randomly For each metric 119910119909 resultsunder a set of camera poses are analyzed (yaw pitch and rollangles are random for each metric as set in Table 3) Initialreadings of angle sensor are as follows 120579 (pitch) 965120593 (yaw)minus027∘ and 120595 (roll) minus094∘ The target position tuples are setas (1500 100) (2000 100) (2500 100) (3000 100)
Figure 7(a) shows measurements from the initial camerastate Compared with the truth it shows bias along bothhorizontal and vertical directions
The deviations of VO results caused by the cameramotions are rectified independently Based on the analysisof the range model in Section 3 the pitch angle is anindependent variable of distance function Therefore werecalculate the pitch angle instead of inversion operationson pixel coordinates Figures 7(b)ndash7(f) show results beforeand after rectification In Figure 7(e) the distance erroralong the optical axis is almost as high as 50 using directmeasurement However this value decreases to be only 6using the proposed algorithm We can also see that the leastimprovement in accuracy in this direction is 10 as shown in
Figure 7(c) Accordingly much more remarkable effect canbe seen from the results along the direction perpendicularto the optical axis In the worst case as shown in Fig-ure 7(f) the measured distance along119883-axis is rectified fromthe measured minus780mm to the final 98mm The percentagegains of measuring precision approach 878 Even in ageneral situation this percentage can be close to 35 asdemonstrated in Figure 7(c) These figures also show thatthe range deviation becomes larger as the distance along theoptical axis direction increases This is mainly because theratio of physical distance to pixel unit increases along theoptical axis
53 Data Assimilation Evaluation For generality the assim-ilation results under different metric size (ie manipulating119910119909) with a fixed camera pose are demonstrated Data fromsonar sensors are set as the background field value and thosefrom camera are set as the observation field value In Fig-ure 8(a) measurement results at some positions are missingwhich indicate that blind zones exist when sonar system onlyis adopted Range data in Figure 8(b) are the results of sensorassimilation It demonstrates the accuracy improvement fromboth 119883-axis and 119884-axis compared with measurements solelyfrom the vision system and sonar sensors Assimilated resultsproved to be as much as 25 percent accurate along119883-axis and9 percent accurate along 119884-axis compared to those acquiredusing a single type of sensor This is mainly because newinformation is brought in to compensate the output from asingle measuring system that is the wild FOV of camera andthe high measurement accuracy of sonar sensors
6 Conclusions
In this paper we have proposed an analytical measuringrectification algorithm formonocular range estimation undercamera shaking conditions Specifically we have establisheda pose-range model and then the algebraic relation between
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 7
100 105 110 115 1201500
2000
2500
3000
3500
Y(m
m)
X (mm)
120579 = 965∘ 120601 = minus027∘ 120595 = minus094∘
(a)
95 100 105 110 115 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 718∘ 120601 = 005∘ 120595 = 213∘
(b)
20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
Y(m
m)
X (mm)
120579 = 484∘ 120601 = 281∘ 120595 = 489∘
(c)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 405∘ 120601 = 359∘ 120595 = 568∘
(d)
0 20 40 60 80 100 1201000
1500
2000
2500
3000
3500
Camera measured valueTrue valueRectified value
minus20
Y(m
m)
X (mm)
120579 = 642∘ 120601 = 277∘ 120595 = 293∘
(e)
0 2001000
1500
2000
2500
3000
Camera measured valueTrue valueRectified value
minus200minus400minus600minus800
Y(m
m)
X (mm)
120579 = 501∘ 120601 = 110∘ 120595 = 407∘
(f)
Figure 7 Comparisons between results after and before rectification a set of camera poses are set to validate the robustness of designedalgorithm
distance increment and a camerarsquos poses variation has beenformulatedWe have also designed a data assimilation systemto provide reliable range information using different typesof transducer systems Physical experiments are conducted
to validate the effectiveness and robustness of the proposedalgorithm For future work we will try to implement ouralgorithms on the multiple robots formations as well asswarm coordination applications
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
8 Journal of SensorsD
istan
ce al
ong
Y-a
xis (
mm
)
Distance along X-axis (mm)
4500
4000
3500
3000
2500
2000
1500
10008006004002000minus200minus400minus600minus800
(a) Results before assimilation
4500
4000
3500
3000
2500
2000
1500
1000
Dist
ance
alon
gY
-axi
s (m
m)
Distance along X-axis (mm)6005004003002001000minus100minus200minus300minus400
(b) Results after assimilation
Figure 8 Comparisons between results before and after assimilation (a) Original data from sensors of sonar and camera (b) Results ofassimilation using the Optimal Interpolation Algorithm (Results of camera are represented as Δ and those of sonar are described as lowast Thesymbol ∘ stands for truth values and are assimilated results)
Competing Interests
The authors declare that they have no competing interests
Acknowledgments
This research is financially supported by the Natural ScienceFoundation of China (Grant no 61571334) and the NaturalScience Foundation of China (Grant no 2014AA09A512)
References
[1] M Mikhalsky and J Sitte ldquoActive motion vision for distanceestimation in autonomous mobile robotsrdquo in Proceedings of theIEEEConference onRobotics Automation andMechatronics vol2 pp 1060ndash1065 December 2004
[2] N Yamaguti S Oe and K Terada ldquoA method of distancemeasurement by usingmonocular camerardquo in Proceedings of the36th SICE Annual Conference (SICE rsquo97) International SessionPapers pp 1255ndash1260 IEEE Tokushima Japan July 1997
[3] X Liu T Qin W Chen and X Yin ldquoReal-time distancemeasurement using a modified camerardquo in Proceedings of theIEEE Sensors Applications Symposium (SAS rsquo10) pp 54ndash58IEEE Limerick Ireland February 2010
[4] U Mudenagudi and S Chaudhuri ldquoDepth estimation usingdefocused stereo image pairsrdquo in Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV rsquo99) vol1 pp 483ndash488 IEEE September 1999
[5] A N Rajagopalan S Chaudhuri and U Mudenagudi ldquoDepthestimation and image restoration using defocused stereo pairsrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 26 no 11 pp 1521ndash1525 2004
[6] M N A Wahab N Sivadev and K Sundaraj ldquoTarget distanceestimation using monocular vision system for mobile robotrdquoin Proceedings of the 2nd IEEE International Conference onOpen Systems (ICOS rsquo11) pp 11ndash15 IEEE Langkawi MalaysiaSeptember 2011
[7] K Umeda and T Takahashi ldquoSubpixel stereo method a newmethodology of stereo visionrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA
rsquo00) vol 4 pp 3215ndash3220 San Francisco Calif USA April2000
[8] T-H Wang C-C Hsu C-C Chen C-W Huang and Y-CLu ldquoThree-dimensional measurement of a remote object witha single CCD camerardquo in Proceedings of the 4th InternationalConference on Autonomous Robots and Agents (ICARA rsquo09) pp187ndash192 IEEE Wellington New Zealand February 2009
[9] K Nakamura K Ishigaki T Ogata and S Muramatsu ldquoReal-timemonocular ranging by Bayesian triangulationrdquo in Proceed-ings of the IEEE Intelligent Vehicles Symposium (IV rsquo13) pp 1368ndash1373 IEEE Gold Coast Australia June 2013
[10] A Tsalatsanis K Valavanis and N Tsourveloudis ldquoMobilerobot navigation using sonar and range measurements fromuncalibrated camerasrdquo Journal of Intelligent and Robotic Sys-tems vol 48 no 2 pp 253ndash284 2007
[11] V Tucakov M Sahota D Murray et al ldquoA stereoscopic visuallyguidedmobile robotrdquo in Proceedings of the Hawaii InternationalConference on Systems Sciences Maui Hawaii USA January1997
[12] B Fidan V Gazi S Zhai N Cen and E Karatas ldquoSingle-view distance-estimation-based formation control of roboticswarmsrdquo IEEETransactions on Industrial Electronics vol 60 no12 pp 5781ndash5791 2013
[13] X Li and L Wang ldquoA monocular distance estimation methodused in video sequencerdquo inProceedings of the IEEE InternationalConference on Information and Automation (ICIA rsquo12) pp 390ndash394 Shenyang China June 2012
[14] J Yu J Song X Sun and T Zhang ldquoDesign and experimentof distance measuring system with single camera for pickingrobotrdquo in Proceedings of the IEEE World Automation Congress(WAC rsquo10) pp 445ndash448 Kobe Japan September 2010
[15] L Guo Y Xu K Li et al ldquoStudy on real-time distance detectionbased on monocular vision techniquerdquo Journal of Image andGraphics vol 11 no 1 pp 74ndash81 2006
[16] S Derrouich K Izumida and K Shiiya ldquoA combination ofmonocular CCD camera and inertial-sensor for range estima-tionrdquo in Proceedings of the 28th Annual Conference of the IEEEIndustrial Electronics Society (IECON rsquo02) pp 2191ndash2196 IEEENovember 2002
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 9
[17] D Wang W Chen Z Zhao and T C Ng ldquoInter-vehicleseparation measurement using monocular visionrdquo in Proceed-ings of the 7th IEEE Conference on Industrial Electronics andApplications (ICIEA rsquo12) pp 258ndash263 Singapore July 2012
[18] Y X Han Z S Zhang and M Dai ldquoMonocular vision systemfor distance measurement based on feature pointsrdquo Optics andPrecision Engineering vol 19 no 5 pp 1110ndash1117 2011
[19] E Malis and P Rives ldquoRobustness of image-based visualservoing with respect to depth distribution errorsrdquo in Pro-ceedings of the IEEE International Conference on Robotics andAutomation (ICRA rsquo03) vol 1 pp 1056ndash1061 IEEE TaipeiTaiwan September 2003
[20] R F StengelOptimal Control and Estimation Dover NewYorkNY USA 1994
[21] J Kim Y Kim and S Kim ldquoAn accurate localization formobile robot using extended Kalman filter and sensor fusionrdquoin Proceedings of the International Joint Conference on NeuralNetworks (IJCNN rsquo08) pp 2928ndash2933 IEEEWorld Congress onComputational Intelligence Hong Kong June 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of