2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

Embed Size (px)

Citation preview

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    1/7

    Proceedingsof the 1998 IEEE!RSJIntl. Conferenceon IntelligentRobotsand SystemsVictoria, B.C ., Canada October 1998

    2D 1/2 visual servoing stability analysiswith respect to camera calibration errors

    Ezio Malis F'ranqois Chaumette Sylvie BoudetIRISA/INRIA IRISA/INRI A EDF/DER ChatouCampus de Beaulieu, Campus de Beaulieu, 6Quai.Watier,35042 Rennes, France. 35042 Rennes, France. 78401 Chatou, France.Abstract

    In this paper, the robustness of a new dsual servo-ing scheme with respect to "era calibrntion errors i sanalyzed. This scheme, called 2 0 1/2 visual servoing,is based on the estimation, of the partial camera dis-placement from the current to the desired camera posesat each iteration of the control law. Visual featuresand data extracted fr om the partial displacement allowus to design a decoupled control law controlling the sixcamera d.0.f. The necessary and suficient conditionsfor local asymptotic stability in presence of camera cal-ibration error s are easily obtained. Then, thanks to thesample structure of the system, suficient conditions forglobal asymptotic stabil ity are proposed. F inall y, exper-imental results show the validity of our approach and itsrobustness not only with respect to camera calibrationerrors but also to robot calibration errors.1 IntroductionVision feedback control loop was introduced in or-der to increase the flexibility and the accuracy of robotsystems [6,71 Consider for example the classical posi-tioning task of an eye-in-hand system with respect toa target. After the image corresponding to the cameradesired position has been learned, and after the cam-era and/or the target has been moved, an error controlvector can be extracted from the two views of the tar-get. A zero error implies that the robot, end-effectorhas reached the desired position with an accuracy re-gardless to calibration errors. However, these errorsinfluence the way the system converges. In many casesimage features may get out of the camera field of viewduring the servoing, which thus lead to its failure. Forthis reason, it is important, t,o study the visual servoingrobustness with respect, to calibration errors.Position-based visual servoing necessitates a cali-brated camera and the knowledge of aperfect geomet-ric model of the target to oht,ain unbi;ised pose esti-

    mation from image features [ll]. ven if a closed loopcontrol is used, which makes possible the convergenceof the system in presence of calibration errors, it isquite impossible to analytically analyze the stability ofthe system. Image-based visual servoing is known tobe robust with respect to calibration errors [3]. How-ever, its convergence is theoretically ensured only ina region (quite impossible to determine analytically)around the desired position. Except in simple cases,the analysis of the stability robustness with respect tocalibration errors is again quite impossible, since thesystem is coupled and non-linear [3].It is shown in this paper that, contrarily to the pre-vious approaches, it is possible to obtain analytical re-sults using the2D 1/2 visual servoing approach [9]. 2D1/2 visual servoing is based on the projective recon-struction of the target using feature points extractedfrom two images (corresponding to the current and de-sired camera poses). Letus emphasize that this recon-struction does not necessitate the knowledge of the 3Dposition of the target points, which makes unnecessarya3D model of the target, and increases the versatilityand the application area of visual servoing.If the camera intrinsic parameters are known, ascaled Euclidean reconstruction is obtained from theprojective reconstruction, and the camera rotation be-tween the two views is computed at each iteration.Consequently, the rotational and translational controlloop can be decoupled which allows to obtain the con-vergence of the positioning task in all the task space.In order to control the three remaining camera d.o.f,weintroduce theextended image coordinatesof arefer-ence point of the target. These ones are obtained fromthe classical normalized image coordinates by addingathird normalized z coordinate which is measured fromthe Euclidean reconstruction. The interaction matrixwhich links the time derivativeof the extended imagecoorr!inates and the camera velocity screw, called theer:tended image .Jacobian matrix, has not singularity.

    0-7803-4465-0/98 $10.000 1998 IEEE 691

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    2/7

    If the camera intrinsic parameters are not perfectlyknown, the estimated control vector rim be analyti-cally computed in function of camera calibration er-rors. Then, the necessary and sufficient conditions forthe local asymptotic stability in presence of camera cal-ibration errors are easily obtained. Moreover, thanksto the simple structure of the syst,em, sufficient condi-tions for global asymptotic stability are presented.2 Euclidean reconstruction2.1 Projective Homography estimation

    Consider three 3D target points P, belonging to areference plane n- (see Figure 1). t is well known thatthe resulting image points m, in the current cameraframe3(expressed in pixels), are related to the cor-responding ones m in the desired camera frame3* ,by a projective homography H, such that ( i =1,2,3)m, =H,m: [5]. If n supplementary points P, doesnot belong to the reference plane (i.e. if the target isnot planar), the relationship between its current anddesired image points m3 and m; respectively, is nomore linear in the unknown dements o H,).However,if n 2 5 points are available, it is possible to estimatethe homography matrix at video-rate using for examplethe linearized algorithm presented in [8].

    Figure 1: Camera displacement modelisation2.2 Known camera calibration

    Let A be the intrinsic parameters matrix of the cam-era:( y 7 f 7 1 ' l l ,o

    A = [ f;; 7 ] (1)This is the transformation between the pixel coor-dinates and the normalized coordinates of an imagepoint. Assuming that the camera calibration is known,the Euclidean homography is calculated its follows:

    After H is computed, it can be decomposed as the sumof arotation matrix and of a rank 1 matrix [5]:H =R +td*n*T (3)

    whereR is the rotation matrix between frames3 ndF' (i.e. the homography of the planeat infinity R =H,), td* is the ratio between the translation vectort and the distance d' of C* from n- , and n* is theunit vector normal to the plane n- expressed in 3*.From H and the image features, it is thus possible todetermine these motion parameters and the structureof the reference plane. For example, the distancesdand8are unknown (where d is the current distancebetween frame3and T ) , but the ratio r = d /d * caneasily be estimated. Indeed, notingn=Rn* he vectornormal to n- , expressed in F ,we have:

    (4)dd*=- 1+nTtd*=det(H)Furthermore, the ratio p between the unknown depth2 of a point lying on n- and d * , can be computed as:

    Z rp = - = -d* nTp (5)These parameters are important since they are usedin the design of our control scheme. In the followingsubsection we will show how it is possible, in the un-calibrated domain, to obtain an analytical form of theestimated motion parameters as a function of the realmotion parameters and of the camera calibration er-rors.2.3 Unknown camera calibration

    If the camera is not perfectly calibrated andi sused instead of A, the measured normalized imagepoint 6 can be written in function of the real normal-ized onep as:

    where 6A =Ki-'A. Then, in presence of calibrationerrors, the estimated homography matrix is:

    6=SAP (6)

    H=SAHSA-~ (7)The estimated homography matrix can be decomposedas the sum of amatrix similar to arotation matrix andof arank 1matrix:

    , . AH =H,,.whereI,SARSA-1,&* = Iln*TSA-l116Atd. and- ,,ll*T6A-1 ,,. The eigenvalues of R depend on:NT- 11*~6A -'

    692

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    3/7

    the angle of rotation 8, and its eigenvector correspond-ing to the unitary eigenvalues is the axis of rotation U.Matrix H, is not a rotation matrix, but is similar toR, which implies that the two matrizes have the sameeigenvalues and the eigenvectors of H, are the eigen-vectors of R m_ultiplied by matrix SA. The estimatedrotation angle8 and tke estimated rotation axis 2,ex-tracted directly fromH,, can be written asafunctionof the real parameters and of the calibration errors:

    SAu8 =8 and U =- I SA 4 (9)It must be emphasized that, as well as the rotationangle8, the ratio T is computed without error:

    AF = det(H) =det(H)=T (10)hConsequently, since6=R-TS =+, p^is:IPA-

    3 2D 1/2 visual servoingIn this section, we present the design of our visual

    servoing scheme. In order to control the orientation ofthe camera, we use of course the 3D estimated rotationR between .F andF (which has to reach the identitymatrix). Let U be the rotation axis and6 the rotationangle obtained fromR. The rotational velocity of thecameras2 is simply expressed in function of the angularvelocity 4 around the axis of rotation U as:

    ue=n (12)The control of the camera orientation is thus decou-pled from the position one since the former is directlyavailable from the obtained partial pose. The positionof the camera can be controlled in the image space andin the Cartesian space at the same time. In order tomaintain the target in the camera field of view, weintroduced in [9]the use of two independent visual fea-tures, such as the image coordinates of a target point,and of the ratio T =$,computed from equation (4),which controls the depth between the camera and thetarget. However,it is possible to design a more elegantcontrol vector, from which stability analysis is simpler.

    Consider a point P lying in the reference plane T .It is well known that the time derivatives of its coor-dinates, expressed in the current camera frame, can bewritten as:

    P =-v+PIxs2 (13)where[PI isthe skew-symmetric matrix associated tovector P. V and s2 define the camera. velocity screw

    T = [ VT nT 1. With a perspective projectionmodel, P projects onto the image lane with normal-ized coordinates p = [ 3 y 1 ] = [ $ 5 1 1.Let us define the extended image coordanates peas:9

    where z is a supplementary normalized coordinatewhich can be computed from equation ( 5 ) . Then, thetime variation of the extended coordinates can be writ-ten as:

    -1 0here:

    Finally, the interaction matrix related to the extendedimage coordinates can be easily obtained:

    fi e = [ L, ]T (17)where:

    Equations (12)and (17) can be regrouped as follows:

    whereL is an upper triangular matrix which isalwaysfull rank. This particular form will allow us to designa control scheme with very interesting properties.4 Control law

    The positioning task controlling the 6camera d.0.f.can be described as the regulation to zero of a taskfunction [4]. In our case:

    e= [ pe-pf UT O I' (20)wherepz can be obtained from the desired image ac-quired during the off-line learning step.

    The exponential convergence of petoward pz andu6 toward 0 can be obtained by imposing e = -Xe(wherk X tunes the convergence rate). Assuming thetarget motionless (see [2, 1, 101otherwise), the controllaw is given by:

    T =-XL-'e (21)

    693

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    4/7

    whereT is the camera velocity screw sent, to the robotcontroller. As usual, the camera velocity is consideredas the control vector. In practice, small perturbationsin the robot Jacobian and calibration errors n the handto eye rigid transformation will be compensated by theclosed-loop scheme. More precisely, from (19)we have:

    Let us point out that the camera translational velocityis proportional to the desired distanced between F*and T . An approximate value has thus to be chosenduring the off-line learning stage. This value has not tobe precisely determined (by hand in the experiments).It will be proven, in the following section, that verylarge errors do not influence the stability of the system.Let us finally note that the rotational control loopisdecoupled from the translational one. A such decou-pled system and the particular form of L allow us toobtain the convergence in the half space in front of Tif exact model and perfect measurements are assumed.However,amore realistic approach consists in general-izing the previous control as:

    (23)where hats indicate that approximations are used sincethe true terms are not perfectly kn>wn. The closed-loop system will then be stable if LL-' is positive [4].In the next section, after the demonstrationof the localasymptotical stabil ity, sufficient conditions to ensurethe positiveness of the matrix L2-l are given.5 Stability analysis

    Let us assume that the only possible errors are onthe intrinsic camera parameters and on d' . The taskfunction can be reconstructedas:

    with:

    where 6All is the (2 x 2) sub-matrix of SA contain-ing the pixel lengths (see ( l)) ,p0 is the (2 x 1)sub-vector containing the error on the image center andp =h.he closed-loop system can be written:

    e=f(e)= -XQ(e)e (26)

    A A hNotingU=$$=$IISA-Tn*II, matrix Q can be writ-ten as:

    Function f is a C" vector field defined on an opensubset S of SE3. It is easy to show the existence anduniqueness of the equilibrium point:Proposition 1 The only point of equil ibri um for f , i .ea point eo E S such that f(e") =0, is eo=0.Proof of proposition 1 The existence of the equil ib-rium point is evident since i f e = 0, then f(e) =Q(e)e = 0. This equilibrium point is unique i f andonly i f det(Q(e))#0, Ve E S. Since matrix Q is up-per triangular, its determinant can be easily calculated:det(Q) = u3det(L,) det(c,l) det(ET)det(E0)=

    (28)then det(Q) #0,VeE S since v#0 and p #0.Therefore, the task has no singularity and, if the taskfunction decreases, it decreases towards 0.Theorem1 (Local asymptotic stability) The dif-ferential system (26) is locally asymptotically stablearound the equil ibrium point eo f and only if:

    In practice, these conditions are easily verified.Proof of theorem 1 Consider now the first orderTaylor expansion of the non-li near differential system(26) around the equil ibri um point:

    e M -XQ'(e")(e - eo) (30)The linearized system (30) is asymptotically stable i fand only if the eigenvalues of &(e") are positive. Theyare given by:

    (31)1 =ucY,/6 ,, A3 =U , A5 =pu/&A2 =vct.,/G, , A* =6 , A s =pa,/G,Then they are positive if and only if conditions (29)are verifi ed. Since the li neari zed system is asymptot-ically stable, the local asymptotic stability, around theequil ibrium point, of the non-l inear system is proven.We now present sufficient conditions to ensure theglobal asymptotic stability.

    694

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    5/7

    Theorem2 (Gl obal asymptotic stabil ity) Thedifferential system (26) I S 9lobally osymptotacallystable, only if conditions (29) are i i e r i f i ~ d ,and if:

    p(~l11-&A([ (1- V( ) ~~( O' AJ ~~~' ( Y)I / ( T ~o~32)where y =d m tan($) $9 the tmgent of thevision angle+fi s i ts mnxim,um, value), and:

    distance d k s 50 cm, the system will asymptoticaliyconverge for any initial position in the task space if d*is chosen between 12 and 211 cm. This result defini-tively validates the robustness of our control scheme.Moreover, similar results can be obtained by consider-ing camera calibration errors. Let us finally note thatcondition (32) is only sufficient, then the convergencecan be realized even for larger errors. In the next sec-tion, we will see experimentally that the 2D 1/2 visualservoing is robust also in presence of errors in the robotJacobian.

    (34)01 =0+1- (a- 1)2+111- 6Al12(l+T2 )>0 (35)

    (36)2 =CT +1- J(u - 1)2+116poII' >0The proof of the theorem is given in [8].This suffi-cient condition depends on many parameters and has to

    be used with some knowledge of the system geometry.As an example, we can use condition (32) to obtain asufficient condition when the hypothesis of exact mea-surements cannot be applied to the distanced" since itis estimated by hand. In that case, where perfect cam-era calibration is assumed (OA=I , o =(TI =02 =2,p =1 andv =g ) , ondition (32) can be written :h

    (37)The solution of this inequality is:

    A1- d m ' I +v"< y < 1 + 2(1 !I 2 (7)1 +2 5J27) (38)The two bounds are plotted in Figure 2 versus7.

    Figure 2: Stabilityboiiiitls for t!st,iiii;itod depthThis means that, f wecoiisidei forex;iiiipleacamera

    with a 20" vision angle (thcn7=0.364). he stabilitycondition is verified if 0.24

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    6/7

    The camera rotation (in degrees) and the directionof translation, computed from the estimated homogra-phy using the initial and desired images, are given inTable 1 n function of the camera calibration.

    exact 28.1 -33.8 96.1 -0.11 0.99 0.10

    coarse 33.0 -32.8 96.3 -0.20 0.97 0.13

    bad I 26.9 -26.6 99.0 -0.25 0.96 0.04Table 1: Motion parameters

    The pose estimations, obtained using coarse and badcamera calibration, are not very different (compare Fig-ure 4(b) and Figure 6(b)). For example, the maximalrotational error is around 5 dg for coarse calibrationand around 7dg for bad calibration. However, the re-sulting features points trajectories in the images (seeFigure 5(b) and Figure7(b)) show the influence of badcalibration on the convergenceof the system.6.1 Coarse calibration

    The error on the extended image coordinates of thereference point are plotted in Figure 4a. The estimatedrotation is plotted in Figure 4b.

    (a) (b)

    m 1 I% am -(c> (4

    Figure 4: Error on extended image coordinates (a), uB(dg) (b), translational velocity V (cm/s) (c) and rota-tional velocity 0 (dg/s) (d) versus iteration number

    The outputs of the control law are given in Figure 4cand Figure 4d. The obtained results are particularlystable and robust and the error decreases exponentiallyto0. Finally, the error on the image coordinates of eachtarget point is given in Figure 5a and the correspond-ing trajectory in the image is given in Figure5b. Thetrajectory of the chosen reference point can be easilyidentified since it looks likeastraight line in the image.

    I I $I n- m 0 I. s s "0 m m I I * e w(a) (b)

    Figure 5: Error on pixels coordinates (a) and trajectoryin the image (b) of the target pointsThe convergence of the coordinates to their desired val-ues demonstrates the correct realization of the posi-tioning task. We can finally note that, due to the im-portant displacement between the initial and desiredcamera poses, image-based and position-based visualservoings fail in that case.6.2 Bad calibrationThe obtained results are given in Figure 6 and Figure7.

    Figure6: Error on extended image coordinates (a),ut9(dg) (b), translational velocity V(cm/s) (c) and rota-tional velocity0 (dg/s) (d) versus iteration number

    696

  • 7/29/2019 2D 1 2 Visual Servoing Stability Analysis With Respect to Camera Calibration Errors

    7/7

    As can be seen in Figure 6a, the convergence of theerror is no more perfectly exponential. This is dueto the bad calibration of the camera and the roughapproximationof d* (which has no influence using acoarse calibration). However, even in this worse case,we can note the stability and the robustness of thecontrol law. Contrarily to the previous experiment, thetrajectory of the reference point in the image is no moreastraight line since the camera is bad calibratedas wellas the homogeneous transformation matrix between thecamera and the robot end-effector frame. However, theconvergence of the coordinates to their desired valuesdemonstrates the correct realization of the task.

    w.

    m -

    Figure7: Error on pixels coordinates (a) and trajectoryin the image (b) of the target points

    7 ConclusionIn this paper we have proposed a new approach to

    vision-based robot control which presents many advan-tages with respect to the classical position-based andimage-based visual servoings. Thanks to its simplestructure analytical results on its robustness can beobtained. The necessary and sufficient conditions forlocal asymptotic stabil ity and sufficient conditions forglobal asymptotic stability in presence of camera cal-ibration errors have been obtained. Experimental re-sults show the validity of our approach and its robust-ness not only with respect to camera calibration errorsbut also to robot calibration errors. Future work willbe devoted to find an adaptive control scheme in orderto maintain the target in the image even in presence ofvery large calibration errors, and to the coupling of ourcontrol scheme with real objects and complex images.Acknowledgments

    References[ l ] P. K. Allen, A. Timcenko, Y . B., andP. Michelman.Automated tracking and grasping of a moving ob-ject with a robotic hand-eye system. I EE E Trans. onRobotics and Automation, 9(2): 152-165, April 1993.

    Compensation of2] F. Bensalah and F. Chaumette.abrupt motion changes in target tracking by visualservoing. In I EE E /RSJ I nt. Conf. on IntelligentRobots and Systems, I ROS'95, volume1, pages 181-187, Pittsburgh, Pennsylvania, August 1995.B. Espiau. Effectof camera calibration errors on vi-sual servoing in robotics. In 3rd I nternational Sympo-sium on Experimental Robotics, Kyoto, J apan, Octo-ber 1993.B. Espiau, F. Chaumette, and P. Rives. A new ap-proach to visual servoing in robotics. IEEE Trans. onRobotics and Automation, 8(3):313-326, June1992.0.Faugeras and F. Lustman. Motion and structurefrom motion in a piecewise planar environment. I nter-national J ournal of P attern Recognition and ArtificialI ntell igence, 2(3):485-508, 1988.K. Hashimoto, editor. Visual Servoing: Real T imeControl of Robot manipulators based on visual sen-sory feedback, volume7 of World Scientific Series inRobotics and Automated Systems. World ScientificPress, Singapore,1993.S. Hutchinson,G. D. Hager, andP. I. Corke. A tutorialon visual servo control. I EEE Trans. on Robotics andAutomation, 12(5): 651-670, October 1996.E. Malis, F. Chaumette, and S. Boudet.2D 112 visual servoing. Technical Report3387, INRIA, March 1998. Available onftp://ftp.inria.fr/INRIA/publication/RR-3387.ps.gz.E. Malis, F. Chaumette, andS. Boudet. Positioning acoarse-calibrated camera with respect to an unknownobject by 2D 112 visual servoing. In IEEE Interna-tional Conference on Robotics and Automation, vol-ume2, pages1352- 1359, Luvein, Belgium, May1998.N. P. Papanikolopoulos,P. K. Kosla, andT. Kanade.Visual tracking of a moving target by a cameramounted on a robot: a combination of control andvision. I EEE !l'rans. on Robotics and Automation,9(1):14-35, February 1993.W. J . Wilson, C. C. W. Hulls, and G. S. Bell. Rela-tive end-effector control using Cartesian position-basedvisual servoing.I EEE Trans. on Robotics and Automa-tzon, 12(5): 684-696, October 1996.

    This work was supported by the national FrenchCompany of Electricity Power: EDF. We are gratefulto the team manager and the researchers of the Tele-operation/Robotics group, at DER Chittou, for theirparticipation and help.

    697

    ftp://ftp.inria.fr/INRIA/publication/RR-3387.ps.gzftp://ftp.inria.fr/INRIA/publication/RR-3387.ps.gz