Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

Embed Size (px)

Citation preview

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    1/11

    Implementing of Multi Fuzzy Controllers for Multi Sensors to SteerWheeled Mobile Robot

    Mohammed Majed Mohammed AlK halidy, Ph.D*.Rami.A. Mahir, Ph.D, and Mohammed. Z. Al-Faiz, Ph.D

    *

    E-mail: [email protected]

    Abstract:The paper addresses three important topics in mobile robotics. These are the path follower, the multi-sensorsand the steering fuzzy controllers for wheeled mobile robot. This paper presents a novel computer vision methodology forbuilding a system capable of determining the presence of a path follower, tracking the objects on that path, and recognizingthe objects shape in vertical or in horizontal in a complex and dynamic environments. The mechanism of multi-sensorstogether with the steering fuzzy controllers is the basis for the robots capability of following a certain path. The resultsobtained by this procedure provide an adequate basis for the robot to successfully perform the path follow task.

    Keywords:Kinematics of nonholonomicwheeled mobile robot; Multi-Sensors; Multi Fuzzy Controllers and ComputerVision.

    1. Introduction

    The field of mobile robot control has been focus of active research in the past decades. Despite the apparent simplicityof the kinematic model of a Wheeled Mobile Robot (WMR), the existence of nonholonomic constraints turns the design ofstabilizing control laws for those systems to a considerable challenge. When implementing mobile robots, there are someproblems related with the mathematical modeling of the kinematics and of the dynamics, difficulties to estimate theorientation and the position of the robot, some complexity in the control design and also when planning a path to be tracked.

    2. Constructions of Wheeled Mobile Robot

    In this paper mobile robot made up of a rigid body and non deforming wheels is considered. It is assumed that thevehicle moves on a plane without slipping, i.e., there is a pure rolling contact between the wheels and the ground. TheWMR which we name Roc1 is illustrated in Figure (1).

    (a) (b)Figure (1) Roc1 WMR a) without cover b) with cover

    The Free body diagram for the new modeling of nonholonomic WMR (Roc1) is illustrated in Figure (2)

    1

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    2/11

    Figure (2) Free body diagram for WMR

    The motion model of the new nonholonomic WMR as it is proved in the dissertation [1]:

    ++

    =

    R

    L

    d

    r

    d

    r

    d

    r

    d

    r

    rr

    rr

    y

    x

    ll

    &

    &

    &

    1

    21

    2

    2

    1

    21

    2

    2

    22

    22

    )]sin()[sgn()]sin()[sgn(2

    sin

    2

    sin2

    cos

    2

    cos

    (1)

    The forward velocity for this model is:

    ++

    ++

    +

    +

    =

    R

    L

    d

    r

    d

    rr

    d

    r

    d

    rr

    d

    r

    d

    rr

    d

    r

    d

    rr

    x

    x

    ll

    ll

    ll

    ll

    &

    &

    1

    21

    2

    22

    1

    21

    2

    22

    1

    21

    2

    22

    1

    21

    2

    22

    2

    1

    )]sin()[sgn(cos2sin)]sin()[sgn(cos

    2sin

    )]sin()[sgn(sin

    2

    cos)]sin()[sgn(sin

    2

    cos

    (2)

    3. Robot Vision & Sensors

    Two types of color camera are used; the first is CMOS and the second is CCD camera, mounted on a head of the WMR.Two magnetic sensors are used and fixed on the front sides of the robot, one for the left and the other for the right. Motion

    detection sensor is also used and fixed in the front centre as shown in Figure (3)

    2

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    3/11

    Figure (3) Construction of Roc1 WMR

    The PC USB camera used is the PHILIPS cu 2001 camera. It is a color CMOS technology camera with true 640 x 480resolutions, auto brightness, white balance saturation and IrDa night-vision crystal light. The focus can be set manually andthe orientation of the camera is easy to change. The vision in Roc1 depends on this type of camera, this camera is the eyefor the robot, through which the robot could recognize its right path. The CCD camera used in this work is a wirelesscamera model Lyd-208C.Magnetic sensor is new in use with WMR. It is believed that it is the first time for using such application that has proved theefficiency of using this type of sensors in the robotic technologies. As mentioned before two of these sensors are used onefor the left and the other for the right. These sensors are used to assist the robot to generate the right steering commands incomplement with the other sensors (vision sensor), also when the vision sensor fails for some reason, magnetic sensors willbe good compensators.Motion detection sensor is familiar in the robotic technologies although its applications are different. In this work themotion detection sensor is used to detect any front motion cross the robot track and to generate a signal to stop the robot fora few seconds. The coverage detector angle is 110o-120o and the maximum range of detection is 12m.

    4. Steering Fuzzy Controllers Design

    In this paper work the fuzzy controllers were designed depending on the random distribution inputs from the sensors:

    two magnetic sensors one for the left and the other for the right , forward motion detection sensor and

    forward camera).SLM SRM SMd

    The camera on the WMR is transmitted back to a computer; this computer has the necessary image processing softwareand runs a program to do the automatic lateral control see Figure (4). The actual image processing, is to obtain the positionof the vehicle with respect to the path follower from the snapshot image. In the second step this information is used as theinput to a control algorithm. The output of this algorithm will be a steering angle that will maintain the WMR in a desiredposition on the path. The control algorithm can be based on fuzzy controllers.

    Figure (4) Bock diagram of camera control

    3

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    4/11

    In general, since there was various information taken from the snapshot images of the camera (threshold, line detection,edge detection, Euler number and number of objects), the information obtained was not consistent with each other, andsome of these information affected by:

    1. Manifestation of some blemish or stain in the image.2. No constancy of illumination.3. Un obviously of some lines and edges.4. Exceed the path limitation because of differential drive error.

    Because of all that and for more certainly it is found that it is necessary to design a Preceding Fuzzy Controller (PFC) forthe camera that limits the priorities from the information, to be the input to the fuzzy controller (FC) as illustrated in Figure(5).

    Figure (5) Block diagram of PFC & FC

    After test and analysis of one hundred snapshot images of multiple paths, as it is illustrated in Figure (6) for the verticaland horizontal objects position, it is found that whenever vertical edge detection is determined by Prewitt operator and edgedetection by Roberts operator is determined for these images, the numbers of objects subtracted from the original binarizedimage are always proportional to the shape positions. Table (1) shows the object decision of this way.

    Figure (6) Snapshot path images a) Vertical Objects b) Horizontal ObjectsTable (1) Objects Decision

    Binarize objectshapes in Image

    No. ofvertical

    edge dete.by Prewitt

    (V)

    No. of edgedetection by

    Roberts(H)

    No. ofsubtraction

    edge detection

    ObjectsDecision

    V Hbwh V =

    bwh -H =

    Then, theobject in

    horizontalposition

    V Hbwv V =

    bwv - H =

    Then, theobject inverticalposition

    bwh

    bwv

    4

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    5/11

    To make sure from this way, a practical statistical ratio for one hundred snapshot path images are obtained as shown inTable (2). The output results of this way are taken to use as an input to PFC before FC as it is mentioned before, and thenew matching algorithm becomes as illustrated in Figure (7).

    Table (2) Practical statistics ratio

    Conformity Unknown Nonconformity

    85% 5% 10%

    -+

    -+

    Object decision

    No. ofObjects

    Scan the path

    Capture an

    image

    Determine thethreshold

    Binarize theimage

    Determine the

    number ofobjects

    Determine Eulernumber

    PFC2

    Determine verticaledge detection by

    Prewitt

    Determine edgedetection by Roberts

    PFC1

    CPFC

    No. Objects

    Start

    Figure (7) Matching architecture algorithm

    5

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    6/11

    The final block diagram which is described the control of the WMR with sensors is shown in Figure (8).

    Figure (8) Bock diagram of sensors control

    The design of each fuzzy controller mentioned in the previous section is presented and illustrated in this section asfollows:

    Design of Precedence Fuzzy Controller1 (PFC1)The primary control goal of this controller is to detect the direction and deliver a proper output to the second fuzzy

    controller (PFC2), and this output is used for more certainty. The two inputs of this controller are taken from the camera andthey are; threshold and number of object with Euler number. The threshold universe of discourse range is [0-1], and for thenumber of objects and Euler (Nob&Euler) is [0-15].

    It is known that in a fuzzy logic controller (FLC), the dynamic behavior of a fuzzy system is characterized by a set oflinguistic description rules based on expert knowledge; Table (3) shows the rule base of the PFC1.

    Table (3) Rule base for the PFC1

    threshold

    S1 R1 L1 F1

    F2 S F F F

    L2 S L L L

    R2 S R R RNob&Euler

    S2 S S S S

    where, the output linguistic labels are:

    F =forward =2L =left =1R =right =3S =stop =0

    6

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    7/11

    Design of Precedence Fuzzy Controller2 (PFC2)The main goal of this controller is to make certain the turns direction. The two inputs of this controller are taken

    from the camera after image processing and matching development algorithm mentioned before, and they are; V~Prewittand H~Roberts. The V~Prewitt universe of discourse range is [0-15], and for the H~Roberts is [0-15].

    Table (4) shows the rule base of the PFC2.

    Table (4) Rule base for the PFC2

    H~Roberts

    Sh Rh Lh Fh

    Fv S S S S

    Lv V V N S

    Rv V N H SV~Prewitt

    Sv N H H S

    where, the output linguistic labels are:

    V =forward =2

    H =turn =1N =equal =3S =stop =0

    Design of Collector Precedence Fuzzy Controller (CPFC)This type of controller is used to collect the output of the two precedence fuzzy controllers (PFC1, PFC2) and use it as

    inputs. The output ofCPFC will be the final decision that is get from the camera sensor. The PFC1 universe of discourserange is [0-3], and for the PFC2 is [0-3]. Table (5) shows the rule base of the CPFC.

    Table (5) Rule base for the CPFC

    PFC2

    S N H V

    F Sc Fc Fc Fc

    L Sc Lc Lc Fc

    R Sc Rc Rc FcPFC1

    S Sc Sc Sc Sc

    where, the output linguistic labels are:

    Fc =forward =2Lc =left =1Rc =right =3

    Sc =stop =0

    Design of Fuzzy Controller (FC)The fuzzy controller (FC) is the final stage of thecommand fuzzy processing. This controller combines in its input

    two signals; the signal from the sensors ( two magnetic sensors one for the left and the other for the right and

    forward motion detection sensor ) and the output of theCPFC. The sensors universe of discourse range is [0-15], and

    for the CPFC is [0-3]. Table (6) shows the rule base of the FC.

    SLM SRM

    SMd

    7

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    8/11

    Table (6) Rule base for the FC

    CPFC

    Sc Rc Lc Fc

    Fs F F F F

    Ls L L L L

    Rs R R R RSensors

    Ss S S S S

    where, the sensors input signals are represented as:

    Fs=forward =2 or 14Ls=left =10Rs=right =6Ss=stop =3, 7, 11 or 15

    The output linguistic labels which represent the WMR directions steering commands are:

    F =forward =2L =left =1R =right =3S =stop =0

    Figure (9) illustrates these directions.

    F

    RL

    S

    R

    L

    F

    S

    Figure (9) WMR directions steering commands

    5. Practical Results

    The WMR that follows the path is tested depending on the magnetic sensors with the vision sensor. The results of thiscase are illustrated in Figure (10); where, the first picture shows the WMR reaching the natural magnetic. Next, picturedescribes the WMR when it detects the left natural magnetic. Next, picture shows WMR reaching the left turn. Next, two

    pictures represent WMR through the left turn. Next, picture depicts WMR when it detects the right natural magnetic. Next,picture shows WMR reach the rights turn. Next, picture shows WMR through the right turn (back view). Next, pictureshows WMR through the right turn (front view). Next, picture shows WMR finished the right turn. Next, picture showsWMR through path tracking. Next, picture shows WMR reach the end of path.

    8

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    9/11

    Figure (10) WMR follows the path

    6. Conclusion

    The paper presents a novel methodology for computer vision-based robot navigation. A robust, multi-sensors navigationloops is also required for providing high accuracy and high frequency vehicle pose estimates.

    One of the key observations is that successful navigation systems should result from the synergistic combination of aset of fundamental principals:

    1. A synergistic of multi-sensors.2. An appropriate path to follow.3. An attention mechanism for handling the complexity of the perception process for the multi-sensors signals,

    thus allowing for an efficient use of computational resources.4. The behavior of the steering controller (action controller) appears arduous to track the path follower error. This

    fact makes the fuzzy controller developed enhance rapidly to provide smoother steering control.

    The speed of the robot was set to 0.5 rad/sec in the live tests, but tests on saved images show that higher speeds would behandled successfully. Though the focus was not to create a high speed application, the robustness was more important, andthe robot can only drive at 0.5 rad/sec anyway. All in all, the path following ability was high, and none of the choices madewere regretted. The WMR path follower created in this project performed well in batch tests. The unmotivated stops in the

    tests were few and these stops were made in images with difficult conditions. Single paths were followed in a satisfactorymanner. The robustness was high, it never tried to drive off the path, and the robot almost always stopped when it wassupposed to.

    The entire tests for sensors types on the path and surroundings were handled successfully. The factors limiting the pathwidth are the camera, its height, and the angle at which it is pointed towards the ground.

    9

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    10/11

    7. References

    [1] M. M. Khalidy Design and Implementation of Wheeled Mobile Robot Using Intelligent Controllers , Ph. D.dissertation, University of Technology, Baghdad, 2007.

    [2] Evolution Robotics company Introduction to Evolution Robotics Technologies, Evolution Robotics, Inc. EvolutionRobotics, ERSP. 2003.

    [3] S. M. LaValle Planning Algorithms, book, University of Illinois, August 31, 2005.

    [4] O. L. . Abdlkadir Improvement of an Image Processing System for Autonomous Mobile Robot Navigation.Paper from Internet Informations, 2000.

    [5] J. F. Reid, Precision Guidance of Agricultural Vehicles, UILU-ENG- 7031, 1998.[6] K. H. Low and Y. P. Leow Kinematic modeling, mobility analysis and design of wheeled mobile robots,VSP and

    Robotics Society of Japan, Advanced Robotics, Vol. 19, No. 1, pp. 7399, 2005.[7] J . R. Asensio and L. Montano A Kinematics and Dynamic Modle-Based Motion Controller for Mobile Robots,

    Department of Computer Science and Systems Engineering University of Zaragoza, Spain, 2002.[8] Z. P. J iang, E. Lefeber and H. Nijeijer Stabilization and Tracking of A Nonholonomic Mobile Robot With

    Saturating Actuators, 3rd Portugese Conference on Automatic Control Coimbra, Portugal, 911 September, 1998.[9] M. Bisgaard, D. Vinther, K . Qstergaard and J. Bendtsten Simulation, Sensor Fusion and Model Verification For A

    Mobile Robot,Department of Control Engineering, Aalborg University Fredrik Bajersvej 7C , DK-9220 AalborgEast, Denmark. 2003.

    [10] R. Costantini and S. Susstrunk Virtual Sensor Design, Audiovisual Communication Laboratory, EPFL, Lausanne,

    Switzerland, 2004.[11] R. Madhavan and E. Messina Quantifying Uncertainty Towards Information-Centric Unmanned Navigation,National Institute of Standards and Technology (NIST), Gaithersburg, MD, 2003.

    [12] Y. Guo, Z. Qu and J. Wang A New Performance-Based Motion Planner for Nonholonomic Mobile Robots,School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, FL 32816-2450,1998.

    [13] K. Macek, I. Petrovic and R. Siegwart A Control Method for Stable and Smooth Path Following of MobileRobots, Swiss Federal Institute of Technology Lausanne, Switzerland, 2005.

    [14] C. Zhou FuzzyArithmeticBased Lyapunov Synthesis in the Design of Stsble Fuzzy Controllers: AComputingWithWords Approach, Int. J. Appl. Math. Comput. Sci., Vol.12, No.3, 411421, 2002.

    [15] J . Park, J. Kim and B. Ahn Evolutionary Learning For Fuzzy Path Planning of Shooting Action For RobotSoccer, Gwangju Institute of Science and Technology (GIST) Department of Mechatronics, 2004

    [16] A. Riid and E. Rstern Fuzzy Logic in Control: Truck Backer-Upper Problem Revisited, Department ofComputer Control Tallinn Technical University, 2001.

    [17] V. M. Olivera and R. Simmons Implementing Human-Acceptable Navigational Behavior and a Fuzzy Controllerfor an Autonomous Robot, Spanish Ministry of Science and Technology under the PROFIT program (FIT-070000-2001-118), 2001.

    [18] J. Pan Fuzzy-Nav: A Vision-Based Robot Navigation Architecture using Fuzzy Inference for Uncertainty-Reasoning, this paper appears in the Proceedings of the World Congress on Neural Networks. Vol. 2, pp. 602-607,Washington D.C., J uly 17-12, 1995.

    [19] T. Kyriacou Vision-Based Urban Navigation Procedures for Verbally Instructed Robots, Robotic IntelligenceLaboratory, School of Computing, University of Plymouth, Plymouth, United Kingdom, 2002.

    [20]Y. Amit2D Object Detection and Recognition,Massachusetts Institute of Technology, Inc. 2002.[21] R. Van Dam and G. Briggs Enhanced Quagent control with vision-based Object Recognition, Computer Science

    242: Artificial Intelligence April 27, 2004.[22] R. Costantini and S. Susstrunk Virtual Sensor Design, Audiovisual Communication Laboratory, EPFL, Lausanne,

    Switzerland, 2000.

    [23] S. E. Umbaugh, Computer Vision and Image Processing Prentice Hall PTR Prentice-Hall, Inc., 1998.[24] R. C. Gonzalez and R. E. W. Digital Image Processing Addison-Weslay Publishing Company, Inc. 1992.[25] R. Henderson CMOS Digital Camera Controller and Frame Capture Device, Oregon Institute of Technology,

    May 26, 2002.[26] D. Litwiller CMOS vs CCD, reprinted from PHOTONICS SPECTRA Laurin Publishing, August 2005.[27] G.E. Healey and R. Kondepndy Radiometric CCD Camera Calibration and Noise Estimation, IEEE Trans. On

    Pattern Analysis and Machine Intelligence, Vol. 16. No. 3. March 1994.[28] R. E. Cummings Intelligent Robot Vision Sensors in VLSI,Departmentof Electrical and Computer Engineering,

    Johns Hopkins University, Baltimore, MD 21218, 1998.[29] P. Chandak Study and Implementation of Follow the Leader, thesis, Msc. University of Cincinnati, 2002.

    10

  • 7/29/2019 Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot 3rd ICMSAO J

    11/11

    [30] P. Ridao, J. Forest, L. Pacheco and X. Cuf Sensorial and Navigation Systems for A Mobile Robot (ROGER),Computer Vision and Robotics Group, 1994.

    [31] K. C. Ng and M. M. Trivedi A Neuro-Fuzzy Controller for Mobile Robot Navigation and MultirobotConvoying, IEEE Transactions on systems, man, and Cyberneticspart B: Cybernetics, Vol. 28, no. 6, December1998.

    [32] N. M. Taalabi Design and Application of Fuzzy Logic For the Speed Control of A Direct-Drive DC Motor,BP 6207 Rabat Instituts. Rabat Maroc. March, 2002.

    [33] H.R. Berenji, Fuzzy Logic controllers, in An introduction to Fuzzy Logic Application in Intelligent Systems,

    R.R.Y ager and L.A. Zadeh, Eds. Boston, M.A:Kluwer, 1992, pp 69-96.[34] Matlab Users Guide Fuzzy Logic Toolbox, Version 2, 2002.[35] K. M. Passino and S. Yurkovich Fuzzy control, Addison Wesley Longman, INS. 1998.[36] L. Reznik Fuzzy Controllers, Victoria University of Technology, Melbourne, Australia, Lionid Reznik, 1997.

    Design of fuzzy controllers[37] J . Jantzen " ", Technical University of Denmark, May 15, 1998.[38] J . Mireles Kinematic Models of Mobile Robots,EE 5325/4315, Summer 2004.[39] J .-P. Laumond. Trajectories for mobile robots with kinematic and environment constraints. In Proc. of

    International Conference on Intelligent Autonomous Systems, pages 346{354, 1986.[40] J .-P. Laumond Robot Motion Plannning and Control, Springer-Verlag, Berlin, 1998. Available online at

    http://www.laas.fr/jpl/book.html.[41] Z. Li and J. F. Canny Nonholonomic Motion Planning, Kluwer Academic Publishers, Boston, MA, 1993.[42] University of Pennsylvania, Mobile Robot Kinematics and Control, 2004.[43] J . D. Bendtsen, P. Andersen, and T. S. Pedersen Robust Feedback L inearization-based Control Design for a

    Wheeled Mobile Robot, Paper from Internet Informations, 2000.[44] A. Riid and E. Rstern Fuzzy Logic in Control: Truck Backer-Upper Problem Revisited, Department of

    Computer Control, Tallinn Technical University Ehitajate tee 5, Tallinn, 19086, Estonia ,2001.[45] J . Zhang, K. Huebner and A. Knoll Learning based situation recognition by sectoring omnidirectional images for

    robot localisation, Technical Computer Science, Faculty of Technology, 2001.[46] O. M. C. Williams Guidance of a Mobile Robot using Computer Vision over a Distributed System, E

    2001/2002.[47] C. M. Bergren Anatomy of a Robot, McGraw-Hill Companies, Inc. 2003.[48] P. E. SandinRobot Mechanisms and Mechanical Devices Illustrated,McGraw-Hill Companies, Inc. 2003.[49] B. SandierRobotics Designing the Mechanisms for Automated Machinery,Academic Press, Inc. 1999.[50]J . LueckeAnalog and Digital Circuits for Electronic Control System Applications, Elsevier Inc. 2005.[51] W. Kester Data Conversion Handbook, Newnes, Elsevier by Analog Devices, Inc.2005.[52] Y . Q. Shi and H. Sun, Image and Video Compression for Multimedia Engineering CRC Press LLC, 1999.

    [53] E. R. Dougherty, Random Processes for Image and Signal Processing, The Society of Photo-OpticalInstrumentation Engineers. 1999.[54] D. Phillips,Image Processing in C, R&D Publications, Inc. 1994.[55] R. C. Gonzalez and R. E. W. Digital Image Processing Addison-Weslay Publishing Company, Inc. 1992.[56] V. B. Rao and H. V. Rao C++Neural Networks and Fuzzy Logic Management Information Source, Inc.

    1993.[57] J. Gomes and L. VelhoImage Processing for Computer GraphicsSpringer-Verlag New York, Inc. 1997.[58] H. Demuth and M. Beale Neural Network ToolboxMath Works, Inc. 1998.[59] P. J . Gibson and C. H. PowerIntroduction Remote Sensing Digital Image Processing and Applications,

    Goldin-Rudahl Systems,Amherst, MA 01002,USA. Inc.2000.[60] T. B. Lauwers, G. A. Kantor and R. L. Hollis A Dynamically Stable Single-Wheeled Mobile Robot with Inverse

    Mouse-Ball Drive, Proc. IEEE Intl. Conf. on Robotics and Automation, Orlando, FL, May 15-19, 2006.[61] D. C. Conner, H. Choset and A. A. Rizzi Towards Provable Navigation and Control of Nonholonomically

    Constrained Convex-Bodied Systems, Proceedings of the 2006 IEEE International Conference on Robotics and

    Automation Orlando, Florida - May 2006.[62] P. AnPC Interfacing using Centronic, RS232 and Game Ports,Newnes, first published, Pei An, Inc.1998.

    11

    http://www.laas.fr/http://www.laas.fr/