6
Proceedings of the 2006 IEEE International Conference on Robotics and Biomimetics December 17 - 20, 2006, Kunming, China A Visual Guided 4 D.O.F Miniature Robot System for Micro-Handling Purpose Yu Song, Mantian Li, Qingling Li, Lining Sun Robotics Institute Harbin Institute of Technology Harbin,HeiLongJiang Province, China sy_touch 7925@sina. com Abstract - A visual guided 4D.O.F miniature robot system is presented. The highlights of the robot are macro/micro dual driven motion mode, embedded hardware, all time visual servoing, wireless power supply and wireless communication. With the macro motion mode, the robot is a typical wheeled mobile robot, which is driven by two small size brushless DC motors. With the micro motion mode, based on the principle of inchworm, the robot can move freely with micron resolution. To guide the robot accomplishing micro-handling task automatically, an external visual sensor system is designed. The visual system includes two parts that are global visual system and local visual system. The global visual sensor is a CCD camera mounted vertically to detect pose of the robot in image space, and guides end-effector of the robot gripper to the view- field of optical microscope. The local visual system, which consists of optical microscopes and CCD cameras, provides the micron resolution information for the robot. Experiments show that the proposed robot system is effectively in providing micro- handling task. Index Terms -Miniature robot, Dual driven, Micro handling, Visual servoing control I. INTRODUCTION Micro-handling assisted by miniature mobile robot has become a research hot in robotics and micro handling domain. In recent years, various micro-handling oriented miniature robot systems have been proposed, such as MINIMAN system [1], MINWALKER system [2], NANOROBOTICS system [3] and MICRON system [4]. These miniature robots stand for development trend of miniature robots research. For accomplishing the micro- handling task effectively, the miniature robot must have several important characteristics as: (1) Miniaturized and compactable embedded hardware. (2) High positioning accuracy and motion resolution. (3) High motion speed. (4)Onboard power supplying and wireless communication so that the robot can move freely without influence of cable. (5) Intelligent sensor system. Our group develops a novel 4 D.O.F (degrees of freedom) macro/micro dual driven wireless miniature robot system. By comparing it with other micro-handling oriented miniature robot systems [1-4], the main improvements of this design are macro/micro dual driven mode, embedded hardware, wireless power supply and wireless communication Hereinto, development of a mobile robot with dual motions is the first challenge and the main original contribution. With the macro motion mode, the robot is a typical wheeled mobile robot, which is driven by two small size brushless DC motors. With the micro motion mode, based on the principle of inchworm, the robot can move freely with micron resolution. To guide the robot accomplishing micro handling task automatically, an intelligent external vision sensor system is designed for the robot visual feedback control in real-time. With the help of external visual system, our robot is able to perform transportation and high precision manipulation of the micro parts automatically. The remainder of this paper is organized as follows: In Section II, our miniature mobile robot will be introduced. Then the vision based high-level control will be discussed in Section III. In Section IV, the experiment shows the effectiveness of proposal robot system. Finally, Section V is our conclusions. II. MINIATURE MOBILE ROBOT SYSTEM Fig. 1 Prototype of macro/micro dual driven wireless miniature mobile robot Fig. 1 is the prototype of our 4 D.O.F macro/micro dual driven wireless miniature robot. It is about 90 mm in length, 70 mm in width and 70 mm in height. Two rechargeable lithium batteries are used as the power-supplying source. All of the hardware driving circuits, low-level controller are integrated into the robot body. Communication between high- level host computer and the low-level embedded control system is realized by using blue-tooth wireless * This work is supported by the High Technology Research and Development Program of China (No. 2003AA404190) 1-4244-0571-8/06/$20.00 C)2006 IEEE la 468

[IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

  • Upload
    lining

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

Page 1: [IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

Proceedings of the 2006 IEEEInternational Conference on Robotics and Biomimetics

December 17 - 20, 2006, Kunming, China

A Visual Guided 4 D.O.F Miniature Robot System forMicro-Handling Purpose

Yu Song, Mantian Li, Qingling Li, Lining Sun

Robotics InstituteHarbin Institute of Technology

Harbin,HeiLongJiang Province, Chinasy_touch 7925@sina. com

Abstract - A visual guided 4D.O.F miniature robot system ispresented. The highlights of the robot are macro/micro dualdriven motion mode, embedded hardware, all time visualservoing, wireless power supply and wireless communication.With the macro motion mode, the robot is a typical wheeledmobile robot, which is driven by two small size brushless DCmotors. With the micro motion mode, based on the principle ofinchworm, the robot can move freely with micron resolution. Toguide the robot accomplishing micro-handling taskautomatically, an external visual sensor system is designed. Thevisual system includes two parts that are global visual systemand local visual system. The global visual sensor is a CCDcamera mounted vertically to detect pose of the robot in imagespace, and guides end-effector of the robot gripper to the view-field of optical microscope. The local visual system, whichconsists of optical microscopes and CCD cameras, provides themicron resolution information for the robot. Experiments showthat the proposed robot system is effectively in providing micro-handling task.

Index Terms -Miniature robot, Dual driven, Micro handling,Visual servoing control

I. INTRODUCTION

Micro-handling assisted by miniature mobile robot hasbecome a research hot in robotics and micro handlingdomain. In recent years, various micro-handling orientedminiature robot systems have been proposed, such asMINIMAN system [1], MINWALKER system [2],NANOROBOTICS system [3] and MICRON system [4].These miniature robots stand for development trend ofminiature robots research. For accomplishing the micro-handling task effectively, the miniature robot must haveseveral important characteristics as: (1) Miniaturized andcompactable embedded hardware. (2) High positioningaccuracy and motion resolution. (3) High motion speed.(4)Onboard power supplying and wireless communication sothat the robot can move freely without influence of cable. (5)Intelligent sensor system.

Our group develops a novel 4 D.O.F (degrees offreedom) macro/micro dual driven wireless miniature robotsystem. By comparing it with other micro-handling orientedminiature robot systems [1-4], the main improvements of this

design are macro/micro dual driven mode, embeddedhardware, wireless power supply and wireless communicationHereinto, development of a mobile robot with dual motions isthe first challenge and the main original contribution. Withthe macro motion mode, the robot is a typical wheeled mobilerobot, which is driven by two small size brushless DC motors.With the micro motion mode, based on the principle ofinchworm, the robot can move freely with micron resolution.To guide the robot accomplishing micro handling taskautomatically, an intelligent external vision sensor system isdesigned for the robot visual feedback control in real-time.With the help of external visual system, our robot is able toperform transportation and high precision manipulation ofthe micro parts automatically.

The remainder of this paper is organized as follows: InSection II, our miniature mobile robot will be introduced.Then the vision based high-level control will be discussed inSection III. In Section IV, the experiment shows theeffectiveness of proposal robot system. Finally, Section V isour conclusions.

II. MINIATURE MOBILE ROBOT SYSTEM

Fig. 1 Prototype ofmacro/micro dual driven wireless miniature mobile robot

Fig.1 is the prototype of our 4 D.O.F macro/micro dualdriven wireless miniature robot. It is about 90 mm in length,70 mm in width and 70 mm in height. Two rechargeablelithium batteries are used as the power-supplying source. Allof the hardware driving circuits, low-level controller areintegrated into the robot body. Communication between high-level host computer and the low-level embedded controlsystem is realized by using blue-tooth wireless

* This work is supported by the High Technology Research and Development Program of China (No. 2003AA404190)

1-4244-0571-8/06/$20.00 C)2006 IEEE

la

468

Page 2: [IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

communication module (JINOU-3264, made in china), whichis RS232C interface connector.

A. Mobile Positioning Module1) Macro Motion Function Unit

With the macro motion mode, the robot is a typicaldifferential driven wheeled mobile robot. As shown in Fig.2,to satisfy the requirement of small size, two brushless DCmotors (SMOOVY, made in Switzerland) with planet-gearspeed reducer 1:125 rates is used to drive the two wheels.And two supporting metal balls (6 millimetres in diameter)are used to enhance the motion stabilization of the robot.2) Micro Motion Function Unit

The function unit realizes the robot slowly moving withmicron resolution. As shown in Fig.2, the unit is XYO3D.O.F mechanism, which is driven by four piezoelectricactuators and four electromagnets based on inchwormprinciple [2]. The piezoelectric actuator is stacked typepiezoelectric elements of 5 mm x 5 mm x 10 mm (PI Company,Germany). Each piezoelectric actuator is pre-loaded bymounting it in ellipse mechanism. The resistance of anelectromagnet is about 80 QD. Iron core has a diameter of2mmand the electrical wire has a diameter of 0.1mm. Theelectromagnet can be fixed stably on the worktable with100mA driving current and no overheating.DC motor Torque motor Wheel Supporting ball

Germany) is integrated on the micromanipulator. It is two-level amplifying mechanism based on flexible hinge.

Balance weight - Piezo-actuator

Fig.3 Prototype ofmicromanipulator module

C. Embedded control hardware descriptionDesign embedded control system is a main challenge for

the miniature robot system. In our project, the control anddriving objectives including:4 Two DC motors for macro motion of the robot.4-K Four piezoelectric actuators and four electromagnets for

inchworm micro motion of the robot. .

4-K Four piezoelectric actuators for micromanipulator andgripper.

4 One torque motor for motion mode switching unit.4 RS232C Bluetooth communication.

Power source generation module

Electromagnet Piezoelectric actuator

Fig.2 Mobile positioning module and its CAD model

3) Motion Mode Switching Function UnitThe unit realizes switching the driving mode of the

miniature robot between macro motion and micro motion byusing a feed screw nut mechanism, which is driven by a

torque motor. The motor rotates with low speed, resulting a

relative movement between the macro motion unit and themicro motion unit.

B. Micromanipulator moduleAs shown in Fig.3, the micromanipulator is a steel ball

placed on manipulation rack that is comprised by threepiezoelectric actuators. Three stacked type piezoelectricactuators drive the micromanipulator based on inertial impactprinciple [1, 5]. By applying saw-tooth voltage to thepiezoelectric actuators, the steel ball is able to rotate aroundthree axes. In our project, only pitch motion D.O.F isavailable and other motion D.O.F are forbidden by software.

A micro-gripper driven by stacked type piezoelectricactuator of 5 mm x 5 mm x 10 mm size (PI Company,

Fig.4 Embedded low-level control system

Fig.4 is the description of embedded control system. Tworechargeable lithium batteries, which can generate 8.4Vvoltage for the whole embedded control system, are selectedas the power supply source. The power source generationmodule generates a proper and stable constant voltage for theembedded control system (3.3 V for digital circuit and 5V forhigh voltage amplification circuit). The high voltages fordriving the piezoelectric actuators are obtained from DC/DCconverters (PICO Company, USA) with ultra small size. Eachof converters increases 5v up to 150v with the capacity of8mA current output. And PA140CC (APEX Company, USA)are utilized as the high voltage amplifiers. To avoid

469

lVlicro motion Macro motionunit unit

Page 3: [IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

overheating, all the high voltage amplifiers are glued on theradiator (shown in Fig.1 and Fig.5).

A 12-bit D/A converter embedded in processor(C805 IF040) is used for producing primitive saw-toothwaveform for the micromanipulator. It is easy for operator tochange the frequency or amplitude of the waveform throughcorresponding control command. Signal processing circuitproduces positive and negative waveforms from primitivewaveform, thus difference waveforms combination can beused to drive the manipulator motion after amplifying themby high voltage amplifiers.

Fig.5 Partial embedded hardware

Main performance of the proposed miniature robot isshown in tablel.

TABLE IPERFORMANCE OF MINIATURE ROBOT

Classification PerformanceRechargeable battery working time -| 3 hoursMacro Speed -50mmIsmovement Resolution - 100gmMicro Speed -200gm/smovement Resolution --4gm

Speed -0.9degree/sManipulator Resolution -0.001 degree

Gripper workspace 150gm-450gmMaximum gripping force 0.6Newton

Main characteristics of the external visual system areshown in table 2.

A: Global visual sensor B: Stepper motor for auto-focus C: Opticalmicroscope D: Led Light E: Micro assembly area F: Micro part

gripping area G: Miniature robot

Fig.6 Miniature robot based micro-handling station

TABLE IICHARACTERS OF EXTERNAL VISION SYSTEM

Classification CharacterFocal length of object lens About 12mmConstant vertical height of About 600mm

Global visual sensor the global cameraResolution 0.43mm x 0.42mmFrame rate PAL 25 frame/sMagnification times 3

Localvisuls View-filed 2mm x 2mmResolution 2.87tm x 2.77tmFrame rate PAL 25 frame/s

Auto focus machine Resolution 2gm

III. VISION BASED HIGH-LEVEL CONTROL OF THE ROBOT

A. Miniature Robot Based Micro Handling SystemAs shown in Fig 6, a miniature robot based micro-

handling station is designed to carry out a micro handlingunder light optical microscope. An external vision sensorsystem including CCD cameras, optical microscopes, auto-focusing machine and a host computer is used for robot real-time visual feedback control. The visual system includes twoparts: the global visual system and the local visual system.The global visual sensor is a CCD camera mounted verticallyto guide the end-effector of robot gripper to the view-field ofmicroscope (about 2mmx2mm). An analogue CCD camera(WAT-902H, Japan) with PAL resolution has been used asglobal sensor. The local vision sensors are microscopes (V20-507, Japan) with analogue CCD cameras (WAT-902H, Japan)mounted on auto-focusing stepper motors (VEXTA C7214-9015, Japan). It provides the microscopic visual informationfor the robot

(A) Global vision sub-GUI (B) Microscopic vision sub-GUIFig.7 Graphical user interface (GUI)

The role of the operator is specification the work task forthe robot, such as telling the robot to grip a micro part, toconvey a micro part or selecting region of interest formicroscope auto -focusing. As shown in Fig.7, to offer anintuitive way to operate the robot, a graphical user interface(GUI) is developed with VC++ 6.0 software. The GUIincludes two parts: global visual control sub-GUI andmicroscopic handling sub-GUI. The global visual control sub-GUI offers automatic visual guiding after operator giving the

470

Page 4: [IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

desired goal in global visual image. And the microscopichandling sub-GUI, which integrating a plenty of imageprocessing functions, provides micro object identification,auto-focusing and end-effector automatic visual guiding forthe robot accomplishing micro-handling task.

B. Global Visual ControlThe function of the global visual system is guiding the

robot quickly moving to micro-handling area with the macromotion mode. The main techniques of the global visual close-loop include real-time robot detection and the designing ofthe visual controller.1) Detection ofRobot in Global Image Space

Fig.8 Definition of the robot posture in image space

As shown in Fig.8, to reduce the time waste of imageprocessing, three Led markers forming a non-isosceles-right-angled triangle are placed on the top of robot. The orientationof robot is defined from right-angle point Led A points toanother point Led C, which belongs to short right-angleborder. And the robot macro motion zero-radius rotationcentre 0 is defined as the robot position. The Led markerscan easily be identified and tracked in global image spacebecause of their high brightness characteristic. When theimage coordinates of Led markers are obtained, the right-angle point A is identified by:

Kirl (Ui- Ui1) (Ui- Ui+2)+(Vi-Vi+) (Vi- 2) (1)Uli _Ui+1)2 +(Vi _Vi+) _lUi i2) +(V -+2)

Where, i samples Led-A, B, and C respectively. And(14'0) (14+1ii) ('4+2'V+2) are the three Led markers imagecoordinates.

LedA /

0LedA X

Led B/Led B'

Fig.9 Estimation ofthe robot position in image space

When the right-angle point LED A is gotten, the robotorientation vector can be easily obtained. To get the robotposition, the geometric relationship between the robotposition and Led markers in global image space should bebuilt in advance. As shown in Fig.9, the image coordinates of

Led A and Led B be recorded firstly, then the robot rotates aspecial angle with zero-radius, recording the current imagecoordinates of Led A' and Led B', the zero-radius rotationcentre 0 can be estimated as the intersection point of twobisectors.2) Image-based Robot 2D Global Visual Control

O -U Robot orientation

Vv-A" ui EO

- t/_ Desired goal

Ed

Fig.9 Control parameters ofthe robot in global image space

The image-based visual control style [6] is implementedin our project. In world coordinate frame, the velocity of therobot pose is (V , 0 )T , thus the robot Jacobian mapping isexpressed by:

R RV L = JR L j (2)

L L_

Where, fL,R are the angle velocities of two wheels. R isthe radius of the wheel. L is the distance between the twowheels. To simplify the problem, letting the camera frame asthe world frame, because the global visual sensor is mountedvertically to supervise the robot workspace, thus the imageJacobian mapping based on Pin-hole model is given by:

LVi j KV * * = JImage*l

K9= x S sinO + CosO1j (3)

ZK VS2y Sin2OiiS Cos2OiJ

Where, Kv,K are the gain coefficients introduced byoptical visual system f is the focal length, Z 0 represents theconstant height of the camera's optical centre with respect tothe robot motion plane, S x,S y represent the camera'sconstant scale factors. Substituting (2) into (3) yields:

L0 (Jimage JR) -'| V [Vi] (4)

-K, 2 K0-With J being the interaction matrix, which describes the

relationship between the image space pose error of the robotand the desired velocity of the robot pose.

As shown in Fig.9, the control error vector is:

471

Page 5: [IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

[Ed (Uo -Ugoal)2 + (V0 -Vgoal)] (5)

Where (u goal 'Vgoal ) denotes the coordinate of desiredgoal in image space. riM is the angle of the line connecting therobot centre and the goal with respect to the U-axis in imagespace.

The proportional visual control law is expressed by:| Ed Cos(E0)] If(E M)

lLol I~~~~~f(Ed< M)O [Ed. E j(6)

Where, X is a constant positive diagonal gain matrixwith dimension 2 x 2. M is a given threshold determining theposition error of the robot point-to-point control.

When the robot arrives at the desired goal, it begins toadjust its orientation with low speed zero-radius rotationmode until the pose arrives at desired pose.C. Microscopic Visual Control System

Microscopic visual sensor is used to provide microposition information with micron resolution for the robotaccomplishing micro-handling task.1) Search ofEnd-effector

When the macro motion of the robot is accomplished,end-effector of robot gripper cannot be guaranteed inside theview-field of optical microscope. Even if the end-effector isinside the view-field of microscope, after the motion modeswitching, the end-effector may outside of microscopic view-field due to loss the positioning accuracy during switching. Soan end-effector search strategy [7], which is based on motioninspection, is designed to solve the problem. The strategybased on the fact that, if the end-effector inside view-field ofmicroscope, through the end-effector active motions, themotion information can be observed in microscopic image.Otherwise no motion information indicates end-effectoroutside the microscopic view-field.2) Auto-focusing and End-effector Depth-direction Control

In order to accomplish the micro-handling task with thehelp of microscopic visual information, obtaining sharplymicroscopic image is necessary. In [8], image sharpnesscriterion based on Normalized Variance operator is provedmost suitable for optical microscope application, so we utilizeit to carry out auto-focusing strategy. After selecting the ROI(Region of Interest) by interaction, the auto-focusing machineis adjusted along the microscopic optical axis until theevaluation function of microscopic image sharpness reaches amaximum. To optimize the search steps, the mounting hillservoing searching strategy is utilized. Based on the auto-focusing method, the depth information also can be estimatedby Depth-from-Focus.3) Target Detection

As shown in Fig. 10, Canny edge detection operator isadopted as the pre-process filter and the Hough transform

detection approach is implemented to extract the interestingfeatures such as straight lines and circles from the randomedges image. The position of end-effector is the defined asmidpoint of interesting points P1 and P2 (P1, P2 is themidpoint of end-effector's top line respectively). Afterdetection of the target, the correlation based image matchmethod is carrying out for real-time tracking of the end-effector. Firstly, two templates are extracted at the interestpoints P1 and P2 respectively. Then the correlation matchesare carried out in the neighbourhoods of their previouspositions.

(A) Detection of end-effector (B) Detection ofmicro partFig. 10 Target detection in microscopic image

4) Microscopic Visual Control

End-effector

Fig. 11 Estimation ofrobot orientation and optimal trajectory inmicroscopic image space

Unfortunately, form the two interesting points P1 and P2,the orientation information of the robot cannot be observed.But we can estimate the orientation information of the robotand then obtain the optimal trajectory of the end-effectormotion. First, the robot is given sequence pure forwardmovements, and the orientation is estimated by fitting thetrajectory of interesting point P1 or P2 positions recorded insequence images to a line. Second, the optimal motiontrajectory for the ender-effector motion can be estimated asshown in Fig. 11: (1) The optimal trajectory is the line withthe same slope of orientation vector.(2) The optimal trajectoryis the line across the target point.

The control error vector [Ex EY] Tis expressed by:

Lx1 LX: YE B]LEXJ=LXE - K 1(7)

With Y=K X+B is the equation of optimal trajectory inmicroscopic image space. (XE,YE) is the image coordinate of

472

Page 6: [IEEE 2006 IEEE International Conference on Robotics and Biomimetics - Kunming, China (2006.12.17-2006.12.20)] 2006 IEEE International Conference on Robotics and Biomimetics - A Visual

defined end-effector position. (X, YT) is the image coordinateof target position.

IV EXPERIMENTS

A. Experiment on Micro part Gripping HandingFig. 12 is the selected images during the robot

accomplishing a micro part-gripping task. (A) The initialposition of the robot is placed arbitrarily to the desired goal(gripping operation area, shown in Fig.6 and Fig. 12 (A)).Then the robot starts moving to desired goal under macromotion mode guided by global visual sensor. (B) After therobot arrives at the goal, through the motion mode switching,the robot is switched to micro motion mode. (C) The end-effector of the robot gripper is inside the view-field ofmicroscope by carrying out the end-effector search strategy.(D) By carrying out auto-focusing strategy and the end-effector depth directional control based on Depth-From-Focus, the end-effector and the micro part are imagedsharply. (E) End-effector moves to micro part automaticallyguided by microscopic visual sensor. (F) The robot grips themicro part successfully.

Fig.12 Micro part gripped handling

B. Experiment on Micro part Conveying and Assembly

Fig.13 Micro parts assembly operation

Fig. 13 is the selected images during the robotaccomplishing micro parts assembly task. (A) When themicro part is gripped, through the motion mode switching,the robot is switched to macro motion mode. Then the robotmoves to assembly task area with macro motion mode guidedby global visual sensor. (B) After the robot arrives at the goal,

through the motion mode switching, the robot is switched tomicro motion mode. (C) The end-effector of the robot isinside the view-field of microscope by carrying out the end-effector search strategy. And micro parts are identified aftercarrying out auto-focusing strategy and the end-effector depthdirectional control based on Depth-From-Focus. (D)(E) End-effector moves to micro part guided by microscopic visualsensor. (F) Successful micro assembly.

V. CONCLUSION AND FuTuRE WORK

In this paper, a novel 4 D.O.F macro/micro dual drivenminiature mobile robot based micro-handling system ispresented. The robot has the characteristics of moving withcomparatively high speed with macro motion mode, and highprecision position with micro motion mode. To guide therobot accomplishing micro-handling task automatically, anexternal visual system has been developed. The visual systemincludes two parts that are the global visual system and thelocal vision system. The global visual sensor is a CCD cameramounted vertically to detect the pose of robot. The localvision sensor provides the micro position information for therobot accomplishing micro-handling task. Experiments verifythe validity ofproposed robot system

Future works will focus on multiple robots cooperationaccomplishing more complicated micro-handling task such asassembly of miniaturized gear system and bio-cellmicroinjection.

ACKNOWLEDGMENT

The authors gratefully acknowledge the valuablecontributions of anonymous reviewers of IEEE ROBIO 2006conference for their helpful comments and suggestionstowards improving the manuscript.

REFERENCES

[l] Stephan Fahlbush, Sergej Fatikow, Joerg Seyfried, Axel Buerkle "FlexibleMicrorobotic System MINIMAN: Design, Actuation Principle andControl," IEEE International Conference on Advanced IntelligentMechatronics, pp.156-161, 1999.

[2] Ohmi Fuchiwaki, Daigo Misaki and Hisayuki Aoyama, "Flexiblemicroprocessing organaized by versatile microrobots," Proc. of 7th Int.Conference on Mechatronics Technology, pp.121-126, 2003.

[3] Ion Pappas, Alain Codourey, "Visual control of a microrobot operatingunder a microscope," IEEE International Conference on IntelligentRobotics and System, pp.993-1000, 1996.

[4] J.Brufau, M.Puig-Vidal, "MICRON:Small Autonomous Robot for CellManipulation Application," IEEE International Conference on Roboticsand Automation, pp.856-861, 2005.

[5] J.-M. Breguet and R.Clavel, "Stick and slip actuators: design, control,perfonmances and applications," IEEE International Symposium onMicromechatronics and Human Science, pp. 89-95.1998.

[6] s. Hutchinson, G. Hager and P. Corke, "A tutorial on visual servocontrol," IEEE Trans. on Robotics and Automation., vol.12, pp. 651-670,1996.

[7] Song yu ,Li mantian and Sun lining, "Global visual servoing of miniaturemobile robot inside a micro-assembly station," IEEE InternationalConference on Mechatronics and Automation, pp. 1586-1591.2005

[8] Yu sun ,Stefan Dutlaher, Bradley J.Nelson, "Auto focusing AlgorithmSelection for Computer Application," IEEE International Conference onIntelligent Robotics and System, pp.419-425,2005.

473