2
Augmented Reality Motion-Based Robotics Off-ine Programming Diana Araque, Ricardo D´ ıaz, Byron P´ erez-Guti ´ errez Davinci Research Group Nueva Granada Mil. University Alvaro Joffre Uribe Mechanical Project Design Department Campinas State University ABSTRACT Augmented reality allows simulating, designing, projecting and validating robotic workcells in industrial environments that are not equipped with real manipulators. In this paper a study and im- plementation of a gestural programmed robotic workcell through augmented reality is presented. The result was an interactive envi- ronment in which the user can program an industrial robot through gestures, accomplished from the development of a computer based framework. Index Terms: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, augmented, and vir- tual realities; H.5.2 [Information Interfaces and Presentation]: User Interfaces—Input devices and strategies 1 I NTRODUCTION Through Augmented Reality (AR) virtual devices can interact with real workspaces and the objects within. This experience offers a greater sense of realism as the interactions are developed in the area where the device is going to work. In the case of industrial robotics, various commercial and noncommercial virtual tools al- low designing and simulating workcells accordingly to the desired tasks, and although this process involves only computer generated models based on real ones, with the augmentation of a scenario the concern for realism is centered on the virtual object enhancing the environment and its tasks. AR allows developing applications for training, teaching and de- signing robotic workcells intended for academic, research or in- dustry solutions with or without the physical equipment. Most common activities related when working with robotics involves the modeling, kinematics analysis, task planning and dynamics. While some industrial solutions such as RobotStudio from ABB (http: //www.abb.com), or Delmia from Dassault Systems(http: //www.delmia.com), let the user design, program, improve and evaluate robotic workcells, researchers have been developing cus- tom applications in order to solve specific problems such as solu- tions for training [1] and education [4] have found a powerful part- ner in VR as it allows to complement their capabilities for task plan- ning, teaching [7], and manipulation of hazardous materials [3]. Most developments use common HIDs such as mouse, keyboard, joysticks or touch screens, tending to maximize the interaction through the dexterity and ergonomics of the human anatomy re- cently using haptic devices, motion image processing or devices in- terpreting gestures. Inspired by the momentum gained of this trend, computers, handhelds, mobile phones, computer and videogames are changing how daily activities are performed. This has been also reflected in robotics, where devices based on accelerometers and gyroscopes are being used to transform motion or gestures in e-mail:[email protected] Universidad Militar Nueva Granada, School of Engineering, Mechatro- nics Eng., VR Center Lab.; Carrera 11 # 101 - 80 Bogot D.C., Colombia e-mail:[email protected] formation for programming mobile robots, assist exoesqueletal or- theses in rehabilitation or command serial robots [5]. Considering the AR characteristics and an environment suitable for a robotic device inclusion, this paper presents an approach to use AR as tool for academic purposes in order to help the develop- ment, understanding and design skills of robotic usage in an local or remote environment. The distinction from previous works is the use of arm and hand gestures for programming and controlling the robot in order to make comfortable and intuitive the interaction between the user and the device taking advantage of the communication and embedded sensors of the Wiimote. 2 MOTION- BASED ARCHITECTURE As the objective of this work is to implement a scalable solution for using in various environments, a standard industrial robot of 5 Degrees of Freedom (DOF) is going to be used as test device. The main structure of the architecture is composed of a computer frame- work for gestural interaction through the user with a physical HID and a virtual robot. The architecture for developing the proposed AR robotics system is composed of the elements shown in Fig.1. The overlay of the virtual object and the real world is executed in the mixed environment loop, where the computer render the 3D in- dustrial robot in the real facility using the ARToolkit library ported to C# language in NyARToolkit [6] library. COMPUTER REAL INDUSTRIAL ROBOT VIRTUAL INDUSTRIAL ROBOT FORWARD INVERSE KINEMATICS GESTURAL INTERACTION DEVICE CAMERA IP REMOTE ROBOTICS WORKCELL LAN INTERNET REAL AND VIRTUAL MIXED ENVIRONMENT CALCULATION AUGMENTED REALITY VISUAL FEEDBACK ROBOT PROGRAM OFFLINE PROGRAMMING LAN INTERNET Figure 1: AR architecture. 2.1 Serial Robotics An arm-type robot can be modeled and analyzed as a serial linked mechanism of any number of DOF. These devices can be pro- grammed through rotational or translational inputs through the so- lution of the kinematic problem. When given known rotations for each link, the forward kinematics analysis can be used to calculate de positions of each bar in the space, this is accomplished using the transformation matrix along with the Denavit-Hargenberg method [2]. 191 IEEE Virtual Reality 2011 19 - 23 March, Singapore 978-1-4577-0038-5/11/$26.00 ©2011 IEEE

[IEEE 2011 IEEE Virtual Reality (VR) - Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - Augmented reality motion-based robotics off-line programming

  • Upload
    a-j

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

Page 1: [IEEE 2011 IEEE Virtual Reality (VR) - Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - Augmented reality motion-based robotics off-line programming

Augmented Reality Motion-Based Robotics Off-�ine Programming

Diana Araque, Ricardo Dıaz, Byron Perez-Gutierrez∗Davinci Research Group

Nueva Granada Mil. University†

Alvaro Joffre Uribe‡

Mechanical Project DesignDepartment

Campinas State University

ABSTRACT

Augmented reality allows simulating, designing, projecting andvalidating robotic workcells in industrial environments that are notequipped with real manipulators. In this paper a study and im-plementation of a gestural programmed robotic workcell throughaugmented reality is presented. The result was an interactive envi-ronment in which the user can program an industrial robot throughgestures, accomplished from the development of a computer basedframework.

Index Terms: H.5.1 [Information Interfaces and Presentation]:Multimedia Information Systems—Artificial, augmented, and vir-tual realities; H.5.2 [Information Interfaces and Presentation]: UserInterfaces—Input devices and strategies

1 INTRODUCTION

Through Augmented Reality (AR) virtual devices can interact withreal workspaces and the objects within. This experience offers agreater sense of realism as the interactions are developed in thearea where the device is going to work. In the case of industrialrobotics, various commercial and noncommercial virtual tools al-low designing and simulating workcells accordingly to the desiredtasks, and although this process involves only computer generatedmodels based on real ones, with the augmentation of a scenario theconcern for realism is centered on the virtual object enhancing theenvironment and its tasks.

AR allows developing applications for training, teaching and de-signing robotic workcells intended for academic, research or in-dustry solutions with or without the physical equipment. Mostcommon activities related when working with robotics involves themodeling, kinematics analysis, task planning and dynamics. Whilesome industrial solutions such as RobotStudio from ABB (http://www.abb.com), or Delmia from Dassault Systems(http://www.delmia.com), let the user design, program, improve andevaluate robotic workcells, researchers have been developing cus-tom applications in order to solve specific problems such as solu-tions for training [1] and education [4] have found a powerful part-ner in VR as it allows to complement their capabilities for task plan-ning, teaching [7], and manipulation of hazardous materials [3].

Most developments use common HIDs such as mouse, keyboard,joysticks or touch screens, tending to maximize the interactionthrough the dexterity and ergonomics of the human anatomy re-cently using haptic devices, motion image processing or devices in-terpreting gestures. Inspired by the momentum gained of this trend,computers, handhelds, mobile phones, computer and videogamesare changing how daily activities are performed. This has beenalso reflected in robotics, where devices based on accelerometersand gyroscopes are being used to transform motion or gestures in

∗e-mail:[email protected]†Universidad Militar Nueva Granada, School of Engineering, Mechatro-

nics Eng., VR Center Lab.; Carrera 11 # 101 - 80 Bogot D.C., Colombia‡e-mail:[email protected]

formation for programming mobile robots, assist exoesqueletal or-theses in rehabilitation or command serial robots [5].

Considering the AR characteristics and an environment suitablefor a robotic device inclusion, this paper presents an approach touse AR as tool for academic purposes in order to help the develop-ment, understanding and design skills of robotic usage in an local orremote environment. The distinction from previous works is the useof arm and hand gestures for programming and controlling the robotin order to make comfortable and intuitive the interaction betweenthe user and the device taking advantage of the communication andembedded sensors of the Wiimote.

2 MOTION-BASED ARCHITECTURE

As the objective of this work is to implement a scalable solutionfor using in various environments, a standard industrial robot of 5Degrees of Freedom (DOF) is going to be used as test device. Themain structure of the architecture is composed of a computer frame-work for gestural interaction through the user with a physical HIDand a virtual robot. The architecture for developing the proposedAR robotics system is composed of the elements shown in Fig.1.The overlay of the virtual object and the real world is executed inthe mixed environment loop, where the computer render the 3D in-dustrial robot in the real facility using the ARToolkit library portedto C# language in NyARToolkit [6] library.

COMPUTER

REAL

INDUSTRIAL

ROBOT

VIRTUAL

INDUSTRIAL

ROBOT

FORWARD

INVERSE

KINEMATICS

GESTURAL

INTERACTION

DEVICE

CAMERA IP

REMOTE

ROBOTICS WORKCELL

LAN

INTERNET

REAL AND VIRTUAL

MIXED ENVIRONMENT

CALCULATION

AUGMENTED

REALITY

VISUAL

FEEDBACK

ROBOT

PROGRAM

OFFLINE

PROGRAMMING

LAN

INTERNET

Figure 1: AR architecture.

2.1 Serial Robotics

An arm-type robot can be modeled and analyzed as a serial linkedmechanism of any number of DOF. These devices can be pro-grammed through rotational or translational inputs through the so-lution of the kinematic problem. When given known rotations foreach link, the forward kinematics analysis can be used to calculatede positions of each bar in the space, this is accomplished using thetransformation matrix along with the Denavit-Hargenberg method[2].

191

IEEE Virtual Reality 2011

19 - 23 March, Singapore

978-1-4577-0038-5/11/$26.00 ©2011 IEEE

Page 2: [IEEE 2011 IEEE Virtual Reality (VR) - Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - Augmented reality motion-based robotics off-line programming

2.2 Motion InteractionEach joint position is calculated through input angles given by Wi-imote’s motion when performing the yaw, pitch and roll rotations.The interactivity between the user and the AR environment occurswhen moving the Wiimote in order to generate a subsequent robottrajectories. To create a suitable motion path, the user must use hishand to move a Wiimote in order to change what the accelerometerssense. Only the X axis acceleration is used to simplify the gesturelanguage.Fig.2 shows a set of gestures defined in 6 different ways,these are stored in a database for a latter comparison with the userinput. When a gesture is detected, kinematic values are rendered onthe virtual robot through AR.

left right left-right-left

right-left-right 2x left 2x right

Figure 2: Defined Wiimote gestures.

2.3 Augmented RealityHaving defined which components are needed to implement the ARsolution; Fig.3 shows the flow diagram of how the computer frame-work and how the developed application should behave upon userinputs through the HID and its six available DOF. One considera-tion that has to be taken into account is that the system assumes fullvisibility upon the marker, as partial, obstructed or blur visualiza-tion can result in the non rendering of the virtual objects and faultyinteraction.

start

Search wiimote

and stream video

device

Devices

found?

Find marker in

image

Markers

visible?

Render robot over

marker

Move joints

according to input

Continue?

end

Figure 3: AR diagram.

3 RESULTS

A computational prototype using Microsoft Visual C#, NyArtoolkitand DirectX was implemented. The Graphical User Interface,

which is shown in Fig.4, has three main components: 1) Videowindow, 2) Status information about Wiimote connection and 3)Wiimote accelerometers’ graph.

Figure 4: Implemented GUI screen capture.

Through the development and implementation of gestural pro-grammable robotic workcell using Augmented Reality, it has beenpossible to propose a scalable architecture for being used withinvarious scenarios and industrial robots. The motion of the robotthrough gestures using a Wiimote controller allows the user for anintuitive and interactive manipulation of the device increasing pro-gramming comfort as expressing through the arms is very naturalfor every person.

4 CONCLUSION

The results from merging a virtual robot in real environments allowsnot only the validation of offline programming practices but also toevaluate if a certain robot is up to the task in certain scenario ormore are required, the foreseen impact of AR technologies lies onits capability for improving management decision when choosingor simulating manipulators to fit certain environments.

ACKNOWLEDGEMENTS

The authors wish to thank the Integrated Automation and RoboticsLaboratory of the Mechanical Engineering Faculty of the CampinasState University.

REFERENCES

[1] M. Bal, H. F. Manesh, and M. Hashemipour. Virtual-reality-based in-formation requirements analysis tool for cim system implementation:a case study in die-casting industry. Int. J. Comput. Integr. Manuf.,21:231–244, April 2008.

[2] J. Denavit and R. Hartenberg. A kinematic notation for lower-pairmechanisms on matrices. ASME Journal of Applied Mechanics, pages215– 221, 1955.

[3] M. Fischer and G. Hirzinger. Fast planning of precision grasps for3d objects. In Intelligent Robots and Systems, 1997. IROS apos;97.,Proceedings of the 1997 IEEE/RSJ International Conference on, 1997.

[4] J. C. Freire Jr., J. V. de Lima, R. A. Neves, and G. J. de Sena. A multi-media environment for supporting the teaching of robotics systems. InProceedings of the 24th International Conference on Distributed Com-puting Systems Workshops - W7: EC (ICDCSW’04) - Volume 7, ICD-CSW ’04, pages 280–285, Washington, DC, USA, 2004. IEEE Com-puter Society.

[5] J. C. Lee. Hacking the nintendo wii remote. IEEE Pervasive Comput-ing, 7:39–45, 2008.

[6] Y. Sugiura, D. Sakamoto, A. Withana, M. Inami, and T. Igarashi. Cook-ing with robots: designing a household system working in open envi-ronments. In Proceedings of the 28th international conference on Hu-man factors in computing systems, CHI ’10, pages 2427–2430, NewYork, NY, USA, 2010. ACM.

[7] P. Zhang, K. Tanaka, E. Shimizu, and M. Ito. A teleoperating systemfor underwater manipulator to practice a time limit task. IntelligentControl. 2003 IEEE International Symposium on, 2003.

192