6
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com 11299 Multi User Myographic Characterization for Robotic Arm Manipulation Paula Useche Murillo Mechatronics Student, Department of Mechatronics Engineering, Militar Nueva Granada University, Bogotá, Colombia. Robinson Jimenez Moreno Assistant Professor, Technology on Electronics and Communication, Militar Nueva Granada University, Bogotá, Colombia. Mauricio Mauledeox M. Assistant Professor, Department of Mechatronics Engineering, Militar Nueva Granada University, Bogotá, Colombia. Abstract This paper presents a test scenario which seeks to determine the precision level in the command of a robotic arm with 4 degrees of freedom which is controlled by electromyographic signals. These signals are acquired from a group of users of different gender and age, in a controlled exercise of taking and leaving an object. The signals are captured by the commercial wearable Myo Armband. It is exposed the kinematic analysis of the manipulator in order to be able to be included later in the arm manipulation. On the set of tests are presented the statistical results about the degree of accuracy and the time used to execute the action, in order to conclude on the versatility of the proposed robotic remote control mode. Keywords: electromyographic signals, Myo armband, myographic control, robotic control. INTRODUCTION In [1] is presented an analysis of the electromyographic signals characteristics extracted from Myo sensor, whose objective is to provide statistical information of the recognized movements by this sensor, in order to open the door for future researches. Inside of the various investigations which have taken place, in the field of medicine they become very important in the rehabilitation systems, a punctual case is presented in [2] where the analysis of the derived signals of the Myo respect to the muscle movements of the arm and the hand can establish satisfaction metrics in a rehabilitation therapy of these muscles, for example. At the same time, the signals provided by the Myo have been employed for the non-verbal language characterization, it is like in [3] is presented an application of this style oriented to support the communication between the deaf people and those who do not. The characterization of the human movements presents a control tool for different systems, like the punctual case of robot control for different applications. In this aspect in [4] can be observed the arm movement’s acquisition of a user through Kinect sensor of Microsoft Copr, to achieve the interaction of that user with a robotic arm, in the development of collaborative robots for man machine applications. In [5] is presented the replica of the arm movements of a person so as to command a humanoid robot, in robotic imitation applications. These characterization schemes of the human movements can derive in real applications of teleoperation through the Myo, like exposed in [6]. In [7] is presented a comparison between sensors like Kinect and Myo for the characterization of the human arm movements. So that in the field of robotics the control systems are already beginning to con-template applications based on electromyographic signals of the Myo. For example in [8] is presented the development of algorithms for association of hand positions to control a robotic wheel chair through the signals of the Myo and a case of application is exposed in [9] where the navigation control of an omnidirectional mobile robot is exposed. From the state of the art presented and based on previous works of tracing human movement for the control of robots of the researchers, in this article is intended to validate Myo functionality to manipulate a robot arm in accuracy applications, where the user performing the task is indifferent, but it is of interest characterize the versatility of the application and the execution times and manipulation through that sensor. METHODS AND MATERIALS Signal capture by Myo Armband Myo Armband is a wearable armband that uses biosensors to measure electromyographic signals produced on forearm muscles to recognize the hand motions and process the signals to send them by Bluetooth, like control words, to another devices [10] [11], furthermore incorporate an accelerometer, gyroscope and magnetometer [12]. Requires direct skin contact, so cannot be use onto clothes [13] [14]. This device is a portable sensor, easy to acquire and its use is kind plug and play, is powered by a rechargeable lithium battery. Like in Figure 1, the device has eight segments of expandable casing, connected using stretchable material that allows adjust the Myo Armband to the user-arm dimensions, a USB charging port, the logo LED that shows the sync state of

Multi User Myographic Characterization for Robotic Arm Manipulation

  • Upload
    buibao

  • View
    230

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Multi User Myographic Characterization for Robotic Arm Manipulation

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com

11299

Multi User Myographic Characterization for Robotic Arm Manipulation

Paula Useche Murillo Mechatronics Student, Department of Mechatronics Engineering,

Militar Nueva Granada University, Bogotá, Colombia.

Robinson Jimenez Moreno Assistant Professor, Technology on Electronics and Communication,

Militar Nueva Granada University, Bogotá, Colombia.

Mauricio Mauledeox M. Assistant Professor, Department of Mechatronics Engineering,

Militar Nueva Granada University, Bogotá, Colombia.

Abstract This paper presents a test scenario which seeks to determine the precision level in the command of a robotic arm with 4 degrees of freedom which is controlled by electromyographic signals. These signals are acquired from a group of users of different gender and age, in a controlled exercise of taking and leaving an object. The signals are captured by the commercial wearable Myo Armband. It is exposed the kinematic analysis of the manipulator in order to be able to be included later in the arm manipulation. On the set of tests are presented the statistical results about the degree of accuracy and the time used to execute the action, in order to conclude on the versatility of the proposed robotic remote control mode. Keywords: electromyographic signals, Myo armband, myographic control, robotic control. INTRODUCTION

In [1] is presented an analysis of the electromyographic signals characteristics extracted from Myo sensor, whose objective is to provide statistical information of the recognized movements by this sensor, in order to open the door for future researches. Inside of the various investigations which have taken place, in the field of medicine they become very important in the rehabilitation systems, a punctual case is presented in [2] where the analysis of the derived signals of the Myo respect to the muscle movements of the arm and the hand can establish satisfaction metrics in a rehabilitation therapy of these muscles, for example. At the same time, the signals provided by the Myo have been employed for the non-verbal language characterization, it is like in [3] is presented an application of this style oriented to support the communication between the deaf people and those who do not. The characterization of the human movements presents a control tool for different systems, like the punctual case of robot control for different applications. In this aspect in [4] can be observed the arm movement’s acquisition of a user through Kinect sensor of Microsoft Copr, to achieve the interaction of that user with a robotic arm, in the development of collaborative robots for man machine applications. In [5] is presented the replica of the arm

movements of a person so as to command a humanoid robot, in robotic imitation applications. These characterization schemes of the human movements can derive in real applications of teleoperation through the Myo, like exposed in [6]. In [7] is presented a comparison between sensors like Kinect and Myo for the characterization of the human arm movements. So that in the field of robotics the control systems are already beginning to con-template applications based on electromyographic signals of the Myo. For example in [8] is presented the development of algorithms for association of hand positions to control a robotic wheel chair through the signals of the Myo and a case of application is exposed in [9] where the navigation control of an omnidirectional mobile robot is exposed. From the state of the art presented and based on previous works of tracing human movement for the control of robots of the researchers, in this article is intended to validate Myo functionality to manipulate a robot arm in accuracy applications, where the user performing the task is indifferent, but it is of interest characterize the versatility of the application and the execution times and manipulation through that sensor.

METHODS AND MATERIALS

Signal capture by Myo Armband

Myo Armband is a wearable armband that uses biosensors to measure electromyographic signals produced on forearm muscles to recognize the hand motions and process the signals to send them by Bluetooth, like control words, to another devices [10] [11], furthermore incorporate an accelerometer, gyroscope and magnetometer [12]. Requires direct skin contact, so cannot be use onto clothes [13] [14]. This device is a portable sensor, easy to acquire and its use is kind plug and play, is powered by a rechargeable lithium battery. Like in Figure 1, the device has eight segments of expandable casing, connected using stretchable material that allows adjust the Myo Armband to the user-arm dimensions, a USB charging port, the logo LED that shows the sync state of

Page 2: Multi User Myographic Characterization for Robotic Arm Manipulation

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com

11300

the Myo, which pulses when Myo is not synced and solid when is synced and located on the arm, and a status LED which shows the Myo state according to a color code [15].

Figure 1: Myo Armband components [15]

According to [20], Myo provides two kinds of data to an application: the spatial data and the gestural data. The spacial data informs about the orientation and movement of the user’s arm. The orientation defines the direction which Myo points. Is a data provided as a quaternion, and can be converted in a rotational matrix or Euler angles. The movement, or acceleration vector, represents the Myo acceleration at any given time. Is a data that is delivered as a three-dimensional vector. On the other hand, the gestural data indicates the user-hand motion. Is a data delivered as one of several preset poses which represents a specific user hand configuration. Myo gives information about in which arm is being implemented, if it is placed on the arm, and which way is oriented, looking at the wrist or elbow. Also is possible to causes the Myo to vibrate, for a touch and sound feedback. The Myo electrodes were located on a muscle specific group, like shown in Figure 2, so at the moment of process and evaluate EMG signals, the movement was determined by the sensor with the largest measurement value. For this reason, the detected movements by the device are those which depends of a muscle specific group.

Figure 2: Location of the electrodes [21]

However, the location of the muscles is different for each person, for this reason is necessary a calibration process in order to improve gesture recognition.

For the application developed was necessary to connect the Myo with an Arduino to obtain the electromyographic signals and turn them into manipulator control movements. Some gestures captured by Myo are represented in the Figure 3. These movements are Rest, Fist, Wave In, Wave Out, Fingir Spread y Double Tap.

Figure 3: Control gestures detected by Myo Armband [17]

The “MyoController” library was downloaded from the Myo Market in order to connect both devices. The library makes easier the reading, obtaining and classification of the gestures recognized by Myo, and creates a communication channel with the Arduino. According to the gesture recognized by the Myo, the program enters on one of the movement possibilities and executes an action, in the case Figure 4 ilustrates the action generated to command each servo mechanism of the robotic arm by means of a modulated pulse signal (PWM).

Figure 4: Upper, increase movement. Lower, decrease movement.

The robotic manipulator

In the Figure 5, can be seen the manipulator and Arduino used in the application, also the location of each freedom degree.

Figure 5: manipulator structure and freedom degrees

In Figure 6 can be observed the schematic manipulator representation, along with the coordinate system for each joint, from which was obtained the Denavit-Hartenberg parameters table shown in Table 1.

Page 3: Multi User Myographic Characterization for Robotic Arm Manipulation

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com

11301

Figure 6: Schematic manipulator representation

Table 1: Table of Denavit-Hartenberg parameters

With the Table 1 can be obtained the transformation matrix which allows kno-wing the final effector location from the angles of each joint. Inverse Kinematics In Figure 7 can be observed the top view of the schematic manipulator representation, while in Figure 8 can be observed the front view respect an auxiliary axis X’ that rotates together with the joint 1 (A1).

Figure 7: Manipulator top view (schematic)

Figure 8: Manipulator frontal view (schematic)

The angle of the joint 1 and the X’ component of the final point P was obtained with the frontal view, as shown in the next equations.

(1)

(2)

The angle of the joints 2 and 3 was obtained with the frontal view, by getting “d” length, and alfa and beta angles like shown next: Obtain “d”:

(3)

Obtain joint 3 angle:

(4)

(5)

(6)

The angles alfa and beta were calculated in order to get the joint 2 angle, being that the angle of the second joint (theta 2) can have two different values depending if the manipulator grabs with elbow up (dotted on Figure 8) or elbow down:

(7)

(8)

(9)

From the equations previously obtained, is possible to know the angles for each joint so as to achieve the desired final position P.

ANALYSIS AND RESULTS

To determine the precision degree of the application, it was performed a test that consists of catching a small circular object and move it to a certain location using only the control gestures defined on Table 2 to control the manipulator.

Page 4: Multi User Myographic Characterization for Robotic Arm Manipulation

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com

11302

Table 2: Control gestures

In order to achieve better results, the Myo was calibrated for each subject, and between one and two previous tests were performed to allow the user to get accustomed to the control gestures, the delay times of the manipulator and the rotation speed of each motor. During the tests, control guidelines and support were given to the user to plan the necessary movements so as to reach the object and displace it to the final point, and to remember the gestures which makes a specific movement. In Figure 9 can be observed the workspace, where the initial position of the object to move (Starting point) and the desired final point (Final point), around which con-centric circles were marked to determine the degree of error, are marked.

Figure 9: Workspace

The height of both locations is different, in such a way that the user has to manipulate the degrees of freedom (DOF) of the robot for the purpose of perform both precision tasks: grip of object and final location of the same. However, the circular rubber was settled in such a way that just shrinking the manipulator (“Wave In” in Control 1) the user could develop an effective grip, whereas that the final point was located in a lower height in order to the user

has to manipulate the gripper link independently to achieve the objective. For the present paper, the precision was determined from the final location of the object. Each person performed five times the same test, and in each test the distance respect to the final point and the execution time were registered. Data was taken and organized on Table 3 from a total of nine different people.

Table 3: Results of precision tests

Time and distance of five test were averaged for each subject getting the values of the lower part of the Table 3. Later, an average of time and distance of all subjects during all tests was gotten, obtaining the results on the Table 4.

Table 4: Statistical results

From results of Table 4 is possible to affirm that test average time is 2min 10sec, with an error approximately of 2cm respect to final point. Maximum time obtained during tests was almost 5min and the minimum was 45sec, with a maximum position error of 5cm and minimum of 0.5cm. The median is close to 2min and exactly in 2cm, so is possible to say that the obtained average is appropriate for the data collected, inasmuch as is closed to the value that divide the data exactly in half. In the case of time, it was estimated that the average overshoot the 2min because of times greater than the median went farther away than lower times, as can be seen on Table #2 that there were more times around the 3min than lower of a minute. In Figure 10 can be seen the distance data of Table 3 organized in a bar chart, whilst in Figure 11 is shown the time data of Table 3.

Page 5: Multi User Myographic Characterization for Robotic Arm Manipulation

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com

11303

Figure 10: Position error

Figure 11: Execution time

Comparing both graphs, is possible to observe that the biggest position error was obtained during the first test together with a long execution time compared to tests 2, 4 and 5, whilst, on the other hand, during the third test were obtained the best precision results with the longest execution time. In the case of the first test, the long execution time was mainly due to gesture recognition difficulties, where the confusion between the “Double Tap” gesture and others caused the greatest manipulation difficulties, because the control change could be generated accidentally without user consent altering the movements that the user wanted to do, generating that the current position changes after being meticulously adjusted. On the other hand, when trying to generate the control change with “Double Tap” gesture, occasionally the Myo Armband recognized the gesture like another movement, making the clamp to open or the manipulator rotates or the gripper link moves independently, what made that the object to move (earphone rubber) falls, or wasted time re-accommodating the manipulator. In another case, the execution time increased owing to the difficulty to grab the rubber, being that the gripper had to reach a specific position in order to grab the rubber without it falls or without the gripper grabs the box where the rubber was located. Also, the user had to consider the distance between the open grab position and the closing point, since the closure is generated a few centimeters beyond the open gripper location. In the third test, the user had already acquired some practice and reliability with the movements, in addition to having planned the necessary movements to do the trajectory, so the user just had to repeat what had done previously. After becoming accustomed to the Myo, the user began to focus on bringing near the object to the

final point -which increased the execution time- and made less unnecessary movements because of the confusion with gestures control were lower. In the next tests, the user presented muscular fatigue because of all the tests developed previously, so execute the control movements for the manipulator became more exhausting, making that the user dedicates less on achieve a minimum position error, and prefers to finish fast. On the other hand, the precision was affected by non-user factors like the false detection of the “Finger Spread” gesture in Control 1, making the gripper opens sooner than desired. In other cases, while changing from a control to another, the manipulator moved unwanted insomuch as the Myo detected a gesture different from “Double Tap” before detecting the control changes. When performing the test with different users, it was observed that for some of them the Myo detected more easily the gestures than for others subjects, so the test became simpler, like happened with subjects 1, 3 and 7, who had the lowest precision average error. For other subjects, the recognition was not suitable even after having calibrated the device several times, which made a delay in changing their movement, changing the control type or trying to perform short movements. The reaction time of the manipulator affected the tests too, being that the movement was detected but it took a while to react, what made the user believe that the gesture was not detected, generating a long displacement before the user could stop it. In order to do a short displacement, the gesture had to be recognized for a short time span, but when the gesture remained until the robot moved, the displacement was too long to achieve a nearby desired position. For other users, the recognition was suitable and there was almost no gesture control confusions, however, they used to need more time to grab the rubber because of the complexity to accommodate the gripper, either because they moved it independently at unfavorable angles or because they lowered the clamp too much and ended holding the box. The rotation speed used in the program forced users to perform the control gesture for a short period of time to avoid the robot moving too much, so the short movements became complicated to do. On the other hand, the motor of the second joint (A2 in Figure 7) did not have enough torque to lift the manipulator when it crouched to take the rubber or leave it, so the “Manipulator Rises” movement in Control 1 did not work properly and it was necessary to move manually the robotic arm in order to the motor could generate the movement from a most favorable angle.

CONCLUSIONS

A suitable gesture recognition allows to reduce the execution time and precision error, because of a more efficient control of the system. Know the behavior of the system and be conscious of the necessary movements to achieve the objective, allows a better and faster test execution, causing the remaining time to be used to adjust the gripper in order to leave the rubber in the final point, and the another movements became repetitive.

Page 6: Multi User Myographic Characterization for Robotic Arm Manipulation

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 23 (2016) pp. 11299-11304 © Research India Publications. http://www.ripublication.com

11304

Know the delay times and when the system has recognized the control gesture, improves test development insomuch as allow the user know when is necessary to hold the gesture for a long time and when has to do it again to be recognized. Take breaks between tests and relax the arm is important to reduce the muscle fatigue and allows the user to perform the control gestures with the same intensity as at first, in order to ensure that the electromyographic signals remains similar in all tests. The gesture recognition error that generated more delays in the tests was the “Double Tap” gesture in charge of the control changes, since the user adjusted the manipulator in the desired position and when the control change gesture was not suitable recognized, the manipulator moved or dropped the rubber, altering the efforts of the user developed until that moment. Reduce the rotation speed of the manipulator can facilitate the control of the precision movements, however, it requires more torque on the motors so as to be able to realize small displacements despite the weight of the structure, and reduces the displacement speed in large movements like translate the object from the start zone to the final zone. Use higher torque motors so as to the manipulator be able to rises without external help would facilitate the robot control and the realization of more precise movements. Reach a high precision level with the presented system is possible, but requires a suitable gesture recognition, a fast user reaction time, know in advance the movements to be performed and do not stress the muscles more than necessary in order to prevent muscle fatigue.

ACKNOWLEDGEMENTS

The authors thanks to the “Universidad Militar Nueva Granada” by the support in the process of develop of this work (ING1830). REFERENCES

[1] Z. Arief, I. A. Sulistijono and R. A. Ardiansyah, "Comparison of five time series EMG features extractions using Myo Armband," 2015 International Electronics Symposium (IES), Surabaya, 2015, pp. 11-14. doi: 10.1109/ELECSYM.2015.7380805.

[2] M. Sathiyanarayanan and S. Rajan, "MYO Armband for physiotherapy healthcare: A case study using gesture recognition application," 2016 8th Interna-tional Conference on Communication Systems and Networks (COMSNETS), Ban-galore, 2016, pp. 1-6. doi: 10.1109/COMSNETS.2016.7439933

[3] J. G. Abreu, J. M. Teixeira, L. S. Figueiredo and V. Teichrieb, "Evaluating Sign Language Recognition Using the Myo Armband," 2016 XVIII Symposium on Virtual and Augmented Reality (SVR), Gramado, 2016, pp. 64-70. doi: 10.1109/SVR.2016.21

[4] Jiménez Moreno Robinson, Aviles Oscar, Mauledeux Mauricio, "Path Optimization Planning for Human-Robot Interaction". International Journal Of Ap-plied Engineering Research (IJAER) ISSN: 0973-4562 ed: v.11 fasc.22 p.10822 - 10827 ,2016.

[5] Espinosa V. Fabio, Jiménez M. Robinson, Amaya Dario, "Control de mo-vimiento de un robot humanoide por medio

de visión de máquina y réplica de mo-vimientos humanos". IngeCuc ISSN: 0122-6517 , v.9 fasc.2 p.44 - 51 ,2013.

[6] Y. Xu, C. Yang, P. Liang, L. Zhao and Z. Li, "Development of a hybrid motion capture method using MYO armband with application to teleoperation," 2016 IEEE International Conference on Mechatronics and Automation, Harbin, 2016, pp. 1179-1184. doi: 10.1109/ICMA.2016.7558729

[7] R. Hosoya, T. Hasegawa, T. Naka, M. Yamada and S. Miyazaki, "A Study of Tracking the Human Arm Twist Motion," 2016 Nicograph International (NicoInt), Hanzhou, 2016, pp. 150-150. doi: 10.1109/NicoInt.2016.43

[8] A. Boyali, N. Hashimoto and O. Matsumoto, "Hand posture and gesture recognition using MYO armband and spectral collaborative representation based classification," 2015 IEEE 4th Global Conference on Consumer Electronics (GCCE), Osaka, 2015, pp. 200-201. doi: 10.1109/GCCE.2015.7398619.

[9] G. C. Luh, H. A. Lin, Y. H. Ma and C. J. Yen, "Intuitive muscle-gesture based robot navigation control using wearable gesture armband," 2015 Interna-tional Conference on Machine Learning and Cybernetics (ICMLC), Guangzhou, 2015, pp. 389-395. doi: 10.1109/ICMLC.2015.7340953

[10] Brando, “Myo. ¿La banda mágica?”. Consulted November 4, 2016. [Online] Available in: http://www.conexionbrando.com/1562490-myo-la-banda-magica

[11] Maleo, “Cómo funciona Myo: Una pulsera para todos”. Consulted November 4, 2016. [Online] Available in: http://como.hol.es/una-pulsera-para-todos/

[12] Guillermo Alegre, “Myo, el controlador de dispositivos mediante gestos”. Consulted November 4, 2016. [Online] Available in: https://rincondelatecnologia.com/myo-controlador-dispositivos-mediante-gestos/

[13] José Andrade, “Controlan un AR.Drone con los brazaletes gestuales MYO”. Consulted November 4, 2016. [Online] Available in: http://es.engadget.com/2014/01/10/thalmic-lab-myo-gestual-ardrone/

[14] Alberto Revoredo, “MYO, todo el poder en la mano”. Consulted November 4, 2016. [Online] Available in: http://larepublica.pe/05-10-2014/myo-todo-el-poder-en-la-mano

[15] Thalmic LabsTM, “Myo SDK Manual: Getting Started”. Consulted November 4, 2016. [Online] Available in: https://developer.thalmic.com/docs/api_reference/platform/getting-started.html

[16] P. G. Jung, G. Lim, S. Kim and K. Kong, "A Wearable Gesture Recognition Device for Detecting Muscular Activities Based on Air-Pressure Sensors," in IEEE Transactions on Industrial Informatics, vol. 11, no. 2, pp. 485-494, April 2015. doi: 10.1109/TII.2015.2405413

[17] MyoTM. Consulted October 14, 2016. [Online] Available in: https://www.myo.com/techspecs