Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Hand Tracking With Leap Motion ControllerLin Shao
Abstract
The novel device Leap Motion Controllerprovides an informative representation ofhands. We utilize tracking data through theAPI of Leap Motion Controller to recognizehand movement and gestures. Our experi-ment shows that our method based on LeapMotion Controller tracking data can recog-nize hand gesture accurately when no occlu-sion happens.
Introduction
Gesturing is a natural part of human commu-nication and becomes more and more impor-tant in AR/VR interaction. The Leap Mo-tion Controller(LMC) is a new device devel-oped for gesture interaction by Leap Motion(https://www.leapmotion.com/). The de-vice has a small dimension of 0.5x1.2x3 inches.To use the Leap Motion Controller, the userneed to connect it to a computer by USB. Thenthe users put hands on top of the Leap MotionController. Figure 1 gives an example of howto use the Leap Motion Controller.
Figure 1: Leap Motion Usage. The small objectin the middle is Leap Motion Controller connectingto the Mac on the right. Hand on top of theLeap Motion is tracked and interacted with virtual ob-jects. Picture source:https://www.leapmotion.com/product/desktop?lang=en
Tracking data from LMC
Figure 2: Example of tracking data with red color. Left: palm data.Right: fingers data. Figure source https://www.leapmotion.com/
Tracking data from Leap Motion Controller API is accessby a Frame Object. Figure 2 highlights tracking data weused to build hand gesture features.Palm positionPpos, normal PN and velocityPv. Hand di-rection PD.Fingertips position F i
pos, direction F iD and velocity F i
v
where i starts from 0 to 4 representing thumb, index,middle,ring and pinky respectively.
Designed Features
Features are based on geometry information of hand.Static GestureFeatures for static gestures are mainly built based on palmand fingers relative distances. Distances between finger-tips F i
pos and palm center Ppos. Distance between thumbF 0
pos and index F 1pos. Distance between index F 1
pos andF 2
pos.Translation FeatureTranslation Feature indicates fingers and palm are movingtogether straightly without rotation. We calculate thecross correlation of velocity vectors between fingers F i
v
and palm Pv.Hand Rotation FeaturePalm Rotation Features contains two parts. One is thedifference of current palm normal P t
N and previous palmnormal P t−1
N defined by DPN . The other parts is theangle between difference of current palm DPN and handdirection PD.Index Tapping FeaturesIndex Tapping Features contains two parts. One is themagnitude of index velocity F 1
1 . The other is the anglebetween index velocity F 1
1 and hand direction PD
Index Circle Direction FeaturesIndex Circle Direction features are detecting whether theindex circle is clockwise or counter clockwise.
Gesture Navigation
We choose three different gestures to representmoving forward/backward, turn left/right, rollclockwise/counter-clockwise.
Experiment Result
We implemented the demo based on unity5 and usedskeleton hands as the output. Figure 3 gives several ex-amples of static gestures our method could effectively rec-ognized.
Figure 3: Examples of static gestures
Failure Case Analysis
We also analysis the failure cases to find out factors lead-ing to misclassification.Self-Occlusionabstractname The Leap Motion Controller use IR togather hand information in the space. When importantfingers or regions are self-occluded by other hand parts,tracking data quality will greatly reduced.Detection RegionCurrently the detection region for Leap Motion Controlleris still small. Hand tracking data becomes unstable whenhands are near the detection region boundaries.ParametersIn the above feature descriptors, some parameters areassociated with real hand sizes. If the hand sizes andcorresponding parameters are not matching, failure caseshappens.Error accumulationFor hand movement gestures, we also use first order dif-ferences. These values are less accurate and less robustdue to error accumulation
Conclusion and Question
•Leap motion provides detaileddescriptions of hand position andvelocity which could be used for handgesture recognition.
•The hand gesture recognition resultsalso depends on other factors such asocclusion which sometimes lead tomisclassification.
References
[1] A. Bracegirdle, S. T. Mitrovic, andM. Mathews, “Investigating the usability ofthe leap motion controller: Gesture-basedinteraction with a 3d virtual environment,”2014.
[2] G. Marin, F. Dominio, and P. Zanuttigh,“Hand gesture recognition with jointlycalibrated leap motion and depth sensor,”Multimedia Tools and Applications,pp. 1–25, 2015.
[3] F. Weichert, D. Bachmann, B. Rudak, andD. Fisseler, “Analysis of the accuracy androbustness of the leap motion controller,”Sensors, vol. 13, no. 5, p. 6380, 2013.
[4] D. Bachmann, F. Weichert, andG. Rinkenauer, “Evaluation of the leap motioncontroller as a new contact-free pointingdevice,” Sensors, vol. 15, no. 1, p. 214, 2015.
Acknowledgements
We would like to thank all ee267 staff for theirsupport during the quarter.