The Implementation of a Glove-Based User Interface Chris Carey

Embed Size (px)

Citation preview

  • Slide 1

The Implementation of a Glove-Based User Interface Chris Carey Slide 2 Abstract Advantages of multi-touch interfaces are being utilized in numerous applications This project aims to go a step further A glove-based interface provides the utility of a multi-touch interface without the proximity restriction A more natural human-computer interaction can also improve efficiency with complicated tasks. Slide 3 Background Why now? Accessibility of Technology Increased Application Sophistication Usage in Restrictive Environments Why not? Slide 4 Past and Current Systems Glove Systems Haptic Gloves and VR Systems Full Motion Capture Glove Systems Basic Wiimote Glove Systems Non-Glove Systems Neural Network Hand Gesture Recognition 3D Model Reconstruction Gesture Recognition Slide 5 Project Goals Focuses: Speed Accuracy Task Simplification Improved User Experience Slide 6 Logitech Webcam IR-blocking filter removed Visible-light blocking filter added IR LED Glove 3 IR LEDs 2-1.5V button cell batteries Hardware Implementation Slide 7 Software Implementation Java and Java Media Framework Custom LED Detection LED Tracking Gesture Recognition Command Execution Slide 8 LED Detection Binary Rasterization Brightness Threshold Determination Blob Comparison Slide 9 LED Tracking LED object class Records previous positions and velocities Predicts next position for faster location Balance between detected LEDs and tracked LEDs Slide 10 Gesture Recognition Static Gestures Do not depend on absolute location Performed and executed once Dynamic Gestures Do depend on absolute location Performed and executed continuously Slide 11 Static Gesture: Minimize Decreasing distance between three LEDs java.awt.Robot class executes keystroke ALT+SPACE + n Slide 12 Dynamic Gesture: Mouse Pointer Tracks LED with greatest y-value Executed when no other gesture is recognized Slide 13 Dynamic Gesture: Drag and Drop Distance between two LEDs decreasing at minimum velocity DRAG: Minimum distance between maintained DROP: Distance between exceeds minimum distance Slide 14 Planned Gestures Single Hand Gestures Mouse Click Mouse Scroll Window Maximize/Restore Two Handed Gestures Window Selection Object Resize Object Zoom Object Rotate Slide 15 Preliminary Analysis Speed Mode length of time for each iteration: 47 ms Slower than necessary 24fps (41ms) required to bypass human perception of real events Accuracy Poor LED detection has led to poor gesture recognition Brighter LEDs or stronger camera necessary Slide 16 Possible Solutions Brighter IR LEDs LED pulse driving circuit Webcam with night vision IR narrow band pass filter Slide 17 Work Remaining Improved hardware Refined LED detection/tracking Quicker processing Increased gesture support Application Control Slide 18 Conclusions Speed and accuracy still an issue Minimize static gesture simplifies task when compared to mouse interface Glove interface constantly receives IR light Multi-touch gestures activate when IR light activates Requirement of multi-touch interfaces for direct contact ensures consistency