1
Background and contribution Haptic Augmented Reality (HAR) allows the user to perceive virtual objects superimposed to a real scene through haptic stimuli. Evaluation of the possibility to use tactile stimulation to enhance user performances in simple HAR tasks. Use of wearable devices and integration of multiple heterogeneous haptic interfaces. Can Wearable Haptics Enhance the User Interaction with Augmented Reality? G. Salvietti, S. Scheggi, A. Ramos, D. Prattichizzo Department of Information Engineering, University of Siena, Italy Advanced Robotics Lab, Italian Institute of Technology, Genova, Italy Performance evaluation using a priori knowledge of the weight of an object Fig. 1. Methods to generate vertical stress and shearing stress on fingerpad (left), disposition on the hand (right). References [1] K. Minamizawa, H. Kajimoto, N. Kawakami, and S. Tachi. A Wearable Haptic Display to Present the Gravity Sensation - Preliminary Observations and Device Design. In Proc. EuroHaptics Int. Conf., pages 133–138, 2007. [2] K. Minamizawa, S. Fukamachi, N. Kawakami, and S. Tachi. Interactive Representation of Virtual Object in Hand-Held Box by Finger- Worn Haptic Display. In Proc. IEEE Int. Symp. on Haptic interfaces for virtual environment and teleoperator systems, pages 367–368, 2008. [3] S. Scheggi, G .Salvietti, and D. Prattichizzo. Shape and Weight Rendering for Haptic Augmented Reality. In Proc. of the IEEE Int. Symp. in Robot and Human Interactive Communication, pages 44-49, 2010. rotate in opposite direction rotate in same direction generated vertical force generated lateral force Motors Belt Velcro strap Tactile finger-worn devices Previous works have demonstrated that the vertical and shearing forces generated by the deformation of the finger-pads can reproduce reliable weight sensations even when the proprioceptive sensations on the wrist and the arm are absent [1]. Minamizawa et al. [2] designed a mechanism for haptic display, based on the principle showed in Fig. 1. Shape and weight rendering The target of this experiment is to allow the user to perceive at the same time the shape and the weight of a virtual object by using both hands [3] (see Fig. 2, 3). The subject were asked to interact with the cube with both devices on. The experiment was validated through a questionnaire. 83% of the subjects confirmed that the interaction was more intuitive using the device. Fig. 2. Experiment setup: (left) the finger-worn devices in the user’s left hand (2) allow to feel the virtual object’s weight, while a FALCON (4) is devoted to the rendering of object shape. The real scene is captured by a fixed webcam (5) while a video eyewear system (1) displays the AR environment built on the marker (3); (right) fusion of the real and virtual world and Falcon avatar (blue sphere). Fig. 3. Virtual object interaction: (left) when no interaction exists between the Falcon and virtual objects, the finger-worn devices only render the object mass (w g ) since the marker mass (w m ) is already exerted by the hand; (center) when the Falcon interacts with objects, also the forces exerted by the user (w f ) contribute in the resulting force computation, while w f represents the radius of the friction cone; (right) the resulting force (w) to be rendered through the finger-worn devices. Fig. 4. Experiment setup: (left) the finger-worn devices in the user’s left hand, (3) allow the user to feel the virtual object’s weight before he starts the manipulation task with the Omega.7 (2). The real scene is captured by a fixed webcam (1) while a monitor displays the AR environment built on the marker (4); (right) fusion of the real and virtual world and Omega.7 avatar (white spheres). Demonstrate that a priori knowledge of the weight of virtual objects displayed by tactile devices enhances the performance of a manipulation task in terms of execution time. The manipulation task consists of grasping the virtual cube (solid red in Fig. 4) and lifting it up until it coincides with the semi-transparent one. Using tactile feedback the task execution time decreases as shown in Fig. 5. Fig. 5. (left) Virtual cube displacement: with (red) and without (blue) haptic feedback on finger-worn devices respectively and relative target zone (green). (right) box plot depicting median, lower and upper quartile, extreme data points and outliers.

Can Wearable Haptics Enhance the User Interaction with

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Can Wearable Haptics Enhance the User Interaction with

!

Background and contribution!!•  Haptic Augmented Reality (HAR) allows the user to perceive virtual

objects superimposed to a real scene through haptic stimuli. !!

•  Evaluation of the possibility to use tactile stimulation to enhance user performances in simple HAR tasks.!

•  Use of wearable devices and integration of multiple heterogeneous haptic interfaces.!

Can Wearable Haptics Enhance the User!Interaction with Augmented Reality?!

!G. Salvietti, S. Scheggi, A. Ramos, D. Prattichizzo!

Department of Information Engineering, University of Siena, Italy!Advanced Robotics Lab, Italian Institute of Technology, Genova, Italy !

Performance evaluation using a priori knowledge of the weight of an object!

Fig. 1. Methods to generate vertical stress and shearing stress on fingerpad (left), disposition on the hand (right).!

References!!

[1] K. Minamizawa, H. Kajimoto, N. Kawakami, and S. Tachi. A Wearable Haptic Display to Present the Gravity Sensation - Preliminary Observations and Device Design. In Proc. EuroHaptics Int. Conf., pages 133–138, 2007.!!

[2] K. Minamizawa, S. Fukamachi, N. Kawakami, and S. Tachi. Interactive Representation of Virtual Object in Hand-Held Box by Finger- Worn Haptic Display. In Proc. IEEE Int. Symp. on Haptic interfaces for virtual environment and teleoperator systems, pages 367–368, 2008.!!

[3] S. Scheggi, G .Salvietti, and D. Prattichizzo. Shape and Weight Rendering for Haptic Augmented Reality. In Proc. of the IEEE Int. Symp. in Robot and Human Interactive Communication, pages 44-49, 2010.!

rotate in opposite direction! rotate in same direction!

generated vertical force ! generated lateral force!

V

B

MMotors!

Belt!

Velcro strap!

Tactile finger-worn devices!•  Previous works have demonstrated that the vertical and shearing forces

generated by the deformation of the finger-pads can reproduce reliable weight sensations even when the proprioceptive sensations on the wrist and the arm are absent [1]. !

•  Minamizawa et al. [2] designed a mechanism for haptic display, based on the principle showed in Fig. 1. !

Shape and weight rendering!!•  The target of this experiment is to allow the user to perceive at the same

time the shape and the weight of a virtual object by using both hands [3] (see Fig. 2, 3).!

•  The subject were asked to interact with the cube with both devices on.!

•  The experiment was validated through a questionnaire. 83% of the subjects confirmed that the interaction was more intuitive using the device. !

Fig. 2. Experiment setup: (left) the finger-worn devices in the user’s left hand(2) allow to feel the virtual object’s weight, while a FALCON (4) is devoted to the rendering of object shape. The real scene is captured by a fixed webcam (5) while a video eyewear system (1) displays the AR environment built on the marker (3); (right) fusion of the real and virtual world and Falcon avatar (blue sphere).!

Fig. 3. Virtual object interaction: (left) when no interaction exists between the Falcon and virtual objects, the finger-worn devices only render the object mass (wg) since the marker mass (wm) is already exerted by the hand; (center) when the Falcon interacts with objects, also the forces exerted by the user (wf) contribute in the resulting force computation, while wf represents the radius of the friction cone; (right) the resulting force (w) to be rendered through the finger-worn devices.!!

Fig. 4. Experiment setup: (left) the finger-worn devices in the user’s left hand, (3) allow the user to feel the virtual object’s weight before he starts the manipulation task with the Omega.7 (2). The real scene is captured by a fixed webcam (1) while a monitor displays the AR environment built on the marker (4); (right) fusion of the real and virtual world and Omega.7 avatar (white spheres). !

•  Demonstrate that a priori knowledge of the weight of virtual objects displayed by tactile devices enhances the performance of a manipulation task in terms of execution time. !

•  The manipulation task consists of grasping the virtual cube (solid red in Fig. 4) and lifting it up until it coincides with the semi-transparent one.!

•  Using tactile feedback the task execution time decreases as shown in Fig. 5.!

!

Fig. 5. (left) Virtual cube displacement: with (red) and without (blue) haptic feedback on finger-worn devices respectively and relative target zone (green). (right) box plot depicting median, lower and upper quartile, extreme data points and outliers. !