2
“Left Arm Up!” Interactive Yoga Training in Virtual Environment Zhiqiang Luo 1 , Weiting Yang 1 , Zhong Qiang Ding 1 , Lili Liu 1 , I-Ming Chen 1 , Song Huat Yeo 1 , Keck Voon Ling 2 , Henry Been-Lirn Duh 3 1 School of Mechanical & Aerospace Engineering, Nanyang Technological University, Singapore 2 School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 3 Department of Electrical and Computer Engineering, National University of Singapore, Singapore ABSTRACT The paper describes a Yoga training system that is built based on motion replication technique (MoRep), including hardware, virtual scenario and feedback design. The motion replication technique proposed here can determine the similarity between Yoga master and student’s postures and then provide feedback on the incorrect body posture of the student through multimodal channels. The key innovations of this project are also discussed. KEYWORDS: Motion Replication, Motion Capture, Multimodal Interaction, Virtual Environment. INDEX TERMS: I.3.7 [Three-Dimensional Graphics and Realism]: Virtual Reality. 1 YOGA TRAINING SYSTEM The Yoga training system includes following key components: (1) One InterfaceSuit comprising 16 Inertial Measurement Unit (IMU) and 6 tactors; (2) A TV screen with PC; and (3) a VR application running on PC. Before running the VR application, a user should wear InterfaceSuit and then power on all hardware. The interaction between the user and the application is similar to training in a studio environment with face-to-face contact. A user first needs to register in the system and chooses a virtual character (a virtual Yoga Student) for representation in VEs. Then, a virtual Yoga Master will appear to welcome the Student and lead the virtual Yoga Student to a virtual room to start the Yoga learning lesson. The virtual Yoga Students’ motion is controlled by the user’s motion in real time. During the lesson, the user will imitate the postures demonstrated by the virtual Yoga Master. With application of MoRep technique, the user’s wrong posture can be identified and corrected by the virtual Yoga Master through multimodal feedback in real time. Thus physical experience of Yoga training in a real studio in reality will be replicated in such training in VEs. 1.1 InterfaceSuit The InterfaceSuit (Figure 1) includes 6 tactors and 16 inertial measurement units (IMU) each of which contains 3-axis of accelerometers, 3-axis of gyroscopes, and 3-axis magnetometer. Figure 1. InterfaceSuit contains sixteen IMUs and six tactors. Figure 2. The virtual environment for Yoga The IMU can measure Roll-Pitch-Yaw orientations, which is used to detect the orientation of human body part. The tactor is attached on four human limbs and can generate the vibration on human skin to provide the haptic sense to user. 1.2 Virtual Environments The virtual environment places the student in a room decorated by candles and mats on the ground (Figure 2). The environment is designed to be similar with a studio in reality, which should be quiet and peaceful as those are the circumstances in which Yoga should be practiced. 1, 2 50 Nanyang Avenue, Singapore. 639798. {zqluo, wtyang, zqding, lililiu, michen, myeosh, ekvling}@ntu.edu.sg. 3 21 Heng Mui Keng Terrace, Singapore 119613. eledbl@ nus.edu.sg. 261 IEEE Virtual Reality 2011 19 - 23 March, Singapore 978-1-4577-0038-5/11/$26.00 ©2011 IEEE

[IEEE 2011 IEEE Virtual Reality (VR) - Singapore, Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - “Left Arm Up!” Interactive Yoga training in virtual

Embed Size (px)

Citation preview

Page 1: [IEEE 2011 IEEE Virtual Reality (VR) - Singapore, Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - “Left Arm Up!” Interactive Yoga training in virtual

“Left Arm Up!” Interactive Yoga Training in Virtual Environment

Zhiqiang Luo1, Weiting Yang

1, Zhong Qiang Ding

1, Lili Liu

1, I-Ming Chen

1, Song Huat Yeo

1,

Keck Voon Ling2, Henry Been-Lirn Duh

3

1School of Mechanical & Aerospace Engineering, Nanyang Technological University, Singapore

2School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore

3Department of Electrical and Computer Engineering, National University of Singapore, Singapore

ABSTRACT

The paper describes a Yoga training system that is built based on motion replication technique (MoRep), including hardware, virtual scenario and feedback design. The motion replication technique proposed here can determine the similarity between Yoga master and student’s postures and then provide feedback on the incorrect body posture of the student through multimodal channels. The key innovations of this project are also discussed. KEYWORDS: Motion Replication, Motion Capture, Multimodal Interaction, Virtual Environment. INDEX TERMS: I.3.7 [Three-Dimensional Graphics and Realism]: Virtual Reality.

1 YOGA TRAINING SYSTEM

The Yoga training system includes following key components: (1) One InterfaceSuit comprising 16 Inertial Measurement Unit (IMU) and 6 tactors; (2) A TV screen with PC; and (3) a VR application running on PC.

Before running the VR application, a user should wear InterfaceSuit and then power on all hardware. The interaction between the user and the application is similar to training in a studio environment with face-to-face contact. A user first needs to register in the system and chooses a virtual character (a virtual Yoga Student) for representation in VEs. Then, a virtual Yoga Master will appear to welcome the Student and lead the virtual Yoga Student to a virtual room to start the Yoga learning lesson.

The virtual Yoga Students’ motion is controlled by the user’s motion in real time. During the lesson, the user will imitate the postures demonstrated by the virtual Yoga Master. With application of MoRep technique, the user’s wrong posture can be identified and corrected by the virtual Yoga Master through multimodal feedback in real time. Thus physical experience of Yoga training in a real studio in reality will be replicated in such training in VEs.

1.1 InterfaceSuit

The InterfaceSuit (Figure 1) includes 6 tactors and 16 inertial measurement units (IMU) each of which contains 3-axis of

accelerometers, 3-axis of gyroscopes, and 3-axis magnetometer.

Figure 1. InterfaceSuit contains sixteen IMUs and six tactors.

Figure 2. The virtual environment for Yoga

The IMU can measure Roll-Pitch-Yaw orientations, which is used to detect the orientation of human body part. The tactor is attached on four human limbs and can generate the vibration on human skin to provide the haptic sense to user.

1.2 Virtual Environments

The virtual environment places the student in a room decorated by candles and mats on the ground (Figure 2). The environment is designed to be similar with a studio in reality, which should be quiet and peaceful as those are the circumstances in which Yoga should be practiced.

1, 250 Nanyang Avenue, Singapore. 639798. {zqluo, wtyang, zqding, lililiu, michen, myeosh, ekvling}@ntu.edu.sg. 321 Heng Mui Keng Terrace, Singapore 119613. eledbl@ nus.edu.sg.

261

IEEE Virtual Reality 201119 - 23 March, Singapore978-1-4577-0038-5/11/$26.00 ©2011 IEEE

Page 2: [IEEE 2011 IEEE Virtual Reality (VR) - Singapore, Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - “Left Arm Up!” Interactive Yoga training in virtual

1.3 Motion Replication

Motion replication (MoRep) enhances the understanding of the user’s motion by comparing the user’s posture with the virtual instructor’s posture, by recognizing body parts that are not replicating the corresponding ones, and then by providing the relevant and professional feedback content.

The MoRep methodology for determining the similarity of two postures is by comparing the joint angles of fifteen joints, including head, shoulders, elbows, wrists, chest, pelvis, hips, knees, and ankles, and by calculating the distance of end points required in each Yoga posture.

1.4 Multimodal Feedback

The multimodal feedback includes the relevant information sent through audio, visual, and haptic channels. First, Yoga instructor will give audio feedback (e.g “Left Arm Up”). At the same time, there will be visual instruction shown on screen, including (1) the text with the same content as the audio feedback; (2) the red arrows pointing to body parts mentioned in the audio feedback. Last, the tactors attached on the mentioned body parts will be activated to generate the vibration to further correct user’s wrong posture.

2 DISCUSSION

Compared to the existing motion training systems in VEs, our designed system has following key innovations.

First, the system employs motion replication technique to determine the student’s performance on Yoga practice. Specifically the motion replication technique can determine 45-DOF full body motion, comprehensively compare the motion performance between the student and master, and provide the detailed feedback to the user (the student).

Second, in addition to visual and audio feedback to correct student’s performance, haptic feedback is provided by the tactors attached on body. The haptic feedback provides spatial information of human postures independent of the visual channel and thus is invaluable to the training process whereby continuous visual contact is not possible.

Third, the pace of Yoga master’s motion is related to the pace of the student learning. Only when both the student and master’s posture are matched, the master will then continue to the next posture. The user can also pause or skip the current posture.

Last, the present system supports multi-user Yoga training simultaneously. The Internet connectivity allows multiple students at their home to attend the same Yoga training at the same time.

3 CONCLUSION

A VR system for interactive Yoga training is described in the present research demonstration. The VR system can not only guide a user to imitate the Yoga pose but also correct user’s wrong pose by providing audio, visual, and haptic feedback.

4 ROBOTIC RESEARCH CENTRE WITH NANYANG

TECHNOLOGICAL UNIVERSITY

Robotics Research Centre (RRC) with Nanyang Technological University (NTU), Singapore, was established in May 1994 with a start-up fund of S$1.68 million. The RRC is jointly managed and funded by the School of Computer Engineering (SCE), the School of Mechanical & Aerospace Engineering (MAE), and School of Electrical and Electronic Engineering (EEE).

The main objectives of the RRC are: • to consolidate, focus and accelerate robotic research

activities within the University;

• to bring together researchers and academics in the area of robotics into a conducive environment equipped with state-of-the-art research facilities;

• to cooperate with industrial partners and government agencies in fields of strategic importance to robotics;

• to provide consultancy services to local industry in robotics and related areas

The RRC vigorously promotes graduate research in the area of robotics and its related disciplines. The centre currently supports over 50 graduate students. RRC provides a supportive and stimulating environment for all who participate in its research and development activities.

4.1 Research Directions

The directions of the Centre remains focused towards intelligent robotic systems in specialized applications.

Under this classification there are two distinct areas. The first being industrially related robotic applications and the other being highly specialized robots for medical and hazardous applications. A secondary aspect is to acquire and develop its technological expertise in the areas of robotics. Investigation into relevant technology in the areas of machine vision, sensors, actuators and cybernetics technology continues at a vigorous pace.

4.2 Research Groups

The RRC is organized into research groups with staff drawn from the academic and research staff of three schools in the University, School of Electronic and Electronic Engineering, School of Mechanical and Aerospace Engineering, and School of Computer Science. There are six research groups with RRC, including Mixed Multi-Agent System Research Group, Underwater Robotics Research Group, Biorobotics Research Group, Adaptive Locomotion Research Group and Interactive Sensing and Robotics Group. The research areas are following:

� Modular Reconfigurable Robotic Systems � Intelligent Vehicle Systems � Underwater Robotics Inspection & Manipulation � Medical Robotics � Adaptive Locomotion � Sensing and Laser Imaging � Applied Optics � Biomechanics

5 INTERACTIVE SENSING AND ROBOTICS GROUP

Interactive Sensing and Robotics (ISR) Group is one of the active research groups in RRC. The ISR group focuses on the cutting edge research and development in innovative sensor design, human-robot interaction in reality and virtual world, novel actuator design, and other robotics topic.

The research projects with the ISR group include modular reconfigurable robots, robotic marrioneetes, ultrasonic motors, Interactive Robotic Lion Dancing System, SmartSuit - Wearable embedded human motion processing systems, Replicating and Processing of Human Body Motion for Participatory Interaction in Co-Space, etc.

The research results have been published widely in reference journals, conferences, book chapters, books, patents, and students’ theses. A brief summary of publications are 71 journal papers, more than one hundred conference papers, five book chapters, two books, two patents, and sixteen master and doctorate theses.

ACKNOWLEDGMENTS

This work is supported by Media Development Authority, Singapore under NRF IDM004-005 Grant.

262