2
MR Fitting Simulation with Virtual Camera Motions for Wheelchair Users Ryosuke Ichikari * Masaki Onishi Takeshi Kurata Center for Service Research, National Institute of Advanced Industrial Science and Technology ABSTRACT Wheelchair users frequently encounter difficulties such as when fit- ting clothes being purchased and the absence of items that are suit- able for them. We present a mixed reality fitting simulation system that enables virtual trial fitting without forcing them to get off the wheelchair. An RGB-D camera is utilized for measuring their body and other real objects without the need for making physical con- tact. The system allows the user to check the result of fitting in a three-dimensional computer-generated space. Index Terms: H.5.1 [INFORMATION INTERFACES AND PRE- SENTATION (e.g., HCI)]: Multimedia Information Systems — Artificial, augmented, and virtual realities 1 I NTRODUCTION Recently, there has been an increased focus on the quality of life of disabled persons. Wheelchair users frequently encounter diffi- culties during activities such as inconveniences when fitting clothes being purchased and the lack of items that are suitable for them. Current technologies used for manufacturing clothes are based on standing positions, including the making of dummies and measur- ing the shapes of the body. We participated in a project that aims to support the need for wheelchair users to take care of their personal appearance by developing new measuring technologies and a new virtual fitting system. In this paper, we present a fitting simulation system developed as a key research topic in the project. From conversations held with several wheelchair users, the re- quirements for fitting simulation have been clarified as follows: The system should enable virtual fitting without the need to dismount from the wheelchair. The system should allow users to check their appearances from various viewpoints including frontal and standing positions. The cost of the system configuration should be moderate to enable affordability. To realize these requirements, we utilized mixed reality (MR) technology for virtually changing clothes. Virtual clothes can be prepared as three-dimensional computer-generated (3D-CG) mod- els using existing CG-modeling methods, which are later superim- posed onto the images of the wheelchair users in real time. Instead of physically changing or adding viewpoints, we prefer to virtually change the viewpoints because of the cost requirement. The idea of using MR or AR (augmented reality) for virtual fitting is com- paratively straightforward. There are existing studies that aim to address such a purpose [3] [1]. However, there are no studies that focus on wheelchair users and users in seated positions. Our study is conducted as a part of a project that focuses on sup- porting wheelchair users by providing solutions for their demands. The research topics in the projects are roughly divided into the fol- lowing three stages (as shown in Figure 1): Contact/non-contact measurement of body * e-mail: [email protected] Non-contact Measurement by RGB-D Camera Modeling/Capturing Clothes MR Fitting Simulation Contact-type Measurement Pattern Making by Apparel CAD Design (Photo) of the Dedicated Dummy Measurement of the Wheel- chair User Virtual Fitting Actual Clothes Fabrication Visualizing the measured results Utilizing data from fitting results (size, personal preference, etc.) Utilizing the measured data for the clothes fabrication Feedback of data obtained during clothes fabrication Figure 1: Research System Virtual fitting Actual clothes fabrication with dedicated dummy In the virtual fitting stage, the result of the contact and non- contact measurements can be visualized in a 3D-CG space onto which CG clothes are superimposed. The results of the measure- ment and the fitting can be utilized in fabrication of the actual clothes. Additionally, the results of fabrication, such as data from apparel CAD software, can be visualized during the virtual fitting. This paper introduces the virtual fitting part and introduces a proto- type system of the MR fitting simulation. 2 PROTOTYPE SYSTEM OF MR FITTING SIMULATION 2.1 Approaches The key idea for fulfilling the demands of the MR fitting simula- tion is to utilize an RGB-D camera for the 3D reconstruction of the real scene. The RGB-D camera enables non-contact measurement of real-world while capturing an RGB video. In the MR fitting sim- ulation, the RGB-D camera is used for visualizing real objects in a 3D-CG space in which the viewpoints can be easily changed. CG clothes to be superimposed on the user are created in advance. The composition quality of the MR fitting simulation is very im- portant because the system aims to be used for decision-making when purchasing clothes. Simulation results should be accurate in terms of size, position, and rotation, as well as in terms of the posi- tional relation between the background and superimposed clothes. According to these requirements, we implemented the following functions in the prototype system. Virtual camera motions in MR space. Composition of CG clothes without visual paradox by multi- path rendering. Accurate semiautomatic registrations of CG clothes. In this prototype system, real-time tracking of the body and arms of wheelchair users was not implemented; this was to focus on the other functions. In this case, we assume that the wheelchair users do not move during the MR fitting. 2.2 Virtual Camera Motion in MR Space The wheelchair users care about their appearance captured from the standing person’s viewpoint. To change viewpoints, everything in

MR Fitting Simulation with Virtual Camera Motions for

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

MR Fitting Simulation with Virtual Camera Motions for Wheelchair Users

Ryosuke Ichikari∗ Masaki Onishi Takeshi Kurata

Center for Service Research, National Institute of Advanced Industrial Science and Technology

ABSTRACT

Wheelchair users frequently encounter difficulties such as when fit-ting clothes being purchased and the absence of items that are suit-able for them. We present a mixed reality fitting simulation systemthat enables virtual trial fitting without forcing them to get off thewheelchair. An RGB-D camera is utilized for measuring their bodyand other real objects without the need for making physical con-tact. The system allows the user to check the result of fitting in athree-dimensional computer-generated space.

Index Terms: H.5.1 [INFORMATION INTERFACES AND PRE-SENTATION (e.g., HCI)]: Multimedia Information Systems —Artificial, augmented, and virtual realities

1 INTRODUCTION

Recently, there has been an increased focus on the quality of lifeof disabled persons. Wheelchair users frequently encounter diffi-culties during activities such as inconveniences when fitting clothesbeing purchased and the lack of items that are suitable for them.Current technologies used for manufacturing clothes are based onstanding positions, including the making of dummies and measur-ing the shapes of the body. We participated in a project that aims tosupport the need for wheelchair users to take care of their personalappearance by developing new measuring technologies and a newvirtual fitting system. In this paper, we present a fitting simulationsystem developed as a key research topic in the project.

From conversations held with several wheelchair users, the re-quirements for fitting simulation have been clarified as follows:• The system should enable virtual fitting without the need to

dismount from the wheelchair.

• The system should allow users to check their appearances fromvarious viewpoints including frontal and standing positions.

• The cost of the system configuration should be moderate toenable affordability.

To realize these requirements, we utilized mixed reality (MR)technology for virtually changing clothes. Virtual clothes can beprepared as three-dimensional computer-generated (3D-CG) mod-els using existing CG-modeling methods, which are later superim-posed onto the images of the wheelchair users in real time. Insteadof physically changing or adding viewpoints, we prefer to virtuallychange the viewpoints because of the cost requirement. The ideaof using MR or AR (augmented reality) for virtual fitting is com-paratively straightforward. There are existing studies that aim toaddress such a purpose [3] [1]. However, there are no studies thatfocus on wheelchair users and users in seated positions.

Our study is conducted as a part of a project that focuses on sup-porting wheelchair users by providing solutions for their demands.The research topics in the projects are roughly divided into the fol-lowing three stages (as shown in Figure 1):• Contact/non-contact measurement of body

∗e-mail: [email protected]

Non-contactMeasurement by RGB-D Camera

Modeling/CapturingClothes

MR Fitting Simulation

Contact-typeMeasurement

Pattern Makingby Apparel CAD

Design (Photo) ofthe Dedicated DummyMeasurement

of the Wheel-chair User

Virtual Fitting

Actual ClothesFabrication

Visualizingthe measuredresults Utilizing data from

fitting results (size, personal preference, etc.)

Utilizing the measured data for the clothes fabrication

Feedback of data obtained duringclothes fabrication

Figure 1: Research System

• Virtual fitting• Actual clothes fabrication with dedicated dummyIn the virtual fitting stage, the result of the contact and non-

contact measurements can be visualized in a 3D-CG space ontowhich CG clothes are superimposed. The results of the measure-ment and the fitting can be utilized in fabrication of the actualclothes. Additionally, the results of fabrication, such as data fromapparel CAD software, can be visualized during the virtual fitting.This paper introduces the virtual fitting part and introduces a proto-type system of the MR fitting simulation.

2 PROTOTYPE SYSTEM OF MR FITTING SIMULATION

2.1 ApproachesThe key idea for fulfilling the demands of the MR fitting simula-tion is to utilize an RGB-D camera for the 3D reconstruction of thereal scene. The RGB-D camera enables non-contact measurementof real-world while capturing an RGB video. In the MR fitting sim-ulation, the RGB-D camera is used for visualizing real objects in a3D-CG space in which the viewpoints can be easily changed. CGclothes to be superimposed on the user are created in advance.

The composition quality of the MR fitting simulation is very im-portant because the system aims to be used for decision-makingwhen purchasing clothes. Simulation results should be accurate interms of size, position, and rotation, as well as in terms of the posi-tional relation between the background and superimposed clothes.

According to these requirements, we implemented the followingfunctions in the prototype system.

• Virtual camera motions in MR space.• Composition of CG clothes without visual paradox by multi-

path rendering.• Accurate semiautomatic registrations of CG clothes.

In this prototype system, real-time tracking of the body and armsof wheelchair users was not implemented; this was to focus on theother functions. In this case, we assume that the wheelchair usersdo not move during the MR fitting.

2.2 Virtual Camera Motion in MR SpaceThe wheelchair users care about their appearance captured from thestanding person’s viewpoint. To change viewpoints, everything in

goto
タイプライターテキスト
ICAT The 23rd International Conference on Artificial Reality and Telexistence, Mirakan, Tokyo, Japan

the real world is reconstructed in the 3D-CG space. In contrastwith normal MR systems that have moving RGB cameras, our sys-tem adopts a fixed RGB-D camera for capturing real world view-points. We adopt an Asus Xtion as the RGB-D camera. The RGB-D sequences are imported using the OpenNI library and are ren-dered as point clouds by OpenGL. In the 3D-CG space generatedby OpenGL, virtual viewpoints can be interactively changed usingthe GUI. Virtual clothes are also rendered as 3D-CG models andsuperimposed onto the rendering result of the reconstructed realscene. The RGB-D camera’s coverage area is limited by its fieldof view. To increase the sense of immersion, the system has a func-tion to import background CG models that cover wide angles. Asan example, we adopt CG models generated by our modeler [2].

2.3 Rendering Pipeline

Geometric consistency, widely regarded as a topic of utmost im-portance in the AR/MR research field, is an area on which manyresearchers are currently working. For MR fitting simulations, thecomposition should be prepared while maintaining the geometricalconsistency between real and virtual objects. If a piece of clothingthat is to be superimposed covers the entire upper body, the part ofthe original clothes belonging to the user should not be visible. Un-fortunately, this is not guaranteed by the simple method for super-imposing virtual objects in MR. We adopt the multi-path renderingmethod to maintain the geometrical consistency. In our multi-pathrendering method, each element of the scene is rendered separatelyand then combined correctly on the basis of positional relationshipbetween the elements. During the process, areas that should notbe visible are deleted. The multi-path rendering is executed usingthe steps shown below. The intermediate results are then shown inFigures 2(c) and 2(d).

1. Obtain the surrounding areas of the silhouette of the clothes byperforming off-screen rendering of clothes and image process-ing (dilation and subtraction).

2. Obtain protruded areas by calculating the logical product ofsurrounding areas and the wheelchair user’s silhouette, whichis rendered by off-screen rendering of the point cloud.

3. Obtain a simple MR composition result by superimposing theoff-screen rendering result of the clothes onto the off-screenrendering result of the entire real scene.

4. Delete the protruding areas from the MR composition result.5. Fill the deleted areas and occluded areas of the RGB-D camera

with proper textures from pre-captured background images orusing inpainting methods (like diminished reality).

2.4 Semiautomatic Registration

To superimpose virtual clothes onto the correct position of the im-age of the user, registration in the world coordinate system is re-quired. Fortunately, the scale of the coordinate system defined bythe RGB-D camera is already based on the scale of the real worldwithout the need for any special calibration process. We assumethat the reconstructed real world and virtual clothes are based onthe scale of the real world.

For the initial registration of the clothing, we implemented asimple semi-automatic registration method. Four control points arepredefined for each piece of virtual clothing. The user assigns fourpoints corresponding to the clothes by clicking points on the screen.The position can be calculated by matching the center of gravity ofthe points, and the rotation can be calculated by minimizing the sumof the distance between the corresponding points.

3 RESULTS AND FUTURE WORKS

As shown in Figure 2, the proposed functions were successfullyimplemented. The effect of the clean-up operation can be shownby making a comparison between (e) and (f). In these images, the

(a) (c)(b) (d)

(e) (g)(f) (h)

Figure 2: Results (a: before composition, b: clothing, c: surroundingarea, d: deleted area, e: simple composition, f: result with clean-up,g: another viewpoint, h: another piece of clothing)

Figure 3: Photograph of the demonstration

occluded areas of the RGB-D camera are accurately filled with thebackground image.

We demonstrated the proposed system during the open house ofAIST in Tsukuba, Japan. The appearance of the demonstration isshown in Figure 3. Along with about 100 participants who werenot wheelchair users, four users have themselves experienced thedemonstrations. A summary of the feedback received from theusers is as follows:• The consideration given to wheelchair users is admired.• The removal of the protruding areas was not perfect because of

the user’s freedom of movement.• Visualization from a standing position is desirable, and visual-

ization from the rear is desired.For future work, we will refine the prototype system according

to the feedback received, especially with respect to the tracking ofthe body and arm in order to improve the freedom of the users andthe visualization from the rear of the users.

ACKNOWLEDGEMENTS

This work was supported by JSPS KAKENHI (No. 24240095).

REFERENCES

[1] Virtual mirror. http://www.hhi.fraunhofer.de/fields-of-competence/image-processing/research-groups/computer-vision-graphics/research/virtual-mirror.html.

[2] T. Ishikawa, T. Kalaivani, M. Kourogi, A. P. Gee, W. Mayol, K. Jung,and T. Kurata. In-situ 3d indoor modeler with a camera and self-contained sensors. InProc of Int’l Conf. on Human-Computer Inter-action, pages 454–464, 2009.

[3] H. Stefan, S. Matthias, and R. Gerhard. Image-based clothes transfer.In Proc. of IEEE Int’l Symp. on Mixed and Augmented Reality, pages169–172, 2011.