Transcript
Page 1: [IEEE 2011 IEEE Virtual Reality (VR) - Singapore, Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - MetaCookie+

MetaCookie+ 1Takuji Narumi, 2Shinya Nishizaka, 2Takashi Kajinami, 2Tomohiro Tanikawa and 2Michitaka Hirose

1 Graduate School of Engineering, the University of Tokyo / JSPS 2 Graduate School of Information Science and Technology, the University of Tokyo

ABSTRACT In our research demonstration, we show a "MetaCookie+" that enables the user to experience various tastes without changing the chemical composition of foods by using the influence between modalities. It is a pseudo-gustatory display by combining the Edible Marker system which can detect the state [number/shape/6-DOF coordinate] of each piece of bitten or divided food in real time, and the "Pseudo-gustation" method to change the perceived taste of food by changing its appearance and scent. KEYWORDS: Gustatory Display, Pseudo-gustation, Edible Marker, Cross-modal Integration, Augmented Reality INDEX TERMS: H.5.1 [Information Interfaces and Presentation (I.7)]: Multimedia Information Systems—Artificial, augmented, and virtual realities, Evaluation/methodology

1 INTRODUCTION Recently it has become easy to manipulate visual and auditory information by using a computer. However, there have been few display systems presenting gustatory information [1]. One reason is that taste sensation is affected by other factors such as vision, olfaction, thermal sensation, and memory. Thus, the complexity of the cognition mechanism for gustatory sensation makes it difficult to build up a gustatory display, which is able to present a wide variety of gustatory information. Our hypothesis is that the complexity of the gustatory system can be applied to the realization of a pseudo-gustatory display that presents the desired flavors by means of a cross-modal effect [2]. In our demonstration, we show our system for flavor augmentation using visual and olfactory AR technologies.

2 PSEUDO-GUSTATORY DISPLAY: META-COOKIE+ "MetaCookie+" (Fig. 1) is a psuedo-gustatory display to change the perceived taste of a cookie by overlaying visual and olfactory information onto a real cookie with a special AR marker pattern. "MetaCookie+" comprises four components: a marker-pattern-printed plain cookie, a cookie detection unit based on an “Edible Marker system” (Fig. 2) which can detect the state [number/shape/6-DOF coordinate] of each piece of bitten or divided food in real time, an overlaying visual information unit, and an olfactory display. In this system, a user wears a head-mounted visual and olfactory display system. The cookie detection unit detects the marker-pattern-printed cookie and calculates the state [6DOF coordinate/occlusion/division/ distance between the cookie and the nose of a user] of the cookie in real time. Based on the calculated state, an image of a flavored cookie is overlaid onto the cookie. Moreover, the olfactory display generates the scent of a flavored cookie with an intensity that is determined based on the calculated distance between the cookie and the nose of a user. The user can choose one cookie, which s/he wants to eat, from

multiple types. The appearance and scent of the flavored cookie, which the user selects, are overlaid onto the cookie. The realistic visual and olfactory augmentation evokes cross-modal effects between vision, olfaction and gustation, and allows users to experience the selected flavor from the plain cookie. We performed an experiment that investigates how people experience the flavor of a plain cookie by using our system. In the experiment, in 79.0% of the trials, the participants experienced a change in the cookie's taste. Moreover, in 72.6% of the trials, the participants felt the selected cookie taste, i.e., a participant felt the taste of a chocolate cookie when they chose the chocolate taste. The results of the experiment suggested that our system can change the perceived taste.

Fig. 1 MetaCookie+

Fig. 2 Processing steps of the occlusion-and-division-detectable

“Edible Marker” system

*1 e-mail: [email protected] *2 e-mail: { nshinya, kaji, tani, hirose}@cyber.t.u-tokyo.ac.jp

265

IEEE Virtual Reality 201119 - 23 March, Singapore978-1-4577-0038-5/11/$26.00 ©2011 IEEE

Page 2: [IEEE 2011 IEEE Virtual Reality (VR) - Singapore, Singapore (2011.03.19-2011.03.23)] 2011 IEEE Virtual Reality Conference - MetaCookie+

3 HIROSE TANIKAWA LAB, THE UNIVERSITY OF TOKYO Hirose Tanikawa Lab, at the University of Tokyo, focuses on building a high-level user interface that unites human and computer, called cybernetic Interface. Starting with virtual reality technology, we seek to research and develop such interface in detail. Specifically, we are designing multimodal interfaces, mixed reality research, wearable computers, and ubiquitous computing research. We also study information visualization for huge data, for example lifelog and genome science. In addition, we have many projects for various application based on the technologies described above; digital public art, digital museum and preservation of implicit knowledge of traditional craft.

3.1 Virtual Reality and Contents Research

3.1.1 Ultra Realistic Communication Research The next generation of stereoscopic high-resolution display provides us "existence" as well as "presence".With developing a physical 3D display, we research a novel high-presence tele-workspace. We also develop 2D to 3D technology. We implemented 3D world made from 2D photos. Figure 3 shows the reconstruction result of a museum that have been closed.

Fig. 3 3D Reconstruction using multiple 2D photos

3.1.2 VR-based Preservation of City Landscape Since it is impossible to preserve large-scale objects in a real-form such as city landscape , VR techniques to convert and record them as realistic virtual objects are becoming a popular method. In this research, we study how to reserve dynamic aspects of city landscape such as daily city life as well as static aspects such as urban spaces.

3.1.3 Five Senses (includes Haptic, Olfactory, Gustatory Sense) Research

Most of the conventional information interfaces are based on visual and acoustic senses. In this research, we study and develop a novel integral display system, which can display all five senses including haptic, olfactory and gustatoru senses, in order to realize more realistic "existence" and "presence".

3.2 Life Log and Wearables Research

3.2.1 Life Log Record and Analysis of Eating Behavior It is called life log to record person’s behavior in a long term. In this research, we study and develop an automatic system to capture the sections related to eating from a mass life log data and furthermore to analyze and verbalize the contents.

3.2.2 Wearable Computer for Mixed Reality System It is called Mixed Reality (MR) to merge real world with virtual world in a real time. In this research, we study a novel wearable computer system specialized for outdoor MR experience.

3.2.3 RFID textile We are developing a cloth with flexibility that implanted RFID tags. Using RFID tags, we can know the position data and can do interactions.

3.3 Application for Art Field

3.3.1 Digital Public Art At the present, we believe that media art is in the transition to evolve from conventional closed gallery art to so-called opened digital public art. The motivation of our digital public art project comes from such a context where we build a new relationship between technology and art through making innovative digital public media art works. We held a Digital Public Art in Haneda Airport "AirHarobr" exhibition from October 9th to November 3rd, 2009. The exhibition held at the biggest airport in Japan.We try to bring media technology to public space. We exhibited 19 artworks. One of the big artwork is "Constellation of Departure" (Fig. 4). It uses 3,000 LEDs on the ceiling of Haneda Airport [3].

Fig. 4 Constellation of Departure

3.3.2 Digital Museum Research In this research, we explore several potential methods to realize Mixed Reality Exhibition. In concrete terms, we research how to naturally combine real objects with virtual information objects by using gallery talk robots, etc and finally develop feasible systems and contents for a real museum.

3.3.3 VR-based Communication and Analysis of Traditional Art and Skill

In this research, we study and develop a recording system for traditional art and skill, which is often hard to verbalize, based on advanced media techniques including VR. Furthermore, we research a traditional art and skill simulation system, which can automatically structure those data and communicate them to the next generation.

REFERENCES [1] H. Iwata, H. Yano, T. Uemura and T. Moriya: Food Simulator: A Haptic Interface for Biting, Proc. of IEEE VR2004, pp.51-57, 2004. [2] P. Rozin: "Taste-smell confusion”and the duality of the olfactory sense. Perception and Psychophysics, Vol. 31 (1982), pp. 397-401. [3]M. Sato, et. al., Particle Display System - A Large Scale Display for Public Space, ICAT, 2009.

266


Recommended