9
Affective Property Evaluation of Virtual Product Designs 1 Sangyoon Lee*, Tian Chen**, Jongseo Kim*, Gerard J. Kim*, Sungho Han*, Zhi-geng Pan** *Departments of CSE, and IE Pohang University of Science and Technology (POSTECH) Pohang, Korea {sylee, creator7, gkim, shan}@postech.ac.kr **State Key Laboratory of CAD and CG Zhejiang University Hangzhou, China {chentian, zgpan}@cad.zju.edu.cn 1 This work was carried out at POSTECH VR Lab where Tian Chen spent here postdoctorship. Abstract In addition to the intended functionality of the product, its affective properties (or “Kansei”) have emerged as important evaluation criteria for the successful marketing of the product. Recently, “immersive” virtual reality systems have been suggested as an ideal platform for affective analysis of an evolving design because of, among other things, the natural style of interaction they offer when examining the product. In this paper, we compare the effectiveness of three types of virtual environments for evaluating the affective properties of mobile phones to that of the real. Each virtual environment offers different degrees of realism in terms of visual, aural, and tactile aspects. Our experiment has shown that the virtual affective evaluation results correlated very highly with that of the real, and but no statistically significant difference could be found between the three systems. This finding was contrary to our initial thought and the conventional notion that the characteristics of immersive virtual reality systems would contribute to making it a better platform for virtual evaluation of product designs. Thus, it goes to say that employing immersive systems is not necessarily cost effective solution for affective analysis of product designs (desktop VR system suffices). 1. Introduction Broadly speaking, product design is often viewed as an iterative process of specifying/adjusting (user) requirements, synthesizing/modifying solutions, and evaluating them. Designers and engineers often have a hard time trying to resolve requirement conflicts between the affective or aesthetic properties and technical feasibility (e.g. functionality and manufacturability). In many cases, satisfying the intended functionalities of the product is given a higher priority, because the affective or aesthetic aspect can only be evaluated after the design is complete through field-testing. Nevertheless, product’s affective properties (or “Kansei” in Japanese) have recently emerged as important factor for the successful marketing of the product, and thus, it became necessary to equally emphasize the affective properties as important evaluation criteria in the design process, preferably in the early stages [1]. Affective properties refer to consumers’ psychological feeling about a product, and studies have shown that they can roughly be mapped into perceptual design elements for possible design modification toward higher customer satisfaction [2, 3, 4, 5]. Affective properties of products “in design” (vs. after design) can partially be assessed using the near photo-realistic graphic rendering feature of the computer-aided design tools, or rapid prototyping tools that can produce a physical mock-up. These models can be, for example, presented to a particular (e.g. male in their 20’s) subject group (representing the main customers) to extract the mapping relationship between the induced feeling and design features. Recently, “immersive” virtual reality systems have been proposed as a more effective platform for affective analysis of an evolving design because of, among other things, the natural style of interaction they offer when examining the product, such as the use of direct and proprioceptive interaction, head tracking and first person viewpoint, and multimodality [6, 7], compared to a desktop graphic rendering. Note that physical mock-ups, the other alternative in product evaluation, are limited in sizes and cannot simulate internal behavior (e.g. moving parts, embedded software). In this paper, we compare three types of virtual environments (VE) for evaluating the affective properties of mobile phones. Each virtual environment offers different degrees of realism in terms of visual, aural, and tactile aspects. The first is the desktop VR system in which a realistic graphic rendering is shown through a monitor and keyboard/mouse based interaction is used to examine and simulate the behavior of the mobile phones. The second environment uses an immersive display with 207 Please see the color plates on page 292 IEEE Virtual Reality 2004 March 27-31, Chicago, IL USA 0-7803-8415-6/04/$20.00©2004 IEEE. Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

[IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

Embed Size (px)

Citation preview

Page 1: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

Affective Property Evaluation of Virtual Product Designs1

Sangyoon Lee*, Tian Chen**, Jongseo Kim*, Gerard J. Kim*, Sungho Han*, Zhi-geng Pan***Departments of CSE, and IE

Pohang University of Science and Technology (POSTECH)

Pohang, Korea

{sylee, creator7, gkim, shan}@postech.ac.kr

**State Key Laboratory of CAD and CG

Zhejiang University

Hangzhou, China

{chentian, zgpan}@cad.zju.edu.cn

1This work was carried out at POSTECH VR Lab where Tian Chen spent here postdoctorship.

Abstract

In addition to the intended functionality of the product,

its affective properties (or “Kansei”) have emerged as

important evaluation criteria for the successful marketing

of the product. Recently, “immersive” virtual reality

systems have been suggested as an ideal platform for

affective analysis of an evolving design because of, among

other things, the natural style of interaction they offer

when examining the product. In this paper, we compare

the effectiveness of three types of virtual environments for

evaluating the affective properties of mobile phones to

that of the real. Each virtual environment offers different

degrees of realism in terms of visual, aural, and tactile

aspects. Our experiment has shown that the virtual

affective evaluation results correlated very highly with

that of the real, and but no statistically significant

difference could be found between the three systems. This

finding was contrary to our initial thought and the

conventional notion that the characteristics of immersive

virtual reality systems would contribute to making it a

better platform for virtual evaluation of product designs.

Thus, it goes to say that employing immersive systems is

not necessarily cost effective solution for affective analysis

of product designs (desktop VR system suffices).

1. Introduction

Broadly speaking, product design is often viewed as an

iterative process of specifying/adjusting (user)

requirements, synthesizing/modifying solutions, and

evaluating them. Designers and engineers often have a

hard time trying to resolve requirement conflicts between

the affective or aesthetic properties and technical

feasibility (e.g. functionality and manufacturability). In

many cases, satisfying the intended functionalities of the

product is given a higher priority, because the affective or

aesthetic aspect can only be evaluated after the design is

complete through field-testing.

Nevertheless, product’s affective properties (or

“Kansei” in Japanese) have recently emerged as important

factor for the successful marketing of the product, and

thus, it became necessary to equally emphasize the

affective properties as important evaluation criteria in the

design process, preferably in the early stages [1].

Affective properties refer to consumers’ psychological

feeling about a product, and studies have shown that they

can roughly be mapped into perceptual design elements

for possible design modification toward higher customer

satisfaction [2, 3, 4, 5]. Affective properties of products

“in design” (vs. after design) can partially be assessed

using the near photo-realistic graphic rendering feature of

the computer-aided design tools, or rapid prototyping tools

that can produce a physical mock-up. These models can

be, for example, presented to a particular (e.g. male in

their 20’s) subject group (representing the main

customers) to extract the mapping relationship between

the induced feeling and design features.

Recently, “immersive” virtual reality systems have

been proposed as a more effective platform for affective

analysis of an evolving design because of, among other

things, the natural style of interaction they offer when

examining the product, such as the use of direct and

proprioceptive interaction, head tracking and first person

viewpoint, and multimodality [6, 7], compared to a

desktop graphic rendering. Note that physical mock-ups,

the other alternative in product evaluation, are limited in

sizes and cannot simulate internal behavior (e.g. moving

parts, embedded software).

In this paper, we compare three types of virtual

environments (VE) for evaluating the affective properties

of mobile phones. Each virtual environment offers

different degrees of realism in terms of visual, aural, and

tactile aspects. The first is the desktop VR system in

which a realistic graphic rendering is shown through a

monitor and keyboard/mouse based interaction is used to

examine and simulate the behavior of the mobile phones.

The second environment uses an immersive display with

207

Please see the color plates on page 292

IEEE Virtual Reality 2004 March 27-31, Chicago, IL USA 0-7803-8415-6/04/$20.00©2004 IEEE.

Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 2: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

visual, aural and tactile feedback, and direct interaction

with a spatial context. The third environment is similar to

the second and only differs in that it shows the real hands

(instead of virtual hands), using a computer vision

technique. This VE configuration is included in the study

based on the prior research results that reported the user’s

changed perception due to showing of realistic self body

parts in direct interaction [8].

The purpose of this study is to evaluate the

effectiveness of these virtual environments, comparing

them to a ground case: the affective assessment using the

real product. Our initial hypothesis and intuition is that

the immersive virtual environments, especially the one

that shows the real hands (implemented in a wireless

fashion), would produce results that correlate very

highly with that of the real, and outperform the

desktop configuration by a wide margin.

2. Related work

Design is regarded as one of the most typical virtual

reality application, because design generally is a 3D task.

Naturally, there have been many attempts to apply VR to

CAD and scene modeling [9, 10, 11, 12]. The main

research issue in these works is whether VR, or more

narrowly 3D multimodal and immersive (and even

stereoscopic) interfaces, would produce a faster, better and

more intuitive design tool. The results have been mixed:

using VR seems only more productive for specialized

design applications (e.g. large scale designs, designing at

the scene), because conventional design still requires 2D

tasks (like sketches) and alphanumeric data input which

VR systems still lack much capabilities in. Another

useful application of VR in the context of product

development is in simulation. For instance, for domains

involving human factors such as airplane and land vehicle

design [12], factory simulation [13], assembly planning

and reachability analysis [14], VR has proven its

usefulness against their desktop counterpart. Design

review has become one of the major productive

applications of VR. Accomplishing design evaluation

tasks in immersive virtual environments enables designers

to evaluate and iterate multiple alternative designs more

efficiently and cheaply.

Researchers at VTT Electronics and the University of

Oulu in Finland have developed a VR-based mobile phone

development and evaluation platform (coincidentally to

our application domain). Figure 1 on the Color Plate

shows the snapshot of the system in use in which the user

is evaluating the designed product on a stereoscopic

desktop display and using a haptic device [15]. The

objective of our study is to validate that such a system

would indeed elicit emotional responses similar to real

products.

In fact, the main objective of virtual reality is to fool

the user into believing that one is in another synthetic

environment, and this can be interpreted as to recreate a

scene that elicits emotions or affective properties similar

to that of the real scene. This concept is often called

“Presence” and dubbed informally as the feeling of “being

there” [16]. Various studies on presence have indicated

the importance of interaction, multimodality, and context

of use [16, 17], which are distinguishing features of VR

from desktop computing environments. Likewise, we

believe that these are very important in affective

evaluation of virtual products.

On the other hand, Kansei engineering has been

applied (and its effect verified) to many different product

domains such as automobiles, electronics and appliances,

home, office supplies, etc. in order to establish a “Kansei”

model in order to reflect user’s feeling into the respective

product design as much as possible [1]. In the case of

mobile phones, Han et al. has developed a model that can

predict the affective satisfaction level based on the 56

design feature selections (e.g. color, key-pad style, overall

shape, location of the power button, etc.) [3, 5].

3. Features of virtual affective evaluation

systems

The purpose of this study is to compare the

effectiveness of different virtual product evaluation

systems and help one choose the most cost effective

solution for a given product. In this section, we stipulate

upon features that are deemed important for virtual design

evaluation systems to be as effective as the real. These

features are implemented in the three virtual evaluation

systems and tested if they are indeed influential. We note

that in this study, we have selected the mobile phone as

the target design whose affective properties are sought,

under the assumption that they represent a class of designs

that are portable (small sized) and interactive (e.g. MP3

player, PDA, pagers, walkmans, palm top computers).

This influenced the selection of important system features

to be examined in this study.

3.1. Requirements of virtual affective analysis

system

It goes without saying that the foremost requirement of

any virtual affective analysis system would be to provide

sufficient visual realism, especially for the product itself

(vs. the scene that is included in). The problem is that it is

difficult to quantitatively specify the amount of required

realism. While current computer graphic rendering and

modeling tools offer photo-realistic image qualities and

high geometric correspondences, there are other visual

cues such as texture, internal depth and concavity, that

208Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 3: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

such computer graphics cannot convey effectively. In this

study, all the tested platforms use the same graphic model

(See Figure 3, 4 and 5 on the Color Plate), and we rather

investigate if other factors such as natural interaction and

tactility make any difference in judging the affective

properties of a (virtual) mobile phone. Another related

requirement is to have the virtual product match the real

one in terms of the size. We believe that this is especially

important for small sized products such as the mobile

phone.

Our second requirement for an effective affective

analysis platform is reconfigurability; that is, the system

should easily allow testing of many similar design

alternatives. Providing reconfigurability in terms of the

visual or aural feedback is relatively easy (just create a

new or modify an existing model and render; reprogram

the sound feedback), however not so for tactile feedback

(if this feature was indeed necessary). We will come back

to this issue later in the section. As part of

“reconfigurability”, we believe that the ease in the change

of context is very important. The context refers to a

particular scenario or surrounding scene in which the

product is examined. For instance, different affective

responses may be obtained when the mobile phone is used

at home, in an office, outdoors, and under different

situations. Testing many real design mock-ups under

many different situations would be practically impossible.

Desktop CAD systems, compared to the VR systems by

design, are also not so amenable to creating different

product usage contexts.

The third probable requirement for effective affective

evaluation of small-sized hand-held products is the direct

interaction. A related requirement to direct interaction is

the provision of tactile/haptic modality in addition to the

usual and relatively easy to provide visual and aural

interaction. It has been generally established that

multimodality increases the effectiveness of the interaction

in virtual environments [18]. In addition, it has been

found that users prefer multimodal interfaces very much to

the unimodal one [19, 20, 21]. Haptic feedback, in

particular, is one of the most important modality when

direct manipulation is required in the task [18].

However, current haptic/tactile devices and

technologies are still far short of providing practical

solutions to generating realistic force/texture feedback.

The popular ground based systems, such as the Sensable’s

Phantom manipulators, can only simulate forces and

textures surfaces at point contact [22, 23] (also shown in

Figure 1 on the Color Plate). The exoskeleton glove

device like the CyberGrasp system [24] is very expensive

and inconvenient to use.

An alternative approach is to use the “prop”. A prop is

an interaction device that represents the virtual object (to

be interacted with), and whose shape and/or appearance

match that of the actual physical object [25]. Props can be

spatially registered with virtual objects providing

inexpensive physical feedback to the user. Hoffman refers

to the technique as “tactile augmentation” [26]. Props

allow us to add inexpensive physical and tactile feedback,

significantly increasing presence for immersive

environments and establishing a common frame of

reference between the device and desktop 3D user

interfaces. The introduction of tactile augmentation

allows us to explicitly control the realism of virtual

environments. The disadvantages of props are that each

prop only represents one object.

In the light of this, designing a prop (or interaction

device) that looks exactly like the actual phone (to be

tested) is not only restrictive in its applicability, but also

defeats the very purpose of using virtual products (that is,

we would like to eliminate the need of building physical

mock-ups or prototypes as much as possible). Our

proposal is to design a “reconfigurable” prop that

represents a family of products. For instance, as for the

mobile phones we are testing, the designed prop is just a

flat rectangular box (as most mobile phones are roughly

rectangular) with push button switches on it (this is of

course not what the user would see but only feel in one’s

hand). Figure 2 on the Color Plate shows the prop used in

our experiment to provide the tangibility and interactive

feeling of the mobile phones. Currently, we only planted

two switches for few simple functions. Certainly, more

such event generators (like dials, buttons, etc.) can be

added to suit the particular purpose of the system. The

tracked prop appears in the virtual space as various mobile

phones and even though the sizes and shapes between

what is appeared on screen and what is hold on the hand

are not matched perfectly, we hypothesize that the benefit

of having the tangibility outweighs this disadvantage.

The use of a “representative” prop (that is, one that

only represents the actual object, but not the actual)

necessitates the use of a head mounted display (rather than

a desktop or projection display) so that the virtual product

(or mobile phone) can be rendered at the position of prop

in the virtual space without distraction. This entailed

required use of HMD does make the user more

inconvenient and the overall system more expensive.

There are HMDs with reasonable resolution that are quite

affordable (~$2,000), comparable to the prices of a LCD

monitor. The HMD model used in our experiment, SONY

Glasstron, only weighs about 120g and was designed for

prolonged TV watching or game playing. We project that

the narrow field of view (FOV) of these low-end HMD

will not be a significantly negative factor because target

product is small enough to fit within the screen. Yet,

another problem with the HMD is that, in the virtual space,

the users have to interact with the target product using

virtual hands, and this is expected to drop the feeling of

209Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 4: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

directness and realism. Not knowing how significant this

factor will be, we propose two immersive VR platforms:

one that shows virtual hands (See Figure 3 on the Color

Plate) and another, the real hands, segmented out and

drawn by using a computer vision technique (See Figure 4

on the Color Plate). Thus, overall, we believe that the

benefit of having a passive haptic feedback outweighs the

disadvantage of having to resort to the HMD.

To summarize, according to the presupposed

requirements of a VR based affective analysis system,

three experimental set ups are proposed according to the

number of hypothetically desired system features. All of

them offer highly realistic visual and aural feedback, but

mostly differ in the style of interaction. The first is the

desktop VR system in which a realistic graphic rendering

is shown through a monitor and keyboard/mouse-based

interaction is used to examine and simulate the behavior of

the mobile phones. Both the second and third systems are

immersive VR-based and both offer visual, aural and

passive haptic (using the reconfigurable prop) modalities

in interacting with the product using an HMD and a way

to reconfigure their content to create various product

usage scenarios and sceneries. They differ in that one

shows virtual hands (hence, referred to as Env. II), and the

other, real hands (hence, referred to as Env. III). The

image segmentation technique used to show the real hands

in the second VR platform may introduce an incoherent

image, because of the size and shape difference between

the actual phone and the prop. That is, actual user’s hands

which are segmented from the camera image are holding

the prop, not the actual phone, thus when the virtual phone

image is overlaid on user’s hands, the fingers may look

unnatural or slight occlusion problem can occur (See

Figure 4 on the Color Plate). However, as already

mentioned, it has the advantage of inducing more

directness in terms of interacting with the target design.

The details of the system architectures are described in the

subsequent subsections.

3.2. The system architectures of the three tested

virtual platforms

In this section, we describe the details of the system

architecture and some implementation details of the three

proposed VR system for affective analysis of mobile

phones. The basic architecture of the three systems is the

same, except for the user interface part.

3.2.1. Env. I: Desktop VR. Figure 5 on the Color Plate

shows what the user sees and interacts (using the keyboard

and mouse) on the desktop monitor. The mobile phones to

be evaluated are modeled using the 3D MAX modeling

tool, and looks quite realistic employing texture mapping

techniques and lighting effects. The virtual model can be

translated, rotated and scaled by the keyboard press,

mouse clicks and drag. Simple behaviors like pressing the

buttons and opening and closing the phone can be invoked

and a sound feedback is provided (e.g. button press beeps,

sound effects for power on and off, etc.)

3.2.2. Env. II: Immersive VR platform 1. Figure 6

shows the architecture of the immersive VR platform

comprising of the computer that processes the sensor input

and renders the virtual scene with the virtual hands, virtual

mobile phone, sound effects, and etc. One ultrasonic

sensor is used to track the viewpoint (i.e. head tracking)

and another magnetic sensor for the arm. The choice is

simply due to the operating range and accuracy. The

ultrasonic sensor is much more accurate but has only

limited operating range, while the magnetic sensor’s

accuracy is much affected by the surrounding environment.

We assumed that the head would not move much while

the hand needed a wider working space. Only one sensor

is used to track the hand and the prop. According to the

position and the orientation of the hand/prop, the arm was

drawn automatically according to a simple inverse

kinematic relation (See Figure 3 on the Color Plate).

Figure 6. The architecture for the Environment II, the immersive VR version

For both Env. II and III, an HMD, called the i-Visor

DH-4400 VPD was used. It has a 31.5-degree diagonal

FOV and 832 x 632 LCOS 0.49 inch micro display panel.

As explained previously, wireless button devices are

attached on the prop to simulate the keys and buttons of

the actual mobile phones. Both systems use the Logitech

3D mouse for head tracking. However among the 6

degrees of freedom, the yaw angle was disabled for

convenience.

3.2.3. Env. III: Immersive VR platform 2 (Mixed

Reality). In the Env. III, a camera system, attached on the

Computer

Head Tracking

(Ultrasonic)

Hand/Prop Tracking

(Magnetic)

RF

Receiver

RF Wireless Button

Prop

Graphic/Sound

Head Position

Orientation

Hand Position

Orientation

Events

HMD/Speaker

210Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 5: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

HMD, is used to track the prop and segment the image of

the hand/fingers holding the prop. In order to track the

prop using computer vision, 10 yellow markers are put on

the blue colored rectangular prop (See Figure 2 on the

Color Plate). From the markers, the position and

orientation of the prop are computed, and transformed into

the world coordinate on which the virtual mobile phone is

drawn first (the camera parameters are known because the

pose of the camera relative to the head is fixed and known,

and the head is being tracked). An algorithm called the

ICP [27, 28] is used to continuously track the 10 markers

between successive frames. In the mean time, the

hands/fingers are segmented out of the image using an

algorithm called the Pyramid Linking [29, 30] and

overlaid on the virtual image. In this way, the finger

image is not occluded by the mobile phone (See Figure 4

on the Color Plate).

3.3. Env. 0: The ground case (Real Platform)

Figure 7 on the Color Plate shows the real product

evaluation environment. It simply consists of a desk on

which the mobile phones are put for user, sitting on a chair,

to take up and examine. The same physical space is used

for the subsequent experiments with Env. I, II, and III (for

convenience sake).

Table 1. The definition for six affective satisfaction degrees [31]

Affective Indices Definition

Texture The image of a product developed by its

texture

Attractiveness Degree to which a product is pleasing,

charming, and arousing interest

Granularity Degree to which a product is worked out

with a great care and in fine detail

Harmoniousness Feeling that the components of a product

is well-matched or in harmony

Simplicity Degree to which the user feels a product

to be simple or tangled

Rigidity Feeling that a product looks stout, stable,

and secure

4. Evaluation indices for affective properties

The affective analysis of (virtual) mobile phones was

carried out using an index scheme and scoring method

developed by the co-authors in a previous study [3, 4, 31].

In that study, six indices were introduced to express the

affective feelings of the users toward the product: texture,

attractiveness, granularity, harmoniousness, simplicity,

and rigidity. Similar to this methodology, the participants

in this study also evaluated and graded the virtual product

in the scale of 0 to 100 along the six representative

affective indices. The definitions of these six indices are

described in Table 1.

5. Experiment

5.1. Evaluation environments

We already roughly described the physical evaluation

environment used in the experiment. As seen in Figure 7

on the Color Plate, the environment is enclosed by a blue

curtain. Although one of the strengths and requirements

of a VR evaluation platform is the provision of context

and scenarios, this feature was deliberately taken out to

make a fair comparison with the desktop system only on

the basis of direct and tactile interaction. This simple

evaluation space is recreated, including the lighting effect,

in the virtual space when Env. I, II and III are used.

5.2. Experimental procedure

Thirty-two participants performed the task of

examining and operating the three different models of

mobile phones in the four test environments (that resulted

in 3x4 = 12 different test combinations). Each participant

visited our VR lab at an appointed time and experienced

all 12 combinations trials in different orders (within

subject test). Trying out each combination lasted

approximately one hour, including the training, the task

itself, filling out the questionnaires, and the post briefing.

The age of the participants ranged from 21 years to 32

years. Among of them were 10 females and 26 males.

Stage I. Preliminaries

A participant was first asked to read the Participant

Instructions which elaborated on the purpose of the

experiment, the definitions and related concepts of the

evaluation indices, points for attention during the

evaluation as well as what the participant would be doing.

The participant’s vital personal information, such as

gender, age and vocation, were collected.

Stage II. Training session

To understand how to use operate and carry out the

required task in a given system (and avoid misuses), the

participants were given a short period of training prior to

the actual experiment. The training involved carrying out

a task, similar to that of the experimental task, in all of the

three test environments, using yet another virtual mobile

phone model different from the ones used in the actual test.

Stage III. Session Evaluation

After experiencing each of the 12 test configurations,

the participants rated, in the scale of 0 to 100 the (virtual)

mobile phone in terms of the seven evaluation indices for

affective satisfaction described in Section 4. In order to

211Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 6: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

enhance the statistical reliability and avoid the order

effects, the orders of 12 trials were arranged in circular

combination for every session according to the Balanced

Latin Square Design Methodology [32]. Examining the

product to assess its affective qualities meant for the users

to zoom in and out of the product (desktop case), or bring

the product toward or away, rotate, and play with the

switches to turn the phone on or off, and open and close it.

Stage IV. Post-briefing

At the end of the experiment (after trying out all 12

combinations), the subjects graded each of the three test

environments in the scale of 0 to 100, as to their

similarities to the real environment. They also answered

to the five questions shown in Table 2.

Table 2. The comprehensive questionnaires

No. Questions

1 Have you ever observed the same or similar mobile

phone used in this experiment?

2 What was the most difficult affective factor to rate ?

3 Was there any abnormal incidence accruing to you ?

4 Was there any inconvenience in carrying out the

task ?

5 How did you assess the size of the mobile phone as

shown in the VR system?

Table 3. Significant factors revealed by two-way

within subject ANOVA ( = 0.05, Subj: Subject, Prod: Product, Sys: System)

SVTextu

re

Attrac

tivene

ss

Granu

larity

Harmo

niousn

ess

Simpl

icityRigidity

Subj

Prod O O O O O

Prod*Subj

Sys O O O O

Sys*Subj

Prod*Sys O O

Prod*Sys*Subj

6. Results and discussion

We applied a two-way within subject ANOVA on the

data collected in the evaluation session. Table 3 shows the

significant factors revealed by the analysis (indicated by

the circles). There were statistically significant

differences among the evaluation environments (systems)

for all of the affective properties except from

harmoniousness and simplicity. However, the SNK

(Student-Newman-Keuls) post hoc comparison revealed

that only the real environment (Env. 0) was in a different

group, that is, there was no significant difference among

the three virtual evaluation systems. This means that our

initial hypothesis is rejected. According to the result,

tactile augmentation and natural interaction have no

statistically significant effect on the affective evaluation.

For the affective evaluation, it seems that the visual factor

strongly dominates over the others.

The average affective scores rated with the virtual

systems were significantly below the scores given in the

real environment (Env. 0). However, the relative ranking

among the virtual products (i.e. three mobile phones)

corresponded with the ranking among the real products.

This means the VR systems can be used for relative

comparisons among products.

Figure 8. Interaction graph between products and systems for the texture property (1: Env. 0, 2: Env. I, 3: Env. II, 4: Env. III)

Figure 9. Interaction graph between products and systems for the granularity property (1: Env. 0, 2: Env. I, 3: Env. II, 4: Env. III)

There were significant interactions between the

products and environment types for the affective

properties of texture and granularity. Figure 8 and 9

illustrates the interactions with the SNK-groups depicted

in dotted rectangles. For the texture property, the results

after SNK grouping show that product 1 and 2 are in one

Scores

Env.

Scores

Env.

212Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 7: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

group and product 3 in another for Env 0, Env II and Env

III, but all of the products were in a single group for Env I.

That is, there is no statistically significant difference

among the products in the desktop VR environment (See

Figure 8), and thus the tactile augmentation from the prop

does have an effect on the evaluation for the texture

property (but not so for other affective properties). The

post-briefing questionnaires revealed that 15 subjects

thought that texture had been the most difficult aspect to

evaluate. We believe that the tactile augmentation had a

positive effect for the evaluation of texture.

As for the granularity property, the SNK grouping in

Env. III (a mixed reality platform) is different from those

in the other environments (See Figure 9). We suspect that

is because, in Env. III, the users’ hand images were not

exactly registered with the virtual products due to

segmentation noises and no consideration of depth.

Furthermore, there were occasional flickers in displaying

the virtual products due to incorrect tracking by markers

occlude by the user’s hand.

In the post-briefing questionnaire, five subjects

complained of cyber-sickness. Thirty two subjects

reported that it was somewhat difficult to move and orient

the virtual hands in Env. II, and four subjects noticed the

prop tracking problem due to the image flickers. Nineteen

subjects informed that they compared the virtual phones to

their (virtual) hands in order to assess the size of the

virtual products in Env. II and Env. III, and nine subjects

said that they did not consider the size at all. In this

experiment, the size perception of the products was not a

factor under evaluation. However, Env. III can be a good

candidate for evaluation of size related affective properties,

since the platform shows the actual hands in their actual

size.

Figure 10. The similarity scores

Figure 10 shows the similarity scores between the real

and the three virtual test environments. The one-way

within subject ANOVA revealed that there were

statistically significant differences among the scores (F2,

105 = 6.44, p < 0.01). SNK grouping test showed that Env.

III was in a different group from Env. I and Env. II ( =

0.05). According to the analysis, Env. III, thus, was the

environment most similar to the real. Although the

similarity score was high, it still did not have an effect on

the affective evaluation. This means that the natural

interaction of the simple tasks did not highly relate to the

affective evaluation.

7. Conclusions and future work

In this paper, we have compared the effectiveness of

three types of virtual environments, namely desktop VR,

immersive VR and immersive VR with real hands, for

evaluating the affective properties of mobile phones to

that of the real. Our experiment has shown that while the

affective evaluation results correlated very highly with

that of the real, and no statistically significant difference

could be found between the three. This finding has

rejected our hypothesis on the positive effects of

immersion and natural 3D multimodal interaction.

We believe that this result is partially due to the fact

that the affective analysis task employed in the study

required mostly visual assessment as most affective

analysis are. Thus, in the aspect of cost, employing

immersive systems is not necessarily cost effective

solution for affective analysis of product designs (desktop

VR system suffices).

On the other hand, the analysis indicated the

importance of enhanced interaction for evaluating the

texture property. The task employed in the study was

probably too simplistic for the natural interaction of Env.

II and III to be clearly more effective than the desktop

counterpart. Moreover, this study did not consider the use

of environment context which is expected to be much

more effective when using an immersive VR. Our future

work is to further investigate these aspects, in relation to

the proposed requirements for VR-based affective analysis

systems.

Acknowledgement

This work was supported in part by the BK 21 program of

the Korean Ministry of Education, and the ITRC program

of the Korean Ministry of Information and

Communication.

References

[1] Nagamachi, M., “Kansei Engineering as a Powerful

Consumer-oriented Technology for Product Development”,

Applied Ergonomics, Vol. 33, No. 3, 2002, pp 289-294.

[2] Kim, K., Han S. H., Yun, M. H., and Kwahk, J., “A

Systematic Procedure for Modeling Usability Based on

Product Design Variables: A Case Study in Audiovisual

Group

I

Group

I

Group

II

213Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 8: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

Consumer Electronic Products”, International Journal of

Occupational Safety and Ergonomics, Vol. 8, No. 3, 2002.

[3] Han, S. H., Kim, K., Kim, G. J., and Yun, M. H., “Usability

Evaluation Methodology for Home/Office Products”,

Technical Report (M1-9817-03-0003), Ministry of Science

and Technology of South Korea, 2002.

[4] Hong, S. W., Kim, K. J., and Han, S. H., “Optimal

Balancing of Product Design Features: A Case Study on

Mobile Phones, in Proceedings of the Human Factors and

Ergonomics Society 46th Annual Meeting, Baltimore,

Maryland, U.S.A., 2002, pp. 700-704.

[5] Yun, M. H., Han, S. H., Kim, K., Kwahk, J., Kwahk, J.,

Hong, S. W., and Kim, J., “Incorporating User Satisfaction

into the Look-and-Feel of Wireless Phones”, in Proceedings

of the International Conference on Affective Human Factors

Design, Singapore, 2001, pp. 423-429.

[6] Vince, J., Virtual Reality Systems, Addison-Wesley Pub.

Co., 1995.

[7] Stuart, R., Design of Virtual Environments, Barricade

Books, 2001.

[8] Slater, M., and Usoh, M., “Body Centered Interaction in

Immersive Virtual Environments”, in N. Magnenat

Thalmann and D. Thalmann (eds.) Artificial Life and

Virtual Reality, John Wiley and Jons, 1994, pp. 125-148.

[9] Aukstakalnis, S., “Virtual Reality and Experiential

Prototyping of CAD Models”, DesignNet, Vol. 1, No. 1,

1992.

[10] Liang, J., and Green, M., “JDCAD: A Highly Interactive 3D

Modeling System”, Computer and Graphics, Vol. 18, No. 4,

1994.

[11] Murakami, H., Nishimura, Y., Impelluso, T. J., and Skelton,

R. E., “A Virtual Reality-based CAD System for Tensegrity

Structures”, in Proceedings of the 12th ASCE Engineering

Mechanics Conference, 1998.

[12] Davies, C. R., “Applications of Systems Design using

Virtual Environments”, (Online) Available at:

http://www.cours.polymtl.ca/inf6802/art/Systems_Design.p

df

[13] Waller, A. P., and Ladbrook, J., “Experiencing Virtual

Factories of the Future”, in Proceedings of the Winter

Simulation Conference, 2002.

[14] Steffan, R., Schull, U., and Kuhlen, T., “Integration of

Virtual Reality-based Assembly Simulation into CAD/CAM

Environments”, in Proceedings of 24th Annual Conference

of the IEEE Industrial Electronics Society (IECON'98),

1998.

[15] Kerttula, M., Salmela, M., and Heikkinen, M., “Virtual

Reality Prototyping – A Framework for the Development of

Electronics and Telecommunication Products”, in

Proceedings of 8th IEEE International Workshop on Rapid

System Prototyping, 1997.

[16] Witmer, B. G., and Singer, M. J., “Measuring Presence in

Virtual Environments: A Presence Questionnaire”,

Presence: Teleoperators and Virtual Environments, Vol. 7,

No. 3, 1998.

[17] Slater, M., Course 49: Understanding Virtual

Environments: Immersion, Presence, and Performance,

Siggraph 2002 Course Notes, 2002.

[18] Popescu G. V., Burdea, G. C., and Trefftz, H., “Multimodal

Interaction Modeling”, In K. M. Stanney (eds.) Handbook

of Virtual Environments: Design, Implementation, and

Application, Lawrence Erlbaum Associates, New Jersey,

2002, pp 435-454.

[19] Oviatt, S. L., DeAngeli, A., and Kuhn, K., “Integration and

Synchronization of Input Modes during Multimodal

Human-computer Interaction”. in Proceeding of Conference

on Human Factors in Computing Systems (CHI ’97), 1997.

[20] Oviatt, S. L., and Clow, L., “An Automated Tool for

Analysis of Multimodal System Performance, in

Proceedings of the International Conference on Spoken

Language Processing, 1998.

[21] Oviatt, S. L., “Ten Myths of Multimodal Interaction”,

Communications of the ACM, Vol. 42, No. 11, 1999.

[22] Massie, T. H., and Salisbury, J. K., “The PHANToM Haptic

Interface: A Device for Probing Virtual Objects”, in

Proceedings of the ASME Winter Annual Meeting,

Symposium on Haptic Interfaces for Virtual Environment

and Teleoperator Systems, Chicago, IL, USA, Nov. 1994.

[23] Inc. Sensable Technologies, “The PHANToM Hardware

Devices”, (online) http://www.sensable.com.

[24] Immersion Co., “Cybergrasp: Groundbreaking Haptic

Interface for the Entire Hand”, (online)

http://www.immersion.com.

[25] Hinckley, K., Pausch, R., Goble, J. C., Kassell, N. F.,

“Passive Real-World Interface Props for Neurosurgical

Visualization”, in Proceeding of Conference on Human

Factors in Computing Systems (CHI ’94), 1994.

[26] Hoffman, H.G., “Physically Touching Virtual Objects using

Tactile Augmentation Enhances the Realism of Virtual

Environments”, in Proceedings of the IEEE Virtual Reality

Annual International Symposium (VRAIS '98), 1998.

[27] Besl, P. J., and McKay, N. D., “A Method for Registration

of 3-D Shapes”, IEEE Transactions on Pattern Analysis and

Machine Intelligence, Vol. 14, No. 2, Feb. 1992, pp.239-

256.

[28] Wunsch, P., and Hirzinger, G., “Registration of CAD-

Models to Images by Iterative Inverse Perspective

Matching”, in Proceedings of the 13th International

Conference on Pattern Recognition, Vienna, Austria, Aug.

1996, pp. 77-83.

[29] Burt, P. J., Hong, T. H., and Rosenfeld A., “Segmentation

and Estimation of Image Region Properties through

Cooperative Hierarchical Computation”, IEEE Transactions

on Systems, Man, and Cybernetics, Vol. 11, No.12, 1981,

pp. 802-809.

[30] Jahne, B., Digital Image Processing, Springer, New York,

1997.

[31] Han, S. H., Yun, M. H., Kim, K. W., Hong, S. W., and Kim,

J., “Identifying Mobile Phone Design Features Critical to

User Satisfaction”, in Proceedings of International

Conference on Computer-Aided Ergonomics and Safety,

Maui, Hawaii, USA., 2001.

[32] Montgomery, D. C., Design and Analysis of Experiments

(5th Eds.), John Wiley & Sons, New York, 2001.

214Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 9: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - Affective property evaluation of virtual product designs

Figure 1. A desktop VR prototyping and evaluation system for mobile phones by researchers from VTT Electronics and University of Oulu, Finland [15]

Figure 2. The prop used to replace actual mobile phones is simply a rectangular box with wireless event generating switches on them. The switches are wireless and sends push button events to the mother station through radio frequencies (the black line on top is the antenna), and the yellow dots on the prop are markers for camera tracking.

Figure 3. Virtual space with a virtual mobile phone and virtual hands (Env. II)

Figure 4. The same virtual space of Figure 3, but with real hands (Env. III)

Figure 5. Mobile phone shown on a desktop monitor (Env. I)

Figure 7. The physical space at which the experiments are carried out (Env. 0)

Fold/Unfold Power on/off

Color plate of paper Affective Property Evaluation of Virtual Product Designs on page 207

292Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE