10
J Multimodal User Interfaces DOI 10.1007/s12193-013-0138-8 ORIGINAL PAPER Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions Charles Pontonnier · Georges Dumont · Asfhin Samani · Pascal Madeleine · Marwan Badawi Received: 18 July 2013 / Accepted: 28 November 2013 © OpenInterface Association 2013 Abstract This paper adresses the issue of properly design- ing a digital mock-up (DMU) to be used in an experiment comparing simulated assembly tasks in both real and vir- tual environments. Motivations and specifications relative to the experiment are reported, and the design process of the DMU is described and illustrated. Recommendations are proposed with a particular focus on specificities relative to the use of a DMU as a support for both manufacturing and virtual reality—3D visualisation and interaction. A sub- jective evaluation of Real (RE), Virtual (VE), and Virtual with Force Feedback (VEF) environments is provided. Results indicate a real sensory and difficulty gap between RE and VEF, whereas a smaller difference was observed between RE and VE. In further improvements of scale-1 (where the objects in VE have the same size as in the real environment), co-localized simulation using haptic devices are warranted to fill in this gap. Results also highlight the impact of cognition C. Pontonnier (B ) Ecoles de Saint-Cyr Coëtquidan, IRISA-INRIA Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France e-mail: [email protected] G. Dumont Ens Cachan Antenne de Bretagne, IRISA-INRIA Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France e-mail: [email protected] A. Samani · P. Madeleine Department of Health Science and Technology, Aalborg University, Fredrik Bajers Vej 7 D-3, 9220 Aalborg, Denmark e-mail: [email protected] P. Madeleine e-mail: [email protected] M. Badawi IRISA-INRIA Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France e-mail: [email protected] and sensory feedback on user’s feeling and presence sensa- tion. Applications of such numerical designs are presented in the last section, especially focusing on collaborative design sessions. Virtual Reality based evaluation of newly designed workstations will be a way in the future to improve design and user learning processes. Keywords Fidelity · Sensor-bridging · Sensor-sharing · Virtual reality · Interaction 1 Introduction The use of Virtual Reality (VR) as a support for products [40] or workstations design [8, 14] tends to be generalized in the near future. Indeed, virtual prototyping—the act of evaluat- ing a product by simulating its behavior and its interactions with humans and/or other components—has become increas- ingly relevant, especially for evaluating assembly tasks. At this point, the use of such tools becomes quite natural to evaluate the functionalities or ergonomic features of work- stations [2, 41] since it is more cost-effective and easier to edit a digital mock-up (DMU) than a real mock-up. In order to evaluate ergonomic features, the user is immersed in a Virtual Environment (VE) mimicking the Real Environment (RE), and reproduces a more or less realistic metaphor of the real task. Motion capture, electromyographic (EMG) electrodes, force sensors, and subjective indicators provide an evaluation of sensory and motor aspects such as muscle fatigue and discomfort [23, 25, 26, 30]. In the near future, one can expect to obtain a VR-based simulator allow- ing a remote intervention of ergonomists on a DMU and on the worker’s gestures to minimize muscle fatigue and avoid potential risks of Work Related Musculoskeletal Disorder (WMSD) appearance. 123

Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

  • Upload
    marwan

  • View
    213

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User InterfacesDOI 10.1007/s12193-013-0138-8

ORIGINAL PAPER

Designing and evaluating a workstation in real and virtualenvironment: toward virtual reality based ergonomic designsessions

Charles Pontonnier · Georges Dumont ·Asfhin Samani · Pascal Madeleine · Marwan Badawi

Received: 18 July 2013 / Accepted: 28 November 2013© OpenInterface Association 2013

Abstract This paper adresses the issue of properly design-ing a digital mock-up (DMU) to be used in an experimentcomparing simulated assembly tasks in both real and vir-tual environments. Motivations and specifications relativeto the experiment are reported, and the design process ofthe DMU is described and illustrated. Recommendationsare proposed with a particular focus on specificities relativeto the use of a DMU as a support for both manufacturingand virtual reality—3D visualisation and interaction. A sub-jective evaluation of Real (RE), Virtual (VE), and Virtualwith Force Feedback (VEF) environments is provided.Results indicate a real sensory and difficulty gap between REand VEF, whereas a smaller difference was observed betweenRE and VE. In further improvements of scale-1 (where theobjects in VE have the same size as in the real environment),co-localized simulation using haptic devices are warranted tofill in this gap. Results also highlight the impact of cognition

C. Pontonnier (B)Ecoles de Saint-Cyr Coëtquidan, IRISA-INRIA Rennes, Campusde Beaulieu, 35042 Rennes Cedex, Francee-mail: [email protected]

G. DumontEns Cachan Antenne de Bretagne, IRISA-INRIA Rennes, Campusde Beaulieu, 35042 Rennes Cedex, Francee-mail: [email protected]

A. Samani · P. MadeleineDepartment of Health Science and Technology, AalborgUniversity, Fredrik Bajers Vej 7 D-3, 9220 Aalborg, Denmarke-mail: [email protected]

P. Madeleinee-mail: [email protected]

M. BadawiIRISA-INRIA Rennes, Campus de Beaulieu,35042 Rennes Cedex, Francee-mail: [email protected]

and sensory feedback on user’s feeling and presence sensa-tion. Applications of such numerical designs are presented inthe last section, especially focusing on collaborative designsessions. Virtual Reality based evaluation of newly designedworkstations will be a way in the future to improve designand user learning processes.

Keywords Fidelity · Sensor-bridging · Sensor-sharing ·Virtual reality · Interaction

1 Introduction

The use of Virtual Reality (VR) as a support for products [40]or workstations design [8,14] tends to be generalized in thenear future. Indeed, virtual prototyping—the act of evaluat-ing a product by simulating its behavior and its interactionswith humans and/or other components—has become increas-ingly relevant, especially for evaluating assembly tasks. Atthis point, the use of such tools becomes quite natural toevaluate the functionalities or ergonomic features of work-stations [2,41] since it is more cost-effective and easier toedit a digital mock-up (DMU) than a real mock-up.

In order to evaluate ergonomic features, the user isimmersed in a Virtual Environment (VE) mimicking the RealEnvironment (RE), and reproduces a more or less realisticmetaphor of the real task. Motion capture, electromyographic(EMG) electrodes, force sensors, and subjective indicatorsprovide an evaluation of sensory and motor aspects such asmuscle fatigue and discomfort [23,25,26,30]. In the nearfuture, one can expect to obtain a VR-based simulator allow-ing a remote intervention of ergonomists on a DMU and onthe worker’s gestures to minimize muscle fatigue and avoidpotential risks of Work Related Musculoskeletal Disorder(WMSD) appearance.

123

Page 2: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

Since VE do not perfectly mimic reality in terms of visu-alization, simulation, and interaction, a work task performedin VE differs deeply from a one performed in RE in terms ofcognition, sensory feedback, and motor control [34]. More-over, unfamiliar environments tend to affect the way we per-form what we consider as well-known tasks [19]. To reducethe gap between RE and VE it is necessary to prepare theDMU with caution, as it will be used as a VE and manufac-tured at the same time. On one hand, the DMU has to reacha compromise between performance and fidelity to be usedas a VE. On the other hand, it has to ensure functionalitiesand specificities mandatory to its fabrication.

Within the framework of the VISIONAIR project [16], theTrans-National Access project VR-GO proposed by the Cen-ter for Sensory-Motor Interaction (SMI) in Aalborg (Den-mark) took place in the IMMERSIA room, Rennes (France).The goals were to perform manual handling tasks in both REand VE, using subjective and objective indicators to quantifydiscrepancies in terms of e.g. discomfort and fatigue experi-ence. The project aimed at answering the following question:How does a VE interfere with motion pattern and muscularactivation? Assessing objective and subjective fidelity of sim-ulated assembly tasks in VE with regard to RE is crucial whenseeking knowledge on how ergonomics can be evaluated ina virtual framework.

This article presents the design process of a DMU forof simulated assembly tasks and an initial evaluation of theexperimental protocol, focusing on the specificities of thenumerical pipeline used to design the DMU. A subjectiveevaluation of fidelity is presented in order to understand theperception of VE and virtual environment with force feed-back (VEF) with regard to RE. A first part relative to thevirtual prototyping of assembly tasks introduces the mainmotivations and specifications relative to the current study.Then the motivations and specifications of the current DMUdesign and the implementation and the realization of themock-up are presented. A subjective evaluation of fidelityand the associated results are presented and discussed anda final section aims at presenting the applications of such aVR-based simulator, especially for collaborative ergonomicsdesign sessions.

2 Virtual prototyping in ergonomics of assemblyprocesses

Virtual prototyping has been widely adopted as a design andvalidation practice in several industrial sectors and, for a longtime companies have been moving from expensive physi-cal prototypes to the more convenient use of DMUs. Forexample, Boeing [32], Volkswagen [7], Caterpillar [21] orGeneral Motors [15] have all been evaluating the benefits ofVR technology to reduce the number of physical prototypes

by using simulations implemented on digital mock-ups in aVE.

Virtual prototyping of assembly tasks has been massivelystudied because the assembly process represents a signif-icant part of the cost of a product [4]. Commercial CADsoftwares can be used in assembly process planning bymanually selecting the mating surfaces, axes and/or edgesto assemble the parts. Nevertheless these interfaces do notreflect human interaction with complex parts, especially theaccurate manipulation of the parts and the global attitudeof the human during the task. Such computer-based sys-tems are unable to address issues related to ergonomics ofthe task, e.g. to detect awkward postures or peaks of mus-cular activity during assembly operations. This is the rea-son why haptics saw an outstanding development in the pastfew years. In using haptics technology, engineers can touchand feel complex CAD models and interact with them usingnatural and intuitive human motions [38]. Moreover, colli-sion and contact forces, computed in real-time, can be trans-mitted to the operator as a force feedback. Therefore, theoperator experiences the physical contacts computed by thesimulation during the assembly task. In [33], the authorsproposed a physically-based modeling tool for simulatingrealistic part-to-part and hand-to-part interactions through adual handed haptic interface. A method is proposed in [36]for interactive assembly operations by applying both kine-matic constraints and guiding virtual fixtures. The purposeof this method is to help the user to perform the assemblytask and to ensure a good assembly of CAD objects in dis-abling most of the physical interactions during the assemblytask. From an assessment point of view, a quantitative analy-sis concerning the significance of haptic interactions in per-forming simulations of manual assemblies was performed in[30,37].

The second main field of application for virtual prototyp-ing is closely linked to the first one and concerns ergonomics.Studies in ergonomics can be separated into two categories[24]: the first one concerns the user’s point of view and hisability to use a product. The second one concerns the worker’spoint of view and his capacity to perform, without additionalrisks, the assembly of the product as well as the tasks associ-ated to his workstation. This has been studied from a cogni-tive point of view with a focus on the importance of humaninput to the design process [6,13]. The interest of using hap-tics for ergonomic evaluations of assembly tasks has beendemonstrated in [5,39] to have the potential to significantlyincrease both speed and accuracy of human-computer inter-actions. It has also been shown that the virtual prototypingcould be used to minimize the physical risks factors involvedin the musculoskeletal disorders appearance [30].

In this paper, we focus on the efficient design and evalu-ation of a DMU used to perform an ergonomic interventionon a simulated assembly task in a VE.

123

Page 3: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

3 Motivations and specifications

The definition of the experiment relied on two main points.The specifications partially came from the goal to performergonomic studies and partially from technical purposesdriven by the ability to efficiently build the real setup andto use a 3D model of this setup within a VE.

On the ergonomics evaluation side, the chosen assem-bly task had to involve elementary operations such as objectmanipulation, object sorting or target reaching under condi-tion of repetitive movements in a standing posture. As the taskcan be performed by people of different morphology and size,the design of both real and virtual setups had to be flexiblein terms of geometry definition and accessibility parameters.The environments and interaction types, i.e. real, virtual, andvirtual plus force feedback, has been chosen to test differentmultimodal interactions. Therefore the experiment has beendesigned to provide an evaluation of the usability of theseinteraction types and environments to perform ergonomicdesign sessions. Namely, the final experiment aims at evalu-ating the relevance of the addition of haptic sense to enhancethe system fidelity. Fidelity can be defined as the objectivedegree of exactness with which real-world experiences andeffects are reproduced by a computing system [12].

On the technical side, the motion tracking of user’s upperbody had to be recorded synchronously with electromyo-graphy signals representing muscle activation. The switchbetween the three different types of interaction that are taskin RE, task in VE and task in VEF had to be done easily ona unique physical platform. Therefore the DMU had to be asflexible as possible to be manufactured and used both as avisually and interactively realistic VE.

To meet these specifications, the following experimentwas defined: the environment comprised a work benchincluding a storage A and a disposal zone B, a holed box andtwelve wooden objects. The holed box was located on a workplan W set at the elbow height as recommended for light work[18,29]. The storage and disposal zones were located 40 cmabove the work plane and 16 cm left and 16 cm right of theholed box center respectively. The holed box had differentholes with different cross-sectional contours. This allowedsome of the objects (“fitters”) to pass through, while others(“non-fitters”) were blocked. During a trial, the subject stoodin front of the workstation and after receiving a verbal let-gosignal, took an object from the storage zone with his righthand. The subject had to pass fitters through a proper hole inthe holed box while placing non-fitters in the disposal zone.There were six fitters and six non-fitters for each trial eitherin RE, VE or VEF. Tasks have been designed as simplifiedassembly tasks, including several elementary operations andconditions: target reaching, object manipulation, piece sort-ing, standing posture and repetitive motion. These specificfeatures are well-known to be involved in WMSD appear-

ance [10]. The dimensions of the workstation were definedin relation with the capacities of the interaction devices, espe-cially the haptic device, to enable a scale-1 manipulation ofthe object.

Within-subject conditions were also mandatory to under-stand how differently they influenced the way the task wasperformed in RE and VE. These within-subject conditionswere the timing regime and the complexity of the task. Tim-ing regime had two different levels: “as fast as possible” and“time managed”, complexity also had two different levelsdefined by the number of holes in the holed box: “2 holes”and “6 holes”.

Figure 1 summarizes the specifications defined prior tothe experiment design. An “idea map„ was used to enhancethe communication between ergonomists and computer sci-entists, and to help the proper design in considering bothergonomics and technical specifications. The map exhibitsseveral level of details and summarizes the main character-istics of the experiment (biomechanical quantities recorded,whithin-subject factors and interaction types tested, etc.). Asimilar idea map was initially used to define the mock-upspecifications (type of material, dimensions in relation withthe interaction device capacities and morphological data).

4 A DMU for designing RE, VE and VEF

This section deals with the design of a flexible, exportableand manufacturable DMU and its implementation within theframe of the experiment described in the previous section.The complete design pipeline is shown in Fig. 2. As the taskin the virtual environment has to be as faithful as possible tothe one in the real environment, we used the same DMU forboth spaces. On Fig. 2 top, views of the three main parts thatwere used during the experiment are presented:

• the work plan;• the holed box;• the different shapes including “fitters” and “non-fitters”:

these shapes were directly issued from a children holedbox game.

In the same figure (second box), the data is exported forthe manufacturing part. Using the same DMU of the setup,simplified representations are exported that lead to a visualrepresentation for VR and a physical representation on whichthe physics simulator relies.

4.1 Digital mock-up

The DMU needed to be flexible, in the sense that the worksta-tion needed to be adaptable to each subject with respect to thetask specifications. The height of the shelves had to be easilytuned. For building the workstation, we chose to use standardaluminium profiles and to use set squares in order to obtain

123

Page 4: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

Fig. 1 Experiment design and specifications

easily adjustable parameters (work plan height and shelvesheight). Particular attention was paid on having a minimumnumber of parameters to set in the assembly of Computer-Aided Design (CAD) parts. Holed box specifications wereprecisely respected in order to obtain a high accuracy onhole shapes and contours.

4.2 Manufacturing

For the workstation, most of the parts comprising the struc-ture were standard and only an assembly had to be performed.

The holed box was composed of plexiglas plan and holeplans that were manufactured using Computer NumericalControlled (CNC) machines. Files exported directly fromthe CAD model were used to manufacture these parts.

4.3 Export to 3D scene

The VE simulation was defined to allow as natural as possi-ble manipulation capabilities. The need of a 3D mesh model

compatible with high visualization frame rates led to simpli-fying the mesh after the export. For haptic manipulations, aphysics engine including contact management was manda-tory.

The visualization framework was based on OpenMask[17]. To obtain a visually realistic scene and enhance framerate and performance of the soft, leaves and curved zones ofthe DMU were replaced by chamfers or chamfer combina-tions. The initial export was too heavy to be easily simulated,thus a simplification of shapes was performed, minimizingvertices on plans and filling in functional holes as they werenot visible by subjects.

Physics of the scene were simulated using the BulletPhysics Library1. As the Bullet library does not handle non-convex meshes properly, the DMU was modified to obtainonly convex objects. The virtual holed box was modified todiminish the computational cost of physics simulation. Sim-

1 http://www.bulletphysics.org.

123

Page 5: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

Fig. 2 Design pipeline from CAD model to fabrication and immersiveroom

plified holes were designed to keep their functionality (a holecan only let pass a unique fitter in a unique position and ori-entation), whereas visual aspect was unchanged as meshesof physics and rendering engines were dissociated.

4.4 Implementation in immersia

Our team operates and maintains a VR platform calledIMMERSIA2. The equipment is a large immersive L-shapesetup, which immerses the user in a high-quality visual andaudible world. An overview of the room is shown in Fig. 3.The visual system uses eight Barco Galaxy NW12 projectors(BARCO Inc., USA) for the wall (resolution: 6,240 × 2,016pixels) and three Barco Galaxy 7+ projectors for the floor(resolution: 3,500 × 1,050 pixels). Stereoscopic images fromBarco projectors are rendered on glass screens (9.6 m large,3 m height). A 360◦ tracking system with 16 ART infra-redcameras (Advanced Real time Tracking GmbH, Germany)3

enables the localization of real objects within the L-shape.Sound rendering, not used in this experiment, is provided by

2 http://www.irisa.fr/immersia.3 http://www.ar-tracking.com.

a Yamaha processor, linked either to Genelec speakers with10.2 format sound or Beyer Dynamic headsets with 5.1 vir-tual format sound, controlled by the user’s position. Tracked3D glasses (ActiveEyes-Pro, Volfoni, SAS, France) can adaptthe simulation to the user point-of-view.

The experiment was set up using this VR system. Theinteraction was simulated with a Flystick2 (Advanced Realtime Tracking GmbH, Germany) in the VE case and witha Virtuose© 6D haptic device (HAPTION SAS, Soulgé surOuette, France)4 in the VEF case.

In the final study (currently being published [28]), motionof the upper body was tracked using the ART localizationsystem and muscle activities were recorded using an EMGamplifier. The data obtained from these records is not dis-cussed in the current article.

The room was physically divided in three zones of approx-imately 3 m long in order to have the different environments(RE, VE and VEF) on a unique platform. Distributed archi-tecture of software driving the room was based on a collab-orative framework [11].

5 Evaluation

5.1 Methods

An initial evaluation of the fidelity of both VE and VEF withregard to RE was performed as a part of a more completestudy not published yet. Ten subjects (age 28.1 ± 2.2 years,height 179.9 ± 7.1 cm, weight 72.0 ± 7.2 kg, VR expe-rience on a 5-point scale rated from “Novice” to “Expert” :1.7 ± 0.9) performed two trials per environment. The com-plexity was set to “two holes”, and both timing regime levelswere tested (“as fast as possible”, “time managed”). Environ-ments were randomized to prevent task learning effect (bal-anced design). After each session in an environment, subjectsanswered a short questionnaire, rating several items. At theend of the experiment, they were invited to rate both VE andVEF in terms of environment fidelity and interaction fidelity.Paired Wilcoxon tests were processed on results to find sig-nificant differences between the environments.

5.2 Results

Results of the questionnaire are shown in Table 1.Paired Wilcoxon tests raised the following points as sig-

nificant results:

• the task was more difficult in VEF than in RE and VE;• under the “time managed” condition, subjects did have

enough time to sort and place pieces in RE and VE,whereas they did not in VEF;

4 http://www.haption.com.

123

Page 6: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

Fig. 3 Experimental setup in the immersive room overview. From left to right, real environment, virtual environment with force feedback, virtualenvironment without force feedback

Table 1 Questionnaire results

Items RE VE VEF p-values

Mean SD Mean SD Mean SD RE vs VE RE vs VEF VE vs VEF

Was the task easy to perform? 4.9 0.3 4.7 0.7 2.9 1.0 >0.05 <0.05 <0.05

1-Absolutely not / 5-Absolutely

Time Managed: enough time to succeed? 5 0 5 0 2 1.1 >0.05 <0.05 <0.05

1-Absolutely not / 5-Absolutely

Was it easy to differentiate fitters and non-fitters? 5 0 4.8 0.2 5 0 >0.05 >0.05 >0.05

1-Absolutely not / 5-Absolutely

Do you think your motion was natural? 4.5 0.5 3.8 0.9 2.4 1.1 <0.05 <0.05 <0.05

1-Absolutely not / 5-Absolutely

Was it difficult to access the different zonesof the workstation?

1.1 0.3 1.6 0.7 2.6 1.1 <0.05 <0.05 <0.05

1-Absolutely not / 5-Absolutely

Was it easy to pass the fitters through the holes? 4.9 0.3 4.0 1.1 3.0 1.5 <0.05 <0.05 <0.05

1-Absolutely not / 5-Absolutely

VE rating VE VEF p-value

Mean SD Mean SD

Environment fidelity 3.8 0.8 3.1 1.2 <0.05

1-Unrealistic / 5-Realistic

Interaction fidelity 4.0 0.8 3.3 1.4 <0.05

1-Unrealistic / 5-Realistic

Each item is evaluated on a 5-point scale. A p value below 0.05 was considered as significant

• motion of subjects tended to be less natural in VE thanin RE, in VEF than in RE and in VEF than in VE;

• accessibility to the different zones was more difficult inVE than in RE, in VEF than in RE, and in VEF than inVE;

• placing the fitters in the holes was more difficult in VEthan in RE, in VEF than in RE and in VEF than in VE.

VE rating highlighted that VEF appeared globally less real-istic than VE. Interaction appeared also less realistic in VEFthan in VE.

5.3 Discussion

Considering these results, VE and RE were closer in termsof subjective fidelity than VEF and RE. This observationis mostly due to the mechanical limitations of the hapticdevice. The subject had to reposition the handle relativelyto the co-localized target in order to obtain complete rota-tion, necessary to place the fitters in the holes in some cases.In other terms, even if the workstation design was takinginto account the work volume of the device, manipulation ofpieces remained unnatural as additional gestures were nec-

123

Page 7: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

essary to complete the task. It resulted in an increased dif-ficulty, a less natural motion, a reduced accessibility and alonger duration of the task that would affect productivity inthe long run.

In VE, motion became less natural than in RE. Even if theinteraction in VE was closer to the one in RE than in VEF,this result is of importance. Given that subjects were novicesin VR (VR experience on a 5-point scale rated from “Novice”to “Expert”: 1.7 ± 0.9), it could be due to an unfamiliarity ofthe subjects with the VE. Modulation of sensory feedback,in line with [35], induced an alteration of motor control. Theinterfaces used to interact with the virtual scene, as well asthe scene itself, differed from the real situation. It inducedthat sensory feedback channels (mostly touch and view here)received altered signals that generated changes in the way theinteraction was realized. This observation is in line with thefact that since cognition is affected by the environment, motorcontrol is also altered [34]. This is also in accordance withthe ACT-R theory [1], as altered cognition will lead to moreeffort and more muscle co-contraction to perform a task inan unfamiliar environment, due to an increased mental loadand a longer process in the perception-action loop. It has alsobeen noticed in [19] that familiarity of the environment has animpact on performance: the more familiar the environmentis, the better is the performance. Unfamiliarity also resultedin a reduced accessibility and an increased difficulty to passfitters through the holes, whereas no mechanical constraintwas active in VE.

The rating of the environment confirms previous remarksabout cognition. Even if the environment was visually andphysically identical in VE and VEF, subjects perceived theenvironment as less realistic in VEF than in VE. Moreoverthe interaction was significantly perceived as less realisticin VEF than in VE. It indicates that despite of the additionof a physical sense to the scene via the force feedback, themechanical limitations of the haptic device lowered the sen-sation of presence. Despite of these considerations, environ-ment fidelity was rated at a reasonable level, confirming ourdesign choices.

This evaluation requires further processing, and concernsonly a part of the experimental protocol. It is necessary to con-duct complete experiments on a larger population size, imply-ing objective measures of discomfort and muscle fatigue,e.g. postural scores, kinematical traces or muscular activ-ity, to highlight the differences between the various environ-ments. Nevertheless, as a first result, it appears clear that theuse of a co-localized haptic device asks for further improve-ments to be totally satisfying. It seems convenient and fullof promises to use 6-DoF devices for such applications, andmixing sensor-sharing and sensor-bridging communication[3] could be a solution to bypass the mechanical limitations:one can imagine that translation and rotation in the virtualscene could be scaled differently to maximize the manipula-

bility of the virtual objects while keeping the scale-1 feeling.Moreover, using other feedback channels, e.g. using soundto emulate force feedback, could be a way to enhance theuser experience. Paradoxically such a sensor-bridging com-munication may be more effective in terms of fidelity thantrying to directly replicate natural interaction. This is one ofthe rationales behind sensor-bridging communication, andcognitive infocommunication channels in general.

6 Toward VR-based ergonomic design sessions

In spite of the previous considerations and the multipleunsolved related challenges [31], one can imagine how sucha simulator can be used in an industrial context. The firstrequirement to impose VR-based simulators as efficientergonomics design tools in the industry is the cost. Evenif one can easily imagine the numerous advantages of suchtools, e.g. reactivity, gain of time, modularity, etc. it still hasto be more cost-effective than fabricating a scale-1 physicalmock-up to evaluate ergonomic features of a workstation. Itremains complicated to fulfil this requirement, as VR facil-ities are still relatively expensive. It implies that the initialinvestment has to be damped along time, requiring an inten-sive and systematic use as an ergonomic design tool. Thisis one reason pushing front side the issue of the numericalchain, that has been partially addressed in the current article.The numerical chain has to be standardized and straightfor-ward to be used easily, without additional treatment goingfrom one platform (CAD) to one other (fabrication, VE).

An additional argument to adopt such tools is the promis-ing possibility to animate remote ergonomic design sessionsby sharing the same VE. Indeed, we plan to use a collab-orative framework [9,11] and share it among the actors ofproduct and workstation design to perform an evaluation.The idea, as shown in Fig. 4, is to compute and visualizeunder several representations the biomechanical risk factorsinvolved in the musculoskeletal disorders appearance at work(e.g. posture scores, kinematic features, muscle forces, etc.)[27]. To do so, the worker (main user) is immersed in a vir-tual representation of the workstation such as the one pre-sented in the current article, and he/she and the collaborators(ergonomists, engineers) are visualizing in real-time biome-chanical factors during the realization of work tasks. Bio-mechanical factors could be represented either on a virtualmanikin, enhancing the main user proprioception [20] or assimple scores and curves for the ergonomists. Ergonomistscan then interactively advise the main user to enhance theway he/she is performing the task (more comfortable pos-tures, less awkward motions). Design engineers can on theirside modify the work environment in relation with ergono-mists’ and users’ advices (features position, work cycle orga-nization, etc.). This intervention should be done with respectto the functionalities and specifications of the workstation

123

Page 8: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

Fig. 4 Collaborative framework for VR-based ergonomic design sessions

in terms of productivity and quality. Some metaphors thatcould, for example, be inspired from the guiding techniquespresented in [22] should be designed and tested for such pur-pose. A specific work on sensor-bridging and sensor-sharinginformation has to be done here to enhance user’s cognitionand comprehension of the task and the recommendations pro-posed by the remote users. For example, one can think thatergonomic recommendation, such as lowering the arm in agiven posture, can be proposed by an ergonomist in animat-ing a ghost avatar whereas it will be shown to the main useras a metaphoric arrow indicating the direction of lowering onthe current avatar, on which the region of interest (the arm)should be highlighted.

The strength of the framework proposed here is the adap-tation to very different user interfaces, from higher-end ren-dering systems with haptic interfaces to standard laptop ren-dering systems [9]. It enables a very flexible use, especiallyinteresting with regard to th e arguments previously devel-oped about the cost of VR-based ergonomic design evalu-ations. The framework could be extended in the future toseveral other activities, such as coaching, training or rehabil-itation.

7 Conclusion and perspectives

In this article, the definition and evaluation of a DMU in realand VE has been presented. A particular attention has been

paid to define guidelines to realize a real mock-up and exporta virtual one from a unique initial DMU. Simulated assemblytasks performed on the designed workstation were comparedin real (RE), virtual (VE) and virtual with force feedback(VEF) environments. With regard to this evaluation, advicesand recommendations were provided for further experimentsof such type. An initial evaluation of VEs fidelity in relationto the real one was performed.

Results of the evaluation highlight the importance offamiliarity as a factor of cognition alteration. An increasein the difficulty was reported in VEF with regard to VEand RE. This point calls for improvements in the con-trol and the design of haptic devices and interfaces toenhance their applicability in scale-1, co-localized simu-lations. To bypass mechanical limitations in such applica-tions, a solution could be to dissociate translation and rota-tion scales between the real and the virtual world. More-over, as distortion between physical and visual perceptionof the scene is a way to enhance fidelity, future workcan focus on this aspect for the design of workstationDMUs, enabling a direct evaluation in a VE for ergonomicspurposes.

Improving the fidelity and the usability of such simulatorsin defining a straightforward numerical chain is a keypoint forgeneralizing the use of VR-based ergonomic design sessions.As it has been explained in the last section of this article, theuse of such simulators in a collaborative framework promisesnumerous efficient applications.

123

Page 9: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

Acknowledgments We wish to thank Quentin Avril, Thierry Duvaland Bruno Arnaldi, IRISA researchers for their kind support. Wealso wish to thank Jérome Quesné, Benjamin Lollivier and Maz-yar Yosofi, undergraduate students, for their active participation. Thiswork was supported by the European Project VISIONAIR http://www.infra-visionair.eu [16] (VISIONAIR is under Grant agreement 262044)and the Danish Council for Independent Research | Technology andProduction Sciences (FTP). Grant Number: 10092821.

References

1. Anderson J, Bothell D, Byrne M, Douglass S, Lebiere C, Qin Y(2004) An integrated theory of the mind. Psychol Rev 111(4):1036

2. Backstrand G, Hogberg D, Vin LJD, Case K, Piamonte P (2007)Ergonomics analysis in a virtual environment. Int J Manuf Res2:198–208

3. Baranyi P, Csapo A (2010) Cognitive infocommunications: cogin-focom. In: Computational intelligence and informatics (CINTI),2010 11th international symposium on, IEEE, pp 141–146

4. Boothroyd G, Dewhurst P (1989) Product design for assembly.McGraw-Hill, New York

5. Bordegoni M, Cugini U, Belluco P, Aliverti M (2009) Evaluationof a haptic-based interaction system for virtual manual assembly.In: Proceedings of the 3rd international conference on virtual andmixed reality, Springer, Berlin, pp 303–312

6. Brough JE, Schwartz M, Gupta SK, Anand DK, Kavetsky R, Pet-tersen R (2007) Towards the development of a virtual environment-based training system for mechanical assembly operations. VirtualReal 11:189–206

7. Dai F, Felger W, Frühauf T, Göbel M, Reiners D, Zachmann G(1996) Virtual prototyping examples for automotive industries. In:Virtual reality world

8. Du J, Duffy V (2007) A methodology for assessing industrial work-stations using optical motion capture integrated with digital humanmodels. Occup Ergonomics 7:11–25

9. Duval T, Nguyen H, Fleury C, Chauffaut A, Dumont G, GourantonV (2012) Embedding the features of the users’ physical environ-ments to improve the feeling of presence in collaborative virtualenvironments. In: Cognitive infocommunications (CogInfoCom),2012 IEEE 3rd international conference on, IEEE, pp 243–248

10. (2010) Eurofound: change over time—first findings from the fiftheuropean working conditions survey. European foundation for theimprovement of living and working conditions

11. Fleury C, Chauffaut A, Duval T, Gouranton V, Arnaldi B (2010)A generic model for embedding users’ physical workspaces intomulti-scale collaborative virtual environments. In: “ICAT 2010(20th international conference on artificial reality and telexis-tence)”

12. Gerathewohl SJ (1969) Fidelity of simulation and transfer of train-ing: a review of the problem. In: Department of transportation,federal aviation administration, office of aviation medicine

13. Holt PO, Ritchie JM, Day PN, Simmons JEL, Robinson G, RussellGT, Ng FM (2004) Immersive virtual reality in cable and pipe rout-ing:design metaphors and cognitive ergonomics. J Comput InformSci Eng 4(3):161–170

14. Jayaram U, Jayaram S, Shaikh I, Kim Y, Palmer C (2006) Intro-ducing quantitative analysis methods into virtual environmentsfor real-time and continuous ergonomic evaluations. Comput Ind57(3):283–296

15. Kobe G (1995) Virtual interiors. Automot Ind 175(5):52–5416. Kopecki A, Wössner U, Mavrikios D, Rentzos L, Weidig C,

Roucoules L, Ntofon OD, Reed M, Dumont G, Bündgens D,Milecki A, Baranyi P, Noel F, Masclet C, Attene M, Giannini F,

Spagnuolo M (2011) VISIONAIR: VISION advanced infrastruc-ture for research. SBC J 3D Interact Syst 2(2):40–43

17. Larrodé X, Chanclou B, Aguerreche L, Arnaldi B (2008) Open-mask: an open-source platform for virtual reality. In: Softwareengineering and architectures for realtime interactive systems(SEARIS)

18. McCormik EJ (1987) Sanders. Human factors in engineering anddesign, M.S.

19. McMahan R, Bowman D, Zielinski D, Brady R (2012) Evaluatingdisplay fidelity and interaction fidelity in a virtual reality game. VisComput Graph IEEE Trans 18(4):626–633. doi:10.1109/TVCG.2012.43

20. Mine MR, Brooks Jr FP, Sequin CH (1997) Moving objects inspace: exploiting proprioception in virtual-environment interac-tion. In: Proceedings of the 24th annual conference on Computergraphics and interactive techniques, ACM Press/Addison-WesleyPublishing Co., pp 19–26

21. Müller M, Dorsey J, McMillan L, Jagnow R, Cutler B (2002) Sta-ble real-time deformations. In: Proceedings of the 2002 ACM SIG-GRAPH/Eurographics symposium on computer animation, ACM,pp 49–54

22. Nguyen TTH, Fleury C, Duval T (2012) Collaborative explorationin a multi-scale shared virtual environment. In: 3D user interfaces(3DUI), 2012 IEEE symposium on, IEEE, pp 181–182

23. Pappas M, Karabatsou V, Mavrikios D, Chryssolouris G (2007)Ergonomic evaluation of virtual assembly tasks. In: Digital enter-prise technology, pp 511–518

24. Picon F (2010) Interaction haptique pour la conception de formesen CAO immersive. Ph.D. thesis, Université Paris XI

25. Pontonnier C, Dumont G (2009) Inverse dynamics method usingoptimisation techniques for the estimation of muscle forcesinvolved in the elbow motion. Int J Interact Design Manuf (IJI-DeM) 3:227–235

26. Pontonnier C, Dumont G (2009) Motion analysis of the arm basedon functional anatomy. In: Springer (ed) 3D physiological human2009. Lecture notes in computer sciences 5903

27. Pontonnier C, Dumont G (2010) From motion capture to muscleforces in the human elbow aimed at improving the ergonomics ofworkstations. Virtual Phys Prototyp 5:113–122

28. Pontonnier C, Samani A, Badawi M, Madeleine P, Dumont G(2013) Assessing the ability of a vr-based assembly task simu-lation to evaluate physical risk factors. IEEE Trans Vis ComputGraph (In press)

29. Pontonnier C, de Zee M, Samani A, Dumont G, Madeleine P (2011)Meat cutting tasks analysis using 3D instrumented knife and motioncapture. In: 15th Nordic-Baltic conference on biomedical engineer-ing and medical physics, IFMBE Proceedings, vol. 34. pp 144–147

30. Pontonnier C, de Zee M, Samani A, Dumont G, Madeleine P (2012)Cutting force and emg recordings for ergonomics assessment ofmeat cutting tasks : influence of the workbench height and thecutting direction on muscle activation levels. In: ASME 2012 11thBiennial conference on engineering systems design and analysis(ESDA)

31. Rebelo F, Noriega P, Duarte E, Soares M (2012) Using virtualreality to assess user experience. Hum Factors J Hum FactorsErgonomics Soc 54(6):964–982

32. Schmitz B (1995) Great expectations-the future of virtual design.Comput Aided Eng 14(10):68

33. Seth A, Su HJ, Vance JM (2006) Sharp: a system for haptic assem-bly and realistic prototyping. In: ASME 2006 international designengineering technical conferences and computers and informationin, engineering conference

34. Stoffregen T, Bardy BG, Smart L, Pagulayan R (2003) Virtual andadaptive environments: applications, implications, and human per-formance issues, chap. On the nature and evaluation of fidelity invirtual environments, pp 111–128

123

Page 10: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions

J Multimodal User Interfaces

35. Svendsen J, Samani A, Mayntzhusen K, Madeleine P (2011) Mus-cle synergies and force variability during precision and trackingtasks. Hum Mov Sci 30:1039–1051

36. Tching L, Dumont G, Perret J (2010) Interactive simulation of cadmodels assemblies using virtual constraint guidance. Int J InteractDesign Manuf (IJIDeM) 4(2):95–102

37. Vo DM, Vance JM, Marasinghe MG (2009) Assessment of haptics-based interaction for assembly tasks in virtual reality. In: Worldhaptics conference pp 494–499

38. Volkov S, Vance JM (2001) Effectiveness of haptic sensation forthe evaluation of virtual prototypes. ASME J Comput Inform SciEng 1(2):123–128

39. Wall S, Harwin W (2000) Quantification of the effects of hapticfeedback during a motor skills task in a simulated environment. In:Proceedings of phantom user research symposium’00

40. Wang Z, Dumont G (2011) Interactive two-stage rendering tech-nique of deformable part through haptic interface. ASME ConfProc 2011(44328):133–143

41. Whitman LE, Jorgensen M, Hathiyari K, Malzahn D (2004) Virtualreality: its usefulness for ergonomic analysis. In: Proceedings ofthe 36th conference on winter simulation, winter simulation con-ference, pp 1740–1745

123