Transcript
Page 1: Virtual reality as a tool for verification of assembly and maintenance processes

2Actually, the term `virtual prototypinga is also used in otherareas such as VLSI chip design.

*Corresponding author. Tel.: #49-89-39-23-46-37.E-mail address: [email protected] (A. Gomes

de SaH )1After all, this seems only natural, since they have been also

among the "rst who applied computer graphics.

Computers & Graphics 23 (1999) 389}403

Technical Section

Virtual reality as a tool for veri"cation of assembly andmaintenance processes

Antonino Gomes de SaH !,*, Gabriel Zachmann"

!BMW AG, Geometrical Integration, CAD/CAM, 80788 Munich, Germany"Fraunhofer Institute for Computer Graphics, Rundeturmstra}e 6, 64283 Darmstadt, Germany

Abstract

Business process re-engineering is becoming a main focus in today's e!orts to overcome problems and de"cits in theautomotive and aerospace industries (e.g., integration in international markets, product complexity, increasing numberof product variants, reduction in product development time and cost). In this paper, we investigate the steps needed toapply virtual reality (VR) for virtual prototyping (VP) to verify assembly and maintenance processes. After a review oftoday's business process in vehicle prototyping, we discuss CAD-VR data integration and identify new requirements fordesign quality. We present several new interaction paradigms so that engineers and designers can experiment naturallywith the prototype. Finally, a user survey evaluates some of the paradigms and the acceptance and feasability of virtualprototyping for our key process. The results show that VR will play an important role for VP in the near future. ( 1999Elsevier Science Ltd. All rights reserved.

Keywords: Virtual environments; Virtual prototyping; Digital mock-ups; Assembly and maintenance process; Useracceptance; Direct manipulation

1. Introduction

In order to stay competitive on international markets,companies must deliver new products with higher qualityin a shorter time with a broader variety of versions atminimum costs. Virtual prototyping is quickly becomingan interesting strategy for product development. Auto-motive industries seem to be among the leaders in ap-plying virtual reality (VR) for real-world, non-trivialproblems.1 Although there are already several commer-cial 3D engineering tools for digital mock-up (and thenumber continues to grow), all of them lack one thing:intuitive direct manipulation of the digital mock-up bythe human. Therefore, they are inherently inferior to VRfor certain applications.

While some automotive companies have already be-gun to routinely use VR as a tool in styling and designreviews in the concept phase, it has not been clear thatVR can be an e$cient tool in assembly/disassemblysimulations and maintenance veri"cations. Assemblysimulations are much more di$cult in that they involvea lot of interaction and real-time simulation. However,[1] revealed that the assembly process often drives themajority of the cost of a product [2] point out that up to70% of the total life cycle costs of a product are commit-ted by decisions made in the early stages of design.

1.1. Dexnitions of virtual prototyping

There seem to be two di!erent understandings of whatexactly virtual prototyping is: the `computer graphicsaand the `manufacturinga point of view.2 We will de"ne

0097-8493/99/$ - see front matter ( 1999 Elsevier Science Ltd. All rights reserved.PII: S 0 0 9 7 - 8 4 9 3 ( 9 9 ) 0 0 0 4 7 - 3

Page 2: Virtual reality as a tool for verification of assembly and maintenance processes

the former as virtual prototyping, and the latter asdigital mock-up (which is often confused with virtualprototyping).

By virtual prototyping (VP) we understand the applica-tion of virtual reality for prototyping physical mock-ups(PMUs) using product and process data. The VR systemsimulates and renders all characteristics relevant to theparticular context as precisely and realistically as pos-sible in an immersive environment [3]. Some examplesare: veri"cation of assembly and disassembly procedures,assessment of product characteristics (e.g., ergonomics),and visualization of simulation data. The idea is to re-place, at least partly, physical mock-ups (PMUs) by soft-ware prototypes.

Digital mock-up (DMU) is a realistic computer simula-tion of a product with the capability of all requiredfunctionalities from design/engineering, manufacturing,product service environment, maintenance, and productrecycling; DMUs are used as a platform for product andprocess development, for communication, and makingdecisions from a "rst conceptual layout [4]. This includesall kinds of geometrical, ergonomic, and functional simu-lations, whether or not involving humans.

So, immersive VP is one of many techniques for imple-menting DMU.

Assembly/disassembly veri"cation has several goals.The "nal goal, of course, is the assertion that a part orcomponent can be assembled by a human worker, andthat it can be disassembled later on for service andmaintenance. However, other questions need to be ad-dressed, too: is it `di$culta or `easya to assemble/disas-semble a part? How long does it take? How stressful is itin terms of ergonomics? Is there enough room for tools?

2. Related work

A lot of development for utilizing VR for VP is beingrealized by automotive and aerospace companies. Manye!orts, however, are still feasability studies.

Practically all automotive companies investigate theuse of VR for styling reviews and other mere walk-through applications. Some of them already employ it fordaily work. Usually, the model is rendered on a large-screen stereo projection or in a cave. Variations can becompared on the #y with realistic colors and illuminatione!ects [5]. At Daimler Benz the body of a car can bereviewed in an immersive virtual environment by the aidof zebra lighting [6].

Since VR provides an intuitive and immersivehuman}computer interface, it is perfectly suited to doergonomics studies. Consequently, many projectscapitalize on this advantage of VR. Ford employsvirtual prototypes with several proposed dashboardcon"gurations to verify visibility and reachability ofinstrument.

Researchers at Caterpillar Inc. use VR to improve thedesign process for heavy equipment. Their system [7]allows them to quickly prototype wheel loader and back-hoe loader designs to perform visibility assessments ofthe new design in a collaborate virtual environment.Further the engineers can simulate the operation of theequipment and evaluate visual obstructions.

Volkswagen has incorporated some useful applica-tions in the vehicle development process. They havecoupled a commercial human inverse kinematic packagewith VR to investigate di!erent ergonomic features. Theyalso visualize the results of FEA crash computations inVR interactively. The virtual product clinic avoids faultydevelopments and helps assess customers' wishes [8].

Chrysler launched a project to study the process of VP,to investigate the steps required for the creation of a vir-tual representation from CAD models, and for the sub-sequent use of the prototype in immersive VR [5].

A vision of VP was developed within the ESPRITproject AIT (Advanced Information Technologies in De-sign and manufacturing). Project partners were manyEuropean automotive, aerospace, IT suppliers, and aca-demia [4]. A lot of visionary prototypes have been pre-sented also by [9].

A prototype virtual assembly system is described in[10]. Our approach integrates many more interactionparadigms, and we present the results of a user surveyevaluating some of them. Although they do not describethe process of authoring a virtual environment, it seemsto us that they pursue the toolbox approach, i.e., thesystem is a monolithic program on top of a set of libra-ries, while our approach is the scripting approach.

Systems for designing in VR are presented by [11,12].Our approach is to use VR only for investigation andsimulation. No geometry can be designed by our system,because we do not feel that this would take advantage ofthe strengths of VR. A factory simulation for immersiveinvestigation has been presented by [13]. Although nodirect manipulation with objects is possible, problemscan be identi"ed earlier than through the aid of chartsand graphs produced by conventional simulations.

3. Assembly processes in the automotive business process

Today's computer-aided tools (CAx) for automotiveand other industries can simulate a lot of the functionsand operating conditions of a new product. In somecases, software simulations are as good or even betterthan PMUs. However, they still do not meet all require-ments to avoid PMUs completely. Certain functions ofa new product cannot be simulated at all by current CAxtools, while others do not provide the results in anacceptable time.

Therefore, many PMUs are built during the develop-ment process to achieve a 100% veri"cation of the

390 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 3: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 1. Process chain for the vehicle prototype activities.

geometry, the functions, and the processes of a new carproject. Additionally, today's CAx tools do not providea natural and intuitive man}machine interface thatallows the user to feel and to get the spatial presence ofthe virtual product.

In order to `"lla these gaps, many automotive andother companies have established projects to investigatethe use of VR technologies for veri"cation of designs andprocesses [14].

3.1. Today's approach

The automotive business process chain comprises vari-ous key processes from the early concept phase through"nal service, maintenance and recycling. Those that willbe highlighted in this paper are the assembly and mainten-ance processes. The veri"cation process can be brokendown into three sub-processes which are described in thefollowing (see Fig. 1):

f Fast CA loops. CAx tools are used to quickly verifydi!erent design concepts and assembly/disassembly ofthe design concepts. These veri"cations take placein-between the design and the CA prototype process(see Fig. 1). At the beginning of the business processchain the freedom to change concepts and the numberof variations of a component is higher. Due to this fact

the number of CA veri"cations during the develop-ment process will decrease.

f PMU loops. For detail veri"cation of design conceptsand assembly processes in some sections of a product,various PMUs are built. This sub-process can be iden-ti"ed in Fig. 1 between the design and the PMUprocess (see dashed line).

f PMU verixcation. Some complete PMUs of the "nalproduct (e.g., a car) are built to verify if all the designedcomponents ful"l all the requirements related to ergo-nomics, functions and processes. Before these full proto-types are built, a freeze of the styling and designprocesses occurs. In Fig. 1 these phases are marked bythe deadlines SF and DF.

In the traditional process chain several problems arisedue to the fact that veri"cations of the processes aremade using PMUs and CAx tools:

f Parallel verixcation processes. Veri"cations are madewith CAx tools and with PMUs (in this case they areobtained by e.g., the use of rapid prototype techniquesand/or hand-built prototypes) concurrently. The cor-relation between these two veri"cation processes isvery hard to obtain.

f Not enough co-ordination. The handling, synchroniza-tion, correlation, and management of these processes is

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 391

Page 4: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 2. Transition from physical-to-digital mock-up [6].

very di$cult and in some cases impossible. In order tobuild a PMU, a design stage needs to be freezed. Atthis time, the building of the PMU starts and can take6}12 weeks. Due to concurrent engineering, furtherchanges of CAD parts (sometimes even signi"cantones) can be made during the build-time. Therefore, bythe time results are obtained by the PMU veri"cationthey no more have a direct correlation to the currentdesign. Even if there have not been changes in thedesign, the `transfera of the results of the PMU veri"-cation to the DMU is, in some cases, very di$cult.

3.2. Vision

Most companies already de"ne their products digitally(e.g., CA methods) and manage the data by product datamanagement systems (PDM). However, the digital dataare not used as the basis for the core business process.Instead, they are maintained in parallel to a more tradi-tional process based on PMUs, more as an auxiliary or`supporta of the PMU process or the building of PMUs.

The goal of DMU is to replace the traditional businessprocess, based on PMUs, by one which fully maximizesDMU technologies available today and in the future. Thevisionary goal is a process with only a single PMU for

a "nal veri"cation, certi"cation, and release to volumemanufacturing (see Fig. 2).

The goal is to perform veri"cations as early as possible,i.e., front-loading of engineering, manufacturing, service,manufacturing, and recycling tasks to the concept phase[6]. We believe that by utilizing VR, DMUs can beevaluated in the concept phase.

3.3. Objectives of the verixcation of assembly processes

Objectives can be classi"ed by two categories: strategicand operative.

Strategic objectives are global and involve the com-plete business process. The most important ones are:reduction of development costs, development time, andtime-to-market; increase of product innovation, productquality, #exibility, and maturity at series start.

Operative objectives are more local, related to onlyone or a few key processes. The most important objec-tives which need to be ful"lled for assembly and mainten-ance are [15]:

f Service, inspection, and repair locations should beeasily accessible;

f visibility should be ensured;

392 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 5: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 3. Data #ow between CAD and VR system.

f exchange of components should be easy;f use few and standard service and inspection tools.f accessibility of service tools, and hand and arm of the

worker;f calculation and investigation of minimal distances to

avoid collisions, e.g., during operating conditions;f calculation of assembly/disassembly paths for o!-line

robot programming;f calculation of sweeping envelop of movable compon-

ent for packaging investigations, e.g., for reservation ofspace in engine bay.

Additionally, these objectives must be veri"ed with1}20 mm precision, related to the business process phase.They must be documented in digital form. These elec-tronic reports should be managed by the PDM systemtogether with the geometry and further administrativeand technological data. As soon as a new version ofa electronic report is created, the PDM system shouldinform involved users that a new report is available forassembly processes.

The electronic report contains information relatedto simulation and investigation results, proposals forchanges of CAD components, assembly/disassembly

paths, collision areas, sweeping envelopes, and the statusof all veri"cation processes.

4. From CAD to VR

The complete data pipeline form the CAD system tothe VR system has various modules. CAD systems arethe source of most of the data. This data is stored ina PDM system, which also maintains administrativedata together with CAD data, such as ID, version, name,project code, etc. Via a retrieval and conversion toolthese data can be converted, reduced, and prepared foruse in a VR system (see Fig. 3).

Common problems, especially with carry-over data(i.e., CAD data designed for predecessor products, butre-used in new ones), are the orientation of normals,missing geometry, and deletion of interior or other `un-wanteda geometry. To our knowledge, there are nocommercial tools available yet which can solve theseproblems automatically. So the process for preparingdata for VR needs to access the CAD system interactive-ly. We have tried to depict that in Fig. 3 by the arrowbetween CAD and preparation tool. Furthermore, the

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 393

Page 6: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 4. We have identi"ed three layers of increasing abstractionand specialization, on which authoring of a virtual environmenttakes place.

Fig. 5. The AEO framework. Note that actions are not `tied-inawith graphical objects.

VR representation of CAD data and the con"guration ofthe virtual environment need to be managed within thePDM systems (dashed arrow in the "gure).

4.1. CAD data requirements

Design data available today in the manufacturing in-dustries and others do not meet the geometric and non-geometric requirements, so they can be used as-is fora VR simulation. There are two ways to tackle the prob-lems described in the previous section: new data must bedesigned with VP in mind; old data must be dealt with,either by redesigning (least preferred), or by semi-automatic conversion to representations suitable for VR.

It is commonly understood that design data has tohave di!erent representations depending on the phase inthe business process (e.g., concept, engineering, etc.) andthe key process that is to be veri"ed [16]. For example, inthe same CAD model there should be geometry andnon-geometric information, like kinematic constrains,material properties, weight, etc., that will be used later onduring a VR session.

To avoid that designers have to become familiar withdi!erent software tools, the number of interfaces must bekept low. To achieve that the two worlds need to beintegrated, at least to a higher degree than present today.Ideally, a designer can create the CAD components andalso perform assembly feasibility studies, himself. Also,with a better integration it will be easier to exchange databetween CAD/PDM systems and VR systems.

5. Immersive veri5cation

In this section, we will brie#y explain the process ofauthoring VEs, present the two scenarios which havebeen chosen for our studies and developments, and"nally describe the functionality needed for assemblyinvestigations.

5.1. Authoring

In order to make VP an e$cient tool to save time, itmust be easy to `builda a virtual prototype, i.e., a virtualenvironment (VE) which represents part of a car andsimulates part of its physical behavior. It must be at leastas easy as designing with a CAD system.

We have developed a three-layer framework; eachlayer provides a certain level of abstraction and special-ization. It has proven to be #exible and powerful(Figs. 4 and 5).

The bottom layer is the scene graph: it deals mostlywith geometry and rendering optimizations. Some scenegraph APIs, such as VRML2.0 or Inventor, also providevery low-level scripting features (for instance routes andengines).

At the next level we have implemented the event-basedscripting approach for building VEs [17]. It is a generalframework based on the concept of objects actions, andevents, each of which with higher-level, yet general`story-board drivena functionality.

End-users working in a certain application domain(such as assembly simulation) will specify scenarios at theapplication layer, which provides a graphical user-inter-face (see Fig. 6) and specialized, very high-level function-ality (e.g., the user tells the system which objects aretools).

5.1.1. Scenario templatesAnother way to tackle the problem of authoring might

be scenario templates. The idea is, if parts had standardnames, then a large portion of VEs could be derived fromstandard `scenario templatesa, e.g., `front doora, `taillighta, `gear boxa, etc. So, for a VR session with a di!er-ent geometry, a VE author would only have to modifyone of those templates. To some extent this idea can becombined with `manuala authoring described above.

394 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 7: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 6. For every application domain there must be an application-speci"c authoring tool which provides the type of high-levelfunctions needed in the particular domain.

Fig. 7. Overview of the tail-light scenario. The tail-light is to beremoved.

Fig. 8. The door scenario. Two hands and several tools arenecessary to perform the assembly.

However, it is not clear to us yet, whether designers willever design all the VR-relevant attributes. Some of themare geometric, like visible material, thickness of metalsheets, and the like. So far, a lot of authoring time is spentbasically on specifying the non-geometric (semantic) at-tributes of parts, such as the function of objects (screw,tool, etc.), non-geometric materials (#exibility, smooth-ness), the order of tasks in the (dis-)assembly process, etc.

5.2. Scenarios

We have chosen two scenarios in order to assess a "rstset of functionalities needed for assembly tasks in VR;one of them is a simple one, the other is one of the mostdi$cult. One of the scenarios (the door) was also used forthe user survey.

5.2.1. The tail-lightThe "rst scenario is the disassembly of the tail-light of

the BMW 5 series (Fig. 7). First, the covering in the cartrunk must be turned down, in order to get access to the

fastening of the lights (Fig. 11). To reach the screws "xingthe tail-light, the fastening needs to be pulled out.

Then the tail-light itself can be unscrewed by a stan-dard tool. After all screws are taken out, the tail-light capcan be disassembled by pulling it out from the outside.

5.2.2. The doorThis scenario is much more complex and more di$cult

in that both hands and various tools must be utilized(Fig. 8).

The "rst task is to put the lock in its place in the door.This is quite di$cult in the real world, because it is verycramped inside the door and the lock cannot be seen verywell during assembly. Screws have to be fastened whilethe lock is held in its place (Fig. 12).

Next, the window regulator is to be installed (Fig. 13).This task needs both hands, because the window regula-tor consists of two parts connected to each other by#exible wires. After placing the bottom "xtures into slots,they must be turned upright, then the regulator screwscan be "xed.

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 395

Page 8: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 9. Administrative data stored in the PDM about parts canbe displayed during the VR session.

Finally, several wires must be layed out on the innermetal sheet, clipped into place, and connected to variousparts. However, this part of the assembly was not per-formed in VR.

5.3. Interaction functionality

In this section, we will describe an array of techniquesmost of which have proven to be helpful in veri"cation ofassembly simulations.

5.3.1. Multi-modal interactionIt is important to create an e$cient human}computer

interface, because the tasks to be performed in VP can bequite complex. Many of the `classica VR interactiontechniques can be utilized, such as (static) gesture recog-nition, 3D menus, selection by beam casting, etc. Eachtechnique must be implemented in a very robust anduser-independent manner, otherwise users will becomeirritated and disapproving of VR.

While grasping two objects with both hands, a usermust still be able to give commands to the computer.This can be achieved most intuitively by voice recogni-tion. Also, multi-sensory feedback (see below) plays animportant role in multi-modal interaction.

5.3.2. Getting help from the systemWhen the number of functions becomes large in the

VR system, it happens that occasional users cannot re-member a certain command. Similar to 2D applications,we additionally provide hierarchical 3D menus. They areinvoked by a speech command. In our experience, 3Dmenus are to be considered only as an auxiliary inter-action technique, since it is more di$cult to select menuentries in VR than it is in 2D.

5.3.3. An on-line service manualWe believe that VR could eventually become an e$-

cient means for training service personnel and creatingan interactive service manual. Service manuals could bedisseminated in the form of VRML environments, whichcan be viewed and interacted with on a PC-based `"sh-tankaVR system. However, augmented reality-based sys-tems might be necessary, especially in larger and morecomplex vehicles, such as aircrafts and submarines.

In our environments we have implemented an inter-active service manual as well as an interactive trainingsession. First, a trainee learns by watching the servicemanual; this is basically an animation of the assemblyprocess. While the animation is being played back, thetrainee can move freely about the environment andwatch from any viewpoint.

When the trainee is ready to learn by doing, he willperform the task step by step. After each step is com-pleted the system will point him to the part or tool he willneed for the next step and tell him what to do with it. For

instance, after all screws for the door lock have beenfastened, the system highlights the window regulator(by blinking) and instructs him how to assemble it. Theinstructions have been pre-recorded and are played backas sound "les.

So far, the virtual service manual and the interactivetraining session are hand-crafted via manual scripting.However, it should be straightforward to extract themfrom a PDM system, if the process data are there ina standardized form.

5.3.4. Investigation toolsIn order to make the correct decisions, it is important

that the user can get information about the parts in-volved in the virtual prototype currently being investi-gated. Administrative information about parts can bedisplayed in a heads-up fashion by pointing at objectswith a ray (see Fig. 9). Of course, any other selectionparadigm can be used as well.

A tool which has been requested by designers is theclipping plane. It can help to inspect `problem areasamore closely. When activated, the user `wearsa a planeon his hand; all geometry in front of that plane will beclipped away in real-time. Optionally, the part clippedaway can be rendered transparent. Sometimes it may benecessary to restrict the motion of the plane so that it isalways perpendicular to one of the world coordinateaxes.

Another tool to inspect assembly situations and themechanical design is the user size. This parameter can becontrolled by simple speech commands, which in turna!ect all parameters by which a virtual human is repre-sented, in particular navigation speed and scale of posi-tion tracking. This way, a user can comfortably `stick hisheada inside some narrow space.

In order to measure distances we have implementedtwo options: A user can select two objects, then the

396 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 9: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 10. With the virtual yard-stick distances can be measuredin the VE.

Fig. 11. Inverse kinematics is needed for `door-likea behavior ofparts.

Fig. 12. Tools snap onto screws and are constrained. Also, theyare placed automatically at an ergonomic position within thehand by the system.

system will compute the minimal distance between thetwo and display it in the heads-up display. Or, the usercan grab a virtual yard stick (see Fig. 10). While grabbed,the yardstick adjusts its length in both directions so thatit just touches the closest geometry. Additionally, itslength is shown on the heads-up display. Another waywould be to select two points on the geometry and havethe system display the length of that line.

5.3.5. Physically-based simulationMany mechanical components have some articulated

parts. These could be simple `door-likeamechanisms (seeFig. 11), i.e., permanent joints with one rotational degreeof freedom (DOF), such as hoods, lids, etc.; other verysimple ones are sliding mechanisms (one translationalDOF), for example the seat of a car. Inverse kinematics ofthese and other articulated chains can be simulatedon-line.

For complicated kinematic simulation, such as theworking conditions of a complete chassis, we have pur-sued a di!erent approach: the VR system loads the re-sults of an o!-line simulation by a commercial package,such as AdamsTM. The user can then interactively steerthe visualization, for example by turning the steeringwheel or by speech commands.

A lot of the parts in a vehicle are #exible: wires, hoses,plastic tanks, etc. It is still a major challenge to simulateall these di!erent types of #exible parts with reasonableprecision and at interactive rates. In particular, simula-tion of the interaction of #exible objects with the sur-rounding environment and the user's hands by a generalframework is, to our knowledge, still unsolved.

We have implemented hoses and wires in our VRsystem; the wires or hoses are attached at both ends toother, non-#exible parts, and they can be pushed orpulled by a user's hand.

5.3.6. Verixcation without force-feedbackIn our experience, assembly tasks are more di$cult in

VR than in the real world, because in VR there is no forceand haptic feedback (see also Section 6). Humans caneven perform quite complicated tasks without seeingtheir hands or tools merely based on auditory, haptic andkinaesthetic feedback. Therefore, we have provided a lotof interaction aids trying to compensate for the missingforce feedback.

In order to help the user placing parts, we have de-veloped two kinds of snapping paradigms: the "rst onemakes objects snap in place when they are released by theuser and when they are su$ciently close to their "nalposition. The second snapping paradigm makes toolssnap onto screws when su$ciently close and while theyare being utilized (see Fig. 12). The second paradigm isimplemented by a 1-DOF rotational constraint whichcan be triggered by events.

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 397

Page 10: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 13. The window regulator has to be installed with twohands; the `ghosta paradigm signals collisions.

Fig. 14. The object-on-the-lead paradigm allows to verify assem-bly. The object is not linked rigidly to the hand.

Fig. 15. During assembly, the path of any part can be recorded,edited, and stored in the PDM system.

The major problems is: how can we verify that a partcan be assembled by a human worker? A simple solutionis to turn a part being grasped into what we call a ghostwhen it collides with other parts: the solid part itselfstays at the last valid, i.e., collision-free, position whilethe object attached to the user's hand turns wireframe(see Fig. 13).

However, invalid positions can be `tunneleda. There-fore, we have developed object-on-the-lead paradigm: theobject is no longer attached rigidly to the virtual hand;instead, it `followsa the hand as far as it can go withoutpenetrating any other parts (see Fig. 14). We have imple-mented a physically based simulation, so that the objectcan glide along other parts; in our earlier implementa-tion, there was no gliding, which caused the object on-the-lead to get stuck in tight environments. So, at anytime it can assume only valid positions.Of course, exactand fast collision detection is a prerequisite [18].

This is only a "rst step. A completely reliable veri"ca-tion will check the virtual hand for collisions as well.Also, the hand and/or part should slide along smoothrigid objects to make assembly easier for the user.

5.3.7. DocumentationIf a certain assembly task cannot be done, then the

result of the veri"cation session should be precise andfurther an intuitive understanding regarding why assem-bly task cannot be done has to be achieved. A number oftechniques have been implemented in order to investigateand document a possible failure of assembly.

During assembly/disassembly the path of any part canbe recorded and edited in VR (see Fig. 15). Saved pathscan then be stored in the PDM system.

While parts are being moved, the sweeping envelopecan be traced out. It does not matter whether the part ismoved interactively by the user or on an assembly path.

Problems can be annotated by placing 3D markers(we have chosen 3D arrows). Then, verbal annotationscan be recorded and displayed textually next to themarker (see Fig. 16). Note that all interaction is done bythe user via speech recognition, except for placing themarker.

5.3.8. Feedback to the userAny VR system should be as responsive as possible,

especially for occasional, non-expert users. The userstargeted for immersive VP will probably not use VRevery day. Therefore, multi-sensory feedback is impor-tant to make them feel comfortable and in control.

Therefore, the system acknowledges all commands, inparticular those invoked via voice recognition. Currently,this is done by pre-recorded audio or speech. Eventually,we will utilize synthesis.

During the assembly simulation, a variety of feedbackscan be combined which will be given if the user tries to

398 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 11: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 16. Annotations can be put into the scene by voice com-mands.

Fig. 17. Violations of security-distance are high-lighted by yel-low, collisions are red.

move an object at an invalid position: acoustic feedback,tactile feedback by a CybertouchTM glove, and visualfeedback. Visual feedback comes in several #avors: wholeparts can be highlighted (see Fig. 17), or the polygonswhich would have intersected at the invalid position canbe highlighted.

6. User-survey

In order to evaluate the acceptance and the potentialof VR for VP, a survey of prospective users has beenperformed at BMW.

We chose representatives from each of "ve groupsinvolved with assembly and maintenance investigations.

f CA specialist (CA). These are engineers that havea good CAx expertise and also some speci"c assem-bly/maintenance processes knowledge.

f Skilled worker (SW). These are skilled mechanics whoactually perform the physical prototype veri"cations.This group has no CAx knowledge.

f Interface specialist (IS). This group comprises mechan-ical technicians. They have mostly speci"c assem-bly/maintenance process knowledge, but they arestarting to get familiar with CAx tools. This groupmediates between group 1 and group 2.

f Managers (MA). They are the ones that co-ordinate allthree groups in the vehicle prototype group.

f IT specialists (IT). In this group are very highly skilledengineers that do development and evaluation of newCAx methods and IT tools. They provide new tech-nologies to the key-user's departments (the above fourgroups are from the key-user department vehicleprototype).

Notice that all subjects have never before been exposedto any VR experiences. This was because the base ofsubject with VR experiences is insu$ciently small for asurvey.

The average (AVG) of all groups will be presented insome "gures.

The scenario used for the survey was the installation ofthe door lock, which is a very di$cult task, because thespace inside the door is very tight. Only subject fromgroup SW were to completely install the lock with their`virtual handa, because they are the only ones who reallyknow how to do it. For all other groups, subjects were tofocus on the `feela of the technology, the I/O devices, andinteraction techniques and capabilities.

Each group consisted of 5}7 person, which gives a totalof approximately 30 persons. Each group had 3}4 h: onehour introduction, presentation of all the I/O devicesand VR functionality of the VR assembly application,and for each person 20}30 min to perform the followingtasks:

(1) Navigate forward and backward with the data glove.(2) Grasp the door lock (with the virtual hand) and put

the lock near the door.(3) Invoke display of the technological/administrative

data of the door lock. To activate this function theuser had to use voice input.

(4) Work with the interactive clipping plane. Againvoice input had to be used to invoke the function.

(5) Change the color of the door. The interaction stepsto perform the task were: First, select the door,activate the color 3D menu, select a color, thendeactivate the 3D menu. Interaction is implementedby pointing, rays, and voice input.

(6) Install the door lock in its "nal position while savingthe assembly path.

While one user was performing the veri"cation tasks,all other users in the group were watching on a large-screen stereo projection with shutter glasses.

The following hardware and setup was used during thesurvey: SGI ONYX with 6 processors, 2 IR graphics

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 399

Page 12: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 18. User statisfaction with object selection. Fig. 19. Navigation in the VE with a data glove.

pipes, 1 GB RAM, FS5 head-mounted display, CyberTouchTM data glove with tactile feedback, stereo projec-tion, Ascension electromagnetic tracking system. Theframe-rate was about 18 and 14 Hz for the tail-light andthe door scenario, respectively. Latency was reduced bya predictive "lter. Tracking errors (which are immanentin electro-magnetic systems) were corrected by ourmethod presented in [19].

6.1. Evaluation

The results that are presented below were obtainedwith a questionnaire that each subject "lled out after theVR experience. Each question was to be answered bya multiple choice with "ve levels: very good (i100%),good, satisfactory, bad, and very bad (i0%).

6.1.1. InteractionTo interact with the assembly application various

paradigms were developed. We have evaluated three ob-ject selection paradigms: voice recognition plus selectionwith the data glove, 3D menu plus selection with the dataglove, and keyboard selection by a second person(i.e., someone assisting the VR user).

From Fig. 18 it is clear that almost all users prefer thecombination of voice input and data glove, instead of thecombination 3D menu and glove or selection by some-one else. However, there are some interesting di!erencesamong the groups: users who had no experience withCAx technologies or IT technologies in general (SWgroup) had no distinguished preferences. On the otherhand, users with a long experience in computers(IT group) clearly preferred voice input.

If we consider that the IT group already had thepossibility to experiment with di!erent input devices andhad gained a good skill in CAx, we think that voice inputwill be one of the most important interaction paradigmsfor this kind of VR applications.

Although, we have not done a formal survey about the`object-on-the-leada paradigm, all users disliked it,because it was unnatural.

6.1.2. NavigationThe possibility to move in 3D space without having to

deal with 3D coordinates of points and vectors was animpressive experience for all groups. The natural andintuitive way to walk like in the `real worlda is animportant aspect for the acceptance of this technology(see Fig. 19).

During the investigation subjects could navigateby point-and-#y. However, most of them disliked it,because it was unconstrained. They said: `In the realenvironment we have only the possibility to walk. If thecar is too low or too high, we can lift the cara. When weinformed them that the system also provides point-and-#y constrained to eye-level, they rated this paradigmvery high.

Most users were missing precision movements of theviewpoint and exact positioning of parts in the VR sys-tem. Of course, we expected this; yet, we chose not toburden the session by too many tasks, although parts canbe positioned exactly with the VR system by voice com-mands. In this survey however, users were only shownthe point-and-#y navigation and how to position partsby using the glove.

6.1.3. Voice inputAs Fig. 20 indicates almost all subjects preferred voice

input for giving commands to the computer. We believethat this is due to the very natural way in which com-mands can be given, e.g., a user just says `selection ona or`switch selection ona. Another advantage is that voicecommands can be chosen in a very `descriptiveamanner.Some tasks can be performed much more precisely witha binary trigger than by a virtual hand, for instance exactpositioning.

Unfortunately, interaction by voice recognition hastwo shortcomings:

f We need to utilize user-independent speech recogni-tion because occasional users like mechanics will not

400 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 13: Virtual reality as a tool for verification of assembly and maintenance processes

Fig. 20. Although the reliability and ease-of-memorization gotonly a medium rating, users preferred voice input signi"cantlyfor giving commands to the VR system. Fig. 21. Evaluation of several collision feedbacks. Tactile feed-

backs was given through the CyberTouch's vibrators.

Fig. 22. Ergonomics of VR devices.

accept to train the speech recognition system prior toa VR session. Therefore, the recognition rate is a littlebit lower than with a user-dependent system. However,even a recognition rate of 90% irritates occasionalusers a lot.

f Occasional users will not remember too many com-mands, so they often have to use 3D help menus inorder to recall them.

This is why the user satisfaction with voice recognitionreliability and command memorization is signi"cantlyless than the overall satisfaction.

6.1.4. Collision feedbackAn interesting result of our survey is the response

regarding feedback. In particular, we asked the users toevaluate the multi-model feedback given by the system incase of a collision: visual, acoustic, and tactile. We use theCybertouchTM to produce tactile feedback. Each "nger'svibrator was controlled individually by the VR system.

The result of the survey can be found in Fig. 21. Webelieve that the visual feedback was not completely satis-factory to the users, because it highlighted the wholeobject instead of the area of collision only, which is whatengineers are interested in.

Acoustic feedback plays a main role in two di!erentways: "rst, it provides feedback to the user of what ishappening at the moment; secondly, it provides informa-tion about the material (e.g., metal, wood, plastic, etc.)that colliding components consist of.

Tactile feedback was evaluated signi"cantly less help-ful than the other two. Although our subject found it anexciting experience, it is nevertheless unnatural to them.After some discussion with them, we realized that whatthey really would have liked is force feedback. Theyreported that without force feedback some assemblytasks are almost impossible to do.

6.1.5. ErgonomicsThe ergonomics of most VR devices is still a major

problem for making VR a popular and widely acceptedtool in design and engineering veri"cations (see Fig. 22).

Our conclusion is that HMDs and data gloves are notyet "t for being used for daily work. Users have reportedthe following shortcomings, which we have expectedanyway:

f The HMD is too heavy and when the user looks downor up, the HMD can fall o!. If the user does quickmovements, he has to hold the HMD with his hand,otherwise it shifts and the images get blurred or ob-scured.

f Unnatural sensation when the data gloves starts tovibrate.

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 401

Page 14: Virtual reality as a tool for verification of assembly and maintenance processes

3A lesson learnt is that what seems to be interesting orimportant from the computer graphics point of view may not beimportant at all to end-users. Instead, it is more important thatthe end-users get the functionality common to their working"eld, instead of getting leading-edge functionality that they willnot accept.

4For instance, in order to create a complete VE for immersiveassembly simulation we need 70% of the time to "nd andprepare the CAD data and only 30% for authoring the VE.

5We believe that HMDs and Gloves are necessary for assem-bly simulation in VR, at least for the interacting user, but displayon a large-screen stereo-projection is necessary, too, for reviewof the simulation results.

6At present, there are a few optical on-line tracking systems,but they are still very expensive.

f Too many cables for I/O devices, which tether the usertoo much. Users do not have the freedom necessary toperform complicated assembly tasks.

7. Conclusion

In this paper we have discussed the bene"ts of VR forVP in assembly and maintenance veri"cation. Also, theintegration of VR with a company's existing IT infra-structure has been discussed. Several problems have beenaddressed and solutions have been proposed.

We have presented several interaction paradigms andfunctionality which a VR system must implement inorder to be suitable for that area of application. Theyenable unexperienced users to work with virtual proto-types in an immersive environment and help themexperiment e$ciently with CAD data. Our three-layerframework for authoring VEs has proven to be powerfuland #exible. All features and frameworks have beenimplemented in Fraunhofer-IGD's VR system VirtualDesign II, which has been successfully evaluated byBMW in a benchmark; it will be employed routinely byBMW for VP in the future.

Finally, we have reported on a user survey which hasbeen performed at BMW with a representative group ofkey users.3 The result of our survey indicates that the useof VR for VP will play an important role in the nearfuture in automotive (and probably other) industries. Inparticular, the response of the surveyed users has beenvery encouraging and optimistic that VR/VP does havethe potential to reduce the number of PMUs and im-prove overall product quality, especially in those pro-cesses of the business process chain where humans playan important role.

VR is the best tool (today) to obtain quick answers inan intuitive way in the concept phase of the businessprocess of a product, because in that phase data changeoften and are available only in a `rougha preparation.However, we cannot provide a formal cost/bene"t analy-sis at this time, since the technology is not yet integratedin the daily productive work environment of our businessprocess.

However, VR will not become a wide-spread tool inmanufacturing industries before it is seamlessly and com-pletely integrated into the existing CAx and IT infra-

structure.4 This is not only a question of conversion andpreparation tools: a lot of the data needed for a completeDMU are just not there yet, such as process information,material properties, etc. All automotive and aerospacecompanies have realized that and are working on imple-menting solutions. However, this does not only involvetechnical aspects of the design process but also a tremen-dous shift in corporate culture.

7.1. Future directions

Based on our user survey we feel that current VRI/Odevices are still way too cumbersome, and not robustenough, to be used in CAD working environments or inworkshops. Gloves must become easier to put on; HMDsmust become much more light-weight while still gettinga better hold on a person's head.5 The lack of wirelesstracking devices, gloves, and HMDs is among the mostannoying de"ciencies of VR devices.6

Especially in virtual assembly and maintenance simu-lations acoustic feed-back turned out to be not su$cientto meet the demands of the users. A comfortable-to-useforce feedback device would extend the immersion andusability, enabling the user to do assembly/disassemblytasks in narrow and complex virtual environments, parti-cularly if the parts, tools, and/or hands cannot be seenduring the task. Mechanics `seea with their hands; there-fore, force feedback would add signi"cantly to the degreeof immersion, and it would provide natural and expectedcollision feedback, such as objects sliding on other regidparts. Furthermore, it prevents a deviation of the (real)hand and the grasped part.

Another problem is the way engineers are designingtoday. It is still quite di$cult to prepare CAD data suchthat an interactive frame rate will be achieved by the VRsystem both in rendering and in simulation speed. Thisproblem might be solved in a few years by faster hard-ware. However, a lot of the semantical (non-geometric)data needed in a virtual environment just do not exist.This can be solved only by a shift in the design process:design guidelines with VP in mind, which are partiallyestablished already, need to be implemented and integ-rated in the product development process, and some datajust have to be modeled for VP either by CAD engineersor by `VP engineersa.

402 A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403

Page 15: Virtual reality as a tool for verification of assembly and maintenance processes

Acknowledgements

We would like to thank the following persons for theirsupport, ideas, and collaboration. At BMW: HasmukhJina, Peter Reindl, Peter Baacke, Rainer Klier, ChristophRieder, and Hans-JuK rgen Penzkofer. At IGD: Prof.Encarnaca8 o, Dr. MuK ller, Dr. Rix, Dirk Reiners, ChristianKnoK p#e, Andreas Flick, and Axel Feix.

References

[1] Boothroyd G, Dewhurst P. Design for assembly } a de-signer's handbook. Technical report, Department ofMechanical Engineering, University of Massachusetts atAmherst, 1983.

[2] Pratt MJ Virtual prototypes and product models in mech-anical engineering. In: Rix J, Haas S, Teixeira J, Virtualprototyping } virtual environments and the product designprocess. London: Chapman & Hall, 1995:113}28 [chapter10].

[3] Gomes de SaH A, von Praun S. Virtual prototyping } a fun-damental building block of the digital mock-up strategy.VR NEWS virtual Reality Worldwide 1998;7(7):15}9.

[4] Dai F, Reindl P. Enabling digital mock up with virtualreality techniques } vision, concept, demonstrator. In: Pro-ceedings of 1996 ASME Design Engineering TechnicalConference and Computers in Engineering, Irvine, Califor-nia, August 1996. ASME.

[5] Flanagan WP, Earnshaw R. Applications: meeting thefuture at the University of Michigan Media Union. IEEEComputer Graphics and Applications 1997;17(3):15}9.

[6] Buck M. Immersive user interaction within industrial vir-tual environments. In: Dai F, editor. Virtual reality forindustrial applications, Computer graphics: systems andapplications. Berlin, Springer: 1998:39}59 [chapter 2].

[7] Lehner VD, DeFanti TA. Projects in VR: distributed vir-tual reality: supporting remote collaboration in vehicledesign. IEEE Computer Graphics and Applications,1997;17(2):13}7.

[8] Purschke F, RabaK tje R, Schulze M, Starke A, Symietz M,Zimmermann P. Virtual reality } new methods for improv-ing and accelerating vehicle development. In: Dai F, editor.Virtual reality for industrial applications, Computer

graphics: systems and applications. Berlin: Springer,1998:103}22 [chapter 6].

[9] Astheimer P, Dai F, Felger W, GoK bel M, Haase H, MuK llerS, Ziegler R. Virtual Design II } an advanced VR systemfor industrial applications. In: Proceedings of Virtual Real-ity World '95, Stuttgart, Germany, February 1995:337}63.

[10] Jayaram S, Connacher HI, Lyons KW. Virtual assemblyusing virtual reality techniques. Computer-aided Design1997;29(8):575}84.

[11] Chu C-CP, Dani TH, Gadh R. Multi-sensory user inter-face for a virtual-reality-based computer-aided design sys-tem. Computer-Aided Design 1997;29(10):709}25.

[12] Zeleznik RC, Herndon KP, Hughes JF, SKETCH: aninterface for sketching 3D scenes. In: Rushmeier H, editor.SIGGRAPH '96 Conference Proceedings, Annual Confer-ence Series, ACM SIGGRAPH, New Orleans, Louisiana,04}09 August 1996. Reading, MA: Addison-Wesley,August 1996:163}70.

[13] Kelsick JJ, Vance JM. The VR factory: discrete eventsimulation implemented in a virtual environment. In: Pro-ceedings of 1998 ASME Design Engineering TechnicalConferences/Design for Manufacturing, Atlanta, Georgia,September 1998.

[14] Gomes de SaH A, Baacke P. Experiences with virtual realitytechniques in the prototyping process at bmw. In: Dai F,editor. Virtual reality for industrial applications, Com-puter graphics: systems and applications. Berlin: Springer,1998:151}8 [chapter 9].

[15] Pahl G, Beitz W. Engineering design: a systematic ap-proach, 2nd rev. ed. London: Springer, 1996.

[16] Spooner DL, Hardwick M. Using views for product dataexchange. IEEE Computer Graphics & Applications,1997;17(5). ISSN 0272-1716.

[17] Zachmann G. A language for describing behavior of andinteraction with virtual worlds. In: Proceedings of ACMConference VRST '96, Hongkong, July 1996.

[18] Zachmann G. Rapid collision detection by dynamicallyaligned dop-trees. In: Proceedings of IEEE Virtual RealityAnnual International Symposium; VRAIS '98, Atlanta,Georgia, March 1998.

[19] Zachmann G. Distortion correction of magnetic "elds forposition tracking. In: Proceedings of Computer GraphicsInternational (CGI '97), Hasselt/Diepenbeek, Belgium,June 1997. Silver Spring, MD: IEEE Computer SocietyPress.

A. Gomes de Sa& , G. Zachmann / Computers & Graphics 23 (1999) 389}403 403


Recommended