115
i Applied Virtual reality Organizer CAROLINA CRUZ-NEIRA Iowa State University Lectures RUDY DARKEN Naval Postgraduate School MARY LYNNE DITTMAR Boeing Defense and Space Group RICHARD GILLILAN Cornell Theory Center OLIVER RIEDEL Fraunhofer Institute for Industrial Engineering FRANK WOOD Cornell Theory Center JUDY VANCE Iowa State University COURSE 15 NOTES SIGGRAPH 1997 24th International Conference on Computer Graphics and Interactive Techniques Los Angeles Convention Center 3-8 August 1997

Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

i

Applied Virtual reality

OrganizerCAROLINA CRUZ-NEIRAIowa State University

LecturesRUDY DARKENNaval Postgraduate School

MARY LYNNE DITTMARBoeing Defense and Space Group

RICHARD GILLILANCornell Theory Center

OLIVER RIEDELFraunhofer Institute for Industrial Engineering

FRANK WOODCornell Theory Center

JUDY VANCEIowa State University

COURSE 15 NOTES

SIGGRAPH 1997

24th International Conference on ComputerGraphics and Interactive Techniques

Los Angeles Convention Center3-8 August 1997

Page 2: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

ii

Course Summary

This course addresses the field of virtual reality from the end-user’s perspective. The course it isfocused on “what we can do” with VR technology, not “how to develop” the technology. Thecourse provides attendees with criteria to identify whether or not VR technology could be a tool intheir working environment. The course will cover several working VR applications in academiaand industry along with discussions of their design processes.

The course objective is to provide an understanding of the unique features of virtual reality andhow these features can be identified and used in developing useful applications. This tutorial willanswer the question -why do we need VR? What does VR have to offer that I can’t alreadydevelop using existing three-dimensional interactive computer graphics techniques? This courseexamines the features of VR technology and relates these features to specific applications. Thecourse concentrates on the applicability of VR technology not on the development of hardware/software to control the various devices required in a virtual environment.

Page 3: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

iii

Course Schedule

10 minutes Welcome and Course Overview 1-1Carolina Cruz-Neira

20 minutes Introduction to Virtual Reality 2-1Carolina Cruz-Neira

1 hour Reusability and User Interface Issues in Virtual Reality 3-1Frank Wood

15 minutes Break

1 hour Design Issues in Virtual Environments 4-1Judy Vance

1 hour Navigation in Virtual Environments 5-1Rudy Darken

90 minutes Lunch Break

1 hour Psychological and Physiological Effects of Immersive Environments 6-1Mary Lynne Dittmar

30 minutes Making Virtual Reality Useful 7-1Carolina Cruz-Neira

15 minutes Break

1 hour Scientific Applications of Virtual Reality 8-1Richard Gillilan

1 hour Using Virtual Reality in Engineering Applications 9-1Oliver Riedel

15 minutes Wrap Up and Panel Discussion 10-1All Speakers

Page 4: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

iv

Speaker Biographies

Dr. Carolina Cruz-Neira is a Litton Assistant Professor in the Department of Electrical andComputer Engineering and an Associate Scientist at the Iowa Center for Emerging ManufacturingTechnology, both at Iowa State University. Dr. Cruz obtained a Ph.D. from the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago in May of 1995. Her PhDresearch involved the design and implementation of the CAVE virtual reality system and thedevelopment of paradigms to integrate high performance computing and communications withthe CAVE for applications in computational science and engineering. She has consulted for IBMWall Street, the Chicago Board of Trade, the National Center for Supercomputing Applicationsand Argonne National Laboratory. She received her Master’s degree at EVL and a bachelors inSystems Engineering at the Universidad Metropolitana in Caracas, Venezuela.

Dr, Cruz’s main research area is on the integration of virtual reality interfaces, high-speed networks and high-performance computing engines for the real-time steering of distributed simulations in science and engineering. She is currently performing collaborative research withscientists and engineers at the Cornell Theory Center, Argonne National Laboratories, John DeereCorporation, and Rockwell International Corporation.

Dr. Cruz grew up in Alicante, Spain, were she started studying classical ballet dancing at theMusic Conservatory of Alicante at the age of 5. She moved to Venezuela in 1981 to pursue herundergraduate degree and to continue her dance education as well. Before discovering virtualreality, she was part of several dance companies and performed in theaters in Spain and Venezu-ela.

Dr. Rudy Darken is an Assistant Professor of Computer Science at the Naval PostgraduateSchool in Monterey, California. He joined the department in July of 1996, having been at theNaval Research Laboratory in Washington, D.C. since 1991 as director and co-founder of theTactical Electronic Warfare Division’s Virtual Environment Laboratory.

Dr. Darken’s research has been primarily focused on human factors in virtual environments withemphasis on navigation and wayfinding in large-scale virtual worlds. His background includesexperience in interface design, collaborative computing, computer augmented training systems,team training systems, real-time visual simulation, computer graphics, and computer animation.Recent research initiatives include spatial audio in aircraft training and operations and wirelessmobile computing, or more to the point, virtual environment technology applied to real worldtasks.

He is a member of the editorial board of PRESENCE Journal. He received his B.S. in ComputerScience Engineering from the University of Illinois at Chicago in 1990 and his M.S. and D.Sc.degrees in Computer Science from The George Washington University in 1993 and 1995, respec-tively.

Page 5: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

v

Dr. Mary Lynne Dittmar is currently the technical lead of the Visualization Laboratory and theUsability Laboratory of the Advanced Computing Group, Boeing Defense and Space Group,Huntsville, Alabama. She took her Ph.D. from the University of Cincinnati in 1989, with anemphasis in human perception and performance (Human Factors and Experimental Psychology).She was a Lecturer at the University of Cincinnati from 1984 to 1989, and a faculty member atThe University of Alabama in Huntsville from 1989 to 1995, before leaving to start her own com-pany, RAD Company (Research, Analysis & Design) of Huntsville, Alabama. Prior to joiningBoeing in March of 1996, she served as a human factors consultant on a number of researchprojects, ranging from sustained operations and human performance to virtual reality applicationsat NASA Marshall Space Flight Center.

She is a member of the Human Factors and Ergonomics Society, the American PsychologicalSociety, the American Association for the Advancement of Science, and Sigma Xi (NationalHonor Society in Science) as well as several other professional and civic organizations.

Dr. Richard Gillilan received his Ph.D. in Theoretical Chemistry from the University of Penn-sylvania in 1988 studying nonlinear dynamical phenomena in surface diffusion. He continued hiswork at Cornell University as a postdoctoral associate in the Chemistry department. In 1990, Dr.Gillilan moved to the University of California, San Diego as a Postgraduate Research Chemist inthe laboratory of Kent Wilson. While working on solution-phase reaction dynamics, reaction-pathcalculation strategies and quantum control theory, he developed an interest in scientific visualiza-tion and animation production.

Dr. Gillilan has been a Research Scientist and Visualization Specialist at the Cornell Theory Cen-ter since 1992. He specializes in animation production and virtual reality in chemistry, molecularbiology and biophysics.

Mr. Oliver H. Riedel was born in Osnabrueck, Germany in 1965. After finishing senoir highschool and the military service requeriments, he started studies in technical cybernetics at the Uni-versity of Stuttgart in Germany. Because of his specific interests, he obtained a degree in biomed-icine and digital machine processing. Before joining the Institute for Human Factors andTechnology Managment (IAT), at the University of Stuttgart, he was a staff memeber of the bio-medical division at Hewlett-Packard. At the end of 1991 he received the assignment to build a vir-tual reality research laboratory at the Fraunhofer Institute for Industrial Engineering (IAO) - oneof the world’s largest industrial research organizations. In 1995 he became the head of the Com-petence Center Virtual Reality at IAO. The lab now plays an important roll in pushing the indus-trial and commercial capabilities of VR.

He is the author of many publications in the field of industrial applications of VR and human fac-tors of immersive displays. Since 1994, Mr. Riedel has been a member of the German ResearchSociety in the category of special research area “rapid prototyping of innovative products”.

Page 6: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

vi

Dr. Judy Vance is an Assistant Professor of Mechanical Engineering at Iowa State University.After receiving her B.S. in Mechanical Engineering from Iowa State University, she worked as amanu facturing engineering at the John Deere Des Moines Works. She later returned to Iowa Stateand received her M.S. and Ph.D. in Mechanical Engineering.

Professor Vance’s research interests are in the area of virtual environment applications for engi-neering design. She recently received the prestigious National Science Foundation CAREERaward which is based on her research and teaching record. She also has two other National Sci-ence Foundation grants supporting this research. Specific project areas include mesh decimation,virtual training, and virtual conceptual design. She has performed research for the Ford MotorCompany and is currently working on developing a virtual environment for engineering design forJohn Deere.

Mr. Frank Wood recently received his BS in Computer Science from Cornell University. Mr.Wood was employed by Fujitsu and the University of Illinois at Chicago before coming to theCornell Theory Center. He is currently working with industrial partners to develop real-time vir-tual simulation environments and continues to research distributed collaborative interfaces forimmersive environments.

Page 7: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

1-1

WELCOMETO

APPLIED VIRTUAL REALITY

Page 8: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

1-2

Welcome!

At SIGGRAPH 93, we offered the first “Applied Virtual Reality” course, which was a big successwith over 700 attendees. Since then, I have received numerous e-mails and comments from peoplethat attended or had heard about the course asking me to do an updated version for another SIG-GRAPH conference. Most VR courses offered at SIGGRAPH are targeted towards virtual realitysystems development, not on how industry and research can use this technology to their advan-tage. These comments have motivated me to prepare this course for SIGGRAPH 97, bringingtogether an impressive team of speakers from different disciplines to share their ideas and practi-cal experiences in the fascinating and quickly growing world of applied virtual reality.

Our objective is to provide an understanding of the unique features of virtual reality and howthese features can be identified and utilized in developing useful applications. This course willaddress questions such as: why do we need VR? What does VR have to offer that I can’t alreadydevelop using existing three-dimensional interactive computer graphics techniques?. The answerswill be provided by examining the features of VR technology and relating these features to spe-cific applications.

The speakers enthusiam and support of the course is evident in the materials provided in thesenotes as well as in the presentations. We believe that this course will be very beneficial to thosealready working on VR, as well as those beginning to be exposed to it and starting to think howVR can be integrated in their workplace. We hope you find this course enjoyable, interesting andprofessionally beneficial.

Carolina Cruz-NeiraCourse Organizer

Page 9: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-1

INTRODUCTIONTO

VIRTUAL REALITY

Carolina Cruz-Neira

Department of Electrical and Computer Engineeringand

Iowa Center for Emerging Manufacturing Technology

Iowa State University

Page 10: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-2

INTRODUCTION TO VIRTUAL REALITY

Carolina Cruz-NeiraIowa State University

[email protected]

1. Introduction

Steve Bryson and Steve Feiner, in their opening remarks for the 1993 Symposium on ResearchFrontiers in Virtual Reality Symposium wrote:

Virtual Reality (VR) refers to the use of three-dimensional displays and interaction devices toexplore real-time computer-generated environments. It is a field that has generated a great deal ofhype and excitement, yet one in which real progress has been relatively slow.(...) While virtualreality’s vision of immersive, interactive three-dimensional environments is compelling and hasattracted many adherents, few applications have left the research laboratory.

It is true that since its early days, the field of VR has been surrounded by a great deal of hype; theline separating the science of virtual reality from science fiction has not always been clearlydefined. The idea of having a “natural interface” to interact with computers is so fascinating that ithas had a very strong impact not only the research laboratories but at many levels of our currentsociety.

In the late 1980s and early 1990s, we experienced how the idea of VR was being distorted by Hol-lywood and mainstream sensationalist media. Movies such as Total Recall and Lawnmower Manpresented VR as some sort of brain manipulation technology. Articles about brain implants, out-of-body experiences and techno-drugs were frequently published.

Rudy Darken, one of the speakers in this course and myself, in 1993, after attending a VR-relatedconference were hype was the main topic, developed the idea of “we” and “them”1. “We” mean-ing the scientists, engineers, and business people who are working hard on developing VR tech-nology and constantly exploring its potential and benefits for a variety of fields, ranging frommedicine to engineering to entertainment and so on. “We” provide the field with very strong theo-retical and applied technical skills to accomplish the hard task of turning the idea of VR into areality.

“Them” are the non-scientific people, with a great deal of imagination and enthusiasm about VRthat lead them in a variety of situations to conceive bizarre and science-fiction applications for thetechnology. Although “them” have been responsible for the great deal of hype and misconcep-

1. David Mizell came up independently with the same “theory” in 1995. See his foreword to the IEEEVRAIS ‘95 Conference Proceedings.

Page 11: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-3

tions surrounding the beginnings of virtual reality as a field, they also have provided a rich pool ofideas and visions about the field supplementing “our” visions and ideas.

Now, in 1997, the hype surrounding VR is starting to fade and there is a true growing interestcoming from industry about its applications and benefits. It is already accepted that VR technol-ogy has a great potential in many disciplines. The challenge we (and I mean “we”) are facing ishow to apply virtual reality, and for that, we have to know what virtual reality is independently oftoday’s technology, we should be aware of the limitations of the current available technology, and,we should understand the design approaches that will lead to the creation of successful virtualexperiences.

The materials in these course notes presents a fresh look at the applicability of virtual reality tech-nology. An overview of the VR hardware/software components followed by an in-depth discus-sion on application design opens the presentations. The course continues the presentations withhuman factors issues that have an impact on the effectiveness of applications and what are the keyelements that differentiates VR from traditional interactive applications. It closes with a survey ofcurrent scientific and engineering applications.

2. Virtual Reality Components

Virtual reality refers to immersive, interactive, multi-sensory computer-generated experiences, oras Brenda Laurel2 puts it,a synthetic representation that renders a sense of place. In order to cre-ate illusions of immersive experiences in virtual worlds, and to achieve the sense of presence, vir-tual reality currently requires the integration of a combination of several technologies that can begrouped under the following categories:

• Visual Displays• Tracking systems• Input devices• Sound systems• Haptic devices• Graphics & computing hardware

2.1. Visual displays

Visual displays are the devices that present to the user’s eyes the 3-D computer generated world.The degree of immersion given by a particular virtual reality system depends greatly on the visualdisplay used as the interface. It is important to notice that visual displays should not be confusedwith the graphics hardware used to generate virtual environments, which could be anything from apersonal computer to a supercomputer.

2. Laurel, Brenda. The Art of Human-Computer Interface Design. Addison-Wesley.1990.

Page 12: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-4

Virtual reality displays are currently classified into four general categories:

• Desktop displays• Head-mounted displays• Head-coupled displays• Projection-based displays

All of these systems are capable of producing wide-angle stereoscopic views of the scene,although, in some cases, monoscopic vision is also used. Generally a head tracking device cou-pled with the displays provides the location and direction of sight of the viewer to computer thecorrect perspective view of the virtual world.

Desktop displays are simply the computer monitor. It represents the most basic visual paradigmfor VR. Head-mounted/coupled display are the most popular visual interface and they consist of apair of small displays, CRTs or LCDs, that cover the user’s visual field. Projection-based displayspresent the virtual worlds on different arrangements and sizes of projection screens.

In addition to the four main categories, there are other systems, such as cab systems and mirrowedreality environments, that are currently considered VR displays3. These systems are not lookedinto detail in this course, since our main focus is on viewer-centered, first person systems.

2.1.1. Desktop DisplaysThe desktop paradigm has evolved from traditional computer graphics. Graphics workstationscoupled with head and hand trackers are commonly known as “fish tank” or desktop VR environ-ments. The visual feedback responds to the user’s movements in front of the monitor, most com-monly head and hand movements. Directly derived from traditional interactive computer graphicsapplications, the fish tank paradigm keeps the model of placing the observer outside the virtualworld, using the monitor screen as a window to the virtual world, therefore providing a very lim-ited sense of immersion. A good example of such system can be found in was developed byMichael Deering at Sun Microsystems4.

Advantages:High-resolution display: These systems allow the full resolution of a computer screen, typicallyabout 1280x1024 full-color pixels.Familiar interface: Users are not confronted with a completely new device. They still see thefamiliar screen, mouse and keyboard and some simple additional control devices such as a 6-degree of freedom tracker attached to the hand.Built from commonly available computer graphics hardware: No sophisticated equipment isrequired. Any installation that has a graphics workstation can easily set up a fish tank VR system.Relatively inexpensive: If the graphics workstation is already available, only the cost of a 6-degreeof freedom tracker or a specialized device, such as a dataglove, is required.Easy to set up: Desktop systems do not require any special conditions for proper operations, suchas space or special lighting.

3. Pimentel, K. and Teixera, K. Virtual Reality, through the new looking glass. McGraw-Hill, 1995.4. Deering, M.F. High-Resolution Virtual Reality. ACM Computer Graphics, Vol. 26. July 1992

Page 13: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-5

Easy to share by several users: Several users can see the virtual world by simply looking at themonitor. However, if one viewer is being tracked, the other viewers will perceive the world withan incorrect perspective from their location.

Shortcomings:Low degree of immersivity: In this paradigm, the computer screen still acts as a barrier betweenusers and virtual worlds. Thus, although users’ perceptions of the virtual world is enhancedthrough stereo, they do not have the feeling of “being there” with the virtual objects.Small angle of view: On a monitor, the angle of view is limited by the size of the screen and thedistance from the observer to the screen. It is not easy to incorporate additional optics to increasethe field of view (FOV).No surrounding view: Because the screen is in a fixed position in front of the user, and because thesize of its screen is small the user does not have surrounding vision while looking at the monitor.Stereo frame violation: Objects that appear to be “outside” or in front of the workstation screenare subject to stereo frame violation because part of them can be clipped from the scene if they areclose to the borders of the viewing frustrum. In this case, users have conflicting cues about thelocation of an object. On one hand, stereo tells them that the object is in front of the screen. On theother hand, the object being clipped by the border of the screen communicates to users that theobject should be behind the screen. These conflicting cues cause the whole sensation of stereo tocollapse.Small range of movement: Users’ movements are restricted to at most one or two feet around theworkstation screen because, at larger distances, they may not be able to clearly perceive theimages on the monitor.

2.1.2. Head-Mounted Displays (HMDs)HMDs are the most broadly used visual displays in virtual reality systems. These devices place apair of display screens directly in front of the user’s eyes. The screens are mounted on a helmetthat viewers wear while in the virtual world. Currently, there are two commonly used displaytechnologies for the screens: cathode ray tubes (CRTs) and liquid crystal displays (LCDs). CRTsprovide a higher resolution and better display quality than LCDs, but have the disadvantage ofbeing heavy. LCDs are much lighter and flatter than CRTs, which makes it easier to install on ahelmet. However, they have the disadvantage of lower resolution and poor display quality due toproblems with contrast and brightness. In HMDs the virtual world is displayed in stereo from theuser’s point of view as he or she explores the environment. This produces a high degree of immer-sion; users are completely surrounded by the virtual environment, which responds visually in amanner similar to what we are used to seeing in the real world. A variant of HMDs are the see-through head mounted displays. In these devices, each one of the two screens are placed on eachside of the user’s head and the images projected on half-silvered mirrors placed on front of theviewer’s eyes. The user can not only see the virtual objects, but also the real world; the virtualobjects seem to merge with the real surroundings.

Advantages:Large FOV: HMDs use special optics in front of the CRTs or LCDs to produce large angles ofview.No stereo frame violation: In HMDs, the CRT or LCD screens are very close to the user’s eyes (1-2 inches); all perceivable virtual objects are behind the screen, so any object clipping will appear

Page 14: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-6

to the user as being outside his FOV. A simple head rotation will bring the object in sight as itwould in the real world.Provide good sense of immersion: HMDs eliminate the barrier represented by the workstationscreen by mounting the screens directly in front of the user’s eyes and having them follow user’shead movements. This creates the sensation of having no screens at all.Users can move in a larger space: The distance a user can physically walk is limited by the rangeof the tracker and the length of the cables attached to the helmet, which can be up to 18 feet. Inaddition, with navigation controls, users can explore much larger virtual spaces.Low-end models are affordable: Several brands of HMDs can be connected to home personalcomputers, which makes it possible to have home-brewed VR systems.Easy to set up: No special conditions, such as large space or lights, are required to use a HMD.

Shortcomings:Invasive interface: For some people, it is very unnatural and uncomfortable to wear the display onone’s head.Distortions: The optics used to create large FOVs can cause distortions, especially in the periph-eral areas of the displays.Weight: Although the weight of HMDs has decreased from earlier models, it is still a concern,specially in situations when users are required to wear them for more than a few minutes.Isolation from the real environment: The viewer, although immersed in the virtual world, is stillaware of the real environment, fearing events in the real world such as tripping over a cable orrunning into a wall. This isolation and awareness of the real world can ruin the whole immersiveexperience.Need computer graphics models for real world objects: Real world objects that participate in theexperience, such as the user’s hand, have to be recreated in the computer model, at a cost to theperformance.Difficult to manipulate real objects: Manipulation of real world controls is hard since they are notvisible to the user. Instead, users see and manipulate computer models of the real objects. The reg-istration between real and virtual objects is very hard and is still an on-going area of research invirtual environments.Not easy to share: It is not possible to share the experience with others in an HMD, unless eachuser has an HMD and they are all networked and tied to the actions of one user.Hygiene: The hygiene of an HMD for public use is questionable.

2.1.3. Head-Coupled Displays (HCDs)An alternative to the HMDs, HCDs that look like a pair of binoculars mounted on an articulatedarm. The user looks at the virtual environment through the lenses, having his movements con-strained by the arm’s length and motion range. The most popular device of this kind is theFakespace BOOM

Advantages:Resolution: HCDs have higher resolution than most common HMDs because of the use of CRTs.Large FOV: As with HMDs, HCDs use special optics on the CRTs to create large FOV.Easy and quick entry and exit: Because HCDs are mounted on an articulated arm, entering or exit-ing the virtual world can be done by simply moving it towards or away from the user’s face.Light weight: HCDs feel very light to users, because the weight is balanced in the articulated arm

Page 15: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-7

from which it is suspended.No lags in tracking position: Mechanical tracking mechanisms are used to measure the displace-ment and angle of the joints in the articulated arm. The capture of these measurements is muchfaster than any other tracking system, such as electromagnetic sensors.Easy to set up: HCDs require very little space, determined by the length of the mechanical arm,and it does not require any other special conditions in the room.

Shortcomings:Limited user movement: HCDs provide very limited space that can be explored, which is limitedby the mechanical arm’s length.Requires at least one user arm dedicated to controlling it: Users need at least one arm to controlthe device’s motion, which leaves only one hand free to interact with the virtual environment.Inertia: Most HCDs suffer from inertia. Fast user movements can produce large inertial forcesthat require the user to use more force to control it.Hygiene: Hygiene is a concern, although it seems better than for HMDs.Low feeling of immersion: HCDs can provide a lower sense of immersion than a HMD. The expe-rience is like looking at the virtual environment through binoculars

2.1.4. Projection-Based SystemsProjection-based displays by themselves are not new. Flight simulators have used projection-based technology since the early 1970s to provide visual feedback to pilots. The entertainmentindustry has also been using projection-based systems over the past 20 years. Imax and Omnimaxtheaters deliver very immersive experiences. A viewer’s FOV is completely covered by theimages, giving the impression of being part of the projected scene. Disney’s Star Tours and BodyWars adventures are good examples of entertainment projection-based systems.

One of the most complete examples projection-based systems for virtual reality is the CAVETM

virtual reality environment, which I developed as part of my Ph.D. dissertation5. In the CAVE, theillusion of immersion is created by projecting stereoscopic computer graphics into a 10x10x10foot cube composed of display screens that completely surround the viewer. The viewer exploresthe virtual world by moving around inside the cube. The CAVE blends real and virtual objectsnaturally in the same space so that individuals have an unoccluded view of their bodies as theyinteract with virtual objects.

Another projection-based system was demonstrated at SIGGRAPH 92 by Michael Deering fromSun Micro Systems, called the Virtual Portal6. In the Virtual Portal, viewers were surrounded bythree rear-projection walls, and their movements inside the room were restricted by a guard rail.

Since these two pioneering systems, similar systems have been recently developed, such as the C2(http://www.icemt.iastate.edu/Labs/se.html), the CyberStage (http://viswiz.gmd.de/tools/Cyber-Stage.html) and the Cove (http://www.vrex.com:80/professional/vrcove.htm). More compact sys-

5. Cruz-Neira, C. Virtual Reality Based on Multiple Projection Screens: The CAVE and Its Applications toComputational Science and Engineering. Ph.D. Thesis, University of Illinois, May 1995.

6. Deering, M. Making Virtual Reality More Real: Experience with the Virtual Portal. Proceedings ofGraphics Interface ‘93. May 1993.

Page 16: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-8

tems, such as the ImmersADesk (http://www2.pyramidsystems.com/psi/idesk.html) and theResponsive Workbench (http://viswiz.gmd.de), use the tabletop metaphor and they are goodoptions for applications that require manipulation of objects directly located in front of the viewer.

Advantages:High-resolution: Projection systems resolution is limited by the resolution given by the worksta-tion screen. Current resolutions can be up to 1280x1024 for each eye.Large FOV: The projection planes are in a fixed position on the screens and they surround theviewer, thus providing a large FOVS and a panoramic view of virtual worlds.Non-invasive display: Users only have to wear light shutter or polarized glasses no more cumber-some than sunglasses in order to converge the stereo images.Easy to share: Multiple participants can share the experience by simply wearing a pair of glassesand stepping into the environment. In current systems, one user is the active participant that con-trols the viewpoint and the other users are passive observers. For passive participants, the perspec-tive is not correct because it is not calculated from their point of view.No isolation from the real world: Users can see both the real and the virtual world, therefore theyfeel more comfortable in the environment and are not afraid to move around.No need to recreate real objects: As a consequence of the previous point, real world objects do notneed to be recreated as computer graphics models. Users can see their hands and any other realobject introduced in the virtual world. Real and virtual objects can be naturally mixed in the envi-ronment.Slower renderings are possible: Techniques such as successive refinement are possible in projec-tion systems, because the virtual world is projected onto screens. A head rotation only involves anadjustment to the stereo pairs, therefore users can better tolerate slower frame rates than in otherVR systems.

Shortcomings:Require large spaces to set up: The screens and projectors require a large space. For a 12x12-footarea of activity, the entire system needs a 30x30 foot area to set up the projectors and the screens.Occlusion violation: Occlusion violation happens when a virtual object is between the user’s handand eyes. The hand will appear to be in front of the object, while it is actually behind.Require accurate adjustment of walls: It is important that in multiple screen systems, they matchvery well at the borders, otherwise the stereo illusion collapses at the points where the screensmeet.Require precise calibration of the projectors: These systems are extremely sensitive to small vari-ations among the projectors in color, contrast, and brightness, misalignment, positioning and sev-eral other factors. If any of these factor is not precisely calibrated across all the projectors, thesense of immersion is destroyed.Require complex hardware and software controls to coordinate all the screens:In projection sys-tems with more than one screen, all the screens have to be synchronized on the same frame andstereo image. Usually this has to be done in software, which makes programs very complicatedand hard to debug.Requires more graphics engines: Projection systems usually require more than one graphicengines to render the scene on several screens, which may considerably increase the total cost ofthe graphics workstation.

Page 17: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-9

2.2. Tracking systemsTracking is a critical component of an immersive environment. A convincing virtual world shouldfeel natural to the participants and the interaction should be as intuitive and transparent as possi-ble. Accurately tracking the user’s movements within the virtual space allows us to achieve thesegoals. Tracking a viewer usually involves tracking her head and hand. The measurements of theuser’s head position and orientation is particularly important because it allows to compute the cor-rect perspective of the world from the user’s point of view. Computing a viewer-centered perspec-tive lets users explore virtual environments in the same way they would explore realenvironments. For example, to see what it is behind a virtual object, users can simply move theirhead to either side as they would do to see behind a real object. Usually, one or both of the user’s hands are tracked to provide interaction. More sophisticated systems can track user’s fingers andeven the whole body.

Choosing a tracking system is an issue that should not be taken lightly. The tracking system is themajor source of lags and errors in a virtual experience affecting its performance and thereforegenerate problems such as motion sickness.

Tracking systems can be classified in six technologies, based on the technique used to detect posi-tion and orientation of a sensor in space:

Electromagnetic: These trackers have a transmitter or source that emits electromagnetic fieldsalong three orthogonal axes which are detected by one or more sensors. Complete informationabout the position and orientation of each sensor with respect to the source is reported. PolhemusFasttrack and Ascension Flock of Birds are the most popular magnetic trackers. Magnetic trackershave two major problems: latency and accuracy. Latency is the elapsed time between a change inposition of a sensor and when that change is reported. Current magnetic trackers can have latencytimes up to 0.1 seconds. They are extremely sensitive in the presence of metal and their reporteddata becomes unreliable. One major advantage of these kind of trackers is that they do not requirea clear space between the transmitter and the sensors. The sensors can be moved freely within thevolume range determined by the transmitter, operating even when there are objects between themand the transmitter, such as the user’s body or part of the head gear.

Mechanical: Mechanical trackers are a rigid structure with several joints. One end of the structureis fixed to a place, while the other end is attached to the object to be tracked (usually the viewer’shead or hand). The joints angles are measured to determine the position and orientation of thetracked end. The Boom uses this technique to determine the display position in space. Mechanicaltrackers have a very small latency and are very accurate due to the sensors at the joints, usuallyencoders. One disadvantage is that the motion is restricted by the range of the mechanical linkage,users can not always move where they want. However, this type of tracking works very nicely forapplications that do not require sudden changes in position.

Acoustic: Acoustic trackers use ultrasonic sound. In this devices, a source produces ultrasonicpulses that are received by a set of microphones arranged usually in a triangular fashion. The posi-tion and orientation of the tracker is determined from the different times at which the pulsereaches each microphone. Acoustic trackers suffer of the same latency problems as the magnetic

Page 18: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-10

trackers. They are not affected by metal interference, but they are susceptible to obstructionsbetween the source and the receivers such as the user’s arm.

Optical: These trackers use a combination of light emitting diodes (LEDs), video cameras andimage processing techniques. They can be used in two manners: The LEDs are placed on theobject to be tracked (the user’s head or hand) while cameras at fixed locations track the position ofthe LEDs or, as in the University of North Carolina system, the cameras are placed on the user’shead, and an array of LEDs is mounted in a fix location in the ceiling. Position and orientation areobtained by determine the position of the LEDs on the video images from the cameras using sig-nal processing techniques. Like the acoustical trackers, optical trackers are susceptible to line ofsight occlusion. The majority of these type of systems are still in the experimental stages.Image processing: This type of technique uses video cameras to capture images of the users.Using image processing techniques one can identify the position of various body parts, such asarms or legs in the video frame.

Inertial: Inertial systems use gyroscopes to measure the three orientation angles of pitch, yaw androll. They are based on the principle of conservation of angular momentum. They have the advan-tage over other systems of not needing a separate source, so their range is determined by thelength of their cable. Other systems use accelerometers to detect the acceleration of the object tobe tracked. From the object’s acceleration, the position can be obtained by integration. This tech-nology is still experimental and not commercially available.

2.3. Input DevicesVirtual reality experiences can be passive, exploratory or interactive. In passive experiences, userslook at the virtual world with very limited motion. In these applications, the display is the maincomponent utilized. In exploratory applications, users not only see the virtual objects, but theycan walk around them and explore the virtual surroundings. To achieve this, tracking is required.The third mode, interactive, is the most complex and immersive experience. Here users can inter-act with the virtual world. They can reach out and grab objects, change the state of the experienceand perform many other interactive tasks.

Currently, there is a large variety of input devices for virtual reality. Perhaps the most popular oneis the dataglove, a device designed to capture the motion of the user’s hand and fingers. VPL’sDataGlove from the late 1980s was one of the first to become commercially available. The Data-Glove and its current successors, such as the CyberGlove (http://users.quake.net:80/~virtex), theDextrous HandMaster (http://www.exos.com) and the 5DT Glove (http://www.vrweb.com/gloves.htm#Fifth) are devices that measure the bend angle at each finger’s joint and therefore cap-turing the complete motion of a human hand.

A variation of the datagloves has recently been introduced with the PinchTM Hand Gesture Inter-face (http://www.fakespace.com), a Fakespace product. The PinchGloves only detect contactamong fingers. To determine the contact, each glove contains five sensors (one in each fingertip).Contact between any two or more digits completes a conductive path, and a complex variety ofactions based on these simple “pinch” gestures can be defined by the application developer.

Page 19: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-11

Three-dimensional joysticks and wands are also commonly used, specially in public demonstra-tion of VR applications. They are basically a frame to hold a 6-degrees of freedom tracker and agroup of switches. Functions such as “fly”, “pick”, “drop” can be programed into the switches,

Voice recognition systems are used as input devices for VR in applications were verbal commandsare required.

There is a great deal of flexibility in area of input devices for virtual reality. Custom-madedevices, such as bicycles, car mock-ups, airplanes cockpits, handles and so on are constantlybeing developed. It all depends on the demands of the specific applications.

2.4. Sound SystemsAudio technology is a very mature technology, since it has been applied to many areas other thanvirtual reality. Synthesizers to create, mix and reproduce high quality sounds have been commer-cially available since long ago.

In VR, the main challenge presented to audio systems is localization, which is the ability of asound system to generate three-dimensional sound. Localized sounds can be attached to objects orcan be used to enhance the sense of immersion in the environment. Obviously, sonification greatlyenhances the sense of immersion provided by the visual displays, since humans use audio cues toacquire information about our surroundings, beyond what it is provided by the visual channels.

Another aspect of sound related to VR issonification, or turning information into sound cues.Sonification can be used in two ways; It can be used as a mechanism for sensory substitution, incases when it is necessary to give the user’s some “feeling” of the objects around him. For exam-ple, it is very common to hear a sound when a virtual object is grabbed to indicate users they areholding the object, even though they do not have any feeling in their hands. In a broader use, son-ification is used to code information about the virtual world as audio cues to avoid overloading thevisual display. An example of this could be coding as sound the energy level of a molecular sys-tem as the user is manipulation the different molecular structures.

2.5. Haptic devicesVisual displays combined with trackers, sound and input devices provide operators with a veryadvanced and powerful tool to explore and interact with virtual objects and spaces. However, toenable operators to interact with computer-generated environments as naturally as the interactionis done in the real world, operators need to “feel” the virtual environment. Feeling involves todetermine object’s type, such as its shape and weight, and its features, such as texture and temper-ature. We humans perceive a great deal of information by using our hands to quantify and modifyour environment.

Haptic devices are input and output devices that can measure the position and forces of the user’shand and other body parts when manipulating a virtual environment and can apply forces back tothe user to simulate the corresponding sensation of the feel of the objects being manipulated.

Page 20: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-12

Current haptic devices technology for VR is still very new, but there are already several devicescommercially available. Mark Yim7 presents a categorization of current technology based on thetechniques used to generate the force reflection:

• Ground-referenced• Body-referenced• Inertia-based• Dermal

Ground-referenced Force Feedback: These devices generally have a physical link between theuser’s body and the ground and the reaction forces are generated relative to the point of contact ofthe device with the ground. Good examples of these type of devices is the SensAble Devices’PHANToM (http://www.sensable.com) and the Sarcos Dexterous Arm Master (http://www.sar-cos.com/Human_interfaces.html).

Body-referenced Force Feedback:In this type of interfaces, the device is anchored in a part ofthe body that is not intended to receive sensation in order to generate the reaction forces. Exam-ples are the Rutgers Master II (http://www.caip.rutgers.edu/~dgomez/rm2.html) and the VirtualTechnologies CyberGrasp PERCRO Force-reflecting Hand Master(http://users.quake.net/~vir-tex).

Inertia-based Force/Tactile Feedback: Inertia-based devices use small vibrating transducers orsome other sort of oscillatory technology to stimulate the user’s tactile sense. Examples are theVirtual Technologies’ CyberTouch glove(http://users.quake.net/~virtex) and the Sandia NationalLaboratory Tactile Feedback Device.

Dermal Tactile Feedback:These devices stimulate the nerves and corpuscles of the skin of theuser’s fingertips. Examples are the TiNi/Xtensory tactors (http://www.xtensory.com), the Telesen-sory Systems’ Optacon II (http://www.telesensory.com) and the EXOS TouchMaster (http://www.exos.com).

2.6. Computation systemsComputation systems refer to the computer hardware used to control the overall operation of avirtual environment. The computer hardware has to handle several tasks: the generation of thegraphics for the scene, the computation of the state of the environment, the control of input andoutput devices. In a virtual reality environment, all these tasks have to be integrated and synchro-nized, usually every frame.

Several hardware configurations are currently used in virtual reality systems:

PC with graphics accelerators and add-on: Current PC technology has matured to the point thatthey can be used in low-end applications of VR. Graphics accelerators allow to produce goodquality images and advances in processing power allows the possibility of integrating some VR

7. ACM SIGGRAPH 96 Course #37 Notes.

Page 21: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-13

peripherals. However, although all these advances are going in the right direction, PCs stillimpose severe limitations for virtual worlds, specially in the size of the database and the amountof details that can be included. Also, the calculation of complex behaviors is limited, specially ifseveral peripherals are used, since the processor will need to be shared across the different tasks.PC systems have the advantage that are generally inexpensive and they are very popular. How-ever, they have the limitation of requiring a great deal of development effort in new programmingenvironments that have not been fully tested.

Graphics workstation with a single processor: This configuration is almost comparable to theconfiguration of a PC, although with perhaps with higher capabilities for graphics and computa-tions. The workstation has to divide the CPU time between the operating system and the applica-tion. Inside the application, the time has to be also divided into the several tasks of computation,rendering and control of the input devices. This type of set up is more appropriate for virtual envi-ronments that are oriented towards static walk-through which do not require complex behaviors ofthe individual objects.

Graphics workstation with several processors: This configuration is in general more adequate forVR applications. The operating system can reside in one of the processors, while the rest are ded-icated to run the virtual reality application. In this case, one of the bottlenecks could be the graph-ics engine, if there is a stereoscopic device being used. The workstation will have to split itsgraphics capabilities into two separate displays (or views), which is some cases, it can degrade theperformance up at a 50%.

Graphic workstation with several graphics engines and several processors: This configurationdiffers from the previous one in having the capability of dedicating two independent graphicsengines to the rendering of each one of the stereo images for the visual display.

Graphics workstation connected via fast network to a supercomputer. A remote supercomputer isused to run applications that require intensive computations. The results are shipped to the graphicworkstation for rendering. This type of environments are still under investigation8,9, although theyare proven to be extremely useful to include complex behaviors in virtual environments.

3. Useful Bibliography

3.1. BooksAukstakalnis, S. and Blatner, D.Silicon Mirage: The Art and Science of Virtual Reality. PeachpitPress. 1992.

Barfield, W. and Furness, T.Virtual Environments and Advanced Interface Design. Oxford Uni-versity Press. 1995

Begault, D.3-D Sound for Virtual Reality and Multi-Media Applications. Academic Press.1994

8. http://www.iway.org9. http://www.icemt.iastate.edu/Projects/C2/html/molecule.html

Page 22: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-14

Benedikt, M. L.Cyberspace: First Steps. MIT Press. 1991

Burdea, G.Force & Touch Feedback for Virtual Reality. John Wiley & Sons.1996

Committee on Virtual Reality Research and Development, National Reseach Council.VirtualReality: Scientific and Technological Challenges. National Academy Press.1994.

Earnshaw, R., Jones, H. and Gigante, M.Virtual Reality Systems. Academic Press. 1993

Earnshaw, R.A., Vince, J.A. and Jones, H.Virtual Reality Applications. Academic Press. 1995Ellis, S. R.Pictorial Communication in Virtual and Real Environments. Taylor and Francis. 1991

Kalawsky, R.The Science of Virtual Reality and Virtual Environments. Addison-Wesley. 1993

Pimentel, K. and Teixeira, K.Virtual Reality: Through the New Looking Glass(2nd ed.). McGrawHill. 1994

Wexelblat, A.Virtual Reality: Applications and Explorations.Academic Press. 1994.

Woolley, B.Virtual Worlds. Viking Penguin.1994

3.2. Journals, Conference Proceedings & MagazinesIEEE Virtual Reality Annual International Symposium, VRAIS Conference Proceedings.

Computer Graphics, ACM SIGGRAPH.

Presence: Teleoperators and Virtual Environments. MIT Press. http://www-mitpress.mit.edu/jrnls-catalog/presence.html.

Human Factors. Journal of the Human Factors and Ergonomics Society. ISSN 0018-7208.

IEEE Computer Graphics and Applications. IEEE Computer Society.

Journal of Medicine and Virtual Reality. Virtual Reality Solutions, Inc.,

IRIS Universe: The Magazine of Visual Computing. Silicon Graphics, Inc.

CyberEdge Journal: The World’s Leading Newsletter of Virtual Reality. The Delaney Companies.

Page 23: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

2-15

3.3. Some Web Sites

HMD Technologyhttp://www.nvis.com/company.htmhttp://www.virtualresearch.com//home.htmhttp://www.keo.com/http://www.vr-atlantis.com/vr_systems_guide/vr_systems_list2.html

HCDs & Pinch Gloveshttp://www.fakespace.com

Projection-Based Technologyhttp://www-graphics.stanford.edu/projects/RWB/http://www.icemt.iastate.edu/Labs/se.htmlhttp://viswiz.gmd.de/tools/CyberStage.htmlhttp://www.evl.uic.edu:80/EVL/VR/systems.htmlhttp://vr.iao.fhg.de/vr/information/Equipment/CAVEEE/OVERVIEW-en.htmlhttp://www.vrex.com:80/professional/vrcove.htm

Softwarehttp://www.sense8.comhttp://www.division.co.ukhttp://www.multigen.comhttp://www.sgi.comhttp://www.vream.comhttp://web.cs.ualberta.ca/~graphics/MRToolkit.html

Referencehttp://www.hitl.washington.edu/projects/knowledge_base/irvr/irvr.htmlhttp://marg.www.media.mit.edu/people/marg/haptics-bibliography.htmlhttp://www.itl.nist.gov/div894/ovrt/hotvr.htmlhttp://haptic.mech.nwu.edu

Page 24: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3-1

REUSABILITY AND USERINTERFACE ISSUES

IN VIRTUAL REALITY

Frank Wood

Cornell Theory Center

Page 25: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3-2

Page 26: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 2

REUSABILITY AND

USER INTERFACE ISSUES IN

VIRTUAL REALITY

Frank Wood

Cornell Theory Center

Page 27: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 3

Objective:

• Sell the concept of reusable user interfaces for virtual reality

• Present the CTC virtual reality user interface design experience

Page 28: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 4

• Writing applicationsCAVE/C2 library basedOwn library based

Sell: Reusable VRUI

• Satisfy customers: users and developers of VR applicationsScientistEngineerGirl/boy scout

Page 29: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 5

• Common Components of VR applicationsNavigationObject selectionState variable changeObject transformations

Sell: Reusable VRUI (cont.)

• SpecificationsConsistent interaction mechanismSimple, flexible, and reusable library

Page 30: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 6

• Requirements:NavigationMove/scale/rotation of objectsObject selectionConstant look−and−feel.

Sell: Reusable VRUI (cont.)

• BenefitsHigher level programming interface.Shields application developer from common VR application

Page 31: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 7

•What we believe a VR user interface environmentshould be like:

Present: CTC Design Experience

Page 32: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 8

• Unambiguous component selection

Biggest Problems and Solutions

• Component parenting coordinate system

• Arbitrarily transformed components

• Event handling

Page 33: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 9

Component Parenting Coordinate System

Choice of world/parent/combination

• WorldAdvantages: easy to code libraryDisadvantages: not very much reuse.

• ParentAdvantages: great reuse,

intuitive modeling environment, etc.Disadvantages: increases the complexity of the library

Page 34: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 0

Arbitrarily Transformed Components

• 2Dmove, scale

• 3Dmove, scale, rotate

Page 35: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 1

Interaction modality

• Tool based

• Cursor based

• X,Win95,MacOS versus novel 3D components and interactors

Page 36: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 2

Event handling

Consistently the bottleneck.

• Better late than never <−−> callback <−−> queuebutton clicksgestural input

• Better never than late <−−> polled <−−> shared memorytracker positions

Page 37: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 3

Unambiguous Object Selection

• Select by:enterpoint

• Component overlap problem.

• Component parent coincidence problem

• Two level selection schemeSimple selectionAlias−Wavefront SBD

Page 38: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 4

Unambiguous Object Selection

Page 39: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 5

WorkSpace• Version 1 (complete)

Queue based event handlingSimple object selectionWorld space parentingCursor based interaction

• Version 2 (never released)Polling event handlingImproved object selectionParent space parentingCursor based interaction

• Version 3 (incomplete)Queue and polling event handlingTwo level object selectionParent space parentingTool based interaction

Page 40: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

3 − 1 6

Questions and Discussion

http://www.tc.cornell.edu/~fwood/workspace

[email protected]

Page 41: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-1

DESIGN ISSUES INVIRTUAL ENVIRONMENTS

Judy Vance

Department of Mechanical Engineeingand

Iowa Center for Emerging Manufacturing Technology

Iowa State University

Page 42: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-2

Applications of Virtual Reality in Engineering

Judy M. VanceDepartment of Mechanical Engineering

Iowa State University

Abstract

Virtual reality (VR) techniques offer us the ability to interact with computer images in a waynever before experienced. The computer, and later computer graphics, revolutionized the way inwhich engineers design and manufacture products, and now virtual reality holds the potential tofurther revolutionize the engineering design cycle by providing engineers with a new window intothe computer world. Up until recently, virtual reality was the domain of university researchers.With the appearance of commercial VR software and peripherals, companies have started build-ing virtual reality applications to solve real engineering problems. The focus of this paper is todescribe just how and where virtual reality fits into the creation and production of better productsfor all of us.

What do engineers really do?

Engineering is an iterative process whereby designs are created, analyzed, tested then modifieduntil the very best design emerges. Then the product manufactured and later to sold. This processis inherently time consuming and expensive.

The need for iteration arises due to the fact that engineers can only approximate the behavior ofmaterials and designs through mathematical modeling. For example, in designing a swingset,forces are transfered to the crossbrace when a child swings. These forces can be measured withinstruments placed on an existing swing set, but, if a totally new swingset design is being ana-lyzed, these forces have to be estimated based on previous knowledge of existing designs com-bined with a basic understanding of the laws of nature. Once the forces are obtained, the size ofthe crossbrace can be determined based on the strength of the material and the geometry of thecrossbrace. If the geometry or the forces are complex, computer analysis is performed to deter-mine the resultant stresses in the crossbrace and compare these stresses to the strength of thematerial. Embedded in this whole process are many assumptions. The reality is that engineers canonly arrive at “best guesses” based on existing physical laws and principles. Before production of200,000 swingsets can begin, some testing must occur. After the results of the testing are ana-lyzed, design changes are made and the process repeats until an acceptable design is achieved.

Engineers are not only concerned with the strength of a product, as illustrated with the swingsetexample, but they are often concerned with the use of that product. This is where human factorsevaluations come into play. In designing a vehicle, visibility and reachability are concerns of theengineers. Questions such as “Can a small woman see over the steering wheel and beyond the

Page 43: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-3

hood of this new car design?” “Can a large man sit comfortably in the driver’s seat without inter-ferring with the steering wheel?” are critical to the design of an automobile. Typically, worksta-tion-based software is used to examine these questions, followed by the building of a prototypefor testing. Now, researchers are investigating the use of virtual reality for this evaluation.

What does virtual reality offer that existing engineering tools do not offer?

Virtual reality allows the engineer to evaluate early product designs directly using the computerdata. If some of the critical evaluations can be performed on computer models, without theexpense of building several physical models, then the cost of designing the product is reduced. Inaddition, the ability to perform mutiple evaluations using computer data gives the engineer thefreedom to freely investigate a multitude of design ideas without incurring prohibitive cost to theproject.

Virtual reality, because of its immersive, first-person nature, presents a unique tool for the engi-neer. Instead of being on the outside of a design, looking through the computer workstation win-dow onto the design, the engineer can experience inhabiting the same space as the design. Thisdifference in perspective can be illustrated with a simple example. Say you are going to buy a newcar. You arrive at the car dealer’s lot and walk around looking at the vehicles in the lot (lookingfrom the outside in). You spot the car you like, obtain the keys and enter the car. Now you arelooking from the inside out. You realize that the car rides just a little too close to the ground foryou and you exit the car and begin to look for a larger vehicle to purchase. Until you entered thatcar, you could not experience the feeling of being “too close to the ground”. Virtual reality can beused to give you that feeling while still avoiding building a real car for you to enter. A non-immer-sive application cannot provide the full sense of being in the driver’s seat.

The applications of virtual reality in engineering can be divided into three main areas: virtual con-ceptual design, virtual design analysis, and virtual prototyping.

Virtual conceptual design

Conceptual design is the most creative process in the engineering design cycle. It is in this stagethat the basic framework of the new product is developed. Brainstorming, idea creation, and cre-ativity are all characteristics of this stage of the design cycle. The final result is a coarse descrip-tion of the general geometry of the design and of the function of the design. In this stage of thedesign cycle, many ideas are generated and evaluated. The more ideas the better. Tools are neededto help the engineer specify geometry without requiring detailed accuracy and to specify orquickly verify the desired function of the product. Current tools consist of a sketchpad and pencilor perhaps Computer-Aided Design (CAD) software program.

Three-dimensional conceptual design is currently performed using a computer monitor interfacewhere the third dimension of depth is lost. Three-dimensional coordinates can be used to definean object, but these objects are displayed on a two-dimensional screen. Modern workstations havefinely tuned graphics that allow the user to manipulate even complex objects in real time. Whatthis interface doesn’t provide, however, is the real experience of spatial relationships that VR can

Page 44: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-4

provide. Everything on a computer monitor is scaled to fit the physical limitations of the monitor.A 30 foot high building is scaled to fit the 17 inch computer monitor. In virtual reality, since it isfirst-person based, everything can be scaled to real world sizes. In virtual reality you can standnext to a 30 foot building, look up, and get a feeling for how tall that building really is. JeffStevens, manager of design process engineering at GM’s NAO Design Center says,

“ Everything is so small when you look at a 17” CAD monitor, that you don’t get a senseof proportion. Once you look at VR, you can present a package to a manager or designerin the scale they would actually relate to a physical property.”1

This real world scaling is very important in conceptual design and a key reason for building phys-ical prototypes. Virtual reality technology provides this capability without the need for physicalprototyping.

Because virtual reality allows the designer to inhabit the space where the design will be created, itprovides a three-dimensional context for the creation of the design. Instead of trying to specify apoint in 3space using a two-dimensional mouse as the traditional human-computer interface, VRinteraction devices allow the user to specify positions in true three-dimensions. Often products orcomponents are designed such that they must interact or connect to other components. Virtualreality allows the designer to be in the design space and investigate these interactions.

Virtual prototyping

Virtual prototyping is the most common application of virtual reality in engineering design. Aprototype is a one-of-a-kind production of a conceptual design. Building real prototypes is inher-ently very expensive. Custom tooling and custom craftsmanship are needed to create a prototypeof a design. Physical prototypes are built so that several aspects of the design can be verified. Dothe parts fit together correctly? Can the customer use the product as intended? Are there safetyissues that haven’t been addressed? Can the customer see and reach all the required features of thedesign? All these are questions that the evaluation of the physical prototype seeks to answer.Often several prototypes are needed before the design is released into production.

Virtual reality is being used to reduce the number of real prototypes required in the design pro-cess. Because of the first-person, true-to-scale environment of virtual reality, many of the ques-tions answered by building a prototype can be answered using a virtual prototype. In addition,changes can easily be performed on the computer models and more evaluations performed in ashorter time period than traditional prototyping.

Virtual analysis

Analysis is an engineering tool that is used to check a design for modes of failure. Often a compo-nent is required to carry a certain load without failure. In that case, finite element analysis may beused to determine the areas which are most likely to fail. Once these areas are identified, the

1. Kobe, Gerry. Virtual Interiors, Automotive Industries, May 1995, vol. 175, no. 5, pp. 52-54.

Page 45: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-5

designer can make changes to those areas to improve the chance of survival of the design. Thereare several types of analysis performed on designs including stress, vibration, sound, fluid, heattransfer, etc. Each of these analyses have as their basis a computer generated geometric model.This model can be readily displayed in the virtual environment along with color contours or datapoints indicating the results of the analysis.

Virtual reality can be used to once again allow the designer to experiment with various designs ina three-dimensional environment and determine the best design. Here the analysis results can bedisplayed on the virtual object. The designer can visually inspect these results in the three-dimen-sional space and gain a more intuitive understanding of the significance of these results. Researchis underway to allow the designer to change the design and see how the analysis results areaffected.

Applications of virtual reality in the United States

Many companies have begun to investigate the use of virtual reality for design. This section willpresent a short description of the use of virtual reality by various U.S. companies. Although notfully comprehensive, this list is representative of the types of uses of virtual reality in the productdesign market sector.

Caterpillar, Inc. which manufactures construction, mining and agricultural machinery has beenusing virtual reality techniques to eliminate expensive prototypes.2 In particular Caterpillar usesvirtual prototypes to assess visibility from the cabs of their vehicles, the ease of manipulating thevehicle, and other performance issues.3 They combine a real seat, steering wheel, and control con-sole with a virtual representation of the rest of the cab and the working environment which is dis-played in a CAVE display system. An article inIEEE Spectrum relates the Caterpillar experiencewith virtual prototypes:

“The perspective that you get in [existing] 3-D visualization packages isn’t really true,”said design engineer Dave Stevenson of the Wheel Loader and Excavator Business Unit.Although you can rotate the image and even go inside it, “you can’t do what you’d do inthe vehicle,” he said, such as lean a few centimeters to the side to get a better view of whatis happening, as well as a better dynamic sense of the design model. With VR, that is acinch.4

The article “Virtual Engineering” in the December 1996 issue ofVR News focuses on the Cater-pillar experience and presents the following summary that compares virtual engineering to tradi-tional design.

2. Mahoney, Diana Phillips. Driving VR, Computer Graphics World, May 1995, vol. 18, no. 5, pp. 22-33.3. Virtual Engineering, VR News, December 1996, vol. 5, no. 10, pp. 24-26.4. Adam, John A., Virtual Reality is for real, IEEE Spectrum, October, 1993, pp. 22-29.

Page 46: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-6

Engineers at Deere & Company, manufacturers of agricultural, construction and consumer equip-ment, are investigating the use of virtual reality for virtual product design and prototyping. Theyare developing a VR system to evaluate reachability of controls and visibility in their vehicleoperator stations. In addition, they are developing a virtual design review environment whereengineers and others on a design team can interactively move objects and annotate actions whilein the virtual environment.

Table 1:

Traditional Design Using Virtual Prototyping

Cost • Detail designs first, then physicallybuilding mockups or other physical repre-sentation of the final object.

• Avoids cost of building and modifyingprototypes.

• Changes require rework of models.

• Limited number of iterations possibledue to limited useful life of model.

• Infinite number of variations possible.

• Unable to simulate interactive results. • Infinite participation possible.

Time • Build prototypes and change designbased upon results experienced. Takesmonths to build prototypes.

• Can be “navigating” the simulation inone week or less.

• Changes required models to be updatedbefore next set of evaluations

• Changes immediately produce revisedrepresentation which can be judged andverified.

• Design iteration completed in about 10min.

Robustness • Interpretation of drawings can lead toerrors

• Solid and direct translation of CAD datato simulation. No reading of drawings ormisinterpretation errors possible.

Scope • Each iteration or new product requiredits own model

• Embedded data can be expanded or con-tracted to magnify size of product orallow minute examination of details.

Reusable Units • Minimal reusability. Limited life practi-cality due to cost and time to keep cur-rent.

• The technology and systems that evolvefrom this are applicable to most complexmanufacturing products such as aircraftand automobiles.

Reconfigurability • Reconfiguration of prototypes was lim-ited to physically building modificationsfrom drawings.

• The virtual engineering model isinstantly reconfigurable by downloadingrevised CAD data.

System Scalability • Building additional scale models wasthe previous method used to convey pro-portions.

• The virtual engineering simulation iseasily scalable to any size configurationdesired.

Page 47: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-7

For several years they have used the human factors modeling software program called JackTM toestablish line-of-sight cones and comfort envelopes for ergonomics studies in relation to their cabdesign.5 Now they are looking to extend that capability and provide first-person interaction withthe virtual objects. Brian Rauch, project engineer from the John Deere Dubuque Works AdvancedResearch and Development Department, says:

“VR allows our engineers to evaluate the ergonomic options of a piece of equipmentbefore it becomes steel,” says Rauch. “Visibility and ‘reachability’ are crucial in largeequipment like bulldozers. In R&D stages, VR gives engineers the ability to look at theway a hood is mounted over the engine and determine whether or not the driver still hasgood visibility.”6

Engineers at the Ford Motor Company are investigating the use of virtual reality for human/vehi-cle interaction projects.7 In an application which is similar to the Chrysler application, Ford usesvirtual reality to assess various car dashboard configurations to verify instrument accessibility anddriver visibility. The participant uses both voice commands and an input device to choose betweenvarious dashboard and steering wheel configurations. By placing themselves in the virtual driver’sseat, the participant can evaluate instrument accessibility and driver visibility. Another applicationallows uses to load luggage into the trunk of a car to determine the ease of loading certain luggagecombinations and the amount of luggage that will fit into the trunk. Ford engineers are also inter-ested in using virtual reality to assess analysis results. Airflow under the bumper and through theengine compartment is examined in a virtual environment in order to help engineers visualize theeffects of engine layout on engine component cooling.8

Chrysler has been actively supporting research into virtual prototyping as well. One focus of theirefforts is in the area of virtual prototyping of autombile interiors.9 In one application, a designersits in a physical automotive seat and wears an instrumented glove while looking through aBOOM display. A physical steering wheel is also present. The display shows computer models ofthe dashboard, steering wheel and all instruments. The user can evaluate the location and appear-ance of the car interior for accessibility, layout, and instrument visibility. In the process of devel-oping this application, methods for easily transfering data from CAD/CAM models to the virtualworld were developed.10

General Motors also is looking to virtual reality to reduce the number of physical prototypes thatare required. Randy Smith, GM Staff Research Scientist, says

5. Puttre, Michael, Virtual Prototypes Move Alongside Their Physical Counterparts, Mechanical Engineer-ing, August, 1992, pp. 59-61.

6. Website: http://www.division.com/1.div/c_news/deere.htm7. Division software to drive VR projects for Ford Motor Co. , Silicon Graphics World, Jan. 1996, vol. 6, no.

1, p. 11.8. Mahoney, Diana Phillips. Driving VR, Computer Graphics World, May 1995,vol. 18, no. 5, pp. 22-33.9. Flanagan, William P. Virtual Reality Lab Projects, IEEE Computer Graphics and Applications, vol. 17,

no. 3, May-June, 1997, p. 17.10.http://www-vrl.engin.umich.edu/automotive.html

Page 48: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-8

“The focus of our effort is the reduction or elimination of physical prototypes. We have tocut a lot of time out of the vehicle development process. The goal is to use these tools toenhance communication and eliminate a lot of the iterations of physical models that arereally just built and thrown away.”11

GM has found that virtual reality provides an interface solution for some fundamental communi-cation problems that sometimes exist in GM design teams. Often these teams are composed ofpeople with widely varying skills, such as designers, engineers, and manufacturing engineers.They sometimes do not fully understand two-dimensional drawings of three-dimensional objects.Representing the objects in virtual reality where they can see the three-dimensional nature of theproduct helps to reduce the miscommunication between members of the design team and helps tofoster creativity.

Probably one of the most well-known applications of virtual prototyping is from the Boeing Corp.In the design of the 777 plane, Boeing claims to have saved an estimate $100 million in develop-ment costs by avoiding building a multitude of physical prototypes.12 This plane is the first to bedesigned without a full scale physical mockup.13 In order to avoid building expensive physicalmockups, Boeing needed the ability to visualize large-scale scenes composed of thousand of com-puter models. To answer this need, they developed their own software called FlyThruTM. By thetime the 777 rolled off the assembly line, about 40,000 models were needed along with 400,000part definitions and 2 million part instances yielding about 2 billion triangles when tesselated. Useof this software at Boeing yielded manufacturing savings resulting from faster assembly that wasperformed more efficiently and at less cost than previous Boeing airplanes.

Boeing is also investigating using augmented reality (computer images superimposed on the realworld view) to guide workers in assemblying wiring harnesses for their airplanes.14 Currentlywire harnesses are assembled by laying wires across pegs on plywood boards. Around 1,000boards are required for a single plane. By using augmented reality, workers can work with a singleboard and the computer indicates where the wires are to be placed.

McDonnell Douglas is using virtual reality to verify maintenance tasks on its new designs. In oneapplication engineers were instructed to remove a virtual engine from an aircraft to evaluate inter-ferences.15 Using virtual tools, the engineer removes the bolts from the engine, hoists it to a trailerthen moves it away from the engine. During the process, interference checking is actively lookingfor places which might prevent the real engine from being removed from the airplane. McDonnellDouglas engineers have also used virtual reality to assemble two subsystems that were designedindependently. During the assembly process, difficulties were uncovered that avoided costly mis-takes. McDonnell Douglas is looking to use this technoloty in other steps in the design cycle.

Lockheed Martin recently announced a joint project with IBM and Dassault Systemes of Franceto create a virtual design environment.16 This environment will be used to simulate every aspect

11.Ellis, Grant, IRIS Universe, Fall 1996, no. 37, pp. 28-30.12.Winter, Drew. Special Effects, WARD’S Auto World, November 1994, vol. 30, no. 11, pp. 31-35.13.Adam, John A., Virtual Reality is for real, IEEE Spectrum, October, 1993, pp. 22-29.14.Sprout, Alison L. Reality Boost, Fortune, March 21, 1994, vol. 129, p. 93.15.Website: http://www.division.com/4.act/a_success/mcd1.htm

Page 49: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

4-9

of airplaned design, support and manufacture. The system will also provide a link between usersat various Lockheed Martin sites throughout the United States. The first real test of the systemwill be the design of the Joint Strike Fighter (JSF).

Although this presentation is not all inclusive of industry uses of virtual reality, it is representativeof the extent of the applications in industry today. As time goes on, many more industries are turn-ing to virtual reality to provide a computer environment for product development.

Challenges that remain

One of the key challenges that industry faces with respect to developing virtual reality applica-tions is the need to integrated Computer-Aided-Design (CAD) models with the virtual environ-ment. Models created with Computer-Aided-Design (CAD) software are the basis for mostproduct designs. These models are inherently full of information and geometry. So much informa-tion is present that it often cannot be displayed rapidly enough to allow the user to navigate in thevirtual environment at a reasonable speed. One of the current methods of dealing with these mod-els is to reduce the complexity of the model for the virtual reality application. This amounts toapproximating surfaces with simpler surfaces while trying not to lose critical geometry informa-tion. Methods are currently being investigated which will allow users to operate on the originalgeometric models without loss of resolution.

Another key challenge that industry faces is the need for haptic feedback. In walk-through appli-cations the absence of haptic feedback is not all that apparent, but the moment a user is asked tointeract with the computer models and the models do not “push” back, the illusion of immersionis compromised. Several of the ergonomic evaluation applications could be greatly enhancedthrough the addition of haptic feedback. Boeing, Ford and others are actively pursuing this area ofresearch.

Conclusions

While the focus at the present time is on virtual prototyping, there are many more uses of virtualreality for engineering design to be investigated in the future. The areas of virtual conceptualdesign and virtual analysis are still in the research arena. In addition, several research institutionsand companies are beginning to investigate networked VR as a communication tool for engineers.The vision is to be able to perform a design review with a team of engineers who may be widelydispersed geographically. The intent would be to provide a tool for each to interact and evaluatethe virtual design and communicate their findings to each other in real time.

16.http://www.clearlake.ibm.com/mfg/whats_new/lockheed.html

Page 50: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-1

NAVIGATION INVIRTUAL ENVIRONMENTS

Rudy Darken

Department of Computer Science

Naval Postgraduate School

Page 51: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-2

interactNaval Postgraduate School

Navigation Issues in the Design of Virtual Reality Systems

Rudy Darken

Naval Postgraduate SchoolMonterey, California

[email protected]

interactNaval Postgraduate School

Active Navigation

• Involves both cognitive and motor elements of human performance

• Knowing where you’re going, and getting thereWayfinding Locomotion

• Teleportation is not the ultimate navigation problem solver• Preservation of “place” and the experience of traversing

space between “places”

interactNaval Postgraduate School

Why is this such a difficult problem?

• We are all trying to mimic the real world because we don’t know what’s important to preserve -- so we preserve everything.

• Realism is easy to evaluate but impossible to achieve.• But if we can achieve realism in terms of human

performance and experience, we have reached our goal.

What elements of virtual worlds are important for navigation?

This section of the course will focus on navigationissues related to the design of effective virtual realitysystems. Rarely is navigation the primary task wheninteracting with a VR system. However, problemsassociated with navigation often impede a user’sability to accomplish the primary task. The materialpresented here is a summary of research over thepast three years at the Naval Postgraduate Schooland the Naval Research Laboratory.

Active navigation involves two fundamentalcomponents: the cognitive part (wayfinding) whichinvolves knowing where you’re going, and themotor part (locomotion) which involves actuallygetting there. Contrary to a common misconception,teleportation is not the ultimate navigationmechanism because there are many tasks (training,for example) where getting there is an important partof the task. Effective spaces for navigation haveseveral things in common. They all have “places”that are easily distinguishable from one another, andthey have paths (one or more) between these placesthat are also easily identified.

It is easy to assume that if we make our virtualworlds look a lot like the real world, that navigationproblems (and every other problem for that matter)will go away. The problem with this thinking is thatgiven current technological constraints, even thebest virtual worlds contain only a tiny fraction of thestimuli of a real environment. So what are theimportant parts to preserve? Our approach is toevaluate systems not by the fidelity of the stimulus(how real it looks, sounds, feels) but rather by theperformance of its users. With regard to navigation,we are not concerned with how real the environmentis but rather in human performance on searching andother navigation tasks.

Page 52: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-3

interactNaval Postgraduate School

Key Factors of Navigation

• Method of locomotion• A priori spatial knowledge• Innate abilities (individual differences)

• Search strategy• Environmental cues (inherent to the space)• Navigation cues/tools (augmentations to the space)

interactNaval Postgraduate School

A Framework for Navigation

CHI97 Workshop on Navigating Electronic Worlds

Goal Formation

Strategy Formation

Information Seeking

Scan

AssessAct

Model

external

internal

interactNaval Postgraduate School

Training Aids versus Performance Aids

• Training aids: Must show that their use improves performance in the real world task

• Performance aids: Must show that their use improves performance in the virtual world task only

• Important but oftentimes subtle difference

There seem to be several key factors to navigationthat all interrelate in some way. Locomotion has todo with getting around. It involves both coarsemovement (traversing space) and fine movement(maneuvering). A priori knowledge refers to whatwe know about an environment before we executeany navigation task there. Innate abilities refersmainly to our spatial abilities; mental rotation,visualization, etc. Each of us is different in thisway. Search strategy refers to our plan of how toaccomplish a navigation task. Environmental cuesare what is in the environment itself. These aretypically not explicitly for navigation but areextremely useful. Navigation cues/tools are thosethings that are explicitly for navigation.

A simple description of this graphic: You have toperform some navigation task. You first develop agoal; what are you looking for? You determine whatstrategy you will use to find it. You seek outinformation in the environment to help you. Youscan the environment to extract information. Youassess what you see. Is it what you are looking for?Does it give any clues as to where to look next? Thisinformation may be incorporated into an internalmodel (a.k.a. a cognitive map). This model mayalso be used to assess what you see. You then movein some way and the cycle repeats. At ant time inthis loop, you may determine to give up and forma anew goal, change strategy, or seek out newinformation on the current goal.

An important distinction must be made betweenartifacts in an environment that are intended solely toimprove user performance in the virtual world(performance aids) versus those that are intended toimprove performance on a real world task that thevirtual world is training them for (training aids).Training aids must be verified with a trainingtransfer study to show that training has occurredwhile performance aids do not. This is another casewhere we can be fooled into thinking that somethingis an aid just because it mimics the real world.

Page 53: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-4

interactNaval Postgraduate School

Navigation Performance Aids

• Organizational Elements• Clustering• Landmarks

• Map Usage• Frame of reference, transformation issues• Experience issues, gaming, etc.

(Darken & Sibert)

interactNaval Postgraduate School

Disoriented Behavior

5

4

3

2

1

0

A

BC

5

4

3

2

1

0

interactNaval Postgraduate School

Search Strategies

The left image here shows a typical top-down viewof the trail left by a disoriented subject in theexperiment. The targets this user was looking forhave been greatly enlarged and numbered in thisimage. Note that the subject traverses the edges ofthe land masses, sometimes more than once. Thissubject also became disoriented several times andbumped into the lower edge. The right image showsthe map this subject drew after completing the task.This particular example shows a very unusual left-right mirror reflection of the world.

Two example search strategies that were stronglyinfluenced by the organization of the environmentare shown here. The left image shows a back andforth exhaustive search strategy in an environmentwith no assistance given. Dead reckoning was oftenused here to maintain relative position. The rightimage shows a radial search strategy that is clearlyinfluenced by a radial grid placed over the world inthis treatment. The subject searched each wedge ofthe grid in turn in an exhaustive fashion untilcompleted.

The first work we did (see Darken & Sibert, 1993,1996a, 1996b) involved only performance aids fornavigation in virtual worlds. We looked at how aworld designed could organize a space such that itmight be more easily navigable. We also looked atmap use as a special case since maps are such apowerful navigation aid. Issues such as how toorient the map with respect to the user andexperience with electronic dynamic maps such asthose found in video games were investigated.

Page 54: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-5

interactNaval Postgraduate School

Design Principles• Organizational Principles

• Divide the large world into distinct smaller parts.• Organize the parts under a simple organizational

principle.

• Provide frequent directional cues.• Map Design Principles

• Show all organizational elements and the organizational principle.

• Always show the observer’s position.• Orient the map with respect to the observer such

that the map is aligned with the environment.

interactNaval Postgraduate School

Task Execution

Start

Acquire OrientationAcquire Position

Reacquire OrientationReacquire Position

Maintain OrientationMaintain Position

Spatial MemoryPrimed Search

Naive Search

Finish

H

interactNaval Postgraduate School

Results of Earlier Studies

• Real-world principles of wayfinding apply to virtual worlds

• Both novice and experienced navigators think in terms of paths

• Experienced navigators construct paths from a priori knowledge and on-the-fly improvisation.

• Novice navigators will cling to anything they can view as a path• Handrails, catching features such as roads, rivers,

edges, etc.

The principles we extracted from the real world forthis first study are shown here. Organizationalprinciples involve dividing large environments intomuch smaller, more easily navigable ones, possiblyin a hierarchy. Directional cues are essential formaintaining orientation. If orientation is lost,navigation tasks cannot continue in any meaningfulway. Map design principles involve orienting themap with respect to the user such that what is infront of the user is on top of the map (referred to asa “track-up” configuration). This is especiallyimportant for egocentric tasks such as searches.However, for geocentric tasks, a north-up mapwould be more appropriate.

This is a simple chart of task execution in thisexperiment. The subject first orients and finds thestart position in the world. Then several searches areperformed (naive searches imply no a prioriknowledge, primed searches imply a return toknown location). However, during task execution,if the subject becomes disoriented, a subtask mustbe completed to reacquire position and orientation.Also, some subjects performed small subtasksintended to maintain their orientation and position.Lastly, many subjects explicitly attempted todetermine the configuration of the environment andto memorize where all targets were.

The high level objective of this first study was todetermine if real world design principles fornavigation will apply to virtual worlds. This wasfound to be true. However, the more interestingfindings have to do with why this is the case, as thisinformation will help us to design better virtualworlds. All users think in terms of paths rather thanlocations and directions only. But experiencednavigators travel with greater confidence and arebetter able to use environmental cues than novices.A novice navigator will tend to travel point to pointbetween landmarks in order to close in on a target.An experienced navigator creates “corridors” whererelevant items are within view but are never actuallytargets.

Page 55: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-6

interactNaval Postgraduate School

Navigation Training Aids

• Both natural and man-made environments• Training knowledge (specific to one space)

• Exposure, practice

• Maps (static, dynamic)• Training skill (general to all spaces)

• Perspective transformations• Landmarking ability

• No study has allowed enough exposure time per subject to allow configuration knowledge to develop.

(Darken & Durlach)

interactNaval Postgraduate School

Locomotion Mechanisms and Metaphors

• Range of motion• Walking, Running, Turning, Crawling, ...• Direction, Velocity, Acceleration

• Form of motion• Completely natural, Approximated, Fully

abstracted• “Close” isn’t good enough. Users detect this easily.

interactNaval Postgraduate School

Locomotion Devices

UniportSarcos

TreadportSarcos

Virtual PerambulatorU. Tsukuba, JP

The last part of this section involves locomotionmechanisms and metaphors. In order to movearound in a VE, the navigator must be able to do thesame sort of this we all do in the real world; walk,run, turn, etc. We need to be able to specifydirections, velocities, and accelerations. However,the form of this motion is largely what is at issuehere. Do we choose a completely natural form (e.g.walking, running), an approximated form (e.g.walking in place), or an abstracted form (e.g. point-and-fly)?

There is a relatively short history of locomotiondevices. These attempts have set a goal ofcompletely natural motion. But completely naturalmotion takes up lots of space. So each of these usessome mechanism to simulate movement in largespaces while staying in relatively the same place.The Uniport is a unicycle-like device that turns withbody twist. The Treadport is a single-directiontreadmill with force feedback. The VirtualPerambulator is a “slippery floor” technique wherethe user has to hold on to a guide rail.

Our most recent research direction has focused onnavigation training aids. The first part of this workinvolves training for specific spaces. Given anenvironment that is completely unknown to thenavigator, what forms of training will bestfamiliarize him with that environment? We arelooking at exposure to maps and VE’s as comparedto the actual environment. If we can show that atraining transfer does occur for VE’s, the next stepis to strip away the fidelity of the VE to determine itsfundamental components essential for navigation.The other side of this work involves trainingnavigation skills such as map reading and landmarkselection.

Page 56: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-7

interactNaval Postgraduate School

The Omni-Directional Treadmill

Virtual Space Devices, Inc.

interactNaval Postgraduate School

How it Works

interactNaval Postgraduate School

The Centering Algorithm

CV

C

V

A. B.

C

V

A. B.

1

2

3

4

The basic principle of omni-directional surfacemotion in a detailed view of roller/belt interaction.The top belt comprised of an array of freely rotatingrollers over a second, orthogonally oriented belt alsocomprised of rollers. Top rollers translate in unisonby the action of a servo motor that acts upon theentire belt. Top rollers rotate through contact with amoving lower belt. Lower belt rollers are housed ina cradle that permits rotation yet also permits restingthe assembly on a low friction support surface.Lower rollers are free to rotate and serve as abearing surface for the linear motion of the top belt.Translation of the lower cradles is effected by asecond servo motor that causes all the cradles androllers of the lower continuous belt to move.

The top roller belt is actuated like a normal treadmillbelt. But at the same time it is mechanicallytransparent to the motion of the lower belt; motionof the lower belt appears at the surface of the upperbelt as rotary motion. In a top down view, the useris walking towards the bottom left of the activesurface. Diagonal movement is accomplished bysimultaneous movement of both the lower roller beltand the upper roller belt. During diagonal motion,each top belt roller is both translating and rotating.Because both the top roller belt and the bottom rollerbelt are bi-directional, all planar vectors may begenerated.

The Omni-Directional Treadmill (or ODT) will beour main focus. It is one generation beyond thedevices previously described. It allows motion inany direction. The left image shows a user on theODT under the overhead boom. There are twoconnections to the user from the boom; one is amechanical spatial tracker, and the other is a nylonsafety strap connected to a kill switch. If the usershould fall in any way, the nylon cord will supportall weight and will shut down the device.

Page 57: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-8

interactNaval Postgraduate School

ODT Analysis

• Locomotion• Velocity - rest, walk, jog• Transitions - accelerate, decelerate

• Movement direction - forward, backward• Direction change - straight, turn

• Maneuvering

• Turning in place• Side-stepping• Bending

video

interactNaval Postgraduate School

Locomotion is an Automatic Process

• Having to “learn” how to walk on a device is unacceptable

• Behavior adapts over time, but is the process automatic?

• Latencies due to tracking and dampening too great

• What is a reasonable approximation?• Maneuvering techniques (NRL, Jim Templeman)

• Walking in place• Intended movement signatures

interactNaval Postgraduate School

Future Work

• Navigation Performance• Individual differences -- cues/tools• Virtual map usage

• Navigation Training

• Natural environments• Urban/architectural environments• Skill/knowledge training

• ODT Analysis

• More data collection on recent improvements

Our future plans include continued experimentationin performance aids for navigating complex virtualworlds. This includes a study of individualdifferences and map usage. We also plan a study onnavigation aids for training. One study will takeplace in a building interior and another in a natural(outdoor) environment. Finally, we plan to takeanother look at the ODT to study recentmodifications to the centering algorithm to determineif it has eliminated some of the problems we saw inour earlier work.

Problems arise when the user steps away fromcenter in such a way that the vector of motion tobring the user back to center is not aligned with thedirection of walking motion. We looked at users onthe ODT in a variety of motions and transitionsbetween motions as well as small maneuvering taskssuch as side-stepping. The ODT performed verywell on tasks with no acceleration, deceleration, orturns during these transitions or if these actionswere relatively small. In cases where the useraccelerates or decelerates very fast, the latency intracking allows the centering motion to continueafter the users legs and feet have stopped moving.

If the goal is to enable perfectly natural locomotion,then walking and running on the ODT must be anautomatic process; the user must never have to thinkabout it. Users should not have to learn how to walkspecially for the ODT. The ODT should not inhibitsome kinds of motions while allowing others. Theuser will have to remember what can and can’t bedone. We believe that the majority of the problemsin this generation of the ODT are in the trackingmechanism and the residual movement of the treadsafter user motion has stopped. This gives the ODTan unsteady feel and does not allow for walking tobe completely automatic. A possible short-termsolution might be an approximation (ongoing NRLresearch) such as walking in place.

Page 58: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

5-9

interactNaval Postgraduate School

Our Research Group

interactNaval Postgraduate School

http://interact.nps.navy.mil

This research is sponsored by the Office of Naval Research, Cognitive and Neural Sciences Division

References

Bowman, D., Koller, D., & Hodges, L. (1997, March 1-5, 1997). Travel in Immersive Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques. Paper presented at the Virtual Reality Annual InternationSymposium (VRAIS), Albuquerque, NM.

Darken, R. P., & Sibert, J. L. (1993). A Toolset for Navigation in Virtual Environments. Paper presented at theUser Interface Software and Technology, Atlanta, Ga. 157-165.

Darken, R. P., & Sibert, J. L. (1996). Navigating in Large Virtual Worlds. International Journal of Human- Computer Interaction , 8(1), pp. 49-72.

Darken, R. P., & Sibert, J. L. (1996). Wayfinding Strategies and Behaviors in Large Virtual Worlds. ACM SIGCHI 96 , pp. 142-149. (See also CHI 96 Conference video).

Lynch, K. (1960). The Image of the City . Cambridge: MIT Press.

May, M., Péruch, P., & Savoyant, A. (1995). Navigating in a Virtual Environment With Map-AcquiredKnowledge: Encoding and Alignment Effects. Ecological Psychology, 7 (1), 21.

Passini, R. (1984). Wayfinding in Architecture . New York: Van Nostrand Reinhold Company Inc.

Péruch, P., Vercher, J., & Gauthier, G. M. (1995). Acquisition of Spatial Knowledge Through Visual Explorationof Simulated Environments. Ecological Psychology, 7 (1), 1.

Satalich, G. (1995). Navigation and Wayfinding in Virtual Reality: Finding the Proper Tools and Cues to Enhance Navigational Awareness. Unpublished Masters thesis, University of Washington.

Witmer, B. G., Bailey, J. H., & Knerr, B. W. (1995). Training Dismounted Soldiers in Virtual Environments: Route Learning and Transfer (Technical Report 1022): U.S. Army Research Institute for the Behavioral andSocial Sciences.

Papers on these topics can be found at this web site.

Page 59: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-1

PSYCHOLOGICAL ANDPHYSIOLOGICAL EFFECTS

OF IMMERSIVE ENVIRONMENTS

Mary Lynne Dittmar

Visualization LaboratoryAdvanced Computing Group

Boeing Defense and Space Group

Page 60: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-2

Psychophysiological Effects in Immersive Environments

Mary Lynne Dittmar, Ph.D.Visualization Laboratory

Advanced Computing GroupBoeing Defense and Space Group

P. O. Box 240002, M/S JW-75Huntsville, Alabama 35824

Abstract

Immersive virtual environments (IVEs) comprise a subset of technologies collectively referred toas “Virtual Reality” (VR). The continuing development of IVEs and related applications is depen-dent upon an understanding of the psychological and physiological effects of such environments.Such effects include discrepancies in visual mapping and distance estimation, Immersive Simula-tion Adaptation Syndrome, and changes in visual accommodation and convergence.

A review of the literature regarding these effects is undertaken here. Human factors concerns,including impacts upon human performance, design issues, and ethical concerns regarding simu-lation of “real” tasks are addressed. Practical suggestions as to how to deal with these issues arepresented. Finally, the potential for an improved understanding of human perceptual adaptation isdiscussed.

Introduction

Arguably, the class of Virtual Reality (VR) applications which has received the most public atten-tion while being the least well understood is the Immersive Virtual Environment, or IVE. IVE’scan be stereoscopic or not, fixed or mobile. For the purposes of this paper, the discussion willaddress stereoscopic, mobile computer-generated environments, generally delivered by head-mounted display (HMD) which couples the human’s head to the IVE by means of some type of“wrapped” configuration, maintaining the visual field at constant distance and orientation for aslong as the viewer wears the device. Although various input technologies for the presentation ofsensory feedback stemming from modalities other than vision (e.g., audition, taction) are cur-rently being explored, the development of VR systems has focused in large part upon the humanvisual system, and it is toward that system and related peripheral devices that this paper isdirected.

Virtual environments are designed to emulate the information presented to the human visual sys-tem by the “real world”. Limitations in technology, as well as in our understanding of the work-ings of the visual system mean that there are and will continue to be detectable discrepanciesbetween IVEs and the real world that they simulate. Since our visual system developed over eonsunder the influences of the physical world we know as “real”, it is closely coupled to those influ-ences. For example, we orient to and locomote through the surrounding landscape under the

Page 61: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-3

influences of gravity, visible light (associated primarily with the star circled by the Earth), thecharacteristics of a variety of surfaces, etc. Since IVEs do not provide exact matches to the physi-cal characteristics of the world in which our species developed, nor, in many cases, do they mapeasily onto expectations for what constitutes a visual scene, differences between IVEs and thephysical world (and our psychological representations thereof), which may appear slight todesigners, can be expected to have considerable effects on perception.

We will begin by examining two of the most prevalent “mismatches” between the human visualsystem and IVEs, then discuss the interaction of those discrepancies with “psychological” expect-ancies and visual memory. Physiological correlates of the use of virtual technologies, specifically“simulator sickness”, will be examined. Finally, practical human factors considerations, ethicalissues, and the adaptive potential of the human perceptual system will be discussed.

Motion Effects

In the construction and presentation of IVEs, motion, like objects, is generally simulated. Liquidcrystal displays (LCDs) and cathode ray displays (CRDs) are updated at a finite and continualrate. As a result, the simulation of motion is achieved by presenting a series of static imageswhich are distributed along a vector, appearing at spatially distinct locations. Called “sampledmotion” (Fahle & Poggio, 1981; Watson et al., 1986), this approach to simulating motion hasbeen used in a variety of applications prior to VR, notably film and animation. In general, thehuman visual system responds to sampled motion by perceiving movement in a specific direction,as long as the sampling rate exceeds requirements of the visual system for generating apparentmotion. However, the perceptual coherence of sampled motion can be lost when the samplingrate of a motion sequence falls below that threshold, permitting the observer to detect spatially-discrete images. The consequences of this detection vary, depending upon the specific physicalcharacteristics of the visual image, but may include reversal of the perceived direction, “jerky”motion, and the appearance of multiple, so-called “ghost” images, displaced along the trajectoryof the target and appearing to linger behind it. This class of perceptual phenomena is referred toas “temporal aliasing”, because it is occurs when a temporally-based sampling method does notprovide information to the visual system with sufficient refresh rates and/or consistency to permitthe illusion of motion to be maintained.

Practical Issues Regarding Motion Effects

Several theoretical explanations for the extraction of motion from static images exist (see Adelson& Bergen, 1985; Kolers, 1972; Johansson, 1975; Ullman, 1979; Reichardt, 1957; van Santen &Sperling, 1985). Analysis of those models is beyond the scope of this paper. In general, however,temporal aliasing results in degraded images, absence of the perception of motion, disruption and/or disintegration of the visual field (particularly in cases where apparent motion is reversed orghosting occurs), and the imposition of considerable mental workload upon the immersed human,the common result of which is failure to remain immersed. In general, these effects may beaddressed by the following methods, collectively referred to as “anti-aliasing” techniques:1. Increasing the update rate. This permits the presentation of more images along a given trajec-tory, resulting in less temporal disparity from image to image. Among other things, an increased

Page 62: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-4

update rate also permits simulation of higher velocities (increased visual angle between images iscompensated for by greater sampling rates). Higher refresh (sampling) rates also reduce the prob-ability1 of jerky motion. However, increasing the update rate may also have the unfortunate con-sequence of creating multiple images (Allport, 1970).

2. Lowering image contrast. It has been suggested that lowering the brightness contrast of animage may reduce sampling artifacts (Bex et al., 1993). There are likely limits to the effective-ness of this technique as well, since reduction of brightness contrast beyond a certain thresholdmay result in a shift of attention on that object from foveal to peripheral locations, especially priorto an eye movement (Henderson, 1993). Peripheral vision is quite sensitive to motion, moresothan foveal fixation under some conditions; thus contrast reduction might actually increase theprobability of temporal aliasing in some IVEs.

3. Blurring images. Farrell et al. (1990) suggested that multiple imaging results from some typeof neural decay, sluggish enough to give rise to ghosting when multiple presentations of a movingstimulus occur at a rate exceeding the decay interval. Blurring the images results in a filtering ofthe image, tuning it for motion detection by receptors which are tuned to larger, less-distinctimages. Such receptors generally have larger receptive fields, and are therefore less likely to gen-erate distinct, well-bounded images which are associated with multiple imaging. Optically, this isachieved by viewing targets peripherally, which has been suggested above as an aid in reducingtemporal aliasing. However, blurring is another way of achieving a marked reduction in the reso-lution of an image, and will likely result in user complaints that the image is “out-of-focus”.

At this point, it should be clear that the specific requirements for the use of IVEs will dictatewhich anti-aliasing techniques are advised. Because of the complex interactions involved in theseeffects, the techniques which work well in one operational or research situation may not be appli-cable in others. Such difficulties are further compounded by another and far more intractablevisual mechanism, that of the accommodation response.

Accommodation and IVEs

“Accommodation” refers to the change in the focusing power of the lens of the eye. In humans,this is achieved by the contraction and relaxation of ciliary muscles in the eye. When the musclescontract, the lens flattens out and focusing power decreases. When the muscles relax, the lensbecomes thicker, and resolving power increases (Dember & Warm, 1979). It appears that inputsfrom both the sympathetic and the parasympathetic pathways of the autonomic nervous systeminnervate the ciliary muscles (Gilmartin, 1986). This is important because it suggests that notonly physical stimulus properties, such as blur, affect ciliary muscle response, but also “psycho-logical” factors such as stress, or expectations about the appearance of stimuli, might affect thefocus of the eye without conscious effort or awareness. Accommodation may be categorized asfollows (Edgar & Bex, 1996):

1. “Probability” and “probabilities” is used throughout this discussion to denote the statistical nature ofthese occurrences. Both within-individual and between-individual differences (variabilities) are consid-erable with regard to these effects.

Page 63: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-5

1) Reflex accommodation. Reflex accommodation occurs when a physical property of a stimulusbeing viewed suddenly changes, provoking a “reflexive” shift in focus. Although several charac-teristics of a stimulus give rise to reflex accommodation, the most powerful factor is blur, whichresults in frequent shifts in accommodation as the eye attempts to bring the blurred image intosharp focus (Fincham, 1951; Hubel, 1988). This is particularly salient in IVE displays, whereinvisual acuity is usually less than optimal, and/or blurring may be invoked as an aid in lowering theprobability of temporal aliasing effects. Frequent changes in reflex accommodation, then, areexpected to be a fairly ubiquitous feature of VR displays for the foreseeable future.

2. Tonic accommodation. Tonic accommodation, sometimes called “passive” or “resting” accom-modation, is the degree of focusing power extant in the lens when the eye is “at rest”. Severalinvestigations have found that, in the absence of visual stimulation, the eye usually focuses asthough fixated on an object between 0.5 and 2.0 m away from the lens (Westheimer, 1957;Owens, 1984). A related effect, referred to as the “Mandelbaum effect” (Mandelbaum, 1960),refers to the tendency to focus on stimuli which are at or near tonic accommodation, even if thatstimulus obstructs or partially obstructs the actual visual target (Owens, 1979). Thus, any objectnear tonic accommodation may affect the overall level of accommodation, as well as resulting in ashift of attention. With regard to VR displays, Edgar and Bex (1996) point out that this may berelevant to a well-known phenomenon referred to as “instrument myopia”, which describes a typeof “over-accommodation” occurring in some people who use optical instruments (see Wesner &Miller, 1986, for a review). If a stimulus is impoverished, perhaps by blur, the eye will shift backand forth (reflex accommodation), eventually moving toward resting or tonic accommodation as a“solution” for resolving the stimulus. Thus, the blur (poor spatial resolution) present in mostIVEs may contribute to a form of instrument myopia.

3. Convergence. Called “convergence accommodation” by some, this refers to the change in theangle of convergence of the two eyes as an object moves closer to or further from the observer. Asthis movement takes place, the resolving power of the lens also changes (accommodation). Sev-eral types of convergence have been identified by researchers; however the salient point here isthat convergence and accommodation are linked; the degree of one affects the degree of the other.In VR displays, the mechanisms underlying this relationship are brought into conflict. Objects inan IVE appear to be distributed across the field of view, and appear to be “in depth”, that is, atvarying distances from the observer. Thus, as an individual looks from one object to another,those objects appearing closer (because the display utilizes both monocular and binocular cues,this is achieved by varying the size and location of objects displayed on the LCD/CRDs) willresult in more nasal convergence, while those appearing farther away (smaller/displaced) willresult in less. However - and this is critical - the accommodative requirement remains the samethroughout, because the actual focusing power of the eye remains the same with regard to theLCD/CRDs. Thus, the relationship between convergence and accommodation is “dis-integrated”in IVEs. This breakdown of sensory mechanisms and the conflicting feedback resulting from itmay underlie some reports that IVE systems are uncomfortable to use over relatively short dura-tions (Neary et al., 1994). In addition, such an effect would not be at odds with mechanismsthought to underlie another discomfort - motion or “simulator” sickness, which will be addressedbelow. Finally, difficulties in reconciling these responses of the visual system to the demands of

Page 64: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-6

the “unreal”, virtual environment result in poor distance judgments, particularly when objects arepresented in different fields of view (FOVs), requiring head-movements to fixate the two objects(Hale & Dittmar, 1994).

4. Proximal accommodation. This type of accommodation is associated with memory (“I remem-ber that the object was 20 yards away from me”), current knowledge (“I just passed it, that objectis 10 feet to my right”), or expectancies (“I think/expect/surmise that the object is 2 feet away”)regarding the relative positions of observer and stimulus. The evidence regarding proximalaccommodation is more equivocal than in the previous three classes of the phenomenon, with sev-eral studies demonstrating that knowledge of distance, awareness of differences in room size, oreven “thinking near” or “thinking far” can impact upon the physical process of accommodation(Dittmar & Hale, 1994; Jaschinski-Kruza, 1991; Malmstrom & Randle, 1976; Rosenfield & Cuif-freda, 1990). Thus, proximal accommodation is the most obviously “cognitive” of the four typesof accommodation. In addition, several studies have suggested that mental workload or “cogni-tive demand” may be related to accommodation (Winn et al., 1991; see Edgar & Bex, 1996).Under ambiguous sensory circumstances, it may be that expectancies are at odds with the presen-tation of stimuli, creating conflict in muscular and cognitive accommodation feedback. In addi-tion, ambiguity in stimulus configuration has been found to increase mental workload with 2-dimensional displays observed over time by human monitors (see Dittmar, 1989; Dittmar et al.,1993), which may in turn lead to further changes in accommodation.

Practical Issues Regarding Accommodation

As mentioned, accommodation mismatches derived from discrepancies between the mechanics ofthe human visual system as it has evolved on Earth and the characteristics of computer-generatedsimulations of reality are difficult to surmount. Little empirical work has been done to quantifythe perceptual consequences of the mismatches involved, beyond the identification of particularlyproblematic task requirements, such as the distance judgments in different FOVs mentionedabove (Hale & Dittmar, 1994). Practical issues, then, must be formulated and addressed a priori,by means of analyses of the requirements of the user. What roles are IVEs intended for in eachproject, and what, specifically, is the user asked to do? What goals are to be met by the use ofIVEs? For example, applications designed to utilize VR in hardware development, operationsdevelopment and support, and facilities development will be valid only if a thorough human fac-tors analysis is undertaken in advance to evaluate task components in terms of what a virtual real-ity application can and cannot provide. In addition, IVEs in particular generate some adverse,even aversive, consequences for some human beings, and so their utilization in lieu of other simu-lation methods must be considered in the light of those consequences. The most notable of theseis “simulator sickness”, so called because of its emergence first in flight simulators, but referred tohere as “Immersive Simulation Adaptation Syndrome” (ISAS).

Page 65: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-7

Immersive Simulation Adaptation Syndrome (ISAS)

The term Immersive Simulation Adaptation Syndrome is coined here and preferred because itsuggests that the “sickness” (including symptoms of nausea, dizziness, spinning sensations, motordyskinesia, pallor, sweating, and in extreme cases, vomiting (Graybiel et al., 1974; Graybiel et al.,1968) occurs as a result of previous perceptual adaptation to the “real” physical world. Previousexplanations of the syndrome have focused upon the discrepancies involved in motion perception(see earlier in the paper), because the symptoms are frequently reported immediately subsequentto exposure to visual or visual-motor simulations of motion (Connors et al., 1985; Toscano &Cowings, 1978; Levy et al., 1981).

The similarities between “simulator sickness” and “Space Adaptation Syndrome” (SAS) (spacesickness) have been commented elsewhere (see Dittmar & Fuller, 1993). This author endorses theposition of certain researchers investigating SAS who have postulated that the syndrome occurs asa result of psychophysiological conflict arising when patterns of sensory input to the brain arerearranged, at variance with each other, or disrupted relative to expectations of stimulus relation-ships in the physical world upon which our species developed (Connors et al., 1985). This view isin some contrast to the prevailing theory, which emphasizes the roles played by vestibular andproprioceptive feedback in generating SAS and simulator sickness (Reason & Brand, 1975).

I propose a broader, developmental/ecological perspective, which holds that expectancies regard-ing the visual properties of environmental stimuli, associated with congruent feedback from theother senses (including proprioception, vestibular orientation, tactile sensation, audition, etc.),serve to create a psychophysiological “set” for the perception of the environment and subsequentinteraction with it. Developmentally, humans and other animals respond to the environment theirinteraction with it by forging neural pathways (networks) which are commensurate with thoseexperiences, and which lay the groundwork for further interaction with the world (see Hubel,1988, for a review). However, when presented with sensory information which breaks the rules,in effect, by violating the psychophysiological underpinnings of the perceptual system, that sys-tem responds with noticeable dysfunction. What we describe here as “sickness” is, in fact, a con-sequence of the adaptive nature of the central and autonomic nervous systems, temporarilystymied by information which the human literally (though only temporarily) lacks the neuralpathways to process. Thus, the term “Immersive Simulation Adaptation Syndrome” more accu-rately describes the cause and consequences of the mismatch between the perceptual system andthe sensory information provided in IVEs1.

Practical and Ethical Issues Regarding ISAS

Elsewhere, it has been suggested that this perspective regarding ISAS is simply re-stating that thevisual system “learns” to process information via interaction with the environment surrounding it(Gibson, 1979; 1966; 1950). Humans adapt to the physical characteristics of their world in part

1. As mentioned, together with my co-author B. Scott Fuller, I began to develop this idea in 1993 (see ref-erences). However, some of its current form, specifically the clearer relationship between ISAS andSAS, emerged from a recent conversation with F. Story Musgrave, M.D., Astronaut, veteran of 6 SpaceShuttle flights, and possessed of considerable conviction regarding the adaptability of the human species.

Page 66: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-8

through the development of a perceptual system - “system” used here in the ecological sense, as acollection of capabilities and potentialities entwined in the physical world which gave rise tothem. In addition, a system so capable of adaptation might be further adaptable - in short, it maybe possible to train the visual system, in order to reduce the consequences of the perceptual mis-matches which exist between real and virtual domains1. For example, there is some evidence thatthe accommodation response can be trained (Cornsweet & Crane, 1973; Trachtman, 1978), andthat instrument myopia decreases with experience (Schober et al., 1970). However, training theperceptual systems of users to the requirements of utilizing IVEs is something of a conundrum;the virtual representation of the real world becomes more markedly a simulation - and a truly “vir-tual” world - if it requires psychophysiological adaptation in order to function there.

Additional considerations regarding the adaptive nature of human neurological and psychologicalmechanisms rest upon the well-established findings concerning plasticity of the brain. In theabsence of binocular stimulation in infancy, the cells which facilitate processing of the differencein visual angles generated by gazing at a single object with two eyes will fail to differentiate,resulting in permanent absence or impairment of depth perception. Similarly, research exploringthe plasticity of the cortex have found that altering the nature of interaction with the environmentwill results in associated changes in the structure of the brain (see Dember & Warm, 1979, andHubel, 1988 for reviews of this literature). Thus, the structure of the environment and the struc-ture of the brain are intimately linked. Ethical issues pertaining to prolonged interaction withIVEs should, therefore, be taken seriously, and the development of research and utilization proto-cols for VR must be considered in light of the ability of the brain to modify itself, giving rise to atype of “cyber-adaptation”. In other words, users are manipulating and possibly altering their neu-ral structures as they learn to interact with a computer-generated world; the more frequent andprolonged the interaction, the more likely it is that physical modifications will occur.

While these concerns are serious ones, they also speak to the astonishing flexibility of the humanperceptual system. As humans continue to develop technologies which have the effect of remov-ing us - however temporarily - from the influence of physical systems in which we evolved, ques-tions pertaining to changes in consciousness, perception, and even physical abilities, will becomemore and more salient. It is true that technologies such as VR and environments such as IVEspresent us with the opportunity to learn more about the limits of perception. Far more meaning-fully, however, the perceptual and physiological adaptations and mis-adaptations evoked by IVEsand other environments may teach us anew how to define ourselves; not by what we perceive to be“real” at any given time and/or place, but by the degree to which we may adapt to whateverstrange, new worlds we choose to explore.

1. Or, in the case of SAS, between gravity-based experiences and those which take place in micro-gravity, inspace.

Page 67: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-9

References

Adelson, E. H., & Bergen, J. R. (1985). Spatiotemporal energy models for the perception ofmotion. Journal of the Optical Society of America, A, 2, 284-299.

Allport, D. A. (1970). Temporal summation and phenomenal simultaneity: experiments with theradius display. Quarterly Journal of Experimental Psychology, 22, 686-701.

Bex, P. J., Edgar, G. K., & Smith, A. T. (1993). Temporal aliasing: investigating multiple imaging.Ophthalmic and Physiological Optics, 13, 434.

Connors, M. M., Harrison, A. A., & Akins, F. R. (1985). Living aloft: Human requirements forextended spaceflight (pp. 19-58). NASA Publication No. SP-483. Washington, D. C.: NASA.

Cornsweet, T. N., & Crane, H. D. (1973). Training the visual accommodation system. VisionResearch, 13, 713-715.

Dember, W. N., & Warm, J. S. (1979). The psychology of perception. New York: Wiley.

Dittmar, M. L. (1989). Sex differences in vigilance and mental workload. Unpublished doctoraldissertation, University of Cincinnati.

Dittmar, M. L., & Fuller, B. S. (1993). A proposal for the use of virtual reality technology in thealleviation of space adaptation syndrome (space sickeness). In Proceedings of the 1993 South-eastern Simulation Conference (pp. 153-156). Huntsville, AL: Huntsville Association of Techni-cal Societies.

Dittmar, M. L., & Hale, J. P. (1994). Virtual reality as a human factors design analysis tool forarchitectural spaces - control rooms to space stations II: Subjective measures. In Proceedings ofthe Human Factors and Ergonomic Society 38th Annual Meeting (pp. 280-284). Santa Monica,CA: Human Factors and Ergonomics Society.

Dittmar, M. L., Warm, J. S., Dember, W. N., & Ricks, D. F. (1993). Sex differences in vigilanceperformance and perceived workload. Journal of General Psychology, 120, 309-322.

Edgar, G. K., & Bex, P. J. (1996). Vision and displays. In K. Carr and R. England (Eds.), Simu-lated and Virtual Realities (pp. 85-101).

Fahle, M., & Poggio, T. (1981). Visual hyperacuity: spatio-temporal integration in human vision.Proceedings of the Royal Society of London, B213, 451-477.

Farrell, J. E., Pavel, M., & Sperling, G. (1990). The visible persistence of stimuli in stroboscopicmotion. Vision Research, 30, 921-936.

Page 68: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-10

Fincham, E. F. (1951). The accommodation reflex and its stimulus. British Journal of Ophthal-mology, 35, 381-393.

Gibson, J.J. (1979). The ecological approach to visual perception. Boston, MA:Houghton Mifflin.

Gibson, J. J.(1966). The senses considered as perceptual systems. Boston, MA: Houghton Mifflin.

Gibson, J. J. (1950). The perception of the visual world. Boston, MA: Houghton Mifflin.

Gilmartin, G. (1986). A review of the role of sympathetic innervation of the ciliary muscle in ocu-lar accommodation. Ophthalmic and Physiological Optics, 6, 23-27.

Graybiel, A., Miller, E. F., & Homick, J. (1974). Experiment M-131: Human vestibular function.In Proceedings of the Skylab Life Sciences Symposium, Vol 1. Houston, TX: NASA TechnicalMemorandum TMX-58154.

Graybiel, A., Wood, C. D., Miller, E. F., & Cramer, D. B. (1968). Diagnostic criteria for gradingthe severity of acute motion sickness. Aerospace Medicine, 43, 1179-1189.

Hale, J. P, & Dittmar, M. L. (1994). Virtual reality as a human factors design analysis tool forarchitectural spaces - control rooms to space stations I: Objective measures. In Proceedings of theHuman Factors and Ergonomic Society 38th Annual Meeting (pp. 275-279). Santa Monica, CA:Human Factors and Ergonomics Society.

Henderson, J. M. (1993). Visual attention and saccadic eye movements. In G. d’Ydewalle and J.Van Rensbergen, (Eds.), Perception and cognition: Advances in eye movement research (pp. 37-50). Amsterdam: Elsevier.

Hubel, D. H. (1988). Eye, brain, and vision. Scientific American Library series (22), New York:Scientific American Library.

Jaschinski-Kruza, W. (1991). On proximal effects in objective and subjective testing of darkaccommodation. Ophthalmological and Physiological Optics, 11, 328-334.

Johansson, G. (1975). Visual motor perception. Scientific American, 232, 76-88.

Kohlers, P. A. (1972). Aspects of motion perception. Oxford: Pergamon Press.

Levy, R. A., Jones, D. R., & Carlson, E. H. (1981). Biofeedback rehabilitation of airsick aircrew.Aviation, Space, and Environmental Medicine, 52, 118-121.

Mahlstrom, F. V., & Randle, R. J. (1976). Effects of visual imagery on the accommodationresponse. Perception and Psychophysics, 19 450-453.

Mandelbaum, J. (1960). An accommodation phenomenon.Archives of Ophthalmology, 63, 923-926.

Page 69: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

6-11

Neary, C., Fulford, K., Cook, M., & Williams, M. (1994). The effect of visuo-perceptual contexton eye movements. Investigative Ophthalmology and Visual Science, 35, 2032.

Owens, D. A. (1984). The resting state of the eyes. American Scientist, 72, 378-387.

Owens, D. A. (1979). The Mandelbaum effect: Evidence for an accommodative bias towardsintermediate viewing distances. Journal of the Optical Society of America, 69, 646-652.

Reason, J. F., & Brand, J. J. (1975). Motion sickness. London: Academic Press.

Reichardt, W. (1957). Autoikorrelationsauswertung als Funktionprinzip des Zentralnervensys-tems. Zeitshcrift fur Naturforchung, 12b, 447-457.

Rosenfield, M., & Cuiffreda K. J. (1990). Proximal and cognitively-induced accommodation.Opthalmological and Physiological Optics, 10, 252-256.

Schober, H., Dehler, H,., & Kassel, R. (1970). Accommodation during observations with opticalinstruments. Journal of the Optical Society of America, 60, 103-107.

Toscano, W. B., & Cowing, P. S. (1978). Transference of learned autonomic control for symptomsuppression across opposite directions of coriolis acceleratio. In Preprints of 1978 Scientific Pro-gram: Aerospace Medical Association (pp. 132-133). New Orleans, LA: Aerospace MedicalAssociation.

Trachtman, J. N. (1978). Biofeedback of accommodation to reduce a functional myopia: A casereport. American Journal of Physiological Optics, 55, 400-406.

van Santen, J. P. H., & Sperling, G. (1985). A temporal covariance model of motion perception.Journal of the Optical Society of America, A, 2, 300-321.

Watson, A. B., Ahumada, A. J., & Farrell, J. (1986). Window of visibility: psychophysical theoryof fidelity in time-sampled visual motion displays. Journal of the Optical Society of America, A,2, 322-342.

Wesner, M. F., & Miller, R. J. (1986). Instrument myopia conceptions, misconceptions, and influ-encing factors. Documenta Ophthalmologica, 62, 281-308.

Westheimer, G. (1957). Accommodation measurements in empty visual field. Journal of the Opti-cal Society of America, 47, 714-718.

Winn, B., Gilmartin, B., Mortimer, L. C., & Edwards, N. R. (1991). The effect of mental effort onopen- and closed-loop accommodation. Ophthalmic and Physiological Optics, 11, 335-339.

Ullman, S. (1979). The interpretation of visual motion. Cambridge: MIT Press.

Page 70: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 1

MAKINGVIRTUAL REALITY

USEFUL

Carolina Cruz-Neira

Department of Electrical and Computer Engineeringand

Iowa Center for Emerging Manufacturing Technology

Iowa State University

Page 71: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 2

DESIGN ISSUES FOR VIRTUAL REALITY

Carolina Cruz-NeiraIowa State University

[email protected]

The previous sessions identified the key features of VR that differentiates it from traditionalinteractive computer graphics applications and discussed the issues that applications developersshould consider to determine if VR is the adequate tool for the problem at hand. In this sectionwe will cover the issues involved in designing effective VR applications

1. Immersion

We are so accustomed to our everyday three-dimensional world in which we live that we areoften not aware that we live in one. Objects constantly surround us. We can move in the world,grab objects, and look around. We perform all these actions naturally and almost unconsciously.Even more so, we expect the world to be there and to behave and respond to our actions in aspecific way. Traditional workstation graphics (with or without stereo) provide a paradigm forviewing the computer generated worlds as through a window. Viewers are constrained to a viewof this world from the outside.

Virtual Reality enables a different visual paradigm: viewers are inside the world looking aroundthem. This new paradigm is the key to achieve total immersion in virtual environments. Thegoal is to create the illusion to the participants that they are completely surrounded by the virtualworld. The immersion in virtual worlds has to be perceived as natural as the real world that welive in. Immersion is still a subject of ongoing research.

We should note that providing a sense of immersion does not necessarily mean been isolatedfrom the real world. In fact, for some applications, real world isolation can be highly intrusiveand disorienting because, even though, viewers can not see the real world, they are still aware ofthe physical surroundings, fearing events such as tripping over a cable or running into a wall.This isolation and awareness of the real world can ruin the immersive experience.

2. Degree of Intrusiveness of the Interface

Today’s virtual reality systems require users to wear encumbering devices and attachments, suchas head gear or cables from trackers and other input devices. These attachments can restrictusers from walking more than a limited distance, from making fast changes in position. Thisprevents participants from freely explore the environment. Once thing that really annoys me isthe tug of a cable when I am in the middle of a very enjoyable virtual experience. Situations likethis make users aware of the interface which limits the enjoyment of the experience. Some

Page 72: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 3

interfaces are more intrusive than others. For example, HMDs tend to be more cumbersome thanstereo glasses with a tracker. In general all current virtual reality systems suffer from beingintrusive.

3. Mixing virtual objects with real world objects

Certain virtual reality applications require the user to operate specialized devices. One exampleof this would be a car driving simulator, where a mock-up of the dashboard is required. Othersmight require the integration of the viewer’s physical body into the virtual environment, such asan ergonomics application. For these type of applications, it is extremely helpful for viewers tosee the devices they are operating, as well as their bodies, in relationship to the environment. Itis up to the designer to identify these needs before decisions are as to about which system ismost appropriate. See-through HMDs, and projection systems, currently provide an environmentwhere blending of real and virtual objects occur naturally. Other virtual reality display devicesrequire the computer to provide a graphical representation of the real objects (like the user’shand) in the virtual world.

4. Audio feedback

Sound plays an important role in virtual environments. The interaction between visuals andsound is particularly important. Sound in immersive experiences is being applied in four areas:Navigation: Using audio as an additional clue to orient users in the virtual space. Localization:Associating a sound with an object or a location in space. Sonification: Presenting informationas sound. Interaction: Using audio for input and output, such as voice recognition and speechsynthesis.

5. Physical Feedback

Physical feedback (also known as haptics) can augment the visual and auditory informationwithin a virtual world. In the real world, we are used to touching, feeling and manipulatingobjects around us in addition to seeing and hearing them. Informal studies have shown thatvirtual environments that synchronize a simple haptic interface with a visual display provide amuch greater level of immersion than a high quality rendering of the visual display alone. Incomparison to hearing and vision, the current understanding of physical feedback is limited.Although there are commercially available devices to simulate the effect of tactile and forcefeedback as presented in earlier sections, physical feedback for VR is still in its early stages ofdevelopment.

6. Paradigms to navigate and control virtual worlds

As was discussed in previous sections, virtual reality presents a new paradigm to interact withcomputer simulated environments. We are dealing with new interfaces and new ways tocommunicate with computers. We are capable of creating large and complex virtual spaces, thatare not constrained to the physical laws of our real world, but to our own imagination. This issue

Page 73: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 4

raises many questions: How should we navigate in virtual spaces? What orientation cues shouldbe provided? How should virtual objects respond to our actions? How can viewers determinewhere they are in the virtual world?. Several years ago, pull-down menus and mice wereintroduced as a more intuitive interface to computers than command line and keyboard. Now,we have to design paradigms to navigate through virtual spaces and to control the objects andvariables of virtual worlds.

Several methods are currently being used for navigation in virtual environments. The mostcommon one is based on pointing and flying to the place we want to go. Another method is totele-transport ourselves to a predetermined location. Other navigation tricks scale down thevirtual world to fit in the viewers’ hand, so they can pick where they want to go. Three-dimensional maps displayed in front of viewers can also be used to specify places the user wantsto go.

There are other actions users can perform in virtual worlds that also require different controlparadigms: manipulation of the position and orientation of objects in the environment, ormanipulation of some of the variables that define the environment state, such as changing thelighting from day to night. Three-dimensional virtual pull-down menus have been used incombination with floating sliders and buttons to present available choices to the viewer. Inputdevices have been overloaded with control tasks. For example, a code of gestures has beendefined for datagloves. In spite of all the methods now used, I am still believe that completelynew and different paradigms should be constructed to interact with immersive virtualenvironments. What those paradigms will be, we do not yet know. At the moment there aremany researchers working on discovering them.

7. Successive refinement or level of detail in the images

Effective virtual reality systems need to perform at a rate no less than 15 frames per second.This frame rate includes the times for computation, stereo rendering and reading of the inputdevices. There is always a trade-off in virtual reality applications between how muchcomputation can be done vs. the level of detail of the corresponding images. Most of currentvirtual environments are characterized by poor image rendering quality. As computer graphicshardware improves, we are able to create virtual worlds with a great deal of polygons, lightingmodels and textures. Still, it is fair to say that in the near future we will not have enoughcomputing power to create complex models and display them in a virtual reality environment inreal time.

The technique of successive refinement trades off real-time response for image quality. In thesuccessive refinement technique images are frozen and detail is added in a progressive manner.This technique can be used when there is a need to explore large amounts of data or when a highlevel of detail is required to extract meaningful information from the virtual environment.Another application for successive refinement is to present viewers with a simplified version ofthe environment to explore in real time, and, once they reach a place or find an object of interest,request a non-real time more detailed version.

Page 74: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 5

8. Shared virtual environments

Sharing a virtual environment simultaneously by more than one user is a much sought after goalin virtual reality systems. Most research projects are done in teams, where scientists constantlyexchange knowledge and ideas. Architects and designers usually work in groups discussingdesign issues over drawings and floor plans. Teachers present their lectures to a group ofstudents. Amusement park rides frequently involve more than one person. In order for virtualreality to be an effective media in disciplines such as research, education and entertainment, itmust allow several users simultaneously interact and communicate with one another in the sameenvironment.

Virtual environments can be shared locally or remotely. For local collaborations, if HMDs areused, one helmet for each additional user would be required. Projection systems offer the optionof having one viewer as the active controller for the experience, while the other participantsobserve and go along for the ride. The observers wear non-tracked shutter glasses in order to seein stereo. Obviously, for remote collaborations a complete duplication of the VR hardware isrequired.

Another feature derived from simultaneously sharing environments is the ability to offer guidedtours in virtual worlds. An expert navigator, familiar with a particular virtual world, canintroduce other viewers to the experience by controlling the path traversed through theenvironment. After an initial guided exploration, other viewers can explore the environment intheir own.

There is another type of shared environment that is now emerging: high-performance distributedvirtual environments. With the advances in network technology, several remote virtual realitysystems (not necessarily using the same hardware) can be linked together to share the samenumerical simulation. Users get the impression of being next to each other in the virtual world,even though they may be physically located at the two extreme poles of the planet.

9. Usability: General vs. Specific

There is not a standard methodology to develop virtual reality interfaces. An effort to compileand standardize current research has recently been addressed by many of the leading researchersin the field. Many of the current virtual reality systems are being developed to satisfy thespecific requirements of a particular application. This situation may have led to the duplicationof efforts in solving similar situations. Virtual reality system designers should be concern withtheir design being flexible enough to be used in a wide variety of applications.

Page 75: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 6

10. Some applications

I have included several images and movies that show some of the research efforts at the IowaCenter for Emerging Manufacturing Technology in the field of virtual reality. Our objective is tointroduce engineers and scientists to the use of virtual reality as a working tool. These projectsand more can be visited at http://www.icemt.iastate.edu/Projects

Architectural walk-throughs: We are collaborating with the Iowa State University(ISU)architects to develop a complete virtual model of the current ISU campus.The environment wasinitially developed as a training testbed for VR researchers. It is currently being used forexperimental interface development, architectural reviews, interactive student orientation, andreal-time walkthroughs.

Virtual campus movie: movie01.mpg

Page 76: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 7

We are also exploring techniques to create photo-realistic scenes in virtual worlds. The Maze isan application that demonstrates that:

Dynamic Statistical Data Exploration: Dynamic statistical graphics enables data analysts in allfields to carry out visual investigations leading to insights into relationships in complex data thatconsists of many different variables. The data consists of multiple observations taken on thesame object or measured at the same place. Dynamic statistical graphics involves methods forviewing data in the form of point clouds or modeled surfaces. Higher dimensional data can beprojected into 1-, 2- or 3-dimensional planes in a set of multiple views or as a continuoussequence of views which constitutes motion through the higher dimensional space containingthe data.

Page 77: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 8

Dynamic Model of an Axial Piston Pump: Due to the inherent flexibility and high power toweight ratio, axial piston pumps have found an increasing industrial use as the flow generatingmember in hydrostatic transmissions. The increasing industrial use of hydrostatic pumps insidefactories has also brought about an increased concern in the environmental conditions that existsdue to the noise problem of these pumps. The goal of the analytical model was to understandand describe the noise and vibrational characteristics of the equipment under normal operatingconditions. The mathematical model provided the basis for making design changes thatminimize the noise produced by the unit.

Pump in motion: movie02.mpg

Interactive devices and force-feedback: We are currently working on developing a series ofdevices to enhance the interaction with virtual environments.The virtual bicycle was developed to navigate through ISU’s campus, but it is now used inseveral other large database navigation applications.

Virtual bicycle movie: movie03.mpg

We are currently working on a robotic and magnetic interface for VR force interaction:

Robotic interface movie: movie04.mpg

Manufacturing: Several mock-ups of factory floors have been developed. The one shown in thismovie is a prototype of a VR based immersive environment as a tool for future factory design.

Factory movie: movie05.mpg

Page 78: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Page 9

Simulation: Several applications linking remote computing engines to our VR facility have beendeveloped.

The real-time interaction with a dynamic simulation of the Space Shuttle Remote ManipulatorSystem (RMS) is an application to illustrate the power of combining virtual reality with real-time dynamic simulations for mission analysis, planning, and training.

Shuttle movie: movie06.mpg

An interactive molecular modeling project used VR for the docking of a drug molecule to itsmolecular receptor using real-time molecular dynamics. A molecular biologist guides a drugmolecule into the active site of a protein, receiving real-time feedback from a moleculardynamics simulation.

Molecular docking movie: movie07.mpg

Page 79: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

8-1

SCIENTIFIC APPLICATIONSOF VIRTUAL REALITY

Richard Gillilan

Cornell Theory Center

Page 80: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-2Cornell Theory Center

Scientific Applicationsof

Virtual Reality

Page 81: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-3Cornell Theory Center

In an ideal virtual world, both hardware and software areinvisible. The researcher is concerned only with repre-senting and interacting with the science.

✬ Hardware: how we create the effect of virtual reality in Cornell’s Visual Insight Zone (VIZ ).

How to best ignore hardware.

✬ Software: making virtual reality a productive place to work with the WorkSpace windowing toolkit. Building an application.

➔ interaction with simulations➔ interaction with persons

✬ Science: getting the most out of virtual reality for scientific research and education. Predicting how and when molecules stick together.

Final QuestionWhat does VR really contribute to Science?

Page 82: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-4Cornell Theory Center

http://www.icemt.iastate.edu/Labs/se.html

http://www.tc.cornell.edu/Visualization/Staff/richard/VROOM/VR.html

✬ stereo vision✬ wide field of view✬ 6D input (position and orientation)✬ 3D sound (soon)

Page 83: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-5Cornell Theory Center

View from above

Objects appear to be present in the room with you at fixed locations as you movearound. When you reach for an object with the 3D wand, it is where it appears to be.

Real space coincides with virtual space.

Problems:• objects close to operator’s face will not converge for other viewers• stereopsis is degraded for others when operator turns or tilts head• objects do not appear in the exact same location to others

Advantages:• Stereo depth is enhanced for viewers behind operator• high resolution display• more appropriate for classroom setting

The person with the tracker sees theobject floating in front of them, othersmay get a distorted view.

Page 84: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-6Cornell Theory Center

--- Software ---WorkSpace VR Windowing Toolkit

See Frank Wood et.al. IEEE Computer Graphics Applications Vol 14 (4) July 1996.

• An application-building environment !• Does for VR what windows did for 2D computer screens.

• Very similar to JavaTM in structure and use.• Device independent.• Handles events, transformations and tools for the user.• standardized environment makes “look and feel” familiar

FrameLayoutRowLayout ColumnLayout FlyTool RotateTool Stack, Queue

ContainerLayout ToolChest Tool WindowServer Vector

Yobject

Component

Scrollbar Slider Button Container 3D Text WindowMan

Navigate Rotate FileMan Frame

ISA relationships

Page 85: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-7Cornell Theory Center

WorkSpace file browser and two running applications:

The left-hand 1x1 ft cube contains a virtual world model that can be expanded tosurround the viewer. The right-hand cube is a movable video window.

move

resize

kill

1 ft 8 ft

Application can expand tobecome a virtual environmentin the same way a window canexpand to fill the full screen.

Page 86: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-8Cornell Theory Center

--- Science ---ChagasWorld

This virtual world tells the story of Chagas’ disease and, at the same time,helps new users learn how to get around and manipulate objects in VR.

An unsuspecting victim.

A simple model of a Cafua (hut) typical ofthe kind encountered by Carlos Chagas inthe early 1900’s when the disease wascharacterized.

Triatoma Infestans, the carrier of the parasite. Trypanosoma cruzi parasite found in the insect gut.

Page 87: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-9Cornell Theory Center

-- Effective use of virtual reality requires some training--

1. Choice of size, position and orientation of object in relation tothe viewer is very important and is the first step in visualizingany scientific data.

2. Learning to organize objects (such as controls) effectively inthe 3-dimensional space around one’s body. Escaping thebias of 2-dimensional computing.

3. Learning to walk to objects that are not within reach.

4. Paying attention to stereo vision when reaching for objects

5. Learning to operate multidimensional (3D-6D) controls:

Rotation:

Zooming:

Page 88: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-10Cornell Theory Center

A few interesting facts about Chagas’ disease

“At night I experienced an attack (for it deserves no less name) of theBenchuca a species of Reduvius, the great black bug of the Pampas. It ismost disgusting to feel soft wingless insects about one inch long crawlingover one’s body. Before sucking, they are quite thin but afterwards they becomeround and bloated with blood, and in this state they are easily crushed”.

-Charles Darwin, March 25, 1835--- from “Darwin’s Illness”, Nature vol 184, 1102 (1959) --

1909 Carlos Chagas describes disease (see http://www.dbbm.fiocruz.br/tropical/chagas/chapter.html)

• Protozoa (T. cruzi) living in insect gut• Passed via insect droppings• Chronic health problems leading to death• Presently >18 million people infected, 90 million at risk

Recent Reviews

R. Heiner Schirmer, Joachim G. Muller, and R. Luise Krauth-Siegel Angew. Chem. Int. Ed. Engl. 1995, 34 141-154

R. Luise Krauth-Siegel and Ralf Schoneck FASEB J. 1995 9, 1138-1146

Page 89: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-11Cornell Theory Center

--- Drug Design Strategy ---

Trypanothione maintains reducing environment in cellsand protection against toxic by-products of metabolism: O2-

Trypanothione reductase (a large protein molecule)maintainsa high concentration of reduced typanothione (breaks S-S bond).

Drug target: find small drug molecules that stopTrypanothionereductasefrom doing its job!

Experimentally-determined molecular structures available:

• Bailey, Smith, fairlamb and Hunter(Eur. J. Biochem 213, 67-75, 1993)

• Lantwin,Schlichting, Kabsch, Pai, Krauth-Siegel(Proteins 18, 161, 1994)

• Kuriyan, Kong, Krishna,Sweet, Murgolo, Field, Cerami,Henderson (1991)

Page 90: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-12Cornell Theory Center

--- Simulating the Drug to Protein Interaction ---

Simulated Annealing/Monte Carlo

Multiple-Start method: Trevor N. Hart and Randy J. Read, Proteins: Structure, Function and Genetics 13:206-222 (1992)

Use an empirical forcefield to estimate the energy of interaction

Merck Molecular Force Field Thomas A. Halgren Journal of Computational Chemistry Vol 17 No 5 & 6, 490-519 (1996)

• Make random translations and rotations

• Evaluate energy E and letδE = E current -E previous

• accept move ifδE < 0• if δE > 0 accept with probability exp(-δE/kT)

V Σvq Σvε ΣvT+ +=electrostatic + van der Waals + dihedrals

Page 91: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-13Cornell Theory Center

--- Flow of the Computation ---

The user interacts with the simulation by dynamically changingthe annealing schedule and building a database of conformationsthat can later be refined with further annealing runs. Dynamicoptimization of appropriate stepsizes and temperatures by visualexamination of the molecular motion is very effective.

floating grid run

initial random config

simulated annealingschedule

final configuration

temperaturesdurationsstep sizes

new ligand

database

refinement

Page 92: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-14Cornell Theory Center

--- Realization of the Flow ---

• movable, resizable control box• visual indicator of interaction energy and acceptance ratio• sliders to set temperature and step sizes• multifunctional buttons

Button functions can load/store database, pause simulation, start from a storedposition and control the display of stored structures

ligand

simulation control panelprotein surface

Page 93: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-15Cornell Theory Center

--- Previous Experiments ---

VideoTeleconferencing in a virtual environment

See Frank Wood et.al. IEEE Computer Graphics Applications Vol 14 (4) July 1996.

Virtual Reality Module for IBM Data Explorer

Page 94: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-16Cornell Theory Center

What does VR really give you? (beyond stereo depth cues)

✧Visual advantage The importance ofperipheral vision for locating objects in a complex scene has long been overlooked. This type of vision ismoresensitive to motion. Anyone who is loosing that vision (glaucoma) is keenly aware of its importance in everyday navigation. We are constantly aware of all objects in our visual field and use that information to avoid collisions.

✧Spatial advantage By using the 3-dimensional space around us, we can potentially organize and access a greater amount of information. The involvement of body motion may also improve the learning process.

The greatest impact virtual reality will have on science is how itwill change our way of thinking. Virtual reality encouragesviewers to be participants immersed in the data rather thanpassive observers watching from a distance. In addition, it is notthe shape of objects that is emphasized so much as the shape ofthe space that they enclose and how that space is filled. Bythinking of some scientific problems as environments for thefirst time, new insights will be gained.

Page 95: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

Applied Virtual RealityAugust, 1997Richard E. Gillilan

8-17Cornell Theory Center

This work would have been impossible without the hard labor and dedication of the following students:

Ryan Lilien Jon Alferness Jon Rosenberg Paul Zimmons Brian Joseph Bob Amidon Dan Brown

Technical collaborators:

Frank Wood* (Cornell Theory Center)Carolina Cruz-Neira (Iowa State, ICEMT, USA)

Scientific collaborators:

Carlos H. Faerman (Cornell University, USA) Trevor N. Hart (University of Alberta, Canada)

* principle author of WorkSpace, joined the CTC staff Sept 1996

Page 96: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-1

USING VIRTUAL REALITYIN ENGINEERING APPLICATIONS

Oliver Riedel

Competence Center Virtual RealityFraunhofer Institute for Industrial Engineering

Page 97: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-2

How to use Virtual Environments for Engineering Projects

Oliver H. Riedel, Ralf Breining, Holger R. ScharmFraunhofer Institute for Industrial Engineering

Competence Center Virtual RealityNobelstrasse 12c

D-70469 Stuttgart, Germanyphone: +49 (0)711 970 2088, fax: +49 (0)711 970 2213

oliver.riedel iao.fhg.de

1. Abstract

Most of todays engineering projects are focusing among the "real" contents on two catchwords:Time and costs. Improving the time to market and the reduction of the costs need innovative tools:One could be the use of Virtual Environments (VE). This paper descirbes:

- The demands from the Engineering Part of a Product Development Project- A classification of the use of VEs in Engineering Projects- Offers from the VE Part- Examples from research projects

Taking advantage of a new process chain the whole process of product development and productimprovement can benefit like other fields of work from its VE-based tools:

- Reduction of costs and time for building prototypes gives the possibility fortesting and evaluation of more variants,

- Faster test and evaluation through virtual testbeds,- Scaleable test persons,- On-line changes of parameters,- Enhancing quality and effectiveness of the project drawing and the design

performance,- Improvement of the product quality, e.g. by integrating the customer earlier

and getting a better feedback of the customer,- A better ecology balance by saving resources during the iteration cycles.

One should not hide two minor disadvantages in using this new techniques: By now the necessaryhard- and software especially the computers are expensive and in some companies the gap bet-ween the actual used tools and the proposed way is huge, may be sometimes invincible. However,in our opinion both facts will come under the pressure of the market and cause the often use ofvirtual prototypes in VE for better products, brought to market more quickly and at lower costs.

Page 98: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-3

2. Demands from the Engineering Part

2.1. Aspects of engineering that acquire a virtual environment (VE)Tools and methods which are used in virtual work environments and which permit information tobe obtained on the product in early development phases offer attractive optimization potentials(time, costs, quality) in engineering. Here, among other things, product aspects and product fea-tures are in the foreground which essentially permit qualitative information to be obtained (e.g.formal aspects) and product features which can be primarily assessed by evaluation (e.g. motionspaces in connection with capability of assembly). For so doing, the interaction of man with thevirtual prototypes and their perception and immersion of man in the virtual work environment isdecisive for handling the tasks on hand (Fig. 1). The verification of qualitative information hasnormally only been possible so far on the basis of real, physical prototypes

.

2.2. Which steps of engineering projects can be solved in a VE?The field of engineering, apart from planning tasks, handles above all complex contents of engi-neering/natural sciences, which will be discussed in the Global Engineering Network on the basisof CSCW in future. Contrary to the planning tasks, the contents of engineering/natural sciencescan in many cases not be discussed exclusively on the basis of numerical values or texts, but thesefrequently involve complex multi-dimensional problem scopes. Therefore, a discussion of thesefacts is supported by a spatial, multi-dimensional representation of the contents.

Fig. 1: Components of a virtual environment for engineering

Page 99: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-4

The major weak-point of computer-assisted simulation and design tools which have been availa-ble so far is that ideas of three-dimensional geometries and/or functions are handled by two-dimensional input/output media. This disadvantage is being intensified with growing complexityand multi-dimensionality of the problem on hand. An extremely interdisciplinary discussion ofthe problem scope is aggravated by the existing tools, which are very special tools in the majorityof cases, whereas the multi-sensory and immersive systems of Virtual Reality (VR) supported aquick and interdisciplinary visualization of contents.

The use of conventional computer systems in such complex processes as the design of productshapes or the evaluation of assembly processes requires the user to have a great ability to think inabstract terms. To gain a more intuitive access to these problem scopes, the problems are to behandled in a virtual environment. The three-dimensional representation of shapes, tools and work-places in real time and the use of man models opens up new perspectives.

The use of VE-based tools in a virtual, immersive work environment with real-time capability isto speed up the finding and review of concepts in the early stages of the development process. Forthe finding and review of concepts, product aspects and product features of prototypes can beexamined in the virtual environment, which so far have only been verifiable in real, physical pro-totypes (Fig. 2). For this purpose, product aspects and product features can be represented directlyor indirectly by means of metaphors. In a passenger car, e.g., the shape or the room required by

Fig. 2: Virtual environments for engineering in the development process

Page 100: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-5

individual units are directly representable product aspects and features in contrast to the dynamicload of the chassis (indirectly representable by means of metaphors). Connecting this steps of theearly development with the quality-, time- an cost-management will achieve several advantages.

3. Classification of fields of application for VE in engineering

3.1. Virtual prototypingAmong other things, the duration of the development process is determined by the number of thephysical prototypes which are required for evaluation of the desired product properties. Virtualprototypes are defined as a computer-based simulation of a technical system or subsystem with adegree of functional behavior which is comparable to corresponding physical prototypes /23/.

The visualization of virtual prototypes calls for the processing of the data in dependence on thevisualization techniques. Apart from purely photorealistic visualization, complexity-reducingmodels and symbolic visualization are used, as well. Mixed models of both methods are mostwidely spread. Metaphors for visualization of simulation results, for example, are used in theFEM analysis (overlaying of paint leveling) or in the representation of paint coat thicknesses inrobot simulation /24/.

Research activities for the evaluation of virtual prototypes with techniques of VR are encounteredin the investigation of flow response of flying objects /5/ and automobile chassis /3/, /8/. In sodoing, the geometry model is overlaid by the results of flow simulation as iso-surfaces or as paintparticles. The user has the possibility of freely moving cutting planes or virtual trails of smoke.The investigation of the capability of assembling component is also found in aircraft construction/7/ and in vehicle construction /3/, /8/, /9/, /25/. This included the evaluation of the free construc-tion room available, the accessibility and the ergonomic design of manual assembly workplaces.

Apart from the evaluation of the product properties, a further-reaching use of virtual prototypesalso requires the production processes of the products to be considered. Research approaches canbe seen in the investigation of industrial specification preparation processes /13/ and the simula-tion of product machining on machine tools /10/, /11/.

Aspects for asynchronous and synchronous object management in a distributed environment /9/, /19/ and the data models for the manipulation of lead time have been in the foreground so far insystems for the design of objects in a virtual work environment. Here, the data models are orien-ted by existing CAD standards or they focus on application protocols for the product data modelSTEP /26/, /27/ with these protocols still to be developed. These systems are also not fully immer-sive, but they are monitor-based in connection with relevant 3D input tools (e.g. space mouse,trackball) such as with the system ARCADE /20/, which, with respect to the workplace, is orien-ted by existing CAD workplaces. Handling aspects of virtual prototypes in an immersive VE canbe found for example at the University of Wisconsin-Madison /21/, /22/.

3.2. 3D representation of complex dataThe 3D representation of complex data is applied to the examination of comprehensive geometrydata records, for example in architecture or in the Scientific Visualization field, the representationof mathematically computed data. In the visualization of building data in architecture, the archi-tect would like to walk the virtual building together with the customer and detect design errors.

Page 101: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-6

Research and development work is conducted at a large number of research institutions and inenterprises /2/, /3/, /4/. The visualization of mathematically computed data has the goal of rende-ring simulation results visible, audible or tactile. Outstanding work on this subject can be foundwith NASA Ames Research /5/. For evaluation of the outside shape of flying objects, they simu-late the flow response in a virtual wind channel.

3.3. Simulation of work sequences and activitiesA near-reality perception (immersion) in virtual environment is used to plan and investigate worksequences and activities. Immersion is achieved by three-dimensional representation of informa-tion in real time by integrating several senses of man, with the user perceiving himself or herselfas part of the scene. The interaction differs from input devices used so far by the possibility ofdirect intervention in the three-dimensional space /6/. The goal is to adapt the work environmentto the response and physique of man. The simulation of work environment and the interaction ofman with his or her work surroundings is necessary to produce a near-reality perception. It is pos-sible in the surroundings to conduct ergonomic investigations with subjects without a physical testsetup.

3.4. Substitution of complex physical prototypes and toolsPhysical prototypes are used for testing and evaluation of product properties in the early develop-ment stages of product development. The use of VR permits testing and evaluation to be conduc-ted in virtual prototypes. The objective is to avoid physical prototypes, if possible. Fields ofresearch are currently found in the automotive /3/, /8/, /9/ and aerospace industry /1/. Apart frompurely design-oriented tasks, efforts are also made to conduct physical materials tests, forexample, crash tests in virtual prototypes /8/, /9/. Further-going approaches deal with the produc-tion-engineering view of prototypes and products /10/ up to the planning of production facilities /11/ and production plants /13/. These also include approaches to the substitution of complexresources for the support of production, for example, in the production of cable harnesses in theaerospace industry /14/.

3.5. Modeling and control of business and production processesApart from product and production-oriented spheres of application, VR technologies are also usedin product modeling and control /12/. Research approaches can also be encountered in informa-tion technology, for the analysis and maintenance of communication networks /1/ and in themodeling of production-engineering processes /13/.

3.6. Co-operation in virtual work environmentsApart from the spoken language, human communication is based on the way of expression of thebody (gestures) and the face (miming). The field of research of near-reality virtual work environ-ments investigates the human communication in immersive environments of VR. The contents ofwork are Computer-Supported Cooperative Work (CSCW) /15/, /16/, the mapping of humangestures and movement /17/ and the realization of synthetic behavior of virtual human beings /18/.

Page 102: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-7

4. Offers from the VE Part

VR has developed from an experimental technology at large-scale research institutions and in bigenterprises up to a strategic basic technology in recent years /1/. The use of these tools in productdevelopment is being investigated already in research and development. VR permits the user toimmerse in the computer-generated worlds and to investigate the concepts of new products or fac-tories as a forerunner to their physical realization. VR also has close links to related technologies(e.g.: 3D animation, CAD and multimedia), however, with the advantage that the user is given theimpression of an immersion, a direct participation and integration in an intuitive manner.

The interaction of man with computers will in all probability be effected in the next millenniumvia user interfaces as they have partly been implemented and studied already under virtual reality(VR). The diffusion of these user interfaces to a wide variety of fields of application is speeded upby the complexity and interdisciplinary nature of the tasks on hand. The general goal of VR-baseduser interfaces is the increase in the efficiency in the solution to complex and/or abstract problemsto be solved.

VR is presently used for 3D architecture visualization very realistically already; it is also used inthe research stage for the interactive presentation of different products. The state of the art of thecomponents relevant to this project is as follows:

- Models up to approx. 400,000 polygons on SGI ONYX with approx. 10 dif-ferent picture scenes per second,

- Stereoscopic visualization up to 1280 x 1024 pixels (BOOM, n-visionHMD),

- Electromagnetic tracking systems with resolutions of from 0.5 mm to 2 mm,- 3D scanning systems for geometry and texture to an accuracy of from 0.05

mm to 0.5 mm.The path from the first draft model to the product description fit for serial production is accompa-nied by many iteration steps. The validation of the state of development requires the generation ofmodels of shape, function and/or production. The classical realization of design representations inthe design, e.g. by folding of paper, plastic shaping of modeling materials or the handling ofmodel foam is very intuitive and it supports spontaneous creativity. The difficulty here is themaking of variants, the reproduction of the models and the common drafting with persons invol-ved who are locally separated. However, these aspects (formation of variants, reproduction, distri-buted drafting) are advantages of CAD/CAM drafting. Creative design which integratescomputer-based tools in the approach in very early phases, however, is valued critically by manydesigners at the present time. A reduction of the handicraft work normally associated with it leadsto a critical distance to the model. In a virtual environment, such tasks can be handled intuitivelyon the basis of virtual prototypes.

Page 103: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-8

4.1. How to migrate data from existing applicationsThe following table gives a small overview about the migration ways from existing applications-Because a migration strategy depends very strong to the available products we show here only avery abstract version of migration. For actual information about converters and other tools refer toour web-page.

.

Tools/Technologies Benefits Problems

3D Realtime Graphics

free choice of viewpoint,visualisation of surfaceproperties,simulation of differentlighting conditions,depth impression

limited bandwidth,no raytracing algorithms,linear geometry/color runsimulator desease

Augmented Realitymixing real and virtualworld,low immersion

data repair / reduction,surface reconstruction,discontinuities

3D Interactionintuitive HCI,manipulation in 3D

problems at low framera-tes,needs accurate calibra-tion

3D Sound Renderinglocating sound sources,sound simulation

time consumption,iterative rendering

3D Scanningvirtualization of realobjects

large data sets

Realtime Simulationphysics (inertia, gravity),asynchronous in realtime,replay of keyframe anims

needs high numerical speed

Online Database Access

access to actual informa-tion,knowledge base,inventories

geometry managementrepresenting information

Table 1: Benefits and problems of the tool / technologies

Page 104: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-9

4.2. Virtualization of real prototypes/original modelsNowadays, product development often does not start on the computer in the majority of cases, butwith the making of "real" models. Most designers are probably right in their opinion that theavailable 3D models are still an obstacle to creativity rather than promoting it. It is, however,essential to a short development time that it is possible to run the design process and the enginee-ring process in parallel to the greatest possible extent, and this calls for the availability of a virtualmodel at an early stage. However, the existence of initial surfaces in CAD and VE permits a furt-her-going detailed itemization of design characteristics. It is another advantage of the first virtualmodel that a visualization of the design draft is possible at any place and time

.

Type of Application Type of Data Migration

3D CAD / Modeler

freeform surfaces,parametric geometries,trim curves,surface/material proper-ties,assembly information

tesselation (static dyna-mic),data reduction (LOD),surface properties -> tex-tures, materials, reflectionmapping

Prototypes3D scanning data sets,surface reconstruction ->CAD data sets

data repair,data reduction,color information -> tex-ture

Simulation (FEM/GEM, ...)geometry,additional simulation data,temporal changes

geometrical representa-tion of simulation data,intelligent structures

Functionalityinteraction man-model,operating/behaviour

description language,feedback,intuitive user interaction

Table 2: Migration of data from existing models

Page 105: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-10

5. Examples from actual research projects

Some interesting projects will be described in the following which constitute important steps onthe way to the goal: "Virtual product design".

5.1. VirtualANTHROPOS ... virtual man in cyberspaceThe presence or telepresence of the user in the virtual environment is decisive not only for distri-buted applications, but the majority of interactions at least require the presence of a part of theuser's body, in order to be able to manipulate the objects (e.g. virtual prototypes) directly. TheFraunhofer Institute for Industrial Engineering developed an anthropometric 3D man model withreal-time capability - VirtualANTHROPOSTM - for virtual environments, which can be directlymanipulated by the user by means of an electromagnetic tracking system. A decisive VR systemcomponent is thus available with which the presence of the user in the virtual environment isrepresented on the basis of an anthropometric database.

VirtualANTHROPOSTM is a software module capable of realistically representing human bodiesin virtual environments. Contrary to existing man models, VirtualANTHROPOSTM has real-timecapability, i.e., the computation and representation of the human body is made in a split second.At the same time, the VirtualANTHROPOSTM is fully movable: the degrees of freedom of alljoints of man are part of an anthropometric database which serves as a basis for the kinematicsimulation. VirtualANTHROPOSTM permits three applications of virtual reality:

Virtual ergonomic prototypingThe goal of virtual prototyping is the substitution of real prototypes by virtual prototypes, in orderto achieve advantages of time, cost and quality in product development. Contrary to real prototy-pes, virtual prototypes permit a judgment of the ergonomic quality without a man model only con-ditionally. VirtualANTHROPOSTM supports the testing of ergonomic parameters at a very earlytime in planning.

Tactile based Sensors Optical based sensors

Precision 1-50 µm 25 - 500 µm

Sampling Rates 0.1-100 points/sec 1000-50000 points/sec

Advantages simple, robust technology,short training-phase

digitalization of flexible sur-faces,none tactile sensor

Disadvantages radius of the sensor,tactile force > 0.1 N

influences by the optical cha-racteristics of the surface

Table 3: Characteristics of Digitalization

Page 106: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-11

Motion capturingVirtualANTHROPOSTM permits body postures and motions of man to be captured, stored andmanipulated by means of only six measuring points. The data can then be used for computer ani-mations, for ergonomic analysis or for the control of autonomous virtual man models.

TelepresenceNetworked virtual reality means the meeting of several users in a virtual environment. To permitthe users to observe each other even at great distances, they will be represented by VirtualAN-THROPOSTM (Fig. 3).

VirtualANTHROPOSTM was presented to the public for the first time at the Munich Systems '96and was given the "Best-of-Systems" award by the US magazine BYTE. VirtualANTHROPOSTM

is a joint development by IST GmbH and the Fraunhofer Institute for Industrial Engineering. Vir-tual assembly planning, the strategic self-research project (SEF), is a first field of application. Inthis project, the assembly of products in virtual reality is planned on the basis of CAD data. TheVIRTASS project described in the following already translates first results of it into industrialapplication. The objective of the Production Management and Virtual Reality Competence Centerresponsible for the "virtual assembly planning" project is the development of new methods for theeffective planning of assembly operations with the assistance of virtual reality. A first prototype ofa virtual assembly planning system was presented at the Systems 1996 and was given the Best-of-Systems Award by the magazine BYTE in conjunction with the man model VirtualANTHRO-POSTM.VIRTASS (Use of virtual reality for optimization of the development, manufacturing andassembly process)

Fig. 3: Telepresence with VirtualANTHROPOS TM

Page 107: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-12

The SEF project is conducted under the Virtual Reality Demonstration Center of the Fraunhofersociety in which, apart from the Fraunhofer Institute for Industrial Engineering, the FraunhoferInstitutes Building Physics (IBP), Computer Graphics (IGD) and Automatization and Production(IPA) are participating, too.

5.1. VIRTASS (Use of virtual reality for optimization of the development, manufacturingand assembly process)The VIRTASS project is a KTI research project (Commission for Technology and Innovation ofthe Swiss Department of Economy). Project partners are the Institute for Technology Transfer ofthe Engineering School of Bern, ASCOM Business Systems AG (Switzerland) and Silicon Gra-phics (Switzerland).

Object of the projectVirtual Reality (VR) is regarded today as one of the future man-machine interfaces. VR is a noveluser interface intended to permit real three-dimensional access to computer data.

The requirements made on VIRTASS are governed by the needs of ASCOM Business SystemsAG. ASCOM develops and manufactures cordless telephones. From the viewpoint of 'rapid pro-duct development', the product development time and the product starting time are to be shor-tened. For optimization of the various development phases, the technology of virtual reality (VR)is to be used and primarily applied to the following points:

Optimization of the cordless design phase, i.e., quicker assessment of the design and, thus, earlierstart of actual development, optimization regarding ability of assembly, simulation of operationand functionality. The number of physical prototypes can be reduced by means of virtual prototy-pes.

Fig. 4: Presence of the user with VirtualANTHROPOS TM for interaction

Page 108: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-13

Optimization of the manufacturing and assembly process by using simulation and VR. Integrationof the virtual cordless into virtual manufacturing. Virtual simulation and assessment of variousassembly variants.

Early, selective training and introduction of workers, virtual training.A coupling of the VR Kernel Lightning developed at the Fraunhofer Institute for Industrial Engi-neering to production planning tools like ERGOPlanTM from Delta Industrie Informatik is desi-gned and realized. Workplaces planned with ERGOMasTM, for example, lend themselves fordirect visualization in VR.

By adopting CAD product data and by using a virtual man model, it will be possible in connectionwith a VE to evaluate real assembly processes at virtual workplaces. A special challenge in thisproject is the development of an intuitive and effective 3D user interface with which the user willbe able to control the VR system from his or her visual workplace.

Fig. 5: Virtual representation of assembly workplaces

Page 109: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-14

5.2. Development of innovative products - Virtual Reality as a Tool for Design, Evaluationan Co-operationThe new special research area, which the German Research Society (DFG) has chosen to occupyitself with, is the topic of "rapid prototyping." IAO has realized and continues to research thetransformation of a multimedia or hypermedia working environment into a virtual working envi-ronment within the limits of this project. The virtual working environment is capable of applica-tion in the duration of the proposal for the tasks of designing freeform surfaces and the evaluationof product features. The basic knowledge for the aforementioned tasks will be offered through thehyper-media information structure from another sub-project. In the definition of the informationstructure the following items must be taken into consideration: the request of VR in the productdata, the exchange of information in communication, as well as the further presentation of know-ledge within the active semantic network, must be taken into consideration. The theme-integratingdevelopment of products, as well as the increasing complexity of the cause-action-chains, requiresnew human-machine interfaces; these interfaces must have a high degree of clarity and allow theinteraction in the form of human-like behavior through computer-aided tools.

For the sake of a quick development process, the definition of geometry must be, in principle, clo-sely linked to the evaluation of particular product features that are influenced through geometry.Due to this reason, design and evaluation form iteration loops which reach a higher level of designof the prototype after each iteration step. The tools necessary for the execution of the describedtasks in the virtual environment are being researched, conceived, and developed within the scopeof the sub-project. Also the fundamentals of the representation of the actor are researched. Therepresentation model, which is based on the above mentioned fundamentals, is integrated into thevirtual working environment. The tools for the design and evaluation represent the first basis foran interdisciplinary project of experts on the basis of CSCW in a virtual working environment. Auser guidance system, adapted to this environment, has to be developed for system control.

The virtual design tools for generation, modeling, and manipulation of complex geometries haveto connect the advantages of manual tools with the advantages of the conventional computer-aided data management and data representation (Fig. 6). The project steps of the manual and com-puter-aided work processes are analyzed before the development of the virtual design tools. Theresult of the analysis is an assessment of the individual project steps to allow an agreement onwhich working processes, from manual as well as computer-aided work, can be transferred to VR

.

The generation of the outer contours on a real model or a computer-aided procedure such as CAD,can occur manually in an additive as well as subtractive way. The sensor-motoric capabilities ofthe human being will be optimally applied in the manual production of the geometry. The visualcontrol of the working results occurs directly during the rescpective project steps. The digitaliza-tion of the model-geometry-data, by means of 3D-coordinate measuring techniques for the com-puter-aided evaluation of the product characteristics, demands a higher time expenditureadditionally. The iteration between modeling and simulation, as well as evaluation, runs subse-quentially up until now, requiring the higher time expenditure. The computer-aided generation ofgeometry is already accomplished on the basis of computer-geometry-data. The computer-aideddesign tools presently available on a CAD-basis force a strong abstracted working method. Thisdisadvantage becomes particularly apparent in the generation and modeling of freeform surfaces,which are produced through such processes as non-uniform rational B-splines (NURBS).

Page 110: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-15

Except for the already existing working processes, further processes are to be derived throughheuristic procedures. These processes result out of the additional possibilities of a virtual workingenvironment. Likewise. The result of this first part of the project is a definition of the work pro-cess for the design of freeform surfaces in VR. The basic principles of the individual virtual toolsare also defined along with working process. Thereby, a classification system relating the toolsand the various input and output devices are created. In the following, the requirements regardingthe input and output devices must be precisely specified and analyzed considering the possibilityto fulfill such requirements. This occurs on the basis of an efficiency analysis. After the conclu-sion of the general analysis, the project steps of the design process ascertained as ideal, can berelated to the corresponding input/output tools.

Besides the tools for the design and evaluation, the communication as well as the cooperation inthe virtual environment, is of a primary requirement. In the space of this sub-project, the funda-mentals of the representation of the actor are researched. The representation model, which isbased on the above mentioned fundamentals, is integrated into the virtual working environment.The tools for the design and evaluation represent the first basis for an interdisciplinary project ofexperts on the basis of CSCW in a virtual working environment. A user guidance system, adaptedto this environment, has to be developed for system control.

A scenario of "Corporate Virtual Workspaces" has already been demonstrated (Fig. 7). This con-text includes several approaches that allow a distribution of work on the same object through VR.Restricted views of the object, which are dependent on the performed task, as well as the lockingmechanisms in the simultaneous work on the same part of the object, are outlined. The couplingof several virtual working environments and the resulting demand of suitable communicationstructures are reached with the conversion of the results from other sub-areas..

A transition from a multimedia or a hypermedia computer-aided working environment into a vir-tual working environment (hardware and software environment of a VR system) is accomplishedthrough the described system for co-operative, virtual planning and design. To reach shorter itera-tion cycles and to optimize the communication between design and evaluation, the Computer Sup-ported Cooperate Work Concept (CSCW-concept) will be carried over in VR. VR-user-surfaces

Fig. 6: Manipulation of complex surfaces

Page 111: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-16

are integrated to enable user control. For extending the two components - visualization andinteraction - an acoustic, tactile and haptic simulation of the objects can be used in addition, toimprove the scope of experience with the object to be developed at an early development stage.The user can thus enter the virtual worlds generated by means of computer-assisted methods andwalk, experience and interact with the incorporated objects in the three-dimensional space.

The major goal of the IAO in this field of research is the development of a new process. This pro-cess connects virtual and physical prototypes by intelligent cooperation and communicationsystems for validation of design and evaluation of product properties.

6. Conclusion and Future Work

The development of products mainly will base on virtual prototypes in future. Many of the virtualprototype characteristics / aspects can only be evaluated in a virtual environment, where the devel-oper or customer will have an adeqaute perception therefore. Changes to the virtual prototype canbe quickly reflected in the digital model without physical effort - a big advantage for the enginee-ring process. This same model can be used in different ways to get feedback on various aspects ofthe design and its performance. VE-based tools are being used as a vehicle for a clear communica-tion and understanding, e.g. for seeing assemblies and how they will fit together without a realphysical prototype.

The research to have be done focuses on two main areas: the structures and principles of the VR-Systems (realtime-interaction, datastructure, database, dataexchange) and the development of newapplications (functionality, human-computer-interface). The Fraunhofer IAO will be representedin both areas.

7. Acknowledgments

This work is the effort of many very talented people over the last three years; we mention hereonly a subset, but express our gratitude to all involved.

Special thanks, in alphabetical order, to: Roland Blach, the head of the software developmentgroup "Lightning"; Matthias Bues, for his coding work; Jürgen Landauer, for all the Tcl/Tk expe-riments; Angela Rösch, for some of the underlaying concepts of Lightning; Andreas Rössler, for

Fig. 7: Demonstrator for manipulating an object in VR

Page 112: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-17

giving birth to VirtualANTHROPOSTM; Andreas Simon, for the enhancements of the renderingstuff; Elke Stimpfig, for her work in the DECADE projects and the hints from the industrialdesign view.

We would also like to thank the German Research Society for the funding of the Special ResearchArea 374 and the Fraunhofer Society for the funding of the Demonstration Center for Virtual Rea-lity in which most of this work was done.

8. Literature and URL

/1/ Leston, J.; Ring, K.; Kyral, E.: Virtual Reality; Business Applications, Markets and Oppor-

tunities. London: OVUM Reports, Ovum Ltd., 1996.

/2/ Fritze, T.; Riedel, O.:Simulation of complexe architectural issues in virtual environments.

In: IGD conferences & seminars; Mecklermedia; Fraunhofer-Institut für Produktionstech-

nik und Automatisierung (IPA); Fraunhofer-Institut für Arbeitswirtschaft und Organisation

(IAO): Virtual Reality World «96: Conference Documentation; Stuttgart, pp 13-15. Febru-

ary 1996. München: Computerwoche Verlag, 1996, pp 1-7.

/3/ May, F.; Stahs, T.: Virtual Reality für Entwicklungs- und Produktionsprozesse bei Daimler-

Benz. In: Forschungszentrum Karlsruhe; Heinz Nixdorf Institut: Fachgesprnterlagen;

Paderborn, 25./26. Februar 1997. Paderborn: Heinz Nixdorf Institut, 1997, o.Z.

/4/ Schmitt, G.: Virtual Reality in Architecture. In: IGD conferences & seminars; Mecklerme-

dia; Fraunhofer-Institut für Produktionstechnik und Automatisierung (IPA); Fraunhofer-

Institut für Arbeitswirtschaft und Organisation (IAO): Virtual Reality World 95: Conference

Documentation; Stuttgart, 21-23 February 1995. München: Computerwoche Verlag, 1995,

pp 113-124.

/5/ Bryson, S.; Johan, S.; Schlecht, L.: An Extensible Interactive Visualization Framework for

the Virtual Windtunnel. In: IEEE Computer Society Press, The Institute of Electronical and

Electronics Engineering Inc.: Virtual Reality Annual International Symposium «97: Confe-

rence Documentation; Albuquerque, 1-5 March 1997. Washington, Brussels, Tokyo: IEEE

Computer Society Press, 1997, pp 106-113.

/6/ Durlach, N.:VR Technology and Basic Research on Humans. In: IEEE Computer Society

Press, The Institute of Electronical and Electronics Engineering Inc.: Virtual Reality Annual

International Symposium 97: Conference Documentation; Albuquerque, 1-5 March 1997.

Washington, Brussels, Tokyo: IEEE Computer Society Press, 1997, pp 2-3.

Page 113: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-18

/7/ Grimsdale, C.:VR Applications,Technologies and Business Benefits. In: IGD conferences

& seminars; Mecklermedia; Fraunhofer IPA; Fraunhofer IAO: Virtual Reality World «95:

Conference Documentation; Stuttgart, 21-23 February 1995. München: Computerwoche

Verlag, 1995, pp 431-434.

/8/ Purschke, F; et.al.:Virtual Reality. Accelerating Vehicle Development. In: Gale, L. (Editor).

Prototyping Technology International’97. New Malden (UK): Williamson’s, 1997, pp 74-

79.

/9/ Kress, Holger:Integration Aspects within a Virtual Prototyping Environment. In: Rolf D.

Schraft (Hrsg.); M. Munir Ahmad (Hrsg.);William G. Sullivan (Hrsg.); Fraunhofer-Institut

für Produktionstechnik und Automatisierung (IPA); CIM System Research Center; Depart-

ment of Industrial and System Engineering: Flexible automation and Intelligent Manufactu-

ring 1995: <<proceedings of the 5th International FAIM Conference, June 28-30, 1995,

Stuttgart. New York, USA; Wallingford, UK: Begell House, 1995, pp 326-337.

/10/ Angster, S.; Jayaram, S.:Open Architekture Framework for Integrated Virtual Product

Development Systems. In: Bergamasco (ed.): FIVE «96, Framework for Immersive Virtual

Environments: Proceedings of the Conference of the FIVE Working Group; 19-20 Decem-

ber 1996. Pisa: PERCRO, Scuola Superiore S. Anna, 1996, pp 84-95.

/11/ Spath, D.; Osmers, U.; Guinand, P.: Virtual Engineering: 3D-Projektierung und Simulation

komplexer Produktionssysteme am Beispiel SPS-gesteuerter Anlagen. In: industrie Mana-

gement, Innovative Strategien für die Produktion (1997) Nr. 1/97, pp 38-51.

/12/ Scheer, A.-W.; Leinenbach, S.:Gesch Virtual Reality? In: Informations Management (1996)

Nr. 4/96, pp 73-76.

/13/ Gausemeier, J.; Bohuszewicz, v. O.; Ebbesmeyer, P.; Grafe, M.: Gestaltung industrieller

Leistungserstellungsprozesse mit Virtual Reality. In: industrie Management, Innovative

Strategien für die Produktion (1997) Nr. 1/97, pp 33-37.

/14/ Ellis, S., R.; Breant, F.; Menges, B.; Jacoby, R.; Adelstein, B., D: Factors Influencing Ope-

rator Interaction with Virtual Objects viewed via Head-mounnted See-through Displays. In:

IEEE Computer Society Press, The Institute of Electronical and Electronics Engineering

Inc.: Virtual Reality Annual International Symposium «97: Conference Documentation;

Albuquerque, 1-5 March 1997. Washington, Brussels, Tokyo: IEEE Computer Society

Page 114: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-19

Press, 1997, pp 138-145.

/15/ Odegard, O.:Social interaction in Televirtuality. In: IGD conferences & seminars; Meckler-

media; Fraunhofer-Institut für Produktionstechnik und Automatisierung (IPA); Fraunhofer-

Institut für Arbeitswirtschaft und Organisation (IAO): Virtual Reality World «96: Confe-

rence Documentation; Stuttgart, 13-15 February 1996. München: Computerwoche Verlag,

1996, pp 1-7.

/16/ Pandzic, I, Sunday; C., Tolga, K.; Thalmann, N., M.; Thalmann, D.: Towards Natural Com-

munication in Networked Collaborative Virtual Environment. In: Bergamasco (ed.): FIVE

«96, Framework for Immersive Virtual Environments: Proceedings of the Conference of the

FIVE Working Group; 19-20 December 1996. Pisa: PERCRO, Scuola Superiore S. Anna,

1996, pp 37-47.

/17/ Capin, T., K.; Pandzic, I., Sunday; Thalmann, N., Magnenat; Thalmann, D.: Virtual Humans

for Representing Participants in Immersive Virtual Environment. In: Mel Slater (ed.): FIVE

«95, Framework for Immersive Virtual Environments: Proceedings of the Conference of the

FIVE Working Group; 18-19 December 1995. London: QMW University, 1995, pp 135-

150.

/18/ Shawver, D., M:Virtual Actors and Avatars in a Flexible User-Determined-Scenario Envi-

ronment. In: IEEE Computer Society Press, The Institute of Electronical and Electronics

Engineering Inc.: Virtual Reality Annual International Symposium «97: Conference Docu-

mentation; Albuquerque, 1-5 March 1997. Washington, Brussels, Tokyo: IEEE Computer

Society Press, 1997, pp 170-179.

/19/ N. N.: Management Overview of the First Project Phase. AIT Consortium (Editor): ESPRIT

Projekct 7704 Advanced Information Technology in Design and Manufacturing, 1994.

/20/ Kress, H.; Jasnoch, U.; Stork, A.; Quester, R.: Eine plattformübergreifende Umgebung zur

kooperativen Produktentwicklung. In: Proceedings zur GI-Fachtagung: CAD '96 "Verteilte

und intelligente CAD-Systeme", Kaiserslautern, 7./8. M 1996.

/21/ Dani, T. H.; Gadh, R.: COVIRDS: A Framework for Conceptual Shape Design in a Virtual

Environment. NSF Grantees Symposium 1996. http://smartcad.me.wisc.edu/~tushar/nsf/

nsf.html.

/22/ Lüddemann, J.:Virtuelle Tonmodellierung für durchgndustriedesign. In: Proceedings zum

Page 115: Applied Virtual reality - USPlsi.usp.br/~mkzuffo/PSI5787/Texto_Aula_02.pdf · 1997. 6. 9. · 1 hour Navigation in Virtual Environments 5-1 Rudy Darken 90 minutes Lunch Break 1 hour

9-20

Fachgespr der industriellen Produktion. Paderborn, 25./26. Feb., 1997, o.Z.

/23/ Haug, E. J.; Kuhl, J. G.; Tsai, F. F.: Virtual Prototyping for Mechanical System Concurrent

Engineering. In: concurrent Engineering: Tools and Technologies for Mechanical System

Design, Haug, E. J. (Ed.), 1993.

/24/ Brown, R., G.:An Overview of Virtual Manufacturing Technology. In: Advisory Group for

Aerospace Research & Development: Proceedings of AGARD; May 6-10, 1996. Sesimbra,

Portugal: AGARD 1996, pp 1-11.

/25/ Dai, F; Felger, W.; Frühauf, T.; Gual Prototyping, Examples for Automotive Industries. In:

IGD conferences & seminars; Mecklermedia; Fraunhofer-Institut für Produktionstechnik

und Automatisierung (IPA); Fraunhofer-Institut für Arbeitswirtschaft und Organisation

(IAO): Virtual Reality World «96: Conference Documentation; Stuttgart, 13-15 February

1996. München: Computerwoche Verlag, 1996, pp 1-7.

/26/ N. N.:ISO 10303 STEP Standard for the Exchange of Product Models.

/27/ Grabowski, H.; Anderl, R.; Erb, J.; Polly, A.: STEP - Grundlage der Produktdatentechnolo-

gie - Die Anwendungsprotokolle, In: CIM Management, Produkte, Strategien, Entschei-

dungshilfen (1994), Nr. 6/94, pp 45-49.

You can find this paper and a lot of links to WWW sources under:

http://vr.iao.fhg.de/vr/information/Publications/SIGGRAPH97