11
EUROGRAPHICS ’99 / P. Brunet and R. Scopigno (Guest Editors) Volume 18,(1999), Number 3 The Hybrid World of Virtual Environments Shamus Smith and David Duke HCI Group, Department of Computer Science The University of York, Heslington, YO10 5DD York, United Kingdom (shamus, duke)@cs.york.ac.uk Mieke Massink CNR -Istituto CNUCE Via S. Maria 36 I56126 - Pisa - Italy [email protected] Abstract Much of the work concerned with virtual environments has addressed the development of new rendering technolo- gies or interaction techniques. As the technology matures and becomes adopted in a wider range of applications, there is, however, a need to better understand how this technology can be accommodated in software engineering practice. A particular challenge presented by virtual environments is the complexity of the interaction that is sup- ported, and sometimes necessary, for a particular task. Methods such as finite-state automata which are used to represent and design dialogue components for more conventional interfaces, e.g. using direct manipulation within a desktop model, do not seem to capture adequately the style of interaction that is afforded by richer input devices and graphical models. In this paper, we suggest that virtual environments are, fundamentally, what are known as hybrid systems. Building on this insight, we demonstrate how techniques developed for modelling hybrid systems can be used to represent and understand virtual interaction in a way that can be used in the specification and de- sign phases of software development, and which have the potential to support prototyping and analysis of virtual interfaces. Keywords: Virtual environments, hybrid systems, inter- action techniques, VE design, HyNet. 1. Introduction Advances in graphics hardware and software 1 11 23 have been a major factor in the development of virtual environ- ments (VEs), and have lead to innovative systems and novel interaction techniques. Although there are a number of im- pediments, e.g. available input devices, it is not unreasonable to expect the technology of virtual environments to diffuse into a wider range of software products. This process is be- ing enabled in part by the development of ‘generic’ virtual reality (VR) toolkits such as dVise and SuperScape. It is dif- ficult to find reports that detail the process used to develop virtual environments, but given the maturity of the technol- ogy it would seem reasonable to suggest that prototyping and exploratory development play a significant role. How- ever, if or when the technology of virtual environments be- comes adopted in mainstream software systems and prod- ucts, exploratory approaches become rather less attractive. Software developers must be concerned with making use of the most appropriate technology in a way that meets the re- quirements of the client, including quality criteria such as us- ability, robustness, maintainability, error-tolerance, etc. For an overview of software engineering methods, see 26 for ex- ample. Our view is that the interface technology and com- plexity of interaction is one feature of virtual environments c The Eurographics Association and Blackwell Publishers 1999. Published by Blackwell Publishers, 108 Cowley Road, Oxford OX4 1JF, UK and 350 Main Street, Malden, MA 02148, USA.

The Hybrid World of Virtual Environments

Embed Size (px)

Citation preview

Page 1: The Hybrid World of Virtual Environments

EUROGRAPHICS’99 / P. BrunetandR. Scopigno(Guest�

Editors)Volume18, (1999), Number3

The Hybrid World of Virtual Envir onments

ShamusSmithandDavid Duke

HCI Group,Departmentof ComputerScienceTheUniversityof York, Heslington,YO105DD

York, UnitedKingdom(shamus,duke)@cs.york.ac.uk

Mieke Massink

CNR-IstitutoCNUCEVia S.Maria 36

I56126- Pisa- [email protected]

Abstract

Much of theworkconcernedwith virtual environmentshasaddressedthedevelopmentof new renderingtechnolo-giesor interactiontechniques.Asthetechnology maturesandbecomesadoptedin a wider range of applications,there is, however, a needto betterunderstandhowthis technology canbeaccommodatedin software engineeringpractice. A particular challenge presentedbyvirtual environmentsis thecomplexity of theinteractionthat is sup-ported,andsometimesnecessary, for a particular task.Methodssuch asfinite-stateautomatawhich are usedtorepresentanddesigndialoguecomponentsfor moreconventionalinterfaces,e.g. usingdirectmanipulationwithina desktopmodel,donotseemto captureadequatelythestyleof interactionthat is affordedbyricher inputdevicesandgraphicalmodels.In this paper, wesuggestthat virtual environmentsare, fundamentally, whatare knownashybrid systems.Buildingon this insight,wedemonstratehowtechniquesdevelopedfor modellinghybridsystemscanbeusedto representandunderstandvirtual interactionin a waythatcanbeusedin thespecificationandde-signphasesof software development,andwhich havethepotentialto supportprototypingandanalysisof virtualinterfaces.

Keywords: Virtual environments,hybrid systems,inter-actiontechniques,VE design,HyNet.

1. Intr oduction

Advancesin graphicshardware and software 1 � 11 � 23 havebeena major factor in the developmentof virtual environ-ments(VEs),andhave leadto innovative systemsandnovelinteractiontechniques.Although therearea numberof im-pediments,e.g.availableinputdevices,it is notunreasonableto expectthe technologyof virtual environmentsto diffuseinto a wider rangeof softwareproducts.This processis be-ing enabledin part by the developmentof ‘generic’ virtualreality(VR) toolkitssuchasdViseandSuperScape.It is dif-

ficult to find reportsthat detail the processusedto developvirtual environments,but given thematurityof thetechnol-ogy it would seemreasonableto suggestthat prototypingandexploratorydevelopmentplay a significantrole. How-ever, if or whenthe technologyof virtual environmentsbe-comesadoptedin mainstreamsoftware systemsand prod-ucts,exploratoryapproachesbecomeratherlessattractive.Softwaredevelopersmustbeconcernedwith makinguseofthemostappropriatetechnologyin a way thatmeetsthere-quirementsof theclient,includingqualitycriteriasuchasus-ability, robustness,maintainability, error-tolerance,etc.Foranoverview of softwareengineeringmethods,see26 for ex-ample.Our view is that the interfacetechnologyandcom-plexity of interactionis onefeatureof virtual environments

c�

TheEurographicsAssociationandBlackwellPublishers1999.Publishedby BlackwellPublishers,108 Cowley Road,Oxford OX4 1JF, UK and350 Main Street,Malden,MA02148,USA.

Page 2: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

that will be particularlyproblematicfor software develop-ers.Anotheris how thepresentationof theenvironment,i.e.the ‘virtual world’ itself, shouldbe designedto, for exam-ple, supportuserembodimentand engagement.Other as-pectsof virtual reality, for examplethoseconcernedwithrendering,areof coursenon-trivial. HoweverVR toolkitsdoprovide pre-built renderingcapabilities,andwhererender-ing hasto be realisedthroughbespoke software,the actualdesignproblemis within theboundsof existingsoftwareen-gineeringmethods.

This paperis intendedas a steptowardsunderstandinghow software engineeringpracticecan be adaptedto dealwith thecomplexitiesof interactionin virtual environments.While a tight couplingbetweeninputandfeedbackis anim-portantaspectof virtual interfaces,we areconcernedheremainly with representingthe ‘flow’ of interactionbetweenuserandapplication.This work is beingcarriedout aspartof the INQUISITIVE project6, a threeyearresearcheffortfundedby the UK EngineeringandPhysicalSciencesRe-searchCouncil betweengroupsat The University of YorkandtheCLRCRutherfordAppletonLaboratory(RAL).

Theaim of theprojectis to developmethodsandprinci-plesthatcanbeusedto improve thedesignof interfacesforvirtual environments.Weseethisasprogresstowardsbridg-ing thegapbetweentherequirements(domain,user, etc)ofa systemandthefinal implementationof a VR application.Weareparticularlyinterestedin lookingathow end-userre-quirementson the interfacecanbe implementedvia a VRtoolkit. Recently, several toolkits for virtual environmentshave beendeveloped,for example18 � 3 � 14. Theseprovide atoolkit layer to insulatethe VE applicationdesignerfromthelow-endVE implementationissues.

We describetheabstractionof interactionmodelsfor VEdesignandshow how thesemodelscanthenbesubsequentlyspecifiedto increasinglydetaileddescriptions,thusmovingthedesigncloserto implementation.At anabstractlevel, theinteractionmodelscanbe refinedin stepwith componentsof a toolkit andcanbeusedasgenericinteractionexamplesfor whataninteractiontoolkit canprovide.

Thework reportedin this paperconcernsthe ‘middle’ ofthis mappingfrom requirementsto implementation,a wayof describingandunderstandingwhatparticularinteractiontechniquessupport,andconsequentlyhow or whethertheyareappropriatefor supportinggiventasksin anapplication.An overview of thisdevelopmentprocesscanbeseenin Fig-ure1.

Theadventof directmanipulationgaverisetodesigntoolssuchasstate-transitionsystemsandevent modelsto repre-sentdialogue.Thesehowever seemdifficult to apply to thericher classof interactiontechniquesthat aresupportedbyvirtual environments.Instead,descriptionsof virtual envi-ronmentsare typically informal, and all too often incom-plete.Wherethey aregiven,descriptionsareeitherentirelytext based,usingnaturallanguage,or at bestareaugmented

Requirements(Domain, User)

Interactionmodel

TOOL�

KIT

VR Application

OtherResources

Implem� e� ntation

ModelRefinement

Figure1: Virtual environmentdevelopmentoverview.

with ad-hochigh-level diagrams.This in partreflectstheex-ploratorynatureof VR design,mentionedearlier, andin parttheintrinsicdifficulty of describinginteraction.Althoughin-formal descriptionsmaybeappropriatefor generaldescrip-tionsof environments,systematicimplementationandanal-ysis of systemsrequiresmore detailedand betterdefinedspecifications.As Jacob7 notesin his work on a visual lan-guagefor Non-Wimp userinterfaces,the prototype-drivenapproachthat is usedat presentcanmake it difficult to de-velop,shareandreusetheinterfaceof avirtual environment.

Theverynatureof virtual environmentscontributesto thedifficulty of describingandmodellinginteraction.Typically,they area collectionof staticanddynamiccomponents7 � 25,areextensively visuallybasedandhaveanon-linearprocessandcontrol flow. Also, thereare the considerationsof theseparationbetweenwhat is systembasedandwhat is userbasedandhow the timing of operationsin theenvironmentare to be handled.The suggestionin this paperis that VRcallsfor a morepowerful modelof interactionbasedon hy-brid models,in which a systemhasboth discreteandcon-tinuouscomponents.A more detailedstudy of continuousinteractiontechniquesandin particularthecognitiveaspectsof theuseof thesetechniquesare,in parallel,beinginvesti-gatedin theEuropeanTMR projectTACIT 27 in which twoof theauthorsareinvolved(DD, MM).

Continuous interaction techniquesare techniques inwhich the interaction,or part thereof,evolvessmoothlyin

c�

TheEurographicsAssociationandBlackwellPublishers1999.

Page 3: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

time, suchas for examplein audio and video basedtech-niques.Researchon a propergraphicalformalismthat canassistinterfacedesignersduringmany phasesof thedesignprocessis one of the topics of TACIT. Sucha formalismshouldideallybesuitableto expressbothsystemorientedas-pectsaswell asuserorientedandcognitiveaspectsof thein-terface.Theformalismsandspecificationsdiscussedin thispapershow the useof two notationsdevelopedwithin theareaof hybridsystemstheory.

The remainderof the paperis organisedas follows. InSection2 weconsiderbriefly themeaningof ‘interaction’ inthe context of virtual environmentsandaddressthe impor-tanceof modellinginteractionandin Section3 we investi-gatehow thismodellingcanbedoneusingahybridsystemsapproach.Thefirst of two exampleinteractiontechniquesisthenintroduced,togetherwith a descriptionof a VE mod-elling techniquebasedon ahybridsystemsapproach.Issuesinvolved in the definition of userandsystembasedmodelsareconsideredandan approachto a moredetailedVE de-scriptionis describedin outline.

2. Modelling interaction in virtual envir onments

Researchinto virtual reality and virtual environmentshasbeenpredominantlyleadby the developmentof new tech-nologies.Although many of thesetechnologieshave ma-tured,therearestill many issuesaboutvirtual environmentsthatremainunanswered.

Oneof theseissuesis how the discreteevent-basedele-mentsandthe continuousvisual componentsof interactioncanbeexpressed.Virtual environmentsprovidetheuserwithaninteractionenvironmentwhich is fundamentallydifferentto traditionalcomputersystems.In VEs,theuseris anactiveparticipantwithin thesystem.Traditionalcomputersystemsreinforcethe man/machinebarrierwherethe applicationiswithin the machineand the userscanonly interactwith itfrom theoutside.VEs provideanew level of interactivity astheseparationof theuserandthesystembecomeslessclear.In immersivesystems,theuseris surroundedby theenviron-mentandcaninteractdirectly with componentswithin theenvironment.This typeof interactionrequiresnot only newtechnologies,in theform of novel inputdevices(for examplespaceballs,3D mice,dataglovesandvisiontracking)but de-scriptionsof interactiontechniques(for exampleHead-ButtZoom17, Go-Gointeraction21, HeadCrusherSelect 20 andTwo HandedFlying 17) which areto be mappedonto thesedevices.

Therearemany factorsthatmake preciseinteractiondif-ficult in the virtual world. Mine 15 notesthat many virtualworldslack hapticfeedback(somethingwe take for grantedin therealworld). While hapticrenderingis becomingprac-tical for sometasks,it is still far from practicalfor generaltasksin virtual environments,suchasnavigation.Mine alsoobserves that currentalphanumericinput techniques(what

weusefor preciseinteractionin thecomputerworld) arein-effective for virtual worlds.He suggeststhatwe mustlearnhow to interact with information and controls distributedaboutthe userinsteadof focusedon a terminal in front ofhim/her. If naturalformsof interactioncanbeidentified,andpossiblyextendedin theVE, thenmoreusableVE interfacescanbeconstructed15.

Modelling interactionis bothimportantfrom theuseranddesignerof VEs perspective. However, usersanddesignershave different requirementson suchmodels.The usersre-quire interactiontechniqueswhich allow themto completeinteractiontasksin a particular applicationand designerswishto build systemsthatmaketherequiredinteractionpos-sible. In the context of researchtools, designersandusersarecloselybound.Typically, systemdesignersareeitherthemainusersor closelyassociatedwith or workingin thesamedomainas the target users.However, as VR technologiesreacha largeraudience,new waysof capturinguserrequire-mentsexplicitly for designerswill beneeded.

Severalproblemshave beenidentifiedwhentrying to de-scribeinteractiontechniques.Firstly, the informal descrip-tion of interactiontechniquesmeansthat thereareno obvi-ouswaysto evaluatewhethertwo interactiontechniquesarethe same.This canleadto every new design‘re-inventing’the samebasicinteractiontechniques.Secondly, the useofinformaldescriptionscanresultin a lack of consistency be-tween interactiontechniques.Also, without a firm designspecification,consistency betweentechniquesmay be lostaftertechniquemodificationduringanimplementation.Typ-ically, informal descriptionsleave room for ambiguityandrequireextensive customisationbeforethey arein animple-mentableform. If thereis no accessto the original designteam,thendifferentimplementorsmaymake theirown arbi-trary designdecisions.This would be highly undesirableifconsistency andstandardisationarerequiredbetweenappli-cations.

Thirdly, vaguedescriptionsin natural languagedo notlend themselves to rigorous analysisand the comparisonof differenttechniquesmaybe impossible.With no way tojudgeandcomparetechniques,how is thedesignerof a VEto makeaninformeddecisiononwhatinteractiontechniquesareappropriatefor any given taskor application?If thede-scriptionsareleft informal, thereis no guaranteethatthefi-nal implementedsystemwill becloseto theoriginaldesign.

At aninitial stageof designit would bedesirableif therewasausefulwayof ‘sketching’theflow of aninteractionatahigh level of abstraction,for requirementsandspecification.This would provide a basisfor pre-implementationevalua-tion of theenvironmentandcould thenbedevelopedinto amoredetailedmodel for mappingonto an implementationmodel.

c�

TheEurographicsAssociationandBlackwellPublishers1999.

Page 4: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

3. Why usehybrid systems?

Thedescriptionof virtual environmentsis anon-trivial task.VEs aredynamicenvironmentsanddueto their continuousandhighly visualnature,definingsalientandusefulaspectsof themis extremelydifficult. By continuous,we meantheuserview of thevirtual environment.Thesystemsbehaviourmaybeableto bebrokendown into discretemodesof inter-actionbut theuseris engagedwith a continuousview of theenvironment.

From a designer/implementorspoint of view, thereareseveral aspectsof VEs which require explicit definition.Namely, the virtual environment,the userinterface,the in-teractionprocesses,thephysicalinteractiondevicesandtheuserscognitive model.Theseelementsof VEs make up acomplex modelof discreteandcontinuousprocesses.Also,eachindividualelementcanbethoughtof asamixedmodelin its own right. Thus, traditionaldiscretemodelling tech-niques are not appropriatefor VEs. Trying to force themodelsinto purecontinuousdescriptionsis alsoundesirablesinceVEs,likeotherdigital systems,tendto exhibit discrete‘moding’ behaviour. A modelneedsto make explicit thoseaspectsof a systemthat areimportantto the modeller. ForVEs,theseaspectsarebothdiscreteandcontinuous.

Systemswhichcombinediscreteandcontinuouselementshave beenobjectsof studyfor a long time, first within thesystemsengineeringcommunity, andmorerecently, throughthe growth of embeddedsystemsand processcontrol, thecomputersciencecommunity. In bothcommunities,theterm‘hybrid systems’is usedfor systemsconsistingof a mix-ture of discreteandcontinuouscomponents.Typically, hy-brid systemsare interactive systemsof continuousdevicesanddigital controlprograms9.

What is proposedhereis that a hybrid of differentmod-elling componentsbe usedto mirror the hybrid natureofVEs. Thecurrentwork follows presenttrendsin thehybridsystemsliterature 4 28 30. We have developeda semiformalnotationto aid in thedescriptionof virtual environments25.Theconsiderationof virtual environmentsashybridsystemsseemsto beanaturalsteptowardsamoredetailedVE spec-ification 25. However, the currentwork is not intendingtodefineandchampionyetanothernotationfor modelling,butin selectingbasicfeaturescommonto notationswhichseemparticularlyrelevantto virtual environmentdescription.

Researchinto thisunknown territoryrequirescarefulcon-sideration.We feel that beginning researchat an abstractmodellinglevel is appropriateasit allowsusto initially con-siderVE modellingataninformallevel of rigour. In thecon-text of the currentproject,we wish to be able to describewhat is required,but at a level that is independentof anyparticularVR toolkit. This will hopefully leadto thedevel-opmentof portablemodellingtechniqueswhichcanthenbespecifiedat higherlevels of rigour if required.In this way,thedevelopmentof VEscanbeconsideredarefinementpro-cessas the designof a systemis specifiedat increasingly

formal levelsof rigour until it canbe implementedthroughaVR toolkit (seeFigure1).

For the remainerof this paper, we describethe useof ahybrid systemsapproachto two exampleinteractiontech-niques.The first techniqueis from an immersive VE whilethe secondtechniqueis basedin a desktopVR and is de-scribedboth in a graphicalflow notation and in HyNet,a recentlydevelopedhybrid extensionof High-level PetriNets8 24. We presenttwo hybrid systembasednotationsasan exampleof how moving from a semiformalhigh-levelnotationto thestartof a formaldescriptioncanbeachieved.This is an initial steptowardsmoreformal descriptionsforvirtual environments.

4. From flying handsto flying cameras

Navigation, object selectionand object manipulationarethreefundamentaltaskswithin virtual environments.WithintheVE literaturetherearemany novel interactiontechniquesthathave beendevelopedto supportthesestandardVE fea-tures.Many of thesetechniqueshavebeenbasedaroundnewtechnologyin an attemptto capturethe imaginationof theusersandpotentialusers.Theuseof headmounteddisplays(HMDs) andglove basedinput devices have becomesyn-onymouswith many descriptionsof VEs.

Flying is a navigation techniquewhich hasbeenpopu-lar in many virtual environmentimplementationsbecauseitavoids the needto take surfaceterrain into account.Typi-cally, this techniqueis basedarounda flight vector calcu-latedby the anglebetweenthe usershandand headposi-tions.However, thereareseveraldisadvantagesto this tech-nique.Likeothergesturebasedinteractiontechniquesit cancausearm fatiguewith continuoususe.Userdisorientationcanalsocomethroughmisunderstandingtherelationshipbe-tweenhandorientationandflying direction15.

4.1. Flying hands

Two HandedFlying (THF) 17 is a specializedtypeof flyingwhichexploits proprioception,theperson’s senseof thepo-sition andorientationof their bodyandlimbs. Directionofflight is definedby thevectorbetweenthe userstwo handsandtheflight speedis specifiedby thedistancebetweentheusershands.Flight is stoppedby moving the handsinto adeadzone, aminimumhandseparation.

Two HandedFlying wasdevelopedby Mine, BrooksandSequin17 to exploit userproprioceptionin virtual environ-mentinteractionat UNC-ChapelHill usingtheChapelHillImmersive Modeling Program(CHIMP). CHIMP is a vir-tual environmentapplicationfor the preliminaryphasesofarchitecturaldesign.It includesboth one and two-handedinteractiontechniques,tries to minimize unnecessaryuserinteractionandtakesadvantageof thehead-trackingandim-mersionprovidedby thevirtual environmentsystem16. Two

c

TheEurographicsAssociationandBlackwellPublishers1999.

Page 5: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

HandedFlying is oneof several techniquesthat wereusedto show how intuitive gesturescould be augmentedto takeadvantageof proprioception.Thefollowing is Mine, BrooksandSequin’s descriptionof Two HandedFlying.

We have found two-handedflying an effectivetechniquefor controlled locomotion.The direc-tion of flight is specifiedby the vector betweentheuser’s two hands,andthespeedis proportionalto theuser’s handseparation.A deadzone(someminimum hand separation,e.g. 0.1 metres)en-ablesusersto stoptheircurrentmotionquickly bybring their handstogether(a quick andeasyges-ture). Two-handedflying exploits proprioceptionfor judgingflying directionandspeed17.

Althoughamoredetaileddescriptionprovidesahighlevelview of the interactiontechnique,it doesnot provide thelevel of detailwhichis requiredto developanimplementablespecificationor amodelthatis amenableto rigoroususabil-ity analysis.

A graphicalnotation has beenproposedby Smith andDuke 25 which provides a small set of powerful operatorswhich canbeusedto definea conciseandexpressive repre-sentationof interactionin VEs.Thisnotationis basedontheevent/processstructurethatis usedin PetriNets19. However,thePetriNetnotationhasbeenextendedto provideprovisionfor the definition of discreteand continuouscomponents.This is importantfor VE descriptionsasoneof thefeaturesof VEs is that they arecomprisedof discreteandcontinu-ouscomponents7 � 25. Theusersposition/embodimentin theenvironment,the updatingof the viewpoint andinput fromVE inputdevicesareall examplesof continuousflowsof in-formationwhich arerequiredto bemodelledin VE specifi-cation.Thecurrentmodeof aninteraction,internalenviron-menteventsandexternalI/O eventsareall possiblediscretetriggersfor both discreteand continuousprocesseswithinVEs.

Figure2showsarepresentationof theTwoHandedFlyingtechniquein the graphicalflow notation.A more detaileddescriptionof thisnotationhasbeendiscussedelsewhere 25

but will beillustratedherewithin thisexample.

In this modeltherearethreeexternalplugs(thehandpo-sitionsplugis repeatedonthediagramfor clarity).Thesearethecontinuousflow from thehandpositionsandthebooleancontrolarcsfrom thetechniquesenableanddisable. A con-trol arcsignalsa controldependency in themodel.Initially,the user triggersthe interactionby some,unspecified,en-ablemechanism(1) which is part of the applicationor theenvironmentin which THF is used.This enablesthe starttransition.Thistransitionalsohasaninhibitor arcsothattheinteractioncannotget restartedwhile the useris currentlyflying. The start transitionpassesa token to the not flyingstate.Theuserwill remainin this stateuntil their handsaremovedoutsidetheTHFdeadzone.Thisconditionisdetectedby asensoron thehandpositionsflow (2). Thesensorspans

flyi� ng

notflying

~ updateposition, speed

position,speed

ex� it

ex� it

sta� rt

handpositionsena� ble

handpositions disab

�le

d <=� mind > min

1�

2�

4�

3�

5�

6�

6�

Figure2: TwoHandedFlying hybridmodel.

the flow andactsasa function from the flow contentto aboolean.

Oncethe usershandsare moved outsidethe THF deadzone,theactive tokenis passedto theflyingstate(3). In thisstatea flow control is activated.The flow control actsasa‘valve’ on the continuousloop for transformingthe userscurrentpositionandspeed.Thecontinuousloop in this ex-ampleis comprisedof threecomponents;the flow control(3), a transformer(4) anda store. A transformerappliesatransformationto a flow to yield a modifiedcontent.In Fig-ure2 theupdateposition,speedtransformertakesthecurrentvaluesfrom thecontinuousflow andupdatesit with thecur-rentvalueonthehandpositionsflow (4). This is thenpassedto thestore. A storeis a sourceandrepositoryfor informa-tion thatis consumedor producedby acontinuousflow.

If theusershandsaremovedbackinto theTHFdeadzone,a sensoron theusershandpositionswould triggera transi-tion (5) backto thestationaryposition.Finally, while in ei-therstate,if theuserwishesto exit the technique,a disablecontrolarccanbetriggered(6) whichde-activatesall states.

The diagramhighlights the modes/statesof the interac-tion andtheeventsthatcausethetransitionsbetweenmodes.Also, thereis a clearseparationof the discrete(the controlprocessesin thebottomof thediagram)andthecontinuous(thecontinuousloop in thetop of thediagram)processes.

Oneof theaimsof thecurrentprojectis to producemodelsof interactiontechniqueswhich canbereusedin alternativeVE descriptionsaspartof a VE developmenttoolkit. In thenext section,aninteractiontechniquefor aPCbaseddesktop

c�

TheEurographicsAssociationandBlackwellPublishers1999.

Page 6: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

VE for a virtual television camerais definedandcomparedto theimmersiveTwo HandedFlying description.

4.2. Flying cameras

Thesimulationof a television studiofor training televisionsetdesignersandcameracrew is anenvironmentwhich hasthe potentialto take advantageof many of the featuresofVR. PC baseddesktopVR systemsarecurrentlycommer-cially available2 andprovide a varieddegreeof interactionwith their environment.Along with prop interactionin thevirtual studio,thereneedsto besomeway of navigatingthecurrentview of theenvironment.In this domain,this wouldinvolve the implementationof a virtual camera.The usershouldbe able to position the camerato experimentwithdifferentviewsandexplorealternativecameraangles.

Theuseof aninteractiontechniquelikeTwo HandedFly-ing to implementthe virtual cameramay at first seemin-feasibleon a desktopplatform as this particulartechniquewasdevelopedwith proprioceptionasa fundamentalprop-erty. DesktopVR systemsaretypically limited to anexternaluserembodimentanda mouseandkeyboardfor userinput.Themousecanbe mappedontoan onscreencursorto pro-vide one‘virtual hand’ position.To provide the secondre-quiredhandposition,a virtual objectis required.This mustbeenabledanddisabled.This togglingis typically doneus-ing mousebutton combinationsin PC baseddesktopVR.The following is a high level descriptionof a virtual flyingcamerafor adesktopVR simulationof atelevisionstudio 2.

Theuserinitiatestheflying cameramodebypress-ing the middle mousebutton. Whenthe modeisactivated,a square(1cm by 1cm) appearsat thecurrentmousepointerposition.While thepointerremainswithin the square,the view remainssta-tionary. Once the pointer is moved outside thesquare,the usersmovementandspeedis directlyproportionalto theangleanddistancebetweenthecurrentpointerandthecenterof thesquarerespec-tively. The flying modeis deactivatedby a sec-ondpressonthemiddlemousebutton.Alternativemovementis obtainedby useof the othermousebuttons. For example, with no buttons down,up/down/left/rightwith the mouseis mappedtoforward/back/panleft/panright while with theleftbutton down it is mappedto up/down/crab left/crab right.

Figure3 shows thisdescriptionasahybridmodel.

Therearetwouniqueexternalplugsin Figure3, themouseandthesquare events. They arerepeatedon thediagramforclarity. Themouseflow providesthecurrentpositionof themouseandany button events.The square eventsflow pro-videsany eventswhichareassociatedwith theflying squarewhen it is onscreen.In this example,the continuousflow

mov� ing

statio� nary

~�updateposition, speed,orientation

position, speed,orientation

ex� it

ex� it

sta� rt

mou� se

middlebutt ondown

pointerouts! idesquare

squareevents

mou� se squareevents

middlebutt ondown

pointerin square

middlebutt ondown

Figure3: Flying camera hybridmodel.

loop for updatingtheuserspositionis on thetop of thedia-gramwhile themodecontrolwithin thetechniqueis on thelowerportionof Figure3.

4.3. Discussion

Thetwo presentedexamplesarefrom two distinctdomainsandareimplementedon completelydifferentplatforms,butarethey that different?By modellingthemin our notationit is possibleto identify several commonfeaturesthat aresharedbetweenthem.

By focusingon theseparationof thediscreteandcontin-uouscomponentsof bothdiagramsit is possibleto seethateachhasacontinuousupdatingof theuserspositionwhichismanagedby a discretecontrolstructure.Eachof thesecon-trol structuresaredependenton firstly, aninitial enablementandsecondly, a sensorto switchthemodeof theinteractionbetweenthedistinctstates.Either theuseris moving or theuseris stationary. Whenthe useris moving, this is the cuefor updatingof the user’s continuousposition in the envi-ronment.This updatingis doneby a transformerbasedonthecurrentpositionin spaceandthecontinuousflow of in-formationfrom theuserscurrentposition.(Eithertheusers’handpositionsin THF or themousepointerto squareposi-tion in theflying cameraexample.)

Using abstractdescriptionsof the two techniquesallowsus to remove their superficialfleshandprovidesus with a‘skeleton’ form which canbe usedto directly comparethetechniques.In this casewe canseethat the two techniqueshave a similar structure.Althoughthis is useful,it doesnot

c"

TheEurographicsAssociationandBlackwellPublishers1999.

Page 7: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

meanthat the superficialfleshis unimportant,asit is whattheuserperceives.

This descriptive way to representVE definitionsallowsquestionsto be asked about the designand possibledefi-cienciesin thedesignmaybeidentifiedandfixedatanearlystage.Oneof themainbenefitsis that therequirementsandresourcesthatareneededfrom andto thecomponentsof theVE are mappedout. For example,interactiontriggersandmodeenablement/ disablementcan be clearly definedonthediagram.Thetriggersfor events(transitions)canbede-scribedfor thediscretecomponentsof thediagramwhile thetransformerelementallowsthenatureof continuousflowstobespecified.

Although the hybrid approachwe have taken providesaclearerview of the interaction,it is only a steptowardsadescriptionwhich can be mappeddirectly onto an imple-mentation.Ourapproachprovidesauserspecificationfor aninteraction.What is alsorequiredis a systemmodelof thesameprocesses.For example,in theflying cameraexample,the userhastwo main modes.Either they are moving thecamera(flying) or they arestationary. Whenthey aremov-ing, theview is beingtransformedby boththecurrentposi-tion of the mousepointerandthe statusof the mousebut-tons.In Figure3 this is representedby theupdateposition,speedandorientationtransformer. Howeverin anactualsys-tem,thestatusof themousebuttonsrepresentsdiscretemodechangeswithin themoving mode.Althoughthesemodesaretransparentto the user, the systemdescriptionwill requirethisspecification.

The hybrid modelcanbe seenasa userspecificationoftheinteraction.It describeswhattheuseris doingin thein-teractionandhow their input affectsthe interaction.This isa designsketchor storyboardtypemodelwhich providesaclearview of theusersexpectationsof theinteraction.Whatis alsoneededis a systembasedapproachwhich canbede-velopedatamoredetailedlevel andmappeddirectlyontoanimplementablemodel.

5. From designsketch to implementablemodel

The needfor formalismsthat can provide for a more de-tailed,systemorienteddescriptionhasbeenarguedfor in 10.

Theuseof formal modelsin theearlyphasesof thesoft-waredesigncycle allows theproductionof a betterfirst de-signthanis possiblewith approachesthatallow lesssystem-aticandpreciseanalysisof adesignbeforeit is implemented.Thereis aneedfor betterdesignmethodsasit hasbeenem-pirically foundthat40%of softwareerrorsare‘introduced’in the designphase,whereaswith currentapproachesonly5%of thoseerrorsarediscoveredduringthedesignphase12.This leadsto high costsaserrorsin the latter stagesof thesoftwaredevelopmentcyclearemoreexpensiveanddifficultto correct26.

However, the useof formalismsand modelsshouldnot

beseenasanalternative to approachesasrapidprototypingfor the developmentof interfaces,but ratheras a comple-mentaryactivity. Thisactivity allows for theearlydiscoveryof particularproblemsin the interfaceand helpsselectingthemostpromisingdesignoptionsthatcanconsequentlybeusedto developaprototype.

5.1. Hybrid High-level Petri Nets

In thissectionweillustratetheuseof HyNet,extendedHigh-level PetriNets8 # 24, asa modellinglanguage.HyNet com-binesthreepromisingconceptsthatwe believe will benec-essaryfor thedescriptionof hybrid interfaces.

$ A graphicalnotation to define discreteand continuousparallelbehaviour.$ Theavailability of objectorientedconcepts.$ A high-level hierarchicaldescriptionthatallows thespec-ificationof morecomplex systems.

In orderto accommodatethedescriptionof processeswhichbehaviour evolvesin timein acontinuouswaytheformalismprovidesthe useof differentialalgebraicequations.Setsofdifferentialalgebraicequationsarecommonlyusedin fieldslike physicsto describecontinuouschange.Theunderlyingconceptof time is thatof discretetime, i.e. time evolvesindiscretesmalltimeunits.

Object-orientedconceptssuchas inheritance,polymor-phismanddynamicbindingprovidemeansfor a morecom-pactandclearstructuringof the specificationof a complexsystem.A detaileddescriptionof the (formal) semanticsofHybrid High-level PetriNetscanbefoundin 29 # 28.

Two smallexamples,showing mostof HyNetfeatures,aregiven in Figure4. As with standardPetri Nets 19, specifi-cationsconsistof placesandtransitionsconnectedby arcs.Besidesstandardarcs,HyNethasenablingandinhibitor arcsthatrespectively enableandinhibit atransitionwhenatokenresideson the placeconnectedto the arc.The numberandtype of tokensthat canresideon a placeat any momentisdefinedin an inscription label.

Tokensin HyNetcanbeatomictokens.Moresignificantlyfor the longertermaim of this work, they canalsobecom-plex objectswith object orientedfeaturesmuch similar tothosefoundin theC++ language.Thestructureof suchob-jectsaregivenin aclassdescriptionandaredefinedseparatefrom thenetdefinition.

Transitionsare labeledby inscriptionswhich defineitscharacteristics.For continuoustransitionsthesearetheacti-vationconditionandthesetof differentialequations.Usuallythesearewrittenwithin theboxdenotingthetransition.Dis-cretetransitionshave morecomplicatedinscriptions,defin-ing also the possibledelay, firing time andfiring capacity.For continuoustransitionsthesehave default valueswherethedelayandfiring timearezeroandthecapacityis infinite.

c%

TheEurographicsAssociationandBlackwellPublishers1999.

Page 8: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

Figure4 showsanexampleof adiscrete(dt) andacontin-uous(ct) transition.Thediscretetransitionis enabledwhenall incidentplaceshave a token (except the oneconnectedby aninhibitor arc),theactivationcondition(AC) is fulfilledandthefiring capacityis not exceeded.At thatpoint thede-lay time startsandthetransitionis fired whenthedelayhaspassedwithoutchangeof theenablingconditions.Thedura-tion of firing of thetransitionis givenby thefiring time(FT).Whendt fires the variabley getsthe Booleanvaluecorre-spondingto theevaluationof theequationx & at ')(+*-, x & at '/.0* .Thecontinuoustransitionfireswhentherearetokenson theincidentplacesthatarenotconnectedto aninhibitor arcandtheactivationconditionis fulfilled (y 132 ). It continuously,i.e. at every clock tick, updatesthevariablesx andy asde-finedby the setof differentialequationsin the body of thetransition.Thefull specificationis describedin 13.

x

p1p4

[Real, omega] [Real,1]

1.0 0.0

y < 4

y’ = -1 x’ = 0.5 * z

[Token,1]

p2

[Int, 1]

p3

1

z

[Int, omega]

25

2 x y

1

p1

p2 p3

p4

[Bool,1]

[Int, 1][Token,1]

FC: 1AC: x.at(1) > 3 && z > 0FA: y = x.at(2) > x.zt(1);DT: 2FT: z

a)

b)

c)

d)

y

ct

z

dt

Figure4: Exampleof a discretetransition(dt) andcontinu-oustransition(ct) in HyNetwith a) inhibitor arc b) enablingc) standard d) continuousarc.

5.2. Flying camerasrevisited

In this sectionwe give a shortintroductionto Hybrid High-level PetriNetsandillustratetheir usefor the specificationof the mouse-basednavigation techniquedescribedin Sec-tion 4.2.

Figure5givesusafirstabstractoutlineof aHyNetspecifi-cationof theflying cameranavigationtechnique.Theshadedboxesrepresentabstracttransitionsthatarespecifiedassep-arateHyNet subnets.Transitionrefinementprovidesa hier-archicalway to dealwith thecomplexity of detailedspecifi-cations.

Figure 5 shows the objects that are involved in theinteractiontechniqueand the transitionsto which they arerelated.In the following sections,we discusssomeof the

mouse movement

view point control

scene projection

control infoprojection

positioncalculation

Screen

Scene

View

Cursor Square

Mouse

Figure5: AbstractHyNetmodelof thenavigationinterface.

abstracttransitionsin moredetail.

MouseoperationFigure6 showsthesubnetdealingwith themouseoperation.The mouseis directly operatedby a humanuserwho can,in principle, decideat any time the velocity of the mousemovement,the direction in which to move the mouseandthepressingandreleasingof thethreemousebuttons.

whenever you like

o.vx = chg (i.vx) o.vy=chg (i.vy)

o.m=switch(i.m)o.l = switch(i.l)o.r=switch(i.r)

0,0,0 }

Mouse

[Mouse,1]{(x,y,vx,vy),

move mouse

mm.x’ = m.vx m.y’ = m.vy

o i

class Mouse{

Bit l,m,r;

Real vx,vy;

Real chg ();Bit switch () }

class Pos {Real x,y};

Real x, y;;

Figure6: Mousemovementsubnet.

The mouseitself is modelledas an object of the classMouse. Its attributesarethe relative positionof the mouse,the velocity of the mousein x andy directionmodelledastwo Realnumbers(vx andvy) thatcanbeupdatedin a con-tinuousway, andthreebits(l, m, r) thatindicatethestatusofthemousebuttons.

The movementof the mouseis modelledby a continu-oustransitionthat usesthe valuesfor the velocity (i.e. dis-placementpertime unit) of theMouseobject.This continu-oustransitionis not restrictedby anactivationcondition,sothe positionof the mouseis continuouslyupdatedat everyclock-tick.

c4

TheEurographicsAssociationandBlackwellPublishers1999.

Page 9: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

The changeof the valuesfor the velocity andthe press-ing andreleasingof buttonsis modelledby adiscretetransi-tion thatausercanactivate‘whenever the user likes’.Thisconditionis left informal in thisspecificationof theinterfacebut it couldberefinedintoamodelof userbehaviour thatonewould like to analysein combinationwith thespecificationof theinterface.

That even simple modelsof user behaviour may revelimportantdesignproblemsthat without the useof modelswould be hardto find hasbeenillustratedby Rushbyin 22.Heshowedthatthecauseof modeconfusion,whichwasob-served in theuseof autopilotsin aircraft,couldbesuccess-fully tracedby using a formal model of the interfaceandpilot behaviour in combinationwith modelcheckingtools.

The method chg models the changein velocity anddirection of mousedisplacementthat the user performsin operatingthe mouse.The method switch models theoperationof thebuttonsby the user. This methodgivesthevalue that modelsthe positionof a button,where 5 standsfor the button being pressedand 6 for the button beingreleased.

View point controlA morecomplex part of the specificationis the control ofthe view point with respectto the threedimensionalscenethroughwhich theuseris supposedto navigate.

Dependingon which buttonsarepressedby theuser, thenavigationtechniqueoperatesin differentmodes.Theviewpointcontroltransitioncanbefurtherrefinedby theabstractsubnetgivenin Figure7 which introducesaplacefor anob-jectmodellingthemode.

Scene

Mouse

Square

Mode

View

part1

part2

Figure7: View pointcontrol subnet,abstract.

The refinementof the two new abstracttransitionsaregivenin Figure8 andFigure9.

Themodeis anobjectof classModeandhasasattributesanumberof variables.Two variablesof typeReal, distxanddisty, containthedistancebetweenthecursorandthesquareon the screen.This distanceis calculatedby methodp andupdatedcontinuouslyat any time whenthenavigationtech-niqueis activated.Thereforethis transitionis enabledwhenthereis anobjectof typeSquare on placeSquare.

Whenthe userpressesthe middle mousebutton a Modeobjectis createdon placeModeandthedistanceinitialisedby 6 . The initial modeof operationin horizontaldirectioncorrespondsto the observation of the scenewith a camerathatis turningleft or right.Theverticaldirectiongivesaviewequal to moving the cameraforward and backward. Themodefor thehorizontalandverticaldirectionsarechangedwhentheleft andor right buttonof themousearetoggled.

Theincreaseanddecreaseof thedistancebetweenthecur-sorandthesquareis calculateddirectly asa functionof thedisplacementof themousepertimeunit. It is thereforemod-elledasacontinuoustransition.

Note thedifferencein theuseof themousebuttons.Themiddlemousebuttonworkswith a ‘clicking’ principle:eachmodechangerequiresapress-and-releasesequence.Theleftandright mousebuttonswork with a ‘status’principle: thestatus,upor down, of thebuttonat any timedefinesthewayof navigation.

class Mode { Real distx;Real disty;Md x, y;Md xmd ();Md ymd ();Real p ();

mm.m = 1

md.distx=0.0md.disty = 0.0md.x = look_lrmd.y = fw_bw

0,0,0 }

Mouse

[Mouse,1] m

sq

n.distx’ = p (m.vx)

n.disty’ = p (m.vy)

mm

[Mode,1]

Mode

md

n

n

Square

[Square,1]

{(x,y),vx,vy,

class Md {Look_lr,crab_lr, look_ud, fw_bw,rise_fall}

n.x=xmd(m)

n.y=ymd(m)

m.m = 0 m.m = 1

[Token,1]

xmd(m) != n.x or ymd(m) != n.y

return (if a.l then look_lr else if a.r then crab_lr

return (if a.l then look_udelse if a.r then rise_fall

Real Mode :: p (r : Real){return (if c*r < boxsizethen 0.0else c * r)};

else fw_bw)} ;

else look_lr)};

Md Mode :: xmd (a : Mouse) {

Md Mode :: ymd (a : Mouse) {

Figure8: View point control subnet,part1.

In Figure9 the updatingof the parametersof the view-matrixarespecified.Theactivationconditionsof thecontin-uoustransitionsspecifyat any momentwhich of the trans-formationsareactive.Whenever thenavigationtechniqueisactivated,two modesof transformationareactive, onecor-respondingto the functionality in horizontaldirection,theothercorrespondingto thatin verticaldirection.Theamountof changeof theview matrixparametersis a functionof thevalueof thedistanceattributeof theMode.

The object View is an object of classMatrx7 denoting

c8

TheEurographicsAssociationandBlackwellPublishers1999.

Page 10: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

a 9;:<9 matrix. On this matrix basic transformationsaredefinedasmethods,suchasrotationin x direction(rotvwx= ,rotation in y-direction (rotvwy) and translation of viewpoint in x, y andz direction(trnsvwx, trnsvwyand trnsvwzrespectively). Further there is a function that createstheinitial view matrix (create).

m.ym = fw_bw

class Matrx4 {Matrix4 view}Matrix4 Matrx4 :: rotvwx (s : Real): {};Matrix4 Matrx4 :: rotvwy (s : Real) {} ;Matrix4 Matrx4 :: trnsvwx (s : Real) {};Matrix4 Matrx4 :: trnsvwy (s : Real) {};

Matrix4 Matrx4 :: create () {};Matrix4 Matrx4 :: trnsvwz (s : Real) {};

v v

View

[Matrx4, 1]

InitView

parvm vm =

create(par)

m.xm = look_lr m.ym = look_ud m.xm = crab_lr m.ym = rise_fall

x0,y0,z0

xN,yN,zN

xV,yV,zV

[Mode,1]

Mode

m

v’=rotvwx(m.xdist) v’=rotvwy(m.ydist) v’=trnsvwx(m.xdist) v’=trnsvwy(m.ydist) v’=trnsvwz(m.ydist)

Figure9: View pointcontrol subnet,part2.

5.3. Discussion

TheHyNetspecificationpresentedin thissectionis morede-tailed thanthe specificationspresentedin the previous sec-tions and canbe seenas a refinementof those.The com-binationof objectorientedconceptswith the expressionofcontinuoustransformationin this specificationexperimentseemsto fit quitewell to what is neededto describeVE in-terfaces.

A detailedanalysisof theinterworking of thecontinuousanddiscreteaspectsis in principlepossible.Fromthespeci-ficationwe can,for example,derive which combinationsofnavigation are possibleand which are excluded.A closerlook alsorevealsthat whenthe left mousebutton is in thedown positionis haspriority overtherightmousebutton(seeFigure8). With the left mousebutton down the specifica-tion showsthatpushingalsotheright mousebuttondoesnotchangethenavigationmode.Thismightbefoundsurprisingby theuserwhoexpectsachangein navigationmodewhen-ever thepositionof theleft or right mousebuttonis changed,asis thecasemostof thetime.

Therearea numberof other interestingaspectsthat canbederivedfrom theformalspecification,but whatweintendto illustratehereis only the fact thatsuchobservationscanbemadeandthat this providesa way for a moresystematiccomparisonandevaluationof VE interfaces.

Unfortunately no automatic tools for the analysis ofHyNet specifications,suchasmodelcheckersor simulationtools,arecurrentlyavailable.Suchtoolswouldmake it pos-sible to automaticallycheckpropertiesof a specification,

suchas the onesmentionedabove, andwould allow for amoresystematicandformal analysiswhile demandinglesseffort from thedesignerin doingsothanwouldotherwisebethecase.

In this context we feel it is worthwhile to continuethespecificationexperimentswith formalisms that are sup-portedby modelcheckingtoolssuchasthe toolsetHyTechfor the specificationand verificationof a subsetof hybridsystemsbasedonhybridautomata5.

6. Conclusions

Virtual environmentsarea mix of continuousanddiscretecomponents.The useof modellingtechniquesfrom hybridsystemsresearchis a steptowardsmoreaccuratelydefineddescriptionsof virtual environmentsandtheir components.Whathasbeenpresentedhereis someinitial researchbasedon this premiseandfocusedon themodellingof interactiontechniqueswithin virtual environments.

The currentmodelsshow the behaviour of the interac-tion techniqueswithin asystem.A hybridbasednotationhasbeenpresentedwhich providesa userbased‘sketch’ of theinteractionin a virtual environment.This canbeusedasthebasisfor a systembaseddescriptionwith theintentof map-ping this descriptiononto an implementationor for furtherformalmodellingthatis amenableto computeraidedanaly-sis.

However, when describinginteraction,considerationofthecognitive processesof theuserasthey interactwith thesystemis alsorequired.If interactionis to besuccessful,theusercognitive modelwill alsohave to be factoredinto theoverall hybrid model.Theconstructionof a userscognitiveframework andthewaythey utiliseresourceswithin avirtualenvironmentsystemaresomeof the next areasof researchfor boththeINQUISITIVE andTACIT projects.

7. Acknowledgements

Thisworkwassupportedin partby theUK EPSRCINQUIS-ITIVE project,Grant GR/L53199and the TACIT networkundertheEuropeanUnion TMR programme,contractERBFMRX CT970133.

References

1. R. Balakrishnan, T. Baudel, G. Kurtenbach, andG. Fitzmaurice. The Rockin’Mouse:Integral 3D ma-nipulationon a plane. In CHI’97: HumanFactors inComputingSystems, pages311–318.ACM, 1997.

2. ColtVR. Virtual ProductionPlanner. BBC/Colt Inter-national,1997.

3. Distributed interactive virtual environment (DIVE).1997. SICS,SwedishInstituteof ComputerScience,Sweden.

c>

TheEurographicsAssociationandBlackwellPublishers1999.

Page 11: The Hybrid World of Virtual Environments

Smithet al. / TheHybrid World of Virtual Environments

4. J-M. FlausandG. Ollagnon. Hybrid flow netsfor hy-brid processesmodellingandcontrol. In Hybrid andReal-TimeSystems, pages213–227.Springer, 1997.

5. T. A. Henzinger, P. H. Ho, andH. Wong-Toi. HyTech:A modelchecker for hybridsystems.In ProceedingsoftheNinthInternationalConferneceonComputerAidedverification, volume 1254 of LNCS, pages110–122.Springer, 1997.

6. INQUISITIVE homepage,1999.http://www.cs.york.ac.uk/? shamus/inquisitive/.

7. R. J. K. Jacob. A visual languagefor non-wimpuserinterfaces. In IEEE Symposiumon Visual Languages,pages231–238.IEEEComputerSocietyPress,1996.

8. K. JensenandG. Rozenberg, editors. High-level PetriNets — Theory and Applications. Springer-Verlag,1991.

9. H. Jifeng.FromCSPto hybridsystems.In A ClassicalMind : EssaysonHonourof C.A.R.Hoare, pages171–189.PrentiseHall, 1994.

10. S.JonesandJ.Sapsford.Theroleof informalrepresen-tationsin earlydesign.In P. MarkopoulosandP. John-son,editors,Design,Specificationand Verification ofInteractiveSystems, pages117–133.Springer, 1998.

11. S.B. Kang.Hands-freenavigationin VR environmentsby trackingthehead.InternationalJournal of Human-ComputerStudies, 48:247–266,1998.

12. P. Liggesmeyer, M. Rothfelder, M. Rettelbach,andT. Ackermann. Qualitäsicherungsoftware-basiertertechnishersysteme– problembereicheund lósungsan-sätze.Informatik-Spectrum, 21:249–258,1998.

13. M. MassinkandD. Duke. Specificationof hybridinter-facesin HyNet. Technicalreport,C.N.R.Ist. CNUCE,1999.

14. GNU MAVERIK. 1999. AIG Group,University ofManchester, UK.

15. M. R. Mine. Virtual environment interaction tech-niques. Technical Report TR95-018,University ofNorthCarolina,1995.

16. M. R. Mine. Working in a virtual world: Interactiontechniquesusedin theChapelHill ImmersiveModelingProgram. TechnicalReportTR96-029,University ofNorthCarolina,1996.

17. M. R. Mine, F. P. BrooksJr, andC. H. Sequin.Movingobjectsin space:Exploiting proprioceptionin virtual-environmentinteraction.In SIGGRAPH97, pages19–26.ACM SIGGRAPH,1997.

18. MR (Minimal Reality)toolkit. 1995.Universityof Al-berta,Canada.

19. C. A. Petri. Kommunikationmit automaten.Schriftendesiim nr. 2, Institut für InstrumentelleMathematic,1962. English translation:TechnicalReportRADC-TR-65-377,Griffis Air Base,New York, Vol. 1, Suppl.1, 1966.

20. J. S. Pierce,A. Forsberg, M. J. Conway, S. Hong,andR. Zeleznik. Imageplaneinteractiontechniquesin 3Dimmersiveenvironments.In 1997SymposiumonInter-active3D Graphics, pages39–44,1997.

21. I. Poupyrev, M. Billinghurst, S. Weghorst, andT. Ichikawa. The Go-Gointeractiontechnique:Non-linearmappingfor directmanipulationin VR. In UIST96, pages79–80,1996.

22. J. Rushby. Using model checkingto help discovermodeconfusionsand other automationsurprises. InDraft for comments. ComputerScienceLaboratory,SRI International,MenloPark,USA, 1998.

23. J. K. Salisbury andM. A. Srinivasan. ProceedingsofthesecondPHANToM usersgroupworkshop.Techni-cal report,AI TechnicalReport1617,MIT, 1997.

24. S. Schöf,M. Sonnenschein,andR. Wieting. Efficientsimulation of THOR nets. In G. De Michelis andM.Diaz,editors,ApplicationsandTheoryof Petri Nets,16th InternationalConference, volume935 of LNCS.Springer, 1995.

25. S.SmithandD. Duke. Virtual environmentsashybridsystems.In EG-UK’99, 1999.EurographicsUK Chap-ter.

26. I. Sommerville. Software Engineering. Addison-Wesley, fifth edition,1996.

27. TACIT homepage,1999.http://kazan.cnuce.cnr.it/TACIT/.

28. R. Wieting. Hybrid high-level nets. In Winter Sim-ulation Conference, pages848–855,1996. Coronado,California.

29. R.WietingandM. Sonnenschein.Extendinghigh-levelpetri netsfor modelinghybrid systems.In A. Sydow,editor, Proceedingsof the IMACSSymposiumon Sys-temsAnalysisandSimulation, pages259–262.GordonandBreachPublishers,1995.Berlin, Germany.

30. Y. Y. Yang,D. A. Linkens,andS.P. Banks.Modellingof hybrid systemsbasedon extendedcolouredpetrinets. In Hybrid SystemsII , pages509–528.Springer,1995.

c@

TheEurographicsAssociationandBlackwellPublishers1999.