Upload
nikki
View
42
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Mixed Reality Systems. Lab V – Interaction and Collaboration- Christoph Anthes. Overview. Introduction Medieval Town with an application base Interaction Recombination of interaction techniques Development of an own input device Development of an own cursor transformation model - PowerPoint PPT Presentation
Citation preview
Mixed Reality Systems-Lab V – Interaction and Collaboration-
Christoph Anthes
LabMixed Reality Systems 2
Overview• Introduction
• Medieval Town with an application base
• Interaction• Recombination of interaction techniques• Development of an own input device• Development of an own cursor transformation model
• Network Communication• Creating own events• Handling events
• Concurrent Object Manipulation• Using mergers
LabMixed Reality Systems 3
Introduction• If you take a look at you code and run the example
you will find yourself in the Medieval Town application
• The code base has become significantly smaller since the application makes use of the OpenSGApplicationBase
• Skybox, height and collision maps, as well as Avatars are additionally registered, the rest of the code is generic
• This could be a good example to start your own application
• Let’s inspect the application base version of the Medieval Town
LabMixed Reality Systems 4
Introduction• The class is derived from OpenSGApplicationBase• Required variables are initialised, constructor and
destructor are defined
LabMixed Reality Systems 5
Introduction• The initialize method
• Establishes the connection to the scenegraph interface• Height maps are loaded or generated if not found• Skybox is set up
LabMixed Reality Systems 6
Introduction• A root node is set• Filled with an anonymous group core• The data from the world database and the skybox are
attached to the node• The starting transformation is taken from the world
database and set as an initial transformation for the player
LabMixed Reality Systems 7
Introduction• Camera and Avatar are requested from the local user
object of the user database• Local pointers to these objects are established if the
system returns that they are available• The display of the Avatar is set to false
LabMixed Reality Systems 8
Introduction• In the display method the skybox is updated
according to the current camera transformation• An updateAvatar() method is implemented in order to
update Avatar transformations based on tracking data
• An empty cleanup method is provided since it is necessary to implement one
LabMixed Reality Systems 9
Introduction• Callbacks register the required modifiers, the avatar
and support to VRPN and trackD input devices
LabMixed Reality Systems 10
Introduction• Main creates an application object• It simply triggers the start function afterwards• This results in a subsequent initialisation process• Once the application returns it is deleted
LabMixed Reality Systems 11
Interaction• Development of own interaction techniques
• Often highly desirable• Can be very specific depending on input and output devices• Can be closely related to the scene which is represented in
the application
• Several approaches possible• Development from scratch• Redesign of the inVRs interaction module• Use of existing interaction module and recombination and
extension of available techniques• Implementation of new techniques by creation of transition
and action models
LabMixed Reality Systems 12
Interaction• Let us remember the application from our first inVRs
lab the Medieval town• In a first step we want to modify the interaction of
the medieval town• Lets have a look at the interaction state machine of
inVRs again
LabMixed Reality Systems 13
Interaction• Recombination of interaction techniques
• Our first approach is to use different predefined parts of interaction techniques
• We first switch from the HOMER interaction technique to a virtual hand interaction technique
• For this we have to alter two configuration files• Let’s start with the interaction.xml stored under
config/modules/interaction• These models have to be exchanged:
• manipulationActionModel • selectionChangeModel• unselectionChangeModel
• These models can stay the same:• selectionActionModel – objects should still be highlighted when
selected• manipulationConfirmationModel – confirmation is still triggered
by a button• manipulationTerminationModel – termination is as well triggered
by a button
LabMixed Reality Systems 14
Interaction• Recombination of interaction techniques
• The setup of an interaction technique is stored inside the interaction.xml file in \config\modules\interaction
• We have to replace the manipulationActionModel inside the stateActionModels tag with the following snippet
• And we have to replace the selectionChangeModel and the unselectionChangeModel inside the stateTransitionModel tag with the following snippet
• To use our newly configured models we need sensor input as well, thus we have to change our abstract input device of the input interface
Snippet 1-1Snippet 1-1
Snippet 1-2Snippet 1-2
LabMixed Reality Systems 15
Interaction• By inserting the following snippet in the
controller.xml in the directory \config\inputinterface\controllermanager we replace the MouseKeybController with the MouseKeybSensorController
• Now we can use the emulation of a sensor as previously introduced in the Going Immersive Tutorial
• We additionally have to insert a cursor transformation model which can work with sensors
• Since sensor emulation is a pretty bad method for input we have to come up with an alternative
Snippet 1-3Snippet 1-3
Snippet 1-4Snippet 1-4
LabMixed Reality Systems 16
Interaction• In this step we actually implement a new input device as
described inside the Going Immersive part of the inVRs tutorial
• This device will make use of ARToolKit as a tracking library
• It shall provide the functionality of sensor transformation and absence or visibility of markers• The marker orientation as provided from the ARToolKit is set to
the sensor orientation• The sensor translation is implemented by an additional emulator
function in a cursor transformation model• If a marker is visible the button of the device will be set to true if
it is not detected it will be set to false
• Details• We inherit from the InputDevice class• Implement the required functions by using parts of the code
provided in the Augmented Reality Lab
LabMixed Reality Systems 17
Interaction• What methods are we going to use?
• virtual void update(); - called from inVRs to update the device state
• void initARToolkit(std::string cameraCalibrationFile, std::string videoConfigurationFile); - initialisation
• void loadMarker(std::string markersConfigurationFile); - loading markers
• void addMarker(const XmlElement* marker); - adding markers
• void startCapturing(); - wrapper to start capturing of the video
• bool captureFrame(); - capture a single frame from the video stream
• void detectAndHandleMarker(); - detect a marker and update sensor information from the recent captured image
• void cleanupARToolkit(); - cleanup the device
LabMixed Reality Systems 18
Interaction• What variables and helper classes are we going to
use?• Variables
• static XmlConfigurationLoader xmlConfigLoader;• bool initialized - a boolean to check whether initialisation is
terminated• ARUint8 *imageData – the captured image• float threshold – a threshold for binarisation• std::vector<MarkerData*> markers – a list of markers
• Helpers• MarkerData – Used to store data about a single marker• ARToolkitInputDeviceFactory – used to help with the creation
of ARToolKitDevice objects
LabMixed Reality Systems 19
Interaction• Let’s start with the constructor
• It takes configuration arguments, triggers ARToolKit initialisation, and loads marker definition
• Afterwards it starts the capturing process
LabMixed Reality Systems 20
Interaction• The destructor calls the ARToolKit cleanup and
empties the marker list
• In the update method the frame capture is triggered and a subsequent method for maker detection and handling is called
LabMixed Reality Systems 21
Interaction• The startCapturing method validates that everything
is initialised and triggers the ARToolKit call for starting the video capture stream
• In the actual captureFrame method the video image is stored inside an image data structure which will be used for future processing and detection
• The method returns whether the capture was successful
LabMixed Reality Systems 22
Interaction• This is the most important function in the device
processing and updating• The first step to detect markers and build a detected Marker
structure• ARToolkit marker processing is performed
LabMixed Reality Systems 23
Interaction• The transformations of the marker on the device are
set
LabMixed Reality Systems 24
Interaction• The cleanup method which is called in the destructor
empties our currently captured image data structure• Stops video capturing• Closes the video stream
• Finally we create a factory, which allows to generate our device• The factory takes a set of configuration parameters as well
as the name of the device• These configuration parameters are passed to the
constructor of the device which is then returned as a generic input device
LabMixed Reality Systems 25
Interaction• This is the implementation of the device factory used
for the creation of ARToolkitInputDevice objects
LabMixed Reality Systems 26
Interaction• Cursor Transformation Models
• Cursor transformation models belong to the user object in the user database
• They are used to determine the users cursor which can be relevant for object selection
• inVRs provides three standard cursor models• VirtualHandCursorModel• HomerCursorModel• GoGoCursorModel
• We have to implement as well a cursor transformation model• We only want to take the orientation of our marker into
account for objects manipulation
LabMixed Reality Systems 27
Interaction• This is the header file of our cursor transformation
model
LabMixed Reality Systems 28
Interaction• The model is derived from a generic cursor
transformation model• The constructor is kept empty and passes the configuration
data up to the super class• The getName() function returns the name of the model
LabMixed Reality Systems 29
Interaction• The cursor transformation is calculated in the
generateCursorTransformation method• Position, orientation and scale of the received sensor data
can be taken into account• To emulate a normal virtual hand model all of these values
would have to be set to 1
LabMixed Reality Systems 30
Interaction• As with many inVRs components a factory is needed
for the generation of cursor transformation models• An argument vector can be passed to the factory which
returns an initial model
LabMixed Reality Systems 31
Interaction• Now we have to configure our ARToolKit device
• These parameters are going to be used by our device• A calibration file is passed, describing the correction of the
intrinsic parameters• A camera and a marker configuration file are passed• A threshold is passed for binarisation• The coordinate systems are mapped on inVRs coordinate
systems
LabMixed Reality Systems 32
Interaction• And finally we have to switch again the controller in the
controller.xml file, set the paths and update the cursor
• Besides the includes we only have to alter two lines in our application• In the UserDatabase we have to register our cursor
transformation model
• In the InputInterface we have to register our device
• If we compile and execute our application now we will be able to manipulate the scene with our markers
Snippet 2-1Snippet 2-1
Snippet 2-2Snippet 2-2
Snippet 2-3Snippet 2-3
Snippet 2-4Snippet 2-4
Snippet 2-5Snippet 2-5
LabMixed Reality Systems 33
Network Communication• In the recent Medieval Town tutorial we have
implemented own animation which was executed only locally
• This could resolved by transmitting information over the network
• Development of network communication can be achieved in several ways• Definition of own messages• Definition of own events and modifiers• Rewrite of the inVRs network module
• The most common approach is sending messages or writing own events
• We will now have a look at writing own events
LabMixed Reality Systems 34
Network Communication• Writing own events
• For communication we often have to develop our own events• Let’s take a look at the implementation of such an event
LabMixed Reality Systems 35
Network Communication• Writing own events
• First we implement the constructor and the destructor• One option is to use an empty constructor which
automatically sets an empty payload• The second option uses the constructor of the superclass
and takes string as message payload
LabMixed Reality Systems 36
Network Communication• Writing own events
• Then we have to implement three functions• Two used for serialisation and deserialisation and network
communication• A third for execution at the events final location
LabMixed Reality Systems 37
Network Communication• To use network communication we have to connect
with the network module
• The event type has to be registered as a callback at the initialisation of the event manager
• An event pipe has to be defined
• And it is initialised with empty values
Snippet 3-1Snippet 3-1
Snippet 3-2Snippet 3-2
Snippet 3-3Snippet 3-3
Snippet 3-4Snippet 3-4
LabMixed Reality Systems 38
Network Communication• The event has to be polled in constant intervals
• The pipe to the text module which has been defined has to be requested
• If the pipe is not empty the current events have to be removed from front
• Once fetched their execute method is called• Afterwards the event is deleted
Snippet 3-5Snippet 3-5
LabMixed Reality Systems 39
Network Communication• This event poling has to be triggered at a given location
• Usually and in our case it is done once per frame in the display method
• The issuing of the event has to be triggered, thus a GLUT callback has to be defined which is automatically registered
• If you compile and execute your code now you should be able to send events by pressing the key ‘e’
• Try to interconnect to a remote user by passing a server IP + port• Console output at the interconnected remote participants will be
provided one you press ‘e’
Snippet 3-6Snippet 3-6
Snippet 3-7Snippet 3-7
LabMixed Reality Systems 40
Concurrent Object Manipulation• Only few MR applications provide the possibility to
manipulate objects concurrently by multiple users• inVRs is the only MR framework which supports that
type of interaction as an out of the box feature• Concurrent object manipulation is implemented by
the use of so called mergers• These mergers are implemented as modifiers in the
transformation manager• Once a pipe is opened on an object with the same id
(e.g. an entity) which already has a pipe open and a merger is defined for such a behaviour it will be executed and process the data from both pipes
LabMixed Reality Systems 41
Concurrent Object Manipulation• Mergers
• As with any other modifier mergers have to have a source id as well as a destination id
• Each merger contains these as well the identifiers for pipeType, objectClass, objectID, and fromNetwork
• These key attributes have to be used for an input and an output pipe
• It is equipped with an id which will be used later on to establish it in the appropriate pipe sections
Snippet 4-1Snippet 4-1
LabMixed Reality Systems 42
Concurrent Object Manipulation• Mergers
• To finally install the merger we have to activate it in our desired pipes
• In our case the merger should affect concurrent interaction• Thus the following snippet has to be inserted twice in the
local interaction pipe and as well in the remote interaction pipe (fromNetwork = “0” and fromNetwork = “1”)
• When we execute our application now it should be possible with two networked users to collaboratively manipulate the same entity
Snippet 4-2Snippet 4-2
LabMixed Reality Systems 43
Things to do at home• Thing to do now!!! – Add another marker, map the
orientation of the marker on the sails of the windmill• Extend the OpenSGApplicationBase to an
ARToolKitOpenSGApplicationBase and combine the rendering from video stream with your input device
• Develop an own interaction technique• Try to implement own events in an application for
example to change the colour of objects on pressing a mouse button
• Improve your merging strategies and have a look at the implementation of the merger
LabMixed Reality Systems 44
Useful Links• ARToolKit
• http://artoolkit.sourceforge.net/ - Sourceforge Entry page • http://www.hitl.washington.edu/artoolkit/ - GPL version• http://www.artoolworks.com/ - commercial version
• ARToolKit Plus Web Page• http://studierstube.icg.tu-graz.ac.at/handheld_ar/
artoolkitplus.php
• OpenSG Web Page• http://www.opensg.org/
• inVRs Web Page• http://www.invrs.org/• http://doxygen.invrs.org/ - inVRs Doxygen Page
Thank You !