9
Computer Networks and ISDN Systems 29 (1997) 1685-1693 2D human-computer interaction techniques in immersive virtual environments K. Coninx *, F. Van Reeth ‘, E. Flerackers 2 LUC - Expertise Centre fur Digital Media, Wetenschapspark 2, E-3590 Diepenbeek, Belgium Abstract Menus and dialog boxes are useful enhancements for the direct manipulation based interaction style of Virtual Environments. In this paper an implementation of 2D HCI techniques in immersive Virtnal Environments is discussed. The applicability of 2D techniques for interaction with the environment is illustrated in the case of a BOOM for immersive viewing. A Pinch Glove is used as the input device to interact with the menus and dialog boxes. A programming approach to make Motif based interfacing widgets suitable for stereoscopicdisplay is discussed. Some examples demonstrate the use of 2D HCl techniques in immersive Virtual Environments. 0 1997 Elsevier Science B.V. Keywords: Virtual environments; Human-computer interaction; User interfacing; Immersion; Stereoscopy 1. Introduction Direct manipulation often is the dominating inter- action style in Virtual Environments. In direct graph- ical manipulation the user interacts with the virtual objects in a graphical way. Classical examples of direct graphical manipulation in the context of Vir- tual Environments are moving an object (by grab- bing or selecting the object and moving the input device so that the object follows the path of the device through the environment) or scaling an object (by manipulating control points on the vertices of the object). In addition to direct graphical manipulation there is a need for other interaction techniques in Virtual Environments. There still is a place for menu l C-g author.Emait: [email protected]. ’ Email: [email protected]. * E?mail: [email protected]. or dialogue style interaction, even in immersive Vir- tual Environments. This kind of interaction is useful for a number of situations where direct graphical interaction is not straightforward. Selection of a working mode of the environment, for instance switching between navigation in the virtual world by means of a flight metaphor and manipulation of the individual objects in the world, can be communi- cated to the system by means of menu or dialogue style interaction. Environment control parameters can be adjusted in dialogues with the user. Direct non- graphical manipulation of objects can be realized by interaction controls in a dialog box e.g. fancy dials or slider widgets to control amibutes of the selected object. Considering the applicability of menu and dia- logue style interaction in immersive Virtual Environ- ments, the designer of the environment has a number of alternatives at his/her disposal to implement the interaction style. 0169-7552/97,/$17.00 0 1997 ElsevierScience B.V. AI1 rights reserved. PII SO1 69-7!552(97)00083-4

2D human-computer interaction techniques in immersive virtual environments

Embed Size (px)

Citation preview

Computer Networks and ISDN Systems 29 (1997) 1685-1693

2D human-computer interaction techniques in immersive virtual environments

K. Coninx *, F. Van Reeth ‘, E. Flerackers 2 LUC - Expertise Centre fur Digital Media, Wetenschapspark 2, E-3590 Diepenbeek, Belgium

Abstract

Menus and dialog boxes are useful enhancements for the direct manipulation based interaction style of Virtual Environments. In this paper an implementation of 2D HCI techniques in immersive Virtnal Environments is discussed. The applicability of 2D techniques for interaction with the environment is illustrated in the case of a BOOM for immersive viewing. A Pinch Glove is used as the input device to interact with the menus and dialog boxes. A programming approach to make Motif based interfacing widgets suitable for stereoscopic display is discussed. Some examples demonstrate the use of 2D HCl techniques in immersive Virtual Environments. 0 1997 Elsevier Science B.V.

Keywords: Virtual environments; Human-computer interaction; User interfacing; Immersion; Stereoscopy

1. Introduction

Direct manipulation often is the dominating inter- action style in Virtual Environments. In direct graph- ical manipulation the user interacts with the virtual objects in a graphical way. Classical examples of direct graphical manipulation in the context of Vir- tual Environments are moving an object (by grab- bing or selecting the object and moving the input device so that the object follows the path of the device through the environment) or scaling an object (by manipulating control points on the vertices of the object). In addition to direct graphical manipulation there is a need for other interaction techniques in Virtual Environments. There still is a place for menu

l C-g author. Emait: [email protected]. ’ Email: [email protected]. * E?mail: [email protected].

or dialogue style interaction, even in immersive Vir- tual Environments. This kind of interaction is useful for a number of situations where direct graphical interaction is not straightforward. Selection of a working mode of the environment, for instance switching between navigation in the virtual world by means of a flight metaphor and manipulation of the individual objects in the world, can be communi- cated to the system by means of menu or dialogue style interaction. Environment control parameters can be adjusted in dialogues with the user. Direct non- graphical manipulation of objects can be realized by interaction controls in a dialog box e.g. fancy dials or slider widgets to control amibutes of the selected object.

Considering the applicability of menu and dia- logue style interaction in immersive Virtual Environ- ments, the designer of the environment has a number of alternatives at his/her disposal to implement the interaction style.

0169-7552/97,/$17.00 0 1997 Elsevier Science B.V. AI1 rights reserved. PII SO1 69-7!552(97)00083-4

1686 K. Coninx et al./Computer Networks and ISDN Systems 29 (1997) 1685-1693

A number of approaches are seen in present-day Virtual Environments. Several systems extend the 3D nature of the scenes of the virtual world to the interaction techniques. Besides using 3D input de- vices, these systems give menus and dialog boxes (interaction objects) their own virtual representation in the world. Depending on the needs of the Virtual Environment, the interaction objects are placed in front of the user’s current view on the world or somewhere between the other 3D virtual representa- tions of the objects. 3D input devices such as 3D mice or gloves are used to control the interaction objects by means of 3D mouse events or interpreted glove gestures (pointing or ray casting). The object- oriented approach is integrated in some environ- ments by associating menus with the individual ob- jects in the world, so that each selected object dis- plays its own context sensitive list of possible ac- tions.

One of the disadvantages of 3D interaction ob- jects is the limited amount of control objects that is supported in standard libraries, if there are any at all. This places the burden of implementing a control widget library on the developer.

In this paper we discuss the use of 2D interaction techniques, based on X-Windows widgets, in 3D immersive Virtual Environments. This allows the developer to exploit the large amount of interface widgets that has been developed for X-Windows based applications. Most users or players of the Virtual Environments are familiar with 2D interfac- ing techniques. From the designer’s point of view, 2D interaction techniques can be used as a quick method. to realize the minimal interaction capabili- ties.

Feiner et al. [3] describe the use of 2D X windows on a see-through head-mounted display in the con- text of 3D Augmented Reality. The authors of [81 anchor 2D menus to a Virtual Tricorder to select interaction techniques in an immersive environment. The Tricorder input device can be used to adjust the distance of the menus to the user’s eyes. Menus in Virtual Environments indeed suffer from poor read- ability if the designer does not explicitly address this topic. [6] applies several menu systems (2D based and 3D) in the CHIMP environment.

The next section gives an overview of the Virtual Reality system that has been used to demonstrate the

use of 2D interaction techniques in immersive Vir- tual Environments. A following section elaborates on the motivation to use 2D techniques to interact with the objects in the 3D world of an immersive applica- tion. Some issues concerning the implementation of 2D interaction techniques for stereoscopic viewing are discussed. Attention is also given to the use of a 3D input device to control 2D interaction objects. Some sample interaction objects illustrate the ap- proach. Finally, conclusions are drawn from the experiment and ideas for future work are mentioned.

2. System overview

The main components of the Virtual Reality sys- tem in case are a Silicon Graphics Onyx computer, a BOOM (Binocular Omni-Orientational Monitor) al- lowing high-resolution stereo viewing and a Pinch Glove as the input device.

The computer used is a Silicon Graphics Onyx rackmount system with two RealityEngine graphics pipes, each equipped with 4 RM4 raster managers (thus delivering 4 Mbytes of texture memory per pipe>. Each of the two pipes drives one of the displays of the BOOM. The Onyx hosts 8 MIPS R4400 processors, 5 12 MBytes of RAM and several SCSI interfaces driving in total 20 GBytes of disk space.

A Binocular Omni-Orientational Monitor type Fakespace BOOM3C is used in the current imple- mentation. A BOOM is a combination of a high resolution CRT based display and a mechanical tracking system. The counterbalanced CRT display is generally used as a high resolution stereoscopic dis- play with up to 1280 X 960 pixels per eye. A left eye view is shown in the left display and a right eye view in the right display. The mechanical tracking system measures the user’s head position and orien- tation. Typically the BOOM reports raw data from each joint of its mechanical linkage by means of a standard RS232 serial line back to the host computer. A software package on the host computer converts the raw data into a number of convenient forms including standard rotation matrices. The BOOM3C used in the described implementation is equipped with two handles, but single handle systems are available too. Buttons on the handles can be used to

K. Coninx et al. /Computer Networks and ISDN Systems 29 (1997) 1685-1693 1687

Fig. 1. BOOM and pinch glove.

navigate through the virtual world or can be mapped to control more complicated features.

One Fakespace Pinch Glove is used as the input device. A Pinch Glove is a glove with fingertip pads to sense contact between the thumb and any of the four fingers. The base unit of the Pinch Glove sys- tem translates the fingertip contact data into RS232 output that is interfaced with the host computer. Fingertip contact data is used in this work to simu- late 2D mouse events to control the 2D interfacing techniques. A Polhemus type tracker is mounted on the back of the glove to provide hand position data to the host computer. Normally the Pinch Glove system comprises a pair of gloves, but in the imple- mentation described in this paper only one glove is used. This leaves the user with one free hand to control the BlDOM viewing head. Fig. 1 shows the BOOM and the Pinch Glove input device.

On the software level Motif has been used to illustrate 2D interfacing techniques in immersive Virtual Environments.

It should be emphasized that the system descrip- tion in this section corresponds to the experimental setup used to implement the 2D interfacing tech- niques in immersive Virtual Environments. Any component aiC the system can be replaced by an alternative e.g. a head-mounted display for immer- sive stereoscopic viewing, a workstation display with shutter glasses for non-immersive stereoscopic view- ing, a 2D mouse as the input device, etc. Any X-Windows based interfacing library is usable on the software side. of the system.

3.2D interfacing techniques in 3D

3.1. Motivation to use 20 HCI techniques in immer- sive virtual environments

Virtual Environments provide the user with an interactive participatory experience. HCI designers strive to make the interaction with the environment as intuitive and natural as possible. It is important to stress the fact that immersive environments are the context of this paper. Direct graphical manipulation for communication with the environment is not the only area for special attention during the interaction design. The design of other components of the inter- action paradigm, namely menus and dialog boxes, deserve enough attention too. Especially in immer- sive environments, because the interaction paradigm is considerably influenced by the immersion. The designer must take into account that the user is immersed in the environment. Indeed, besides using projection on large screens surrounding the user, immersion in the Virtual Environment is often real- ized through the use of wide field of view stereo- scopic displays as there are head mounted devices and BOOM systems. As natural interaction with the objects is one of the basic principles of Human- Computer Interaction design, whole-hand input by means of tracked gloves is applied in many contem- porary Virtual Environments. Two-handed input is investigated since several years e.g. [l]. In [7] the author distinguishes between three categories of fea- tures that determine the appropriateness of whole- hand input in an application: naturalness, adaptability and dexterity. A survey of design issues in spatial input can be found in [4].

The chosen combination of input and output de- vices for the immersive Virtual Environment has implications for the realization of menu and dialogue style interaction. In fact, the menus and dialog boxes are viewed on the same display and used by means of the same input device. Once immersed in the interactive experience, the user cannot escape from the environment just for the sake to interact with menus and dialogue boxes by throwing off his head mounted device or placing aside his BOOM, taking off the gloves.. .

An interaction style is characterized by the visual appearance of the interaction objects and by the way

1688 K. Coninx et al. /Computer Networks and ISDN Systems 29 (199711685-1693

in which the user interacts with them. When applied to menus and dialog boxes in immersive environ- ments this gives the designer the task to define a visual representation and interaction commands to use these interface objects when immersed in the environment. The research path that is often chosen in this context is attributing menus and dialog boxes with their own 3D virtual representation, and inte- grating them in between the other objects of the virtual world. This approach allows the user to inter- act with the interfacing objects in a similar way he interacts with the other objects of the world [5].

Even if these 3D interface objects are very useful, they are hard to realize for the developer. The dVISE Toolbox provides some support to realize 3D inter- face objects [2]. Most libraries provide none or only a few standard 3D interfacing widgets. This was an important reason for us to explore the usability of 2D interfacing techniques as menus and dialog boxes in the context of immersive 3D environments. From the designer’s point of view, 2D interaction techniques can be used as a quick method to realize the minimal interaction capabilities. The designer can gradually replace 2D interaction techniques with other tech- niques (e.g. direct graphical manipulation) if desired. Most people entering an immersive Virtual Environ- ment are skilled computer users and probably have experience in using some 2D windowing system/ Graphical User Interface, so we believe the users benefit from the use of 2D HCI techniques for specific tasks in immersive systems. Indeed, HCI research in general and especially in the context of Virtual Reality strives to improve the performance of users by lowering the cognitive load in the comple- tion of a task. Easy to use 2D menus and dialog boxes certainly contribute to this aim. The combina- tion of 2D HCI techniques with 3D direct graphical manipulation of the objects can use the best of both worlds.

3.2. Design of 20 HCI techniques in immersive virtual environments

Once convinced to explore 2D HCI techniques in immersive Virtual Environments, the visual represen- tation of and the interaction with the interface ob- jects had to be decided upon. Motif was chosen as the interface library, so part of the work consisted of

making the widgets usable for stereoscopic viewing through a BOOM. Early experiments provided evi- dence that showing the interface widgets in one eye only was not usable. The information in the interface widgets was hard to read and users ended up with painful eyes. The solution to present the same visual information to both eyes is based on the principle of widget cloning, and is described in the next section.

The other part of the work consisted of enabling the user to interact with the interface widgets, e.g. designing how to press a button, how to use dials, etc. In order to be consistent with the skills of the users to work with 2D Graphical User Interfaces the interaction paradigm should (1) allow movement of a pointer over the widgets by moving the input device and (2) use concepts as clicking and dragging. It was decided to use a Pinch Glove as the input device. Instrumented gloves are useful for a number of applications in which virtual objects must be directly manipulated. Besides that, the Pinch Glove allows a direct mapping between pressing the fingertip pads of the glove and button press events. The next section describes the simulation of 2D mouse events with a Pinch Glove.

An advantage of using a Pinch Glove in this context is that the input device is convenient for interaction with objects in the world through direct manipulation, but also for interaction with menus and dialog boxes. The Pinch Glove is a lightweight input device, which makes it comfortable for the user to work with. Also, the user is not forced to make gestures that might cause painful hands after a while. There is almost no learning curve for the user to become familiar with using the Pinch Glove for interacting with menus and dialog boxes, controls,. . .

The combination of cloned Motif widgets, allow- ing stereoscopic viewing through a BOOM, and a Pinch Glove as the input device, make 2D HCI techniques worthwhile to use in immersive Virtual Environments.

4. Implementation

The proposed concept for 2D interfacing tech- niques in immersive Virtual Environments is imple- mented in Motif. The realization intensively uses the event driven programming paradigm of X-Windows.

K. Coninx et al. / Computer Networks and ISDN Systems 29 (1997) 1685-1693 1689

In this section the widget cloning principle and the simulation of 2D mouse events when interacting with a Pinch Glove are explained.

4.1. Widget cloning

The main idea in the implementation is to clone the interface widgets to provide each eye with its own image of the interface widgets. It does not matter for the user which eye looks at the original widgets and which eye looks at the cloned ones. The original widget and its copy are visually identical and show the same behavior. For instance, the visual feedback of a button being pressed is displayed to each eye. However, the action corresponding to the button must be initiated only once. So the callback function associated with an interface widget can be called for the original widget or for its clone, but not for both widgets.

The fragment of C code in Fig. 2 contains the main event loop of the HCI process.

The interface widgets are created as they would be in any other application. This establishes the data structure for the so-called original widgets. After initialization of the code module for stereoscopic interfacing the original widgets are cloned.

From now o,n two data structures are available for the interface objects, one for each eye.

The stereoscopic interfacing module maintains a structure sterecinterface. Essential in this data struc-

ture are two lists, one containing the original widgets and one the cloned widgets.

The widget cIoning function recursively dupli- cates the attributes of the widget and its children. It also registers the original and the cloned widgets in the widget lists of the stereoInterface data structure.

The stereoscopic interfacing module preprocesses the events in the main event loop (Fig. 2). The event handler of the stereoscopic interface module (Fig. 3) peeks to the next event in the queue and applies a preprocessing function if there is one defined for this type of event.

The majority of the events is handled by changing the window pointer in a copy of the event. If the widget that is addressed by the event is an original one, the window pointer is replaced by the clone (and vice versa) before the event is dispatched.

So the event handler of the stereoscopic interface module virtually duplicates the events in the queue, and dispatches that event to the widget presented to one eye (e.g. the original widget). The event that was originally in the queue is dispatched in the main event loop to the widget presented to the other eye (e.g. cloned widget).

It is obvious from the code fragments that the approach to clone the widget and to preprocess the events is a general solution. It works for any Motif widget definition. The simplicity of the solution contributes to its elegance, and is therefore extremely usable for the developer of immersive Virtual Envi-

main0 { Widget shell-widget; StereoInterface *stereo-interface; XEvent event; . . . shell-widget = createInterface(...); /* create the interface widgets */ stereo-i.nterface = siface-initialize(...);

/* initialize the stereoscopic interface module *f siface_c:opyWidget(stereo,interface, shell-widget); /* clone the widget *I

while (TRUE) l siface,eventHandler(stereo-interface);

/* preprocess the event for the stereoscopic interface */ XtAppNextEvent(..., &event); /* get the next event from the queue... */ XtDispatchEvent(&went)~ /* . . and dispatch it */ 1

Fig. 2.Code fbgtnentslmwing the main event hop and event preprocessing.

1690 K. Conic et al. /Computer Networks and ISDN Systems 29 (1997) 1685-1693

siface-eventHandler(stereoInterface * stereo-interface) { XEvent event;

XPeekEvent(..., &event); /* peek to the event in the queue */

/* call now the preprocessing function for the event in case */ if (eventtypes[event.tye].psocessfunc(stereo-interface, &event) )

XtDispatchEvent(&event); /* and dispatch the event to the correct window */

1

Fig. 3. Code fragment showing event preprocessing (event duplication and dispatching).

ronments to have quick access to existing menus and boxes, glove and tracker data are converted to X- dialog boxes. Windows type events.

4.2. Simulating mouse events with a Pinch Glove

The Pinch Glove and the tracker mounted on top of it replace the 2D mouse that is traditionally used to control the Motif menus and dialog boxes. We chose the Pinch Glove because it is useful in a number of immersive Virtual Environments to inter- act directly with the objects in the world. In order to use the same device to steer the menus and dialog

The tracker provides information for the (x, y) position of the 2D pointer (e.g. arrow) shown to interact with the Motif widgets. Data that is reported by the tracker concerning the third dimension of its coordinate system is not used.

Pinch Glove data is used to generate X events from the types ButtonPress and ButtonRelease. Inter- pretation of Pinch Glove commands is based on a correspondence of pressing two fingertip pads against each other and pressing a mouse button. For in-

Rg. 4. Sample interfacing widgets and virhml keyboard.

K. Conic et al. /Computer Nefworks and ISDN Sysrems 29 (I 997) 1685-1693 1691

Fig. 5. Sample interfacing object depicting the hierarchical structure of the. scene in tbe background

stance, pressing the thumb against the index is inter- preted as pressing the left mouse button.

5. Examples

The examples in this section provide evidence that the proposed concept to use 2D HCI techniques in immersive Virtual Environments places a vast amount of Motif style interfacing widgets at the disposal of the designer (Fig. 4).

However, -it remains the responsibility of the de- signer to ensure the quality of the interface. Read- ability of the information in the interface widgets is an important :issue in this context. This explains why most interfacing widgets are centralized on top of the scene for stereoscopic viewing through the BOOM. If a display system with a lower resolution than the

BOOM is used readability issues can place restric- tions on usable widgets.

Input of numbers and text traditionally deserves special attention in immersive Virtual Environments. The user’s hands are not free to use the keyboard when wearing gloves! Most systems choose the op- tion to let the user interact with virtual keyboards by means of the same interaction techniques that are used to interact with other objects. For instance pointing, ray casting, interpreted gestures with the glove,. . .

Obviously the use of 2D interfacing techniques in immersive environments suffers from the same prob- lem. Standard editable text fields in dialog boxes are not usable. Besides investigating special purpose 2D interfacing widgets, virtual keyboard widgets are a possible solution (Fig. 4).

1692 K. Coninr e? al. /Computer Networks and ISDN Systems 29 (1997116854693

The designer of the 2D interface widgets must take into account that the user will interact by means of a glove. Moving a hand freely in space is not as accurate as with constrained movements (e.g. 2D mouse movement is constrained by the table it is moved on). Therefore the designed widget should not be too small.

This explains why two arrow widgets to incre- ment or decrement a number are not as small and vertically aligned in Fig. 4 as they usually are.

Fig. 5 shows an other example of a 2D interfacing object.

6. Conchsions and future work

The possibility to use 2D HCI techniques in immersive Virtual Environments is discussed in this paper. An experimental setup using a BOOM and a Pinch Glove as peripherals is described. 2D HCI techniques open perspectives for the user as well as for the designer for immersive Virtual Environments. The user can recall to his skills in using common 2D Graphical User Interfaces. The chosen approach brings a vast amount of standard interfacing objects at the developer’s disposal. This is exploited in the presented work through the use of the X-Windows platform. A programming solution to make X- Windows widgets usable for stereoscopic viewing is elaborated upon.

Future work will evaluate the usability of the proposed concept in immersive Virtual Environ- ments, e.g. in the context of immersive modeling. The affordance of 2D HCI techniques for tasks in the immersive environment emphasizing focused or di- vided attention can be studied. Other directions to be investigated include the application of the widget cloning technique with regard to distributed Virtual Environments.

Acknowledgements

Part of the work presented in this paper is subsi- dized by the Flemish Government and EFRO (European Fund for Regional Development).

We also acknowledge the work of B. Rassaerts and D. Nouls who were instrumental in the program- ming work for the research experiments.

References

[ll

121

[31

t41

El

h51

[71

181

W. Buxton, B.A. Myers, A study in two-handed input, in: Proc. CHI’86, Human Factors in Computer Systems, 1986, pp. 321-326. dVISE User Guide and dVISE Developer Guide, Division, 1995. S. Feiner, B. MacIntynz, M. Haupt, E. Solomon, Windows on the world: 2D windows for 3D augmented reality, in: Proc. ACM Symp. on User Interface Software and Technology (UIST’93), Atlanta, GA, November 3-5, 1993, pp. 145-155. K. Hinckley, R. Pausch, J.C. Goble, N.F. KasseII, A survey of design issues in spatial input, in: Proc. ACM Symp. on User Interface Software and Technology (UIST’94), Marina de1 Rey, CA, November 2-4, 1994, pp. 213-222. R.J. Jacoby, S.R. Ellis, Using virtual menuS in a virtual environment, in: Proc. SPIE’92: Visual Data Interpretation Vol. 1668, 1992, pp. 39-48. M. Mine, Working in a virtual world: Interaction techniques used in the Chapel Hill immersive modeling program, Techni- cal Report TR96-029, Dept. Computer Science, UNC Chapel Hill, 1996. D.J. Sturman, Using the whole hand in the human-computer interface in communicating with virtual worlds, in: N. Magne- nat-Thalmann, D. Thalmann (Eds.), 1993, pp. 14-28. M.M. Wloka, E. Grcenfield, The virtual ticorder: A uniform interface for virtual reality, in: Proc. ACM Symp. on User Interface Software and Technology (UIST’95). Pittsburgh, PA, November 14-17. 1995, pp. 39-40.

Karin coninx got her Ph.D. in com- puter science in 1997 at the Limburg University Centre (LUC), Diepenbeek (Belgium). She is a research assistant at the Expertise Centre for Digital Media at the same university. Her research in- terests include human-computer interac- tion, virtual reality, multimedia and telematics. She is a member of IEEE Computer Society and of the Virtual Reality Society.

Frank Van Reetb got his M.Sc. in computer science at the Free University of Brussels in 1987, and his Ph.D. in computer science at the Limburg Uni- versity Centre (LUC), Diepenbeek (Bel- gium). He is curnmtly full professor computer science at LUC and is manag- ing the R & D activities of the Expertise Centre for Digital Media at the same university. His current research interests include computer graphics and -anima- tion, virtual reality. multimedia technol-

ogy and telematics. He is a member of the. ACM, CGS, Em+ graphics, IEEE, and the VRS.

K. Coninx et al/Computer Networks and ISDNSystems 29 (1997) 1685-1693

Eddy Flerackers is cutrently full pro- fessor of computer science at the Lim- burg University Centm in Diepenbeek (Belgium). He is chairman of the infor- matics department at the Limburg Uni- versity Centre and also director of the Expertise Cents for Digital Media (a LUC research cenne). He is consultant to government and industry and has ref- ereed a number of European research projects. He is also project leader of a large number of industrial research pro-

jects. Eddy Fleraclkers is cofounder of three spin-off companies, working in the multimedia and computer gaphics fteld. He is executive editor of the Virtual Reality Journal and member of ACM, IEEE, the Computer Graphics Society, Eurographics and Fellow of the Virtual Reality Society. His research activities are in the area of computer graphics, computer animation, virtual reality, multimedia and telematics.

1693