8
Distributed PC-Based Haptic, Visual and Acoustic Telepresence System — Experiments in Virtual and Remote Environments — H. Baier M. Buss F. Freyberger J. Hoogen P. Kammermeier G. Schmidt Institute of Automatic Control Engineering, Technische Universit¨ at M ¨ unchen D-80290 M¨ unchen, Germany, E-mail: [email protected] [email protected] Abstract In this paper we present a distributed PC-based multi- modal (haptic, visual and acoustic) telepresence and vir- tual presence system. Two desktop kinesthetic devices (DeKiFeD3 and DeKiTop3) with 3 degrees-of-freedom have been developed for multi-modal telepresence. Feedback to the human modalities of the visual, auditory, kinesthetic, tactile and temperature senses is generated using appropri- ate actuator hardware. We present several applications in virtual presence and teleoperation in physical remote envi- ronments. 1. Introduction In recent years the interest in multi-modal 1 telepresence technology — and especially combined with Virtual Real- ity (VR) implementations in the human-system interface 2 (HSI) — has been continuously increasing. There are many important application areas, to name only a few: teleop- eration in hazardous environments (space/nuclear), long- distance telemaintenance and teleservice, rapid prototyping, CAD systems for manufacturing and design, telemedicine, teleshopping, virtual shopping, etc. For many of these ap- plication domains a VR approach can help to cope with communication time-delay problems by local prediction. Another important aspect of VR is the training possi- bility. Before actually performing telemanipulation (e.g. telesurgery) the human operator can learn and practice the task in the VR with realistic multi-modal feedback. Our main interests in the area of multi-modal HSI based telepresence are multi-modal control loops — possibly closed via a communication network — including a hu- man operator, executing teleoperator and multi-modal lo- 1 Multi-modality in this paper comprises the visual, auditory and haptic modalities, where we use the working definition of haptics to include the human kinesthetic (proprioceptive), touch and temperature senses. 2 The term human-system interface—HSI is to cover all other com- monly used terms such as human-machine, human-robot interface etc. cal feedback from VR models. For corresponding experi- mental investigations we have been developing and build- ing a distributed PC-based multi-modal — haptic (kines- thetic, vibrotactile, temperature), visual and acoustic — telepresence and teleaction system in our laboratory. The system includes a multi-modal HSI with a newly de- veloped high-performance Desktop-Kinesthetic-Feedback- Device (DeKiFeD3) with 3 active degrees-of-freedom (DOF) and 1 additional passive DOF. The DeKiFeD3 being part of the human operator site is shown in Fig. 1(a). Main differences between the DeKiFeD3 and other available force feedback hardware (e.g. PHANToM [11]) are the rel- atively large workspace of approximately , the high force capability of up to 60N in the 3 Cartesian directions with high-fidelity force control supported by a 6- axis JR3 force/torque sensor. (a) DeKiFeD3 (b) DeKiTop3 Figure 1. Photos of the DeKiFeD3 and DeKiTop3 systems. The kinesthetic feedback is augmented with vibrotac- tile, temperature, visual and acoustic feedback. The tac- tile display uses piezoelectric vibration generating elements together with a Peltier element for temperature feedback, both adapted from a design of a tactile actuated glove [4]. In the HSI the DeKiFeD3 is equipped with a handle easy to grasp by the human hand. The vibrotactile and temper- ature actuators are attached to bring the sensations to par- ticularly sensitive areas close to the tip of the index finger. The DeKiFeD3 is used as the multi-modal HSI in the VR applications and experiments presented in this paper.

[IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

  • Upload
    g

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

Distributed PC-Based Haptic, Visual and Acoustic Telepresence System— Experiments in Virtual and Remote Environments —

H. Baier M. Buss F. Freyberger J. Hoogen P. Kammermeier G. Schmidt

Institute of Automatic Control Engineering, Technische Universit¨at MunchenD-80290 Munchen, Germany, E-mail: [email protected] [email protected]

Abstract

In this paper we present a distributed PC-based multi-modal (haptic, visual and acoustic) telepresence and vir-tual presence system. Two desktop kinesthetic devices(DeKiFeD3 and DeKiTop3) with 3 degrees-of-freedom havebeen developed for multi-modal telepresence. Feedback tothe human modalities of the visual, auditory, kinesthetic,tactile and temperature senses is generated using appropri-ate actuator hardware. We present several applications invirtual presence and teleoperation in physical remote envi-ronments.

1. Introduction

In recent years the interest in multi-modal1 telepresencetechnology — and especially combined with Virtual Real-ity (VR) implementations in the human-system interface2

(HSI) — has been continuously increasing. There are manyimportant application areas, to name only a few: teleop-eration in hazardous environments (space/nuclear), long-distance telemaintenance and teleservice, rapid prototyping,CAD systems for manufacturing and design, telemedicine,teleshopping, virtual shopping, etc. For many of these ap-plication domains a VR approach can help to cope withcommunication time-delay problems by local prediction.Another important aspect of VR is the training possi-bility. Before actually performing telemanipulation (e.g.telesurgery) the human operator can learn and practice thetask in the VR with realistic multi-modal feedback.

Our main interests in the area of multi-modal HSI basedtelepresence are multi-modal control loops — possiblyclosed via a communication network — including a hu-man operator, executing teleoperator andmulti-modal lo-

1Multi-modality in this paper comprises the visual, auditory and hapticmodalities, where we use the working definition of haptics to include thehuman kinesthetic (proprioceptive), touch and temperature senses.

2The term human-system interface—HSI is to cover all other com-monly used terms such as human-machine, human-robot interface etc.

cal feedbackfrom VR models. For corresponding experi-mental investigations we have been developing and build-ing a distributed PC-based multi-modal — haptic (kines-thetic, vibrotactile, temperature), visual and acoustic —telepresence and teleaction system in our laboratory. Thesystem includes a multi-modal HSI with a newly de-veloped high-performance Desktop-Kinesthetic-Feedback-Device (DeKiFeD3) with 3 active degrees-of-freedom(DOF) and 1 additional passive DOF. The DeKiFeD3 beingpart of the human operator site is shown in Fig. 1(a). Maindifferences between the DeKiFeD3 and other availableforce feedback hardware (e.g. PHANToM [11]) are the rel-atively large workspace of approximately25�30�80cm3,the high force capability of up to 60N in the 3 Cartesiandirections with high-fidelity force control supported by a 6-axis JR3 force/torque sensor.

(a) DeKiFeD3 (b) DeKiTop3

Figure 1. Photos of the DeKiFeD3 andDeKiTop3 systems.

The kinesthetic feedback is augmented with vibrotac-tile, temperature, visual and acoustic feedback. The tac-tile display uses piezoelectric vibration generating elementstogether with a Peltier element for temperature feedback,both adapted from a design of a tactile actuated glove [4].In the HSI the DeKiFeD3 is equipped with a handle easyto grasp by the human hand. The vibrotactile and temper-ature actuators are attached to bring the sensations to par-ticularly sensitive areas close to the tip of the index finger.The DeKiFeD3 is used as the multi-modal HSI in the VRapplications and experiments presented in this paper.

Page 2: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

By duplicating the DeKiFeD3 and mounting a sen-sor tip as the end-effector (EE) instead of the handle forthe human hand, we have built a Desktop-Kinesthetic-Teleoperator (DeKiTop3) system, also with 3 DOF; seeFig. 1(b). This system includes vibrotactile and temperaturesensors at the sensor tip as well as a 6-axis JR3 force/torquesensor. The kinematic structure of both the DeKiFeD3 andDeKiTop3 systems are equivalent. Both systems are con-nected by a communication network.

The developed system provides the possibility of multi-modal VR presence or telepresence in a real (physical) re-mote environment, see Fig. 2. Feedback to the operator ismulti-modal including the visual, auditory and haptic —kinesthetic, vibrotactile and temperature — senses. Themulti-modal VR engine computes multi-modal feedbacksignals resulting from operator action in the virtual world.

Multi-ModalFeedback

VE EngineMulti-Modal

HumanOperator

Multi-ModalCommands

Mul

ti-M

odal

Hum

an-S

yste

mIn

terf

ace

Remote Environment(Teleoperator)

Barrier

Figure 2. Operation modes of multi-modal vir-tual presence and remote telepresence.

The paper is organized as follows: In Section 2 we out-line details of the overall system architecture of the de-veloped distributed PC-based multi-modal telepresence andteleaction system. Section 3 describes the newly developedDeKiFeD3 hardware. It also presents evaluation experi-ments with a stiff virtual wall. The vibrotactile and temper-ature feedback aspects are briefly discussed in Section 4.Section 5 presents four VR presence applications and ex-periments: haptic exploration, virtual drilling, ping pongas an entertainment application and haptic interaction withdeformable objects. Telemanipulation experiments are re-ported in Section 6.

2. Telepresence System Architecture

2.1. Hardware Subsystems

The overall system architecture of the distributed exper-imental telepresence environment is shown in Fig. 3. Inthe following we briefly discuss some subsystem details,the computational environment, controller implementation,communication infrastructure and graphics capabilities ofthe developed PC-based telepresence system. Multi-modalpresence of the operator can be realized both in a VR and areal (physical) remote environment, see Fig. 2.

The DeKiFeD3 haptic feedback device can be cou-pled with the already mentioned symmetric teleoperatorDeKiTop3, see upper part of Fig. 3. Using these two sub-systems multi-modal telepresence in a real remote environ-ment located in a neighboring building is realized. Thevisual, auditory, kinesthetic and tactile modalities are im-plemented. The operator and teleoperator are connectedby an idealistic communication infrastructure. Actual per-formance characteristics and the construction of the hapticdisplay DeKiFeD3 and the teleoperator DeKiTop3 are dis-cussed in greater detail in Sections 3 and 4.

A second teleaction subsystem with emphasis on thekinesthetic modality is used to analyze the behavior of hap-tic control loops in communication infrastructures with sig-nificant time-delay. The system has 1 rotational DOF (forcepaddle), see Fig. 4(a), on the operator and 1 linear DOF(linear axis) on the teleoperator site. A drilling machine ismounted on the teleoperator, see Fig. 4(b), allowing the hu-man operator to perform the simple telemanipulation taskof teledrilling holes through an object [2].

Both systems, the DeKiFeD3 or the force paddle, canalso be used as haptic interfaces to a virtual world. Whenconnected to the VR engine the feedback data for all themodalities are generated by simulation. The generation of3D graphics of virtual objects is well established. For thekinesthetic modality many methods have been reported tocompute object interaction forces in real-time. The gener-ation of realistic artificial acoustic, tactile and temperaturesensations is not so common. In this paper we present VRexperiments while modeling and generating multi-modal ar-tificial data for the kinesthetic, tactile, temperature and au-ditory modalities.

2.2. Information Processing

Fig. 3 presents the distributed computing architecture ofthe developed telepresence system. The system consists of7 Pentium or Pentium-II PCs and 1 DEC alpha PC. Distri-bution is logically by placing subprocesses on several PCsand physically because the operator is situated in one build-ing separated from the remote site in a neighboring building150m apart.

The four mechatronic subsystems — 1. force pad-dle, 2. linear axis/drilling machine, 3. DeKiFeD3, 4.DeKiTop3 — are each controlled by a PC with expan-sion boards for D/A- and A/D-conversion, counters for theoptical encoders and 2 ISA DSP-cards for processing ofthe JR3 force/torque sensor data. The DC motors are cur-rent (torque) controlled by standard PWM amplifiers with25 kHz modulation frequency. The temperature, tactile andpaddle force measurements are fed into the A/D-converters.

For multi-modal communication between the operatorand remote sites there are several connections (see Fig. 3):

Page 3: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

������������������������������������������������������������������������������������������������������������������������������������������������������������������������������

������������������������������������������������������������������������������������������������������������������������������������������������������������������������������

Pentium 133

BurrBrown AD DA

Plug-In counter

snazzy mpegencoder

Pentium 133

switch10 MBit

hub100 MBit

hub100 MBit

PWM & DC motor

force sensor

drilling machine

inc. encoder

Plug-In counter

Pentium 200

Pentium 133

BurrBrown AD DA

Plug-In counter

video monitor

& speaker

PWM & DC motor

HP inc. encoder

1 DoF Paddle

force sensor

monitor

PC

Pentium 166Videoconference

switch10 MBit

fibre

Plug-In counter

Pentium 200

Pentium II 233

Voodoo II

Voodoo II

DeKiTop3

DECalpha 533

DeKiFeD3

150 m

CAMPUS LAN

LAB LAN

router

JR3 ISA DSP card

Meilhaus A/D D/A

PWM & DC motor

tactile actors

JR3 force/ torque sensor

incremental encoder

Meilhaus A/D D/A

JR3 ISA DSP cardJR3 force/ torque sensor

incremental encoder

tactile sensors

PWM & DC motor

Kinesthetic Teleaction with 1DOF and Time-Delay

Operator Site, building I: 3rd floor Remote Site, building II: basement

VR

Multi-Modal Telepresence and Teleaction in 3DOF

Figure 3. System architecture of the developed PC-based distributed telepresence system.

1. two analog connections by coaxial cable; 2. two optic fi-bres used for a dedicated 100 MBit/s connection via 2 hubsin the laboratories; 3. via the campus LAN (shared with 8other laboratories) using 10 Mbit/s.

The visual and auditory modalities are transmitted via avideoconferencing system using a Snazzi MPEG hardwareencoder on the remote site and software decoding on theoperator site (with reduced image quality and frame rate),or via the 2 analog cables as a high quality connection.

(a) (b)

Figure 4. Photos of the force paddle (a) andthe linear axis with drilling machine (b)

All PCs controlling hardware systems are running Linuxwith the real-time patch applied; see the Real-Time Linuxproject onhttp://www.linux.org for details. All ofthe force and motion controllers run at 1 to 2 ms samplingtime. The videoconferencing PCs run Windows 95.

The multi-modal VR engine of the system is located onthe operator site, see Fig. 3. One Pentium II PC equippedwith two VOODOO II dedicated 3D graphic acceleratorsassures high graphical performance for implementation of

fairly complex VRs. Additionally a sound adapter is used inthe videoconferencingPC for the auditory modality. Acous-tic data comes from the remote site or is synthesized (seeSection 5.2). The DEC alpha PC is used for haptic render-ing of deformable objects (see Section 5.4).

3. DeKiFeD3 – Kinesthetic Feedback Device

Fig. 5(a) shows a more detailed photograph of theDesktop-Kinesthetic-Feedback-Device (DeKiFeD3) with 3active degrees-of-freedom (DOF). The kinematic structureshown in Fig. 5(b) allows the end-effector (EE) to be posi-tioned in 3 Cartesian DOFs. The mechanical design of thethird joint is such that the orientation of the EE does notdepend on the joint angles. The rotation around the verticalaxis resulting from the SCARA joints is eliminated using apassive rotational DOF.

Each active DOF is actuated by a high-performance DCmotor (Faulhaber) with planetary gears, optical encodersand standard PWM power amplifiers. The chosen motorshave exceptionally good maximum holding torque (1 Nmwithout gear) to weight (less than 1 kg including gear andencoder) ratios with a maximum output power of 220W.The gear ratio is 66, 43, 66 in the 1st, 2nd, 3rd joint, respec-tively. The mechanical structure of the links is designed formaximum mechanical stiffness, ease of manufacturing, lowweight, inertia and cost.

The EE consists of a holding handle with additionalvibrotactile and temperature actuators, see Fig. 5(a), all

Page 4: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

(a) DeKiFeD3

x0

z0

y0q1

x1

z1

y1q2

x2 z2y2 q3 xE zE

yE

l1

l2 l3 EE

(b) Kinematics

Figure 5. Photo of the DeKiFeD3 (a) and itskinematic structure (b).

mounted on a JR3 6-axis force/torque sensor operating at amaximum sampling rate of 8kHz and a measurement rangeof 100N/6.3Nm.

3.1. Kinematics and Force Control

The kinematic configuration of the DeKiFeD3 is shownin Fig. 5(b) with Denavit-Hartenberg coordinate systems at-tached. The first two joints form a SCARA-type configura-tion and the 3rd joint permits a motion of the EE in the ver-tical z-direction. The forward kinematics solution is givenby

x = �l3c3c12 � l2s12 + l1c1y = �l3c3s12 + l2c12 + l1s1z = l3s3 ;

(1)

with l1 = l2 = l3 = 0:2m and the usual abbreviations ofc12 = cos(q1 + q2) etc. The determinant of the Jacobian of(1), i.e. detJ = l1l3c1(l2c2 � l3s2c3) reveals singularitiesat q3 = ��=2 andl2c2 = l3s2c3. Two of these are impos-sible to reach because of the mechanical design. The thirdsingularity can be avoided by controlling the elbow to pointin outward direction. A collection of some discrete pointsof the useful kinematic workspace (projected forq3 = 0) isshown in Fig. 6 together with a sketch of the human oper-ator. To complete the DeKiFeD3 model, a reasonable esti-mate for the mass of each of the three links is1:6kg.

Using the 6-axis JR3 force/torque sensor in theDeKiFeD3 has the advantage of high fidelity force controlof the actual forces between the device and the human oper-ator. A standard force controller implemented in joint spacewas extended by feed forward action and adaptive (contactforce depending) gains. This gives better steady-state accu-racy and a very stable behavior while moving effortlesslyin free space and when contact with hard surfaces is estab-lished.

0.2

Figure 6. DeKiFeD3 workspace projection to-gether with a human operator.

Figure 7. Experimental result when repeat-edly hitting the virtual wall.

3.2. Performance Evaluation

To evaluate the performance of the kinesthetic feedbackcapabilities of the DeKiFeD3 we have conducted severalexperiments, e.g. with a stiff virtual wall. Virtual wallshave been reported and used as benchmark experiments byother researchers also [5, 12, 13]. The location of the virtualwall in the workspace is shown in Fig. 6.

The DeKiFeD3 is force controlled to zero-force in freespace. When they-position of the EE enters the virtual wallat y < yw = 0:22m, a repelling forceF is generated asthe set-point for the force controller abiding the virtual wallmodel of

F = K(yw � y) +B _y (2)

with K = 10000N/m andB = 50Ns/m.Fig. 7 shows an experimental result. The repeated en-

tering of the virtual wall iny-direction can be seen whilethe desired forces and actual forces generated to the humanare shown below. Due to the limited actuator bandwidth(13Hz) and actuator saturation (maximum exertable force60N) the desired force cannot be achieved exactly. How-ever, concerning the exerted forces up to50 � 60N and

Page 5: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

the low penetration depth of the virtual wall (� 1cm) theDeKiFeD3 generates a rather stiff subjective impression ofhard surfaces with the human operator. This is not an exactpsychophysical measurement, but the result of questioningexternal expert users. Other multi-modal experimental re-sults using the DeKiFeD3 are shown subsequently.

4. Tactile Feedback and Exploration

Vibrotactile and temperature actuators in theDeKiFeD3 augment the kinesthetic capabilities real-izing a haptic display. The tactile sensors mounted on theDeKiTop3 and the actuators on the DeKiFeD3 are shownin Fig. 8.

Figure 8. Tactile sensors and actuators.

We are investigating modeling paradigms for the humantactile sense from a systems theory point of view [10] aim-ing at a framework to describe the processes of human per-ception in general using mathematical mappings. The hu-man sense of touch, its physical stimulation and informa-tion processing, is one part of this model. Deduction ofnon-ambiguous coherences between physical stimuli dis-played by the tactile actuators and resulting sensory impres-sions/perceptions with the human operator are to be seen asa first step towards a detailed formulation of mappings forhuman touch perception.

Here we can only briefly demonstrate how teletactile im-pressions are realized in the presented system by an exem-plary tactile exploration scenario. A prototypical experi-mental setup for tactile exploration is shown in Fig. 9(a)with four objects significantly different with respect to tex-ture and the resulting tactile impression. Fig. 9(b) showstypical trajectories during tactile object exploration.

5. Experiments in Virtual Environments

As typical scenarios of multi-modal VR presence thissection outlines the applications of haptic exploration, vir-tual drilling, entertainment and force feedback as a result ofdeforming soft objects.

(a)

(b)

Figure 9. Tactile exploration of four ob-jects (a); corresponding vibrotactile sen-sor/actuator signals (b).

5.1. Haptic Exploration

The DeKiFeD3 can be used to haptically explore a vir-tual environment and the objects therein. An example worldis shown in Fig. 10. The relatively flat object on the leftis hot, with a soft but rough skin and substantial friction,whereas the box type object on the right is cold, rigid,smooth and frictionless. The half sphere on the ceiling isrigid and frictionless like all the walls of the shown virtualworld. The borders of the room are inside the workspace ofthe DeKiFeD3 as shown in Fig. 6.

Figure 10. Virtual world for haptic exploration.

The small sphere in the middle of the picture is thevirtual probe, which can be moved around using theDeKiFeD3. It can interact with the other objects in the VRand in case of contact with one of the other objects, appro-priate feedback forces, tactile feedback stimuli and temper-ature set-points are generated to display the characteristicsof the touched object. The underlying algorithms for mod-eling and haptic rendering of the VR are for nondeformableobjects similar to [3, 15], but extended by an approach tomodel and display effects like friction forces, roughness and

Page 6: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

temperature, see [9] for details. Due to neglecting defor-mation and simple object shapes this rendering algorithmis very fast (� 100�s) which makes a controller samplingtime of1ms possible.

Tactile feedback is generated depending on parame-ters representing object characteristics as surface texture,edge type, temperature and thermal conductivity. For VR-experiments the tactile data can either be taken from phys-ical measurements, see Fig. 9(b), or generated by simula-tions using functional models for touched shape primitives,fingertip deformation and skin receptors. Alternatively, forease of implementation deterministic (e.g. peaks, square-wave) or stochastic signals (white noise) can be used.

Fig. 11 shows typical trajectories when a human operatorexplores the VR shown in Fig. 10 moving in negativex-direction from the left, making contact and following theshape of the left object, gliding over the ground to the right,and finally moving across the right object. The height of thevirtual fingertip corresponding to the flat and the higher boxcan be easily seen in the first plot (z-position) of Fig. 11.FzandFx indicate the detected impedance and friction forceswhen moving across the vertical and horizontal surfaces.Ushows the activity of the vibrotactile actuator whileT is theoutput for the temperature set-point.

The fact that haptic, vibrotactile and temperature feed-back is generated in this exploration scenario adds a di-mension to operator immersion quality. The physical ob-ject characteristics like e.g. shape, corners, edges, friction,roughness (texture) and temperature can be felt very realis-tically.

Figure 11. Typical trajectories of a human op-erator exploring the VR shown in Fig. 10.

5.2. Virtual Drilling Experiments

For investigations with respect to evaluation of multi-modal presence quality in visual-haptic telepresence sys-tems we have set up a virtual teledrilling experiment to em-ulate the real teledrilling application, see Fig. 12(a). Weconducted a subjective and objective measurement of pres-ence quality depending on visual frame rate, intermodal

time-delay and consistency as well as importance of eachmodality by putting up a questionnaire for 25 probands [14].Main motivation to realize the drilling system in VR wasthat we had to perform several hundred evaluation experi-ments. Another reason is that the tuning of modal qualitycharacteristics like frame rate or time-delay is much easierto achieve in the VR realization.

By augmenting synthesized sound to the virtual drillingmachine the presence quality was improved significantly.The idea to generate synthetic auditory feedback is simi-lar to the work reported in [7]. We analyzed the sound ofthe real drilling machine by a Fourier analysis for the twocases of free motion and when drilling an object. The char-acteristic features of these Fourier transforms were used tosynthesize sounds similar to the natural sound, see [14] fordetails and detailed quantitative results of the questionnaire.Just to mention one result, which is obvious in retrospect:intermodal consistency is extremely important, because ifone modality is realized with particularly bad quality (e.g.time-delay larger than other modalities), human operatorswill completely ignore it and rely on the other modalitieswith better quality.

(a) virtual drill(a) ping pong

Figure 12. VR applications.

5.3. Ping Pong

As an exemplary entertainment application we have im-plemented a simple ping pong game, see Fig. 12(b). Thehuman player uses the DeKiFeD3 to move the racket, hitthe ball and make it bounce in the vertical direction. Typi-cal trajectories of the ball, the racket, the desired and mea-sured DeKiFeD3 feedback forces to the player are plottedin Fig. 13.

5.4. Deformable Objects

In addition to the nondeformable virtual world in Sec-tion 5.1 we implemented an environment with deformableobjects, see Fig. 14. These objects consist of 100-300 massnodes which are connected to each other by spring-dampersystems [6]. This approach allows a very realistic simula-tion of soft objects with deformable shape. The drawbackof nodal models lies in the required numerical integration

Page 7: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

Figure 13. Trajectories during the ping ponggame.

time. In our implementation the integration time is� 10msfor haptic rendering. Details about the implementation ofthe deformable VR can be found in [8].

This virtual world can also be explored with theDeKiFeD3. Some results of object deformations are shownin Fig. 14 and a projection to thex=z-plane of generatedforces is indicated in Fig. 15.

(a) (b)(c)

(d)(e) (f)

Figure 14. Graphics of sphere, cylinder andbox objects based on nodal models in nomi-nal and deformed configuration.

6. Experiments in Real Remote Environments

6.1. Object Manipulation

An example of a telemanipulation task is shown inFig. 16, where the 3 triangular parts have to be manipu-lated such that they fit together forming a square. Fig. 16(a)

xz-Forces

DeformableObject

Ground

Probe

Figure 15. Plot of the forces generated due toobject deformation (projection of the sphereto the x/z-plane).

shows the DeKiTop3 pushing a smaller triangle. The cor-responding force measurements during this telepushing op-eration are shown in Fig. 16(c). The teleoperator may alsolift the part by putting the hook into the ring. This is shownin the photo of Fig. 16(b) and the corresponding measuredforce trajectory is shown in Fig. 16(d).

(a)(b)

(c) (d)

Figure 16. Telemanipulation of triangular ob-jects and measured forces.

6.2. Other Telepresence Experiments

The teledrilling system, see Fig. 4 and 1 DOF subsystemin Fig. 3, is used to investigate haptic telecontrol in commu-nication infrastructures with time-delay. The control strat-egy relies on a hybrid system model for switching controlmodes depending on the task situation. In each of the con-trol modes lossless (passive) communication is guaranteedand master-slave impedance matching is realized [2]. Oneof the goals of current investigations is to generalize the the-ory of lossless communication [1] to the other modalities ofmulti-modal telecontrol systems.

Page 8: [IEEE Comput. Soc Virtual Reality - Houston, TX, USA (13-17 March 1999)] Proceedings IEEE Virtual Reality (Cat. No. 99CB36316) - Distributed PC-based haptic, visual and acoustic telepresence

One important goal of realistic telepresence systems isthat the human operator gets an exact feeling of the re-mote object impedance when touched with the teleopera-tor. The DeKiFeD3/DeKiTop3 telemanipulation systemachieves very good performance here for soft objects likethe sponge shown in Fig. 1(b). Experimental results cannotbe reported here because of paper length limitations.

7. Conclusions

We have presented a distributed PC-based multi-modaltelepresence system developed and manufactured in ourlaboratory. The system relies on standard PC hardwarewith standard expansion boards for 3D graphics and ana-log signal processing. The overall system is distributedacross two laboratories in two neighboring buildings em-phasizing the requirements for telepresence technologies.Main feature of the telepresence system is the multi-modal(kinesthetic and tactile) HSI DeKiFeD3 and the teleopera-tor DeKiTop3 with similar structure. Incorporating sensorsand actuators for vibrotactile impressions and temperatureallows haptic telepresence in real (physical) remote envi-ronments or virtual worlds.

To demonstrate the capabilities and variety of potentialapplications of the developed multi-modal telepresence sys-tem we have presented various VR and telemanipulationexperimental results. In particular, experimental results oftactile and haptic exploration of real and virtual environ-ments, virtual drilling, ping pong as an entertainment appli-cation and haptic interaction with deformable objects werereported.

Acknowledgments

The implementation of the on-line deformable ob-ject simulation and interaction with the DeKiFeD3 wasmainly programmed by A. Hall, S. Kolssouzidis andT. Schickinger. The authors greatly appreciate the sup-port by J. Gradl and H. Kubick of the laboratory workshopand the electronic engineers T. Stoeber and T. Lowitz dur-ing construction of the force paddle, teledrilling machine,DeKiFeD3/DeKiTop3 experimental hardware and controlsystems. The support by W. Jaschik to integrate the pre-sented system into the laboratory LAN is also appreciated.

This research was partly funded by the German NationalScience Foundation (DFG) within the interdisciplinary re-search project on “Telepresence and Teleaction” (SFB 453).

References

[1] R. Anderson and M. Spong. Bilateral Control of Teleop-erators with Time Delay.IEEE Transaction on AutomaticControl, 34:494–501, 1989.

[2] H. Baier, M. Buss, and G. Schmidt. Control Mode Switchingfor Teledrilling Based on a Hybrid System Model. InPro-ceedings of the IEEE/ASME International Conference onAdvanced Intelligent Mechatronics AIM’97, Tokyo, Japan,Paper No. 50, 1997.

[3] C. Basdogan, C. Ho, and M. Srinivasan. Haptic Rendering:Point- and Ray-Based Interactions. InProceedings of theSecond PHANToM Users Group Workshop, Dedham, MA,1997.

[4] D. Caldwell, S. Lawther, and A. Wardle. Multi-Modal Cu-taneous Tactile Feedback. InProceedings of the IEEE/RSJInternational Conference on Intelligent Robots and SystemsIROS, pages 465–472, Osaka, Japan, 1996.

[5] J. Colgate, P. Grafing, M. Stanley, and G. Schenkel. Imple-mentation of stiff virtual walls in force-reflecting interfaces.In Proceedings of the IEEE Virtual Reality Annual Interna-tional Symposium VRAIS, pages 20–26, 1993.

[6] O. Deussen and Kuhn. Echtzeitsimulation deformierbarerObjekte uber nodale Modelle. InProc. Intergration vonBild, Modell und Text, ASIM Mitteilungen No. 46, Univer-sitat Magdeburg, 1995.

[7] K. Doel and D. Pai. The Sounds of Physical Shapes.Pres-ence, 7(4):382–395, August 1998.

[8] A. Hall, S. Kolssouzidis, and T. Schickinger. Modeling ofDeformable Objects in Virtual Worlds. Internal Report, In-stitute of Automatic Control Engineering, Technical Univer-sity of Munich, June 1998.

[9] J. Hoogen. Development of an Experimental System forKinesthetic Feedback in Telepresence Systems. Internal Re-port, Institute of Automatic Control Engineering, TechnicalUniversity of Munich, May 1998.

[10] P. Kammermeier. Development of a Haptic Display for Tac-tile Feedback in Telepresence Systems. Internal Report, In-stitute of Automatic Control Engineering, Technical Univer-sity of Munich, April 1997.

[11] T. Massie and J.K.Salisbury. The PHANToM Haptic Inter-face: A Device for Probing Virtual Objects. InProc. of theASME Winter Annual Meeting, Symposium on Haptic In-terfaces for Virtual Environment and Teleoperator Systems,Chicago, 1994.

[12] L. Rosenberg and B. Adelstein. Perceptual decomposition ofvirtual haptic surfaces. InProceedings of the IEEE Sympo-sium on Research Frontiers in Virtual Reality, pages 46–53,San Jose, CA, 1993.

[13] S. Salcudean and T. Vlaar. On the emulation of stiff wallsand static friction with a magnetically levitated input/outputdevice. InProceedings of ASME WAM, pages 303–309, NewYork, 1994.

[14] S. Wermuth. Quantitative Investigations of Multi-ModalTelepresence Quality in Visual-Haptic Telepresence Sys-tems. Internal Report, Institute of Automatic Control En-gineering, Technical University of Munich, Mai 1998.

[15] C. Zilles and J. Salisbury. A Constraint-based God-objectMethod For Haptic Display. InProceedings of the IEEE/RSJInternational Conference on Intelligent Robots and SystemsIROS, 1995.