COSC 426 lect. 4: AR Interaction

Preview:

Citation preview

Lecture 4. AR InteractionLecture 4. AR Interaction

Mark Billinghurstmark.billinghurst@hitlabnz.org

Gun Leegun.lee@hitlabnz.org

Aug 2011

COSC 426: Augmented RealityCOSC 426: Augmented Reality

B ildi C lli AR E iBuilding Compelling AR Experiences

experiences

applications Interaction

tools Authoringtools Authoring

components Tracking, Display

Sony CSL © 2004

AR InteractionAR Interaction

Designing AR System = Interface DesignU i diff i d h l iUsing different input and output technologies

Objective is a high quality of user experiencej g q y pEase of use and learning Performance and satisfaction

Interaction Tasks2D (from [Foley]):

Selection, Text Entry, Quantify, Position y y3D (from [Bowman]):

Navigation (Travel/Wayfinding)Navigation (Travel/Wayfinding)SelectionManipulationManipulationSystem Control/Data Input

AR: 2D + 3D Tasks and.. more specific tasks?

[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D., V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984.[Bowman]: 3D User Interfaces: Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005

AR Interfaces as Data Browsers

2D/3D virtual objects are registered in 3Dregistered in 3D

“VR in Real World”I t tiInteraction

2D/3D virtual viewpoint lcontrol

ApplicationsVisualization, training

AR Information BrowsersInformation is registered toreal-world context

Hand held AR displaysInteraction

M i l i f i dManipulation of a windowinto information space

ApplicationsContext-aware information displays

Rekimoto, et al. 1997

Architecture

Current AR Information BrowsersMobile AR

GPS + compass

Many ApplicationsLayaryWikitudeAcrossair PressLiteYelpYelpAR Car Finder……

Junaio

AR Browser from Metaiohttp://www.junaio.com/p j

AR browsingGPS GPS + compass2D/3D object placementPhotos/live videoCommunity viewingCommunity viewing

Advantages and Disadvantages

Important class of AR interfacesWearable computersAR simulation, trainingAR simulation, training

Limited interactivityModification of virtualcontent is difficult

Rekimoto, et al. 1997

3D AR Interfaces

Virtual objects displayed in 3D physical space and manipulatedphysical space and manipulated

HMDs and 6DOF head-tracking 6DOF h d k f i6DOF hand trackers for input

InteractionViewpoint controlTraditional 3D user interface Kiyokawa, et al. 2000interaction: manipulation, selection, etc.

AR 3D Interaction

AR G ffitiAR Graffiti

www.nextwall.net

Advantages and DisadvantagesImportant class of AR interfaces

Entertainment, design, training

AdvantagesUser can interact with 3D virtual bj t h i object everywhere in space

Natural, familiar interaction

DisadvantagesDisadvantagesUsually no tactile feedbackUser has to use different devices for virtual and physical objects

Oshima, et al. 2000

S f Augmented Surfaces and Tangible InterfacesTangible Interfaces

Basic principlesBasic principlesVirtual objects are projected on a surfaceprojected on a surfacePhysical objects are used

l f l as controls for virtual objectsSupport for collaboration

Augmented Surfaces

Rekimoto, et al. 1998F j iFront projectionMarker-based trackingMultiple projection surfaces

Tangible User Interfaces (Ishii 97)Create digital shadows f h i l bjfor physical objectsForegroundg

graspable UI

B k dBackgroundambient interfaces

Tangible Interfaces - AmbientD l SDangling String

Jeremijenko 1995A bi h iAmbient ethernet monitorRelies on peripheral cues

Ambient FixturesDahley, Wisneski, Ishii 1998Use natural material qualities

for information display

Tangible Interface: ARgroove Collaborative InstrumentExploring Physically Based InteractionExploring Physically Based Interaction

Map physical actions to Midi outputTranslation rotation- Translation, rotation

- Tilt, shake

ARgroove in Use

Visual FeedbackContinuous Visual Feedback is KeySingle Virtual Image Provides:Single Virtual Image Provides:

RotationT lTiltHeight

i/O Brush (Ryokai, Marti, Ishii)

Other ExamplesTriangles (Gorbert 1998)

Triangular based story tellingg y g

ActiveCube (Kitamura 2000-)C b h Cubes with sensors

Lessons from Tangible InterfacesPhysical objects make us smart

Norman’s “Things that Make Us Smart”gencode affordances, constraints

Objects aid collaborationestablish shared meaningestablish shared meaning

Objects increase understandingserve as cognitive artifacts

TUI Limitations

Difficult to change object propertiescan’t tell state of digital data

Limited display capabilitiesLimited display capabilitiesprojection screen = 2Ddependent on physical display surface

Separation between object and displayp j p yARgroove

Advantages and Disadvantages

AdvantagesNatural users hands are used for interacting Natural - users hands are used for interacting with both virtual and real objects.

N d f i l i t d i- No need for special purpose input devices

DisadvantagesgSpatial gap

Interaction is limited only to 2D surface- Interaction is limited only to 2D surface• Full 3D interaction and manipulation is difficult

- Separation between interaction object and displaySeparation between interaction object and display

Orthogonal Nature of AR Interfaces

Back to the Real World

AR overcomes limitation of TUIsenhance display possibilitiesmerge task/display spacemerge task/display spaceprovide public and private views

TUI + AR = Tangible ARgApply TUI methods to AR interface design

Space vs. Time - MultiplexedSpace-multiplexed

Many devices each with one functiony- Quicker to use, more intuitive, clutter - Real Toolbox

Time-multiplexedTime multiplexedOne device with many functions

- Space efficientSpace efficient- mouse

Tangible AR: Tiles (Space Multiplexed)

T l Tiles semanticsdata tiles

i iloperation tiles

Operation on tilesproximityspatial arrangements

lti l dspace-multiplexed

Space-multiplexed Interface

Data authoring in TilesData authoring in Tiles

Proximity-based Interaction

Object Based Interaction: MagicCupIntuitive Virtual Object Manipulation Intuitive Virtual Object Manipulation on a Table-Top Workspace

Time multiplexedM l i l M kMultiple Markers

- Robust Tracking

Tangible User Interface- Intuitive Manipulation

Stereo Display- Good Presence

MagicCup system

Main table, Menu table, Cup interface

Tangible AR: Time-multiplexed Interaction

Use of natural physical object manipulations to control virtual objectsj

VOMAR DemoCatalog book:

- Turn over the pagep g

Paddle operation:- Push, shake, incline, hit, scoopPush, shake, incline, hit, scoop

VOMAR Interface

Advantages and Disadvantages

AdvantagesNatural interaction with virtual and physical toolsNatural interaction with virtual and physical tools

- No need for special purpose input devices

Spatial interaction with virtual objectsSpatial interaction with virtual objects- 3D manipulation with virtual objects anywhere in physical

space

DisadvantagesRequires Head Mounted DisplayRequires Head Mounted Display

Wrap-upBrowsing Interfaces

simple (conceptually!) unobtrusivesimple (conceptually!), unobtrusive

3D AR Interfacesexpressive, creative, require attention

Tangible InterfacesTangible InterfacesEmbedded into conventional environments

T ibl ARTangible ARCombines TUI input + AR displayp p y

Designing AR Applications

Interface Design Path

1/ Prototype Demonstration

2/ Adoption of Interaction Techniques from other interface metaphors Augmented Realityp

3/ Development of new interface metaphors Augmented Reality

appropriate to the medium

4/ Development of formal theoretical models for Virtual Reality

4/ Development of formal theoretical models for predicting and modeling user actions

Desktop WIMP

AR Design Space

Reality Virtual Reality

A t d R litAugmented Reality

Physical Design Virtual DesignPhysical Design Virtual Design

AR is mixture of physical affordance and virtual affordancePhysical

T ibl ll d bjTangible controllers and objects

VirtualVirtual graphics and audio

AR Design PrinciplesInterface Components

Physical componentsPhysical componentsDisplay elements- Visual/audio

Interaction metaphorsInteraction metaphors

Physical Display yElements

p yElementsInteraction

MetaphorInput Output

Tangible AR Metaphor

AR overcomes limitation of TUIsenhance dis la ssibilitiesenhance display possibilitiesmerge task/display spaceprovide public and private views

TUI + AR = Tangible ARApply TUI methods to AR interface design

Tangible AR Design PrinciplesTangible AR Interfaces use TUI principles

Physical controllers for moving virtual contentPhysical controllers for moving virtual contentSupport for spatial 3D interaction techniquesSupport for multi-handed interactionMatch object affordances to task requirementsj qSupport parallel activity with multiple objectsAllow collaboration between multiple usersAllow collaboration between multiple users

Case Study 1: 3D AR Lens

Goal: Develop a lens based AR interface

MagicLensesDeveloped at Xerox PARC in 1993Vi i f h k diff l h View a region of the workspace differently to the restOverlap MagicLenses to create composite effects

3D MagicLenses

MagicLenses extended to 3D (Veiga et. al. 96)Volumetric and flat lensesVolumetric and flat lenses

AR Lens Design PrinciplesPhysical Components

Lens handle - Virtual lens attached to real object

Display ElementsDisplay ElementsLens view

- Reveal layers in dataset- Reveal layers in dataset

Interaction MetaphorPh i ll h ldi l Physically holding lens

3D AR Lenses: Model ViewerDisplays models made up of multiple partsEach part can be shown or hidden through the lensEach part can be shown or hidden through the lensAllows the user to peer inside the modelMaintains focus + contextMaintains focus + context

AR Lens Demo

AR FlexiLens

Real handles/controllers with flexible AR lens

Techniques based on AR Lenses

Object SelectionSelect objects by targeting them with the lensSelect objects by targeting them with the lens

Information FilteringShow different representations through the lensHide certain content to reduce clutter, look inside things

Case Study 2 : LevelHead

Block based gameBlock based game

Case Study 2: LevelHead

Physical ComponentsReal blocksReal blocks

Display ElementsVirtual person and rooms

I i M hInteraction MetaphorBlocks are rooms

Case Study 3: AR Chemistry (Fjeld 2002)

Tangible AR chemistry education

G Goal: An AR application to test molecular structure in chemistryyPhysical Components

R l b k i b ki kReal book, rotation cube, scoop, tracking markers

Display Elementsp yAR atoms and molecules

I t ti M t hInteraction MetaphorBuild your own molecule

AR Chemistry Input Devices

Case Study 4: Transitional Interfaces

Goal: An AR interface supporting transitions from reality to virtual realityfrom reality to virtual realityPhysical Components

Real book

Display ElementsDisplay ElementsAR and VR content

Interaction MetaphorBook pages hold virtual scenes p g

Milgram’s Continuum (1994)

Mixed Reality (MR)

Reality VirtualityAugmented AugmentedReality(Tangible Interfaces)

y(Virtual Reality)

Augmented Reality (AR)

Augmented Virtuality (AV)

Central HypothesisypThe next generation of interfaces will support transitions along the Reality-Virtuality continuum

Transitions

Interfaces of the future will need to support i i l h RV itransitions along the RV continuum

Augmented Reality is preferred for:Augmented Reality is preferred for:co-located collaboration

I i Vi t l R lit i f d fImmersive Virtual Reality is preferred for:experiencing world immersively (egocentric)sharing viewsremote collaboration

The MagicBook

Design Goals:Allows user to move smoothly between reality Allows user to move smoothly between reality and virtual realityS ll b iSupport collaboration

MagicBook Metaphor

Features

Seamless transition between Reality and VirtualityR li l d i l iReliance on real decreases as virtual increases

Supports egocentric and exocentric viewsUser can pick appropriate view

Computer becomes invisibleComputer becomes invisibleConsistent interface metaphorsVirtual content seems real

Supports collaboration

Design alternatives for Design alternatives for common user tasks in Tangible AR

User Tasks Interface Design

Viewpoint Camera on HMDFixed camera – top view, front view, mirroring

ControlFixed camera top view, front view, mirroringHandheld camera

SelectionStatically paired virtual and physical objectsD i ll i d i t l d h i l bj t ddl i tDynamically paired virtual and physical objects - paddle, pointer

3DManipulation

Direct mapping of whole 6DOFFiltered/distorted mapping - snapping, non-linear mapping

Manipulation Multiplexed mapping - rotation from one, position from anotherLocation and pose based - proximity, spatial configurationGestures with props - tilt shake

Event &Command

Gestures with props tilt, shake

Menu,Buttons

Keyboard & mouse2D/3D GUI with tracking objects as pointersOcclusion based interactionButtons Occlusion based interactionCustom hardware devices

Design Tips for Tangible ARUsing metaphors from the real worldTake advantage of parallel interactionsTake advantage of parallel interactionsUse timers to prevent accidentsInteraction volume – user, tracking, whitespaceWhat happens when tracking gets lost or object What happens when tracking gets lost or object is out of view?Problems in visualization

Field of view, occlusions,

OSGART: From Registration to Interaction From Registration to Interaction

Keyboard and Mouse InteractionTraditional input techniquesOSG provides a framework for handling keyboard p g yand mouse input events (osgGA)

1. Subclass osgGA::GUIEventHandlerg2. Handle events:

• Mouse up / down / move / drag / scroll-wheel• Key up / down

3. Add instance of new handler to the viewer

Keyboard and Mouse InteractionCreate your own event handler class

class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {

public:KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }

Create your own event handler class

virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) {

switch (ea.getEventType()) {// Possible events we can handle// Possible events we can handlecase osgGA::GUIEventAdapter::PUSH: break;case osgGA::GUIEventAdapter::RELEASE: break;case osgGA::GUIEventAdapter::MOVE: break;case osgGA::GUIEventAdapter::DRAG: break;case osgGA::GUIEventAdapter::SCROLL: break;

d b kcase osgGA::GUIEventAdapter::KEYUP: break;case osgGA::GUIEventAdapter::KEYDOWN: break;

}

return false;}}

};

viewer.addEventHandler(new KeyboardMouseEventHandler());

Add it to the viewer to receive eventsviewer.addEventHandler(new KeyboardMouseEventHandler());

Keyboard Interaction

case osgGA::GUIEventAdapter::KEYDOWN: {

switch (ea.getKey()) {

Handle W,A,S,D keys to move an object

case 'w': // Move forward 5mmlocalTransform->preMult(osg::Matrix::translate(0, -5, 0));return true;

case 's': // Move back 5mmlocalTransform->preMult(osg::Matrix::translate(0, 5, 0));return true;return true;

case 'a': // Rotate 10 degrees leftlocalTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS));return true;

case 'd': // Rotate 10 degrees rightlocalTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS));t treturn true;

case ' ': // Reset the transformationlocalTransform->setMatrix(osg::Matrix::identity());return true;

}

break;

localTransform = new osg::MatrixTransform();localTransform->addChild(osgDB::readNodeFile("media/car.ive"));arTransform >addChild(localTransform get());arTransform->addChild(localTransform.get());

Keyboard Interaction Demo

Mouse Interaction

Mouse is pointing device…Use mouse to select objects in an AR sceneUse mouse to select objects in an AR sceneOSG provides methods for ray-casting and

intersection testingReturn an osg::NodePath (the path from the hit g ( pnode all the way back to the root)

ProjectionProjectionPlane (screen) scene

Mouse InteractionCompute the list of nodes under the clicked position

case osgGA::GUIEventAdapter::PUSH:

Compute the list of nodes under the clicked positionInvoke an action on nodes that are hit, e.g. select, delete

case osgGA::GUIEventAdapter::PUSH:

osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa);osgUtil::LineSegmentIntersector::Intersections intersections;

// Clear previous selectionsf ( i d i t i 0 i < t t i () i++) {for (unsigned int i = 0; i < targets.size(); i++) {

targets[i]->setSelected(false);}

// Find new selection based on click positionif (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {

for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) {

if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) {std::cout << "HIT!" << std::endl; target->setSelected(true);target >setSelected(true);return true;

}}

}

break;

Mouse Interaction Demo

Proximity Techniques

Interaction based on the distance between a marker and the camerathe distance between multiple markers

Single Marker Techniques: ProximityUse distance from camera to marker as input parameter

e g Lean in close to examinee.g. Lean in close to examine

Can use the osg::LOD class to show different content at different depth different content at different depth ranges Image: OpenSG Consortium

Single Marker Techniques: Proximity// Load some modelsosg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");

// Use a Level-Of-Detail node to show each model at different distance ranges.osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m awaylod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm awaylod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away

arTransform->addChild(lod.get());

Define depth ranges for each nodeDefine depth ranges for each nodeAdd as many as you wantRanges can overlap

Single Marker Proximity Demo

Multiple Marker ConceptsInteraction based on the relationship between markers

e.g. When the distance between two markers decreases below threshold invoke an actiondecreases below threshold invoke an actionTangible User Interface

A lApplications:Memory card gamesy gFile operations

Multiple Marker Proximity

VirtualCamera

Transform A Transform B

Camera

Switch A Switch B

Distance > Threshold

Model Model Model Model Model A1

Model A2

Model B1

Model B2

Multiple Marker Proximity

VirtualCamera

Transform A Transform B

Camera

Switch A Switch B

Distance <= Threshold

Model Model Model Model Model A1

Model A2

Model B1

Model B2

Multiple Marker ProximityUse a node callback to test for proximity and update the relevant nodes

virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {

if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {

Use a node callback to test for proximity and update the relevant nodes

if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {if (mMarkerA->valid() && mMarkerB->valid()) {

osg::Vec3 posA = mMarkerA->getTransform().getTrans();osg::Vec3 posB = mMarkerB->getTransform().getTrans();osg::Vec3 offset = posA - posB;float distance = offset.length();

if (distance <= mThreshold) {if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1);if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1);

} else {} {if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0);if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0);

}

}

}

traverse(node,nv);

}

Multiple Marker Proximity

Paddle InteractionU k l f l d Use one marker as a tool for selecting and manipulating objects (tangible user interface)A h k id f f fAnother marker provides a frame of reference

A grid of markers can alleviate problems with occlusion

MagicCup (Kato et al) VOMAR (K t t l)MagicCup (Kato et al) VOMAR (Kato et al)

Paddle InteractionOften useful to adopt a local coordinate systemOften useful to adopt a local coordinate system

Allows the camera to move without disrupting Tlocal

Places the paddle in th di t the same coordinate system as the content on the gridcontent on the grid

Simplifies interaction

osgART computes Tlocal using the osgART::LocalTransformationCallbackosg co putes local us g t e osg :: oca a s o at o Ca bac

Tilt and Shake Interaction

Detect types of paddle movement:yp pTilt

- gradual change in orientationgradual change in orientation

Shakeshort s dden chan es in translation- short, sudden changes in translation

More Information• Mark Billingh rst• Mark Billinghurst

– mark.billinghurst@hitlabnz.org

• Gun Lee– gun.lee@hitlabnz.orgg @ g

• Websiteswww hitlabnz org– www.hitlabnz.org

Building Tangible AR Interfaces Building Tangible AR Interfaces with ARToolKit

Required CodeC l l C PCalculating Camera Position

Range to marker /Loading Multiple Patterns/Models

Interaction between objectsProximityRelative position/orientation

OcclusionStencil bufferingMulti-marker tracking

Tangible AR Coordinate Frames

Local vs. Global Interactions

LocalActions determined from single camera to marker Actions determined from single camera to marker transform

- shaking, appearance, relative position, rangeshaking, appearance, relative position, range

GlobalActions determined from two relationships Actions determined from two relationships

- marker to camera, world to camera coords.Marker transform determined in world coordinates- Marker transform determined in world coordinates

• object tilt, absolute position, absolute rotation, hitting

Range-based InteractionS l Fil R T tSample File: RangeTest.c

/* h f i *//* get the camera transformation */arGetTransMat(&marker_info[k], marker_center, marker width marker trans);marker_width, marker_trans);

/* find the range *// find the range /Xpos = marker_trans[0][3];Ypos = marker trans[1][3];p _ [ ][ ]Zpos = marker_trans[2][3];range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);

Loading Multiple PatternsS l F l L dM lSample File: LoadMulti.c

Uses object.c to load

Object Structuretypedef struct {

2 6char name[256]; int id; int visible;;double marker_coord[4][2];double trans[3][4];d bl k idthdouble marker_width;double marker_center[2];

} ObjectData_T;_

Finding Multiple TransformsC Create object list

ObjectData_T *object;

Read in objects - in init( )read ObjData( char *name int *objectnum );read_ObjData( char *name, int *objectnum );

Find Transform – in mainLoop( )p( )for( i = 0; i < objectnum; i++ ) {

..Check patternsp

..Find transforms for each marker}}

Drawing Multiple Objects

Send the object list to the draw functiondraw( object objectnum );draw( object, objectnum );

Draw each object individuallyfor( i = 0; i < objectnum; i++ ) {

if( object[i].visible == 0 ) continue;argConvGlpara(object[i].trans, gl_para);draw_object( object[i].id, gl_para);

}

Proximity Based Interaction

Sample File – CollideTest.cDetect distance between markerscheckCollisions(object[0],object[1], DIST)checkCollisions(object[0],object[1], DIST)If distance < collide distanceThen change the model/perform interactionThen change the model/perform interaction

Multi-marker Tracking

Sample File – multiTest.cMultiple markers to establish a single coordinate frameg

Reading in a configuration fileT ki f t f kTracking from sets of markersCareful camera calibration

Sample File Data/multi/marker dat

MultiMarker Configuration FileSample File - Data/multi/marker.dat

Contains list of all the patterns and their exact i ipositions#the number of patterns to be recognized66

#marker 1

Pattern File

Pattern Width +Data/multi/patt.a40.00.0 0.0

Pattern Width + Coordinate Origin

Pattern Transform0.0 0.01.0000 0.0000 0.0000 -100.00000.0000 1.0000 0.0000 50.00000 0000 0 0000 1 0000 0 0000

Pattern Transform Relative to Global Origin

0.0000 0.0000 1.0000 0.0000…

Camera Transform CalculationI l d <AR/ M lti h>Include <AR/arMulti.h>Link to libARMulti.lib

In mainLoop()Detect markers as usualarDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num)

Use MultiMarker Functionif( (err=arMultiGetTransMat(marker_info,

k fi )) < 0 ) {marker_num, config)) < 0 ) {argSwapBuffers();return;return;

}

Paddle-based Interaction

Tracking single marker relative to multi-marker set- paddle contains single markerp g

Paddle Interaction CodeSample File – PaddleDemo.c

Get paddle marker location + draw paddle before drawing p p gbackground modelpaddleGetTrans(paddleInfo, marker_info,

marker flag marker num &cparam);marker_flag, marker_num, &cparam);

/* draw the paddle *//* draw the paddle */if( paddleInfo->active ){

draw paddle( paddleInfo);draw_paddle( paddleInfo);}

draw paddle uses a Stencil Buffer to increase realismdraw_paddle uses a Stencil Buffer to increase realism

Paddle Interaction Code II

Sample File – paddleDrawDemo.c

Finds the paddle position relative to global coordinate frame:setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])

Sample File – paddleTouch.cFinds the paddle position:findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);

Checks for collisions:checkCollision(&curPaddlePos myTarget[i] pos 20 0)checkCollision(&curPaddlePos,myTarget[i].pos,20.0)

General Tangible AR Libraryd b d b h command_sub.c, command_sub.h

Contains functions for recognizing a range of diff ddl idifferent paddle motions:int check_shake( );i t h k h( )int check_punch( );int check_incline( );int check pickup( );int check_pickup( );int check_push( );

Eg: to check angle between paddle and basecheck_incline(paddle->trans, base->trans, &ang)

Recommended