COSC 426 Lect. 8: AR Research Directions

Preview:

DESCRIPTION

A lecture on research directions in Augmented Reality as part of the COSC 426 class on AR. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury.

Citation preview

Lecture 8: Research Directions

Mark BillinghurstgHIT Lab NZ

University of Canterbury University of Canterbury

Looking to the Future

The Future is with usIt takes at least 20 years for new technologies to go It takes at least 20 years for new technologies to go

from the lab to the lounge..“The technologies that will significantly affect our lives

over the next 10 years have been around for a decade.

The future is with us The trick is learning how to spot it The future is with us. The trick is learning how to spot it. The commercialization of research, in other words, is far more about prospecting than alchemy.”

Bill Buxton Oct 11th 2004

R h Di tiResearch Directions

experiencesUsability

applications Interaction

tools Authoringtools Authoring

components Tracking, Display

Sony CSL © 2004

Research DirectionsCComponents

Markerless tracking, hybrid trackingDi l i t d iDisplays, input devices

ToolsAuthoring tools user generated contentAuthoring tools, user generated content

ApplicationsInteraction techniques/metaphorsInteraction techniques/metaphors

ExperiencesUser evaluation novel AR/MR experiencesUser evaluation, novel AR/MR experiences

HMD DesignHMD Design

Occlusion with See-through HMDTh P blThe Problem

Occluding real objects with virtualOccluding virtual objects with realOccluding virtual objects with real

Real Scene Current See-through HMD

ELMO (Kiyokawa 2001)

Occlusive see-through HMDMasking LCDReal time range finding

ELMO Demo

ELMO DesignVirtual imagesfrom LCD

LCD MaskDepth Sensing

RealWorld

LCD Maskg

WorldOpticalCombiner

Use LCD mask to block real world

Combiner

Use LCD mask to block real worldDepth sensing for occluding virtual images

ELMO Results

ToolsTools

BuildAR

http://www.hitlabnz.org/wiki/BuildARStand alone applicationStand alone applicationVisual interface for AR model viewing application Enables non-programmers to build AR scenesp g

Mobile BuildARId l h lIdeal authoring tool

Develop on PC, deploy on handheld

AR Scene

AR PlayerPC

PC

EdgelibBuildAR XML

S bi /WM

Mobile Phone stbES

Symbian/WM

Python

Desktop PC authoring tool

Desktop PC

Mobile Phone

ApplicationsApplications

Future DirectionsSLIDE 17

Interaction TechniquesInput techniques

3D vs. 2D inputP /b /Pen/buttons/gestures

Collaboration techniquesSimultaneous access to AR contentSimultaneous access to AR content

User studies…

Flexible DisplaysFlexible Lens Surface

Bimanual interactionDigital paper analogyDigital paper analogy

Red Planet, 2000Red Planet, 2000

Sony CSL © 2004

Sony CSL © 2004

Tangible User Interfaces (TUIs)GUMMI bendable display prototypeReproduced by permission of Sony CSL

Sony CSL © 2004

Sony CSL © 2004

Lucid TouchM f R h & M b h El R h L bMicrosoft Research & Mitsubishi Electric Research LabsWigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C.LucidTouch: A See-Through Mobile DeviceIn Proceedings of UIST 2007, Newport, Rhode Island, October 7-10, 2007, pp. 269–278.

Auditory Modalities

Auditoryauditory iconsauditory iconsearcons speech synthesis/recognition

Nomadic Radio (Sawhney) bi ti li d di- combines spatialized audio

- auditory cues- speech synthesis/recognition

Gestural interfaces1. Micro-gestures

(unistroke, smartPad)

2 D i b d 2. Device-based gestures(tilt based examples)

3 Embodied interaction3. Embodied interaction(eye toy)

Haptic ModalitiesHaptic interfaces

Simple uses in mobiles? (vibration instead of ringtone)Sony’s Touchengine

- physiological experiments show you can perceive two stimulus 5ms apart, and spaced as low as 0.2 microns

n層28 μm

n層

4 μm

n層

VV

Haptic Input

AR Haptic WorkbenchCSIRO 2003 – Adcock et al CSIRO 2003 – Adcock et. al.

AR Haptic Interface

Phantom, ARToolKit, Magellan

Multimodal InputCombining speech and gesture builds on the strength of each

Speech – mode selection, group selectionp g pGesture – direct manipulation

Key problemCommand disambiguation

- “Move that chair” - which chair?

Use statistical methods for disambiguationUse statistical methods for disambiguationSpeech and gesture recognition provide multiple possibilities –need to look for most probableSenseShapes detect object of interest (Olwal 2003)

Olwal 2003

ExperiencesExperiences

ExperiencesCrossing Boundaries

Ubiquitous VR/AR

Massive ARAR + Social Networking

Usability

Crossing Boundaries

Jun Rekimoto, Sony CSL

Invisible Interfaces

Jun Rekimoto, Sony CSL

Milgram’s Reality-Virtuality continuum

Mixed Reality

Real Augmented Augmented VirtualEnvironment Reality (AR) Virtuality (AV) Environment

Reality - Virtuality (RV) Continuum

The MagicBook

Reality VirtualityAugmented Reality (AR)

Augmented Virtuality (AV)Reality (AR) Virtuality (AV)

Invisible Interfaces

Jun Rekimoto, Sony CSL

Example: Visualizing Sensor NetworksRauhala et. al. 2007 (Linkoping)Network of Humidity SensorsNetwork of Humidity Sensors

ZigBee wireless communication

Use Mobile AR to Visualize HumidityUse Mobile AR to Visualize Humidity

Example: Sensor Input for AR InteractionUbiComp sensor

Light, temp, motion, soundRF connection

AR software plug-inSensor input interacting with AR applications

uPartuPart USB BridgeUSB Bridge

ParticleParticle

http://particle.teco.eduhttp://particle.teco.edu idle: 16 Houridle: 16 Hour

AR response to change in light levelsp g g

Invisible Interfaces

Jun Rekimoto, Sony CSL

UbiVR – CAMAR

CAMAR ControllerCAMAR Controller

CAMAR ViewerCAMAR Viewer

CAMAR CompanionCAMAR CompanionGIST - KoreaGIST - Korea

ubiHome @ GISTMedia services Light service MR window

ubiTrackubiTrack

Where/When

ubiKeyTag-it

©ubiHome

What/When/HowWho/What/When/How

Couch SensorPDA Door SensorPDA Door Sensor

Wh /HWh /HWh /Wh t/Wh /H When/HowWhen/HowWho/What/When/How

CAMAR - GIST

(CAMAR: Context-Aware Mobile Augmented Reality)y)

UCAM: ArchitectureLight

<Service> IR receiver

wearServicewear-UCAM

Sensor

Content

<Service>

MRWindow<Service>

UserProfileManager< wearService >

UserConditionalContext #2

UserConditionalContext #3

ubiTrack

BioSensorIR receiver

wearSensor

Sensor Service(Integrator,Manager,

Interpreter,ServiceProvider) Media services Light service

ubiTV<Service>

UserConditionalContext #1

Context Interface

Couch Sensor

Tag-it

Door Sensor

ubiTrack

What/When/How Where/When

PDA

BAN/PAN TCP/IP

Network Interface When/HowWhen/How

PDA

Who/What/When/How

ubi-UCAM

O ti S t

BAN/PAN(BT)

TCP/IP(Discovery,Control,Event)

Operating Systemvr-UCAM

UbiquitousUbiComp

Ubi AR

Ubi VR

Weiser Mobile AR

Desktop AR VR

Reality Virtual Reality

Terminal

Reality Virtual Reality

Milgram

From: Joe Newman

Future DirectionsSLIDE 51

Massive MultiuserHandheld AR for the first time allows extremely high numbers of AR usersR iRequires

New types of applications/gamesNew infrastructure (server/client/peer to peer)New infrastructure (server/client/peer-to-peer)Content distribution…

Massive Multi User

Ubiquitous

TerminalSingle User

Reality

Single User

VRVR

Massive MultiUser2D Applications

MSN – 29 millionSkype – 10 millionSkype 10 millionFacebook – up to 70m

3D/VR ApplicationsSecondLife > 50KStereo projection - <500

A mented RealitAugmented RealityShared Space (1999) - 4Invisible Train (2004) - 8( )

BASIC VIEW

PERSONAL VIEW

Augmented Reality 2.0 Infrastructure

L i W b 2 0Leveraging Web 2.0Content retrieval using HTTPgXML encoded meta information

KML placemarks + extensionsQueries Queries

Based on location (from GPS, image recognition)Based on situation (barcode markers)

Q i l d li ki f d bQueries also deliver tracking feature databasesEverybody can set up an AR 2.0 serverSyndication: y

Community servers for end-user contentTagging

AR client subscribes to arbitrary number of feedsAR client subscribes to arbitrary number of feeds

CContentContent creation and deliveryContent creation and delivery

Content creation pipelineDelivering previously unknown contentDelivering previously unknown content

Streaming ofData (objects multi media)Data (objects, multi-media)Applications

DistributionDistributionHow do users learn about all that content?H d th it?How do they access it?

Twitter 360

Twitter 360 http://www.twitter-360.compAR to geo-locate Tweets around you

Better than Google maps?g p

Scaling Up

AR on a City ScaleyUsing mobile phone as ubiquitous sensorMIT Senseable City LabMIT Senseable City Lab

http://senseable.mit.edu/

WikiCity Rome (Senseable City Lab MIT)

More Information• Mark Billinghurst• Mark Billinghurst

– mark.billinghurst@hitlabnz.org

• Websiteshttp://www hitlabnz org/ – http://www.hitlabnz.org/

– http://artoolkit.sourceforge.net/http://www osgart org/– http://www.osgart.org/

– http://www.hitlabnz.org/wiki/buildAR/

Recommended