Siggraph 2014: The Glass Class - Designing Wearable Interfaces

Preview:

DESCRIPTION

Course on Designing Wearable interfaces taught by Mark Billinghurst at Siggraph 2014. Presented on August 10th, 2014 from 10:45am - 12:15pm. The course focuses mainly on design guidelines and tools for rapid prototyping for Google Glass.

Citation preview

The Glass Class: Designing Wearable Interfaces Mark Billinghurst The HIT Lab NZ, University of Canterbury

The 41st International Conference and Exhibitionon Computer Graphics and Interactive Techniques

INTRODUCTION

Mark Billinghurst ▪  Director of The HIT Lab NZ, University of

Canterbury

▪  PhD Univ. Washington

▪  Research on AR, mobile HCI, Collaborative Interfaces, Wearables

▪  Joined Glass team at Google [x] in 2013

How do you Design for this?

Course Goals In this course you will learn

▪  Introduction to head mounted wearable computers

▪  Understanding of current wearable technology

▪  Key design principles/interface metaphors

▪  Rapid prototyping tools

▪  Areas for future research

What You Won’t Learn ▪  Who are the companies/universities in this space ▪  See the Siggraph exhibit floor

▪  Designing for non-HMD based interfaces ▪  Watches, fitness bands, etc

▪  How to develop wearable hardware ▪  optics, sensor assembly, etc

▪  Evaluation methods ▪  Experimental design, statistics, etc

Schedule •  10:45 am Introduction •  10:55 am Technology Overview •  11:05 am Design Guidelines •  11:25 am Prototyping Tools •  11:55 am Example Applications •  12:05 am Research Directions/Resources

A Brief History of Computing

Trend ▪  Smaller, cheaper, faster, more intimate ▪  Moving from fixed to handheld and onto body

1950’s 1980’s

1990’s

Wearable Computing ▪  Computer on the body that is: ▪  Always on ▪  Always accessible ▪  Always connected

▪  Other attributes ▪  Augmenting user actions ▪  Aware of user and surroundings

Desk Lap Hand Head

The Ideal Wearable ▪  Persists and Provides Constant Access: Designed for

everyday and continuous use over a lifetime. ▪  Senses and Models Context: Observes and models the

users environment, mental state, it’s own state. ▪  Augments and Mediates: Information support for the user in

both the physical and virtual realities. ▪  Interacts Seamlessly: Adapts its input and output modalities

to those most appropriate at the time.

Starner, T. E. (1999). Wearable computing and contextual awareness (Doctoral dissertation, Massachusetts Institute of Technology).

History of Wearables ▪  1960-90: Early Exploration ▪  Gamblers and Custom build devices

▪  1990 - 2000: Academic, Military Research ▪  MIT, CMU, Georgia Tech, EPFL, etc ▪  1997: ISWC conference starts

▪  1995 – 2005+: First Commercial Uses ▪  Niche industry applications, Military

▪  2010 - : Second Wave of Wearables

Origins - The Gamblers

•  Thorp and Shannon (1961) –  Wearable timing device for roulette prediction

•  Keith Taft (1972) –  Wearable computer for blackjack card counting

Belt computer Shoe Input Glasses Display

Steve Mann (1980s - )

http://wearcomp.org/

Thad Starner (1993 - )

MIT Wearable Computing (1993-)

http://www.media.mit.edu/wearables/

CMU Wearables (1991–2000) ▪  Industry focused wearables ▪  Maintenance, repair

▪  Custom designed interface ▪  Dial/button input

▪  Rapid prototyping approach ▪  Industrial designed, ergonomic

http://www.cs.cmu.edu/afs/cs/project/vuman/www/frontpage.html

Prototype Applications ▪  Remembrance Agent ▪  Rhodes (97)

▪  Augmented Reality ▪  Feiner (97), Thomas (98)

▪  Remote Collaboration ▪  Garner (97), Kraut (96)

■  Maintenance ■  Feiner (93), Caudell (92)

Mobile AR: Touring Machine (1997) ▪  University of Columbia ▪  Feiner, MacIntyre, Höllerer, Webster

▪  Combined ▪  See through head mounted display ▪  GPS tracking, Orientation sensor ▪  Backpack PC (custom) ▪  Tablet input

Feiner, S., MacIntyre, B., Höllerer, T., & Webster, A. (1997). A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Personal Technologies, 1(4), 208-217.

Touring Machine View

▪  Virtual tags overlaid on the real world ▪  “Information in place”

Early Commercial Systems ▪  Xybernaut (1996 - 2007) ▪  Belt worn, HMD, 200 MHz

▪  ViA (1996 – 2001) ▪  Belt worn, Audio Interface ▪  700 MHz Crusoe

■  Symbol (1998 – 2006) ■  Wrist worn computer ■  Finger scanner

Google Glass (2011 - )

The Second Wave of Wearables ▪  Vuzix M-100 ▪  $999, professional

▪  Recon Jet ▪  $600, more sensors, sports

▪  Opinvent ▪  500 Euro, multi-view mode

▪  Motorola Golden-i ▪  Rugged, remote assistance

Projected Market

Summary Wearables are a new class of computing

Intimate, persistent, aware, accessible Evolution over 50 year history

Backpack to head worn Custom developed to consumer ready device

Enables new applications Collaboration, memory, AR, industry, etc

Many head worn wearables are coming

TECHNOLOGY

▪  fafds

Enabling Technologies (1989-) ▪  Private Eye Display (Reflection Technologies) ▪  720 x 280 dipslay ▪  Vibrating mirror

▪  Twiddler (Handykey) ▪  Chording keypad ▪  Mouse emulation

Tin Lizzy (Platt, Starner, 1993) ▪  General Purpose Wearable ▪  150 MHz Pentium CPU ▪  32-64 Mb RAM, 6 GB HDD ▪  VGA display ▪  2 PCMCIA slots ▪  Cellular modem

http://www.media.mit.edu/wearables/lizzy/lizzy/index.html

•  asda

▪  Hardware ▪  CPU TI OMAP 4430 – 1 Ghz ▪  16 GB SanDisk Flash, 2 GB Ram

▪  Input ▪  5 mp camera, 720p recording, microphone ▪  InvenSense MPU-9150 inertial sensor

▪  Output ▪  Bone conducting speaker ▪  640x360 micro-projector display

Google Glass Specs

Glass Display

View Through Google Glass

Always available peripheral information display Combining computing, communications and content capture

Google Glass Demo

Google Glass User Interface

•  dfasdf

Timeline Metaphor

User Experience •  Truly Wearable Computing

–  Less than 46 ounces

•  Hands-free Information Access –  Voice interaction, Ego-vision camera

•  Intuitive User Interface –  Touch, Gesture, Speech, Head Motion

•  Access to all Google Services –  Map, Search, Location, Messaging, Email, etc

Types of Head Mounted Displays Occluded

See-thru

Multiplexed

Multiplexed Displays ▪  Above or below line of sight ▪  Strengths ▪  User has unobstructed view of real world ▪  Simple optics/cheap

▪  Weaknesses ▪  Direct information overlay difficult ▪  Display/camera offset from eyeline

▪  Wide FOV difficult

Vuzix M-100

▪  Monocular multiplexed display ($1000) ▪  852 x 480 LCD display, 15 deg. FOV ▪  5 MP camera, HD video ▪  GPS, gyro, accelerometer

Optical see-through HMD Virtual images from monitors

Real World Optical

Combiners

Epson Moverio BT-200

▪  Stereo see-through display ($700) ▪  960 x 540 pixels, 23 degree FOV, 60Hz, 88g ▪  Android Powered, separate controller ▪  VGA camera, GPS, gyro, accelerometer

Strengths of optical see-through ▪  Simpler (cheaper) ▪  Direct view of real world ▪  Full resolution, no time delay (for real world) ▪  Safety ▪  Lower distortion

▪  No eye displacement ▪  see directly through display

Video see-through HMD Video cameras

Monitors

Graphics

Combiner

Video

Vuzix Wrap 1200DXAR

▪  Stereo video see-through display ($1500) ■ Twin 852 x 480 LCD displays, 35 deg. FOV ■ Stereo VGA cameras ■ 3 DOF head tracking

Strengths of Video See-Through ▪  True occlusion ▪  Block image of real world

▪  Digitized image of real world ▪  Flexibility in composition, match time delays ▪  More registration, calibration strategies

▪  Wide FOV is easier to support ▪  wide FOV camera

Input Options ▪  Physical Devices ▪  Keyboard, Pointer, Stylus

▪  Natural Input ▪  Speech, Gesture

▪  Other ▪  Physiological sensors

Twiddler Input

▪  Chording or multi-tap input ▪  Possible to achieve 40 - 60 wpm after 30+ hours ▪  cf 20 wpm on T9, or 60+ wpm for QWERTY

Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., & Looney, E. W. (2004, April). Twiddler typing: One-handed chording text entry for mobile phones. In Proceedings of the SIGCHI

conference on Human factors in computing systems (pp. 671-678). ACM.

Virtual Keyboards

▪  In air text input ▪  Virtual QWERTY keyboard up to 20 wpm ▪  Word Gesture up to 28 wpm

▪  Handwriting around 20-30 wpm A. Markussen, et. al. Vulture: A Mid-Air Word-Gesture Keyboard (CHI 2014)

Unobtrusive Input Devices

▪  GestureWrist ▪  Capacitive sensing, changes with hand shape

Rekimoto, J. (2001). Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Wearable Computers, 2001. Proceedings. Fifth International Symposium on (pp. 21-27). IEEE.

Unobtrusive Input Devices

▪  GesturePad ▪  Capacitive multilayered touchpads ▪  Supports interactive clothing

Skinput

Using EMG to detect muscle activity Tan, D., Morris, D., & Saponas, T. S. (2010). Interfaces on the go.

XRDS: Crossroads, The ACM Magazine for Students, 16(4), 30-34.

Issues to Consider ▪  Fatigue ▪  “Gorrilla” Arm from free-hand input

▪  Comfort ▪  People want to do small gestures by waist

▪  Interaction on the go ▪  Can input be done while moving?

DESIGN GUIDELINES

INTERACTION DESIGN

Design For the Device

•  Simple, relevant information •  Complement existing devices

Last year Last week Now Forever

The Now machine Focus on location, contextual and timely information, and communication.

Don’t design an app

Glass OS is time-based model, not an app model.

The  world  is  the  experience  

Get  the  interface  and  interac-ons  out  of  the  way.  

It's  like  a  rear  view  mirror    

Don't  overload  the  user.  S-ck  to  the  absolutely  essen-al,  avoid  long  interac-ons.  

Be  explicit.      

Micro  Interac8ons  

The  posi-on  of  the  display  and  limited  input  ability  makes  longer  interac-ons  less  comfortable.  

 Using  it  shouldn’t  take  longer  than  taking  out  your  phone.  

Micro-Interactions

On mobiles people split attention between display and real world

Time Looking at Screen

Oulasvirta, A. (2005). The fragmentation of attention in mobile interaction, and what to do with it. interactions, 12(6), 16-18.

Design for MicroInteractions ▪  Design interaction less than a few seconds

–  Tiny bursts of interaction –  One task per interaction –  One input per interaction

▪  Benefits –  Use limited input –  Minimize interruptions –  Reduce attention fragmentation

Make it Glanceable

•  Seek to rigorously reduce information density. •  Design for recognition, not reading.

Bad Good

Reduce the Number of Info Chunks

•  You are designing for recognition, not reading. •  Reducing the total # of information chunks will

greatly increase the glanceability of your design. •  .

1

2 3

1 2

3

4

5 (6)

Design single interactions < 4 s

Eye movements For 1: 1 230ms For 2: 1 230ms For 3: 1 230ms For 4: 3 690ms For 5: 2 460ms

~1,840ms

Eye movements For 1: 1-2 460ms For 2: 1 230ms For 3: 1 230ms

~920ms

1

2

3

1 2

3

4

5 (6)

Test the glanceability of your design

Don’t Get in the Way

•  Enhance, not replace, real world interaction

Design for Interruptions

▪  Gradually increase engagement and attention load ▪  Respond to user engagement

Receiving SMS on Glass

“Bing”

Tap Swipe

Glass

Show Message Start Reply

User Look Up

Say Reply

Do one thing at a time

Keep it Relevant

•  Information at the right time and place

Design for Context

Avoid the Unexpected

•  Don’t send unexpected content at wrong times •  Make it clear to users what your application does

Build for People

•  Use imagery, voice interaction, natural gestures •  Focus on fire and forget interaction model

VISUAL DESIGN

Transparent  displays  are  tricky  

 Colors  are  funny  and  inconsistent.    You  can  only  add  light  to  a  scene,  not  cover  anything  up.  

 Mo-on  can  be  disorien-ng.  Clarity,  contrast,  brightness,  visual  field  and  aHen-on  are  important.  

White is your new black

Establish hierarchy with color

White is your <h1> and grey is your <h2> or <h3>. Footer text - establishing time, attribution, or distance - is the only place with smaller font size.

Use brand-specific typography

Test your design indoors + outdoors

EXAMPLE APPLICATIONS

•  https://glass.google.com/glassware

Glassware Applications

Virtual Exercise Companion

•  GlassFitGames –  http://www.glassfitgames.com

Vipaar Telemedicine

•  Vipaar + UAB - http://www.vipaar.com •  Endoscopic view streamed remotely •  Remote expert adds hands – viewed in Glass

CityViewAR

•  Using AR to visualize Christchurch city buildings – 3D models of buildings, 2D images, text, panoramas – AR View, Map view, List view – Available on Android/iOS market

CityViewAR on Glass

•  AR overlay of virtual buildings in Christchurch

PROTOTYPING TOOLS

How can we quickly prototype Wearable

experiences with little or no coding?

Why Prototype? ▪  Quick visual design ▪  Capture key interactions ▪  Focus on user experience ▪  Communicate design ideas ▪  “Learn by doing/experiencing”

Prototyping Tools ▪  Static/Low fidelity ▪  Sketching ▪  User interface templates ▪  Storyboards/Application flows ▪  Screen sharing

▪  Interactive/High fidelity ▪  Wireframing tools ▪  Mobile prototyping ▪  Native Coding

Important Note ▪  Most current wearables run Android OS ▪  eg Glass, Vuzix, Atheer, Epson, etc

▪  So many tools for prototyping on Android mobile devices will work for wearables

▪  If you want to learn to code, learn ▪  Java, Android, Javascript/PHP

Typical Development Steps ▪  Sketching ▪  Storyboards ▪  UI Mockups ▪  Interaction Flows ▪  Video Prototypes ▪  Interactive Prototypes ▪  Final Native Application

Increased Fidelity & Interactivity

Low Fidelity Tools •  Sketching •  GlassSim •  UI Templates •  Storyboards •  GlassWare flow designer •  Android Design Preview •  Video sketches

High Fidelity Tools •  UXPin/Proto.io •  JustinMind •  Processing •  WearScript •  Unity3D •  Native Coding

Sketched Interfaces

▪  Sketch + Powerpoint/Photoshop/Illustrator

GlassSim – http://glasssim.com/

▪  Simulate the view through Google Glass ▪  Multiple card templates

GlassSim Card Builder ▪  Use HTML for card details ▪  Multiple templates ▪  Change background ▪  Own image ▪  Camera view

GlassSim Samples

Glass UI Templates

▪  Google Glass Photoshop Templates ▪  http://glass-ui.com/ ▪  http://dsky9.com/glassfaq/the-google-glass-psd-template/

Application Storyboard

▪  http://dsky9.com/glassfaq/google-glass-storyboard-template-download/

Glassware Flow Designer •  Features

–  Design using common patterns and layouts –  Specify interactions and card flow –  Share with other designers

•  Available from: –  https://developers.google.com/glass/tools-downloads/glassware-flow-designer

Example Flow

•  Blah

Screen Sharing

▪  Android Design Preview –  Tool for sharing screen content onto Glass –  https://github.com/romannurik/

AndroidDesignPreview/releases

Mac Screen Glass

▪ Series of still photos in a movie format. ▪ Demonstrates the experience of the product ▪ Discover where concept needs fleshing out. ▪ Communicate experience and interface ▪ You can use whatever tools, from Flash to iMovie.

Video Sketching

See https://vine.co/v/bgIaLHIpFTB

Example: Glass Vine UI

Limitations ▪  Positives ▪  Good for documenting screens ▪  Can show application flow

▪  Negatives ▪  No interactivity/transitions ▪  Can’t be used for testing ▪  Can’t deploy on wearable ▪  Can be time consuming to create

Interactive Wireframing ▪  Developing interactive interfaces/wireframes

▪  Transitions, user feedback, interface design

▪  Web based tools ▪  UXpin - http://www.uxpin.com/ ▪  proto.io - http://www.proto.io/

▪  Native tools ▪  Justinmind - http://www.justinmind.com/ ▪  Axure - http://www.axure.com/

UXpin - www.uxpin.com

▪  Web based wireframing tool ▪  Mobile/Desktop applications ▪  Glass templates, run in browser

Proto.io - http://www.proto.io/ ▪  Web based mobile prototyping tool ▪  Features ▪  Prototype for multiple devices ▪  Gesture input, touch events, animations ▪  Share with collaborators ▪  Test on device

Proto.io - Interface

Demo: Building a Simple Flow

Gesture Flow Scr1

Scr2 Scr3

Scr4 Scr5 Scr6

Tap

Swipe

Start Transitions

Justinmind ▪  Native wireframing tool ▪  Build mobile apps without programming ▪  drag and drop, interface templates ▪  web based simulation ▪  test on mobile devices ▪  collaborative project sharing

▪  Templates for Glass, custom templates

User Interface - Glass Templates

Web Simulation Tool

Wireframe Limitations ▪  Can’t deploy on Glass ▪  No access to sensor data ▪  Camera, orientation sensor

▪  No multimedia playback ▪  Audio, video

▪  Simple transitions ▪  No conditional logic

Processing ▪  Programming tool for Artists/Designers ▪  http://processing.org ▪  Easy to code, Free, Open source, Java based ▪  2D, 3D, audio/video support

▪  Processing For Android ▪  http://wiki.processing.org/w/Android ▪  Strong Android support, builds .apk file

Basic Processing Sketch /* Notes comment */ //set up global variables float moveX = 50; //Initialize the Sketch void setup (){ } //draw every frame void draw(){ }

Importing Libraries ▪  Can add functionality by Importing Libraries ▪  java archives - .jar files

▪  Include import code import processing.opengl.*;

▪  Popular Libraries ▪  Minim - audio library, OCD - 3D camera views ▪  bluetoothDesktop - bluetooth networking

Processing and Glass ▪  One of the easiest ways to build rich

interactive wearable applications ▪  focus on interactivity, not coding

▪  Collects all sensor input ▪  camera, accelerometer, touch

▪  Can build native Android .apk files ▪  Side load onto Glass

Hello World Image PImage img; // Create an image variable void setup() { size(640, 360); //load the ok glass home screen image img = loadImage("okGlass.jpg"); // Load the image into the program } void draw() { // Displays the image at its actual size at point (0,0) image(img, 0, 0); }

Demo

Touch Pad Input ▪  Tap recognized as DPAD input void keyPressed() { if (key == CODED){ if (keyCode == DPAD) {

// Do something ..

▪  Java code to capture rich motion events ▪  import android.view.MotionEvent;

Motion Event //Glass Touch Events - reads from touch pad public boolean dispatchGenericMotionEvent(MotionEvent event) { float x = event.getX(); // get x/y coords float y = event.getY(); int action = event.getActionMasked(); // get code for action switch (action) { // let us know which action code shows up

case MotionEvent.ACTION_MOVE: touchEvent = "MOVE"; xpos = myScreenWidth-x*touchPadScaleX; ypos = y*touchPadScaleY; break;

Demo

Sensors ▪  Ketai Library for Processing ▪  https://code.google.com/p/ketai/

▪  Support all phone sensors ▪  GPS, Compass, Light, Camera, etc

▪  Include Ketai Library ▪  import ketai.sensors.*; ▪  KetaiSensor sensor;

Using Sensors ▪  Setup in Setup( ) function

▪  sensor = new KetaiSensor(this); ▪  sensor.start(); ▪ sensor.list();

▪  Event based sensor reading void onAccelerometerEvent(…){ accelerometer.set(x, y, z); }

Sensor Demo

Using the Camera ▪  Import camera library

▪  import ketai.camera.*; ▪  KetaiCamera cam;

▪  Setup in Setup( ) function cam = new KetaiCamera(this,640,480,15);

▪  Draw camera image void draw() { //draw the camera image image(cam, width/2, height/2);

Camera Demo

Native Coding ▪  For best performance need native coding ▪  Low level algorithms etc

▪  Most current wearables based on Android OS ▪  Need Java/Android skills

▪  Many devices have custom API/SDK ▪  Vusix M-100: Vusix SDK ▪  Glass: Mirror API, Glass Developer Kit (GDK)

Glassware Development ▪  Mirror API ▪  Server programming, online/web application ▪  Static cards / timeline management

▪  GDK ▪  Android programming, Java (+ C/C++) ▪  Live cards

▪  See: https://developers.google.com/glass/

▪  REST API ▪  Java servlet, PHP, Go,

Python, Ruby, .NET ▪  Timeline based apps ▪  Static cards

-  Text, HTML, media attachment (image & video) ▪  Manage timeline

-  Subscribe to timeline notifications, contacts -  Location based services

Mirror API

GDK ▪  Glass Development Kit ▪  Android 4.0.3 ICS + Glass specific APIs ▪  Use standard Android Development Tools

▪  GDK add-on features ▪  Timeline and cards ▪  Menu and UI ▪  Touch pad and gesture ▪  Media (sound, camera and voice input)

GDK

Glass Summary ▪  Use Mirror API if you need ... ▪  Use GDK if you need ... ▪  Or use both

Hardware Prototyping

Build Your Own Wearable

▪  MyVu display + phone + sensors

Beady-i

▪  http://www.instructables.com/id/DIY-Google-Glasses-AKA-the-Beady-i/

Rasberry Pi Glasses

▪  Modify video glasses, connect to Rasberry Pi ▪  $200 - $300 in parts, simple assembly ▪  https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-glasses

Physical Input Devices ▪  Can we develop unobtrusive input devices ? ▪  Reduce need for speech, touch pad input ▪  Socially more acceptable

▪  Examples ▪  Ring, pendant, ▪  bracelet, gloves, etc

Prototyping Platform

Arduino Kit Bluetooth Shield Google Glass

Example: Glove Input

▪ Buttons on fingertips ▪ Map touches to commands

Example: Ring Input

▪ Touch strip, button, accelerometer ▪ Tap, swipe, flick actions

How it works

Bracelet

Armband

Gloves

1,2,3,4

Values/output

Other Tools ▪  Wireframing ▪  Pidoco, FluidUI

▪  Rapid Development ▪  Phone Gap, AppMachine

▪  Interactive ▪  App Inventor, Unity3D, WearScript

WearScript

▪  JavaScript development for Glass ▪  http://www.wearscript.com/en/

▪  Script directory ▪  http://weariverse.com/

WearScript Features •  Community of Developers •  Easy development of Glass Applications

–  GDK card format –  Support for all sensor input

•  Support for advanced features –  Augmented Reality –  Eye tracking –  Arduino input

WearScript Playground

•  Test code and run on Glass –  https://api.wearscript.com/

Summary ▪  Prototyping for wearables is similar to mobiles ▪  Tools for UI design, storyboarding, wireframing

▪  Android tools to create interactive prototypes ▪  App Inventor, Processing, etc

▪  Arduino can be used for hardware prototypes ▪  Once prototyped Native Apps can be built

RESEARCH DIRECTIONS

Challenges for the Future (2001) ▪  Privacy ▪  Power use ▪  Networking ▪  Collaboration ▪  Heat dissipation ▪  Interface design ▪  Intellectual tools ▪  Augmented Reality systems

Starner, T. (2001). The challenges of wearable computing: Part 1. IEEE Micro,21(4), 44-52. Starner, T. (2001). The challenges of wearable computing: Part 2. IEEE Micro,21(4), 54-67.

Interface Design

Gesture Interaction

Gesture Interaction With Glass ▪  3 Gear Systems ▪  Hand tracking

▪  Hand data sent to glass ▪  Wifi networking ▪  Hand joint position ▪  AR application rendering ▪  Vuforia tracking

Capturing Behaviours

▪  3 Gear Systems ▪  Kinect/Primesense Sensor ▪  Two hand tracking ▪  http://www.threegear.com

Performance

▪  Full 3d hand model input ▪  10 - 15 fps tracking, 1 cm fingertip resolution

Meta Gesture Interaction

▪ Depth sensor + Stereo see-through ▪ https://www.spaceglasses.com/

Collaboration

Social Panoramas

Ego-Vision Collaboration

▪  Wearable computer ▪  camera + processing + display + connectivity

Current Collaboration

▪  First person remote conferencing/hangouts ▪  Limitations

-  Single POV, no spatial cues, no annotations, etc

Social Panoramas

▪  Capture and share social spaces in real time ▪  Enable remote people to feel like they’re with you

Key Technology

▪  Google Glass ▪  Capture live panorama (compass + camera) ▪  Capture spatial audio, live video

▪  Remote device (desktop, tablet) ▪  Immersive viewing, live annotation

Awareness Cues

▪  Where is my partner looking? ▪  Enhanced radar display, Context compass

Interaction

▪  Glass Touchpad Input/Tablet Input ▪  Shared pointers, Shared drawing

Cognitive Models

Modeling Cognitive Processes •  Model cognitive processes

– Based on cognitive psychology •  Use model to:

–  Identify opportunity for wearable – Predict user’s cognitive load

Typical Cognitive Model 1.  Functional Modularity: cognitive system divided

into functionally separate systems 2.  Parallel Module Operation: cognitive modules

operate in parallel, independent of each other 3. Limited Capacity: cognitive modules are limited in

capacity with respect to time or content 4. Serial Central Operation: central coordination of

modules (eg monitoring) is serial

Cognitive Interference ▪  Structural interference ▪  Two or more tasks compete for limited

resources of a peripheral system -  eg two cognitive processes needing vision

▪  Capacity interference ▪  Total available central processing

overwhelmed by multiple concurrent tasks -  eg trying to add and count at same time

Example: Going to work ..

Which is the most cognitively demanding?

Cognitive Resources & Limitations

asdfasdf

Application of Cognitive Model

Busy street > Escalator > Café > Laboratory. But if you made Wayfinding, Path Planning, Estimating

Time to Target, Collision Avoidance easier?

Social Perception

How is the User Perceived?

GlassHoles •  safa

TAT Augmented ID

The Future of Wearables

RESOURCES

Online Wearables Exhibit

Online at http://wcc.gatech.edu/exhibition

Glass Developer Resources ▪  Main Developer Website ▪  https://developers.google.com/glass/

▪  Glass Apps Developer Site ▪  http://glass-apps.org/glass-developer

▪  Google Design Guidelines Site ▪  https://developers.google.com/glass/design/

index?utm_source=tuicool

Other Resources ▪  AR for Glass Website ▪  http://www.arforglass.org/

▪  Vandrico Database of wearable devices ▪  http://vandrico.com/database

Glass UI Design Guidelines

•  More guidelines –  https://developers.google.com/glass/design/index

Books ▪  Programming Google Glass ▪  Eric Redmond

▪  Rapid Android Development: Build Rich, Sensor-Based Applications with Processing ▪  Daniel Sauter

•  Beginning Google Glass Development by Jeff Tang

•  Microinteractions: Designing with Details – Dan Saffer – http://microinteractions.com/

Conclusions •  Wearable computing represents a fourth

generation of computing devices •  Google Glass is the first consumer wearable

–  Lightweight, usable, etc •  A range of wearables will appear in 2014

–  Ecosystem of devices •  Significant research opportunities exist

– User interaction, displays, social impact

Contact Details Mark Billinghurst ▪  email: mark.billinghurst@hitlabnz.org ▪  twitter: @marknb00

Feedback + followup form ▪  goo.gl/6SdgzA

Recommended