Upload
others
View
8
Download
0
Embed Size (px)
Citation preview
Off the desktop: Pen and Touch Advanced HCI
IAT351
Week 11 Lecture 1 14.11.2012
Lyn Bartram [email protected]
Off the desktop? (NUI)
Pen and Touch | IAT 351 | 14.11.2012
New paradigms
Pen and Touch | IAT 351 | 14.11.2012
• Freeform “open-space” interaction • Remote / sensed position
and acceleration • Whole body :Movement
and gesture
• Surface computing • Pen • touch
Today’s agenda
• pen and touch sensitive display devices • interaction paradigm shift from keyboard & mouse to pen
and touch • New data types: ink, gesture
• challenges of data entry on touch devices • Human factors • recognition
• Related: Gestures (on surfaces) • iPhone, MS Surface • Technology sometimes similar to pens • Related issues with recognition
Pen and Touch | IAT 351 | 14.11.2012
Pen and Touch
Darwin’s tree of life
-‐ Which is more evoca5ve?
Pen and Touch | IAT 351 | 14.11.2012
Pen-Based Interaction Paradigms
7
Pen Computing
• Use of pens has been around a long time • Light pen was used by Sutherland before Engelbart
introduced the mouse • Resurgence in 90’s
• GoPad • Much maligned Newton
• Types of “pens” • Passive (same as using a finger) • Active (pen provides some signal)
8
Example Pen Technology
• Passive • Touchscreen (e.g., Nokia phones, some tablets) • Contact closure • Vision techniques (like MS Surface) • Capacitive sensing (like iPhone)
• Active • Pen emits signal(s) • e.g. IR + ultrasonic
• Where is sensing? Surface or pen
Surfaces (passive)
It all depends on the display sensors -‐ Electromagne5c (tablets) -‐ Capaci5ve (iPhones, iPads) -‐ Camera – most tables and large displays
Pen and Touch | IAT 351 | 14.11.2012
Electromagnetic
• The stylus/pen has an electromagnet in the pen barrel • Can be sensed close to the surface
• Provides an active cursor • Barrel buttons for right click and
eraser • Hard to use • Inconsistent across manufacturers
• Pressure sensing • Pen tilt sensing
Pen and Touch | IAT 351 | 14.11.2012
10
Capacitive
• Designed for touch input • All phones are capacitive
• Capacitive pens are simply something that will carry a current from the user to screen
• Pen tips have to be ~5mm to be sensed • Bad for drawing/writing
Pen and Touch | IAT 351 | 14.11.2012
Camera/Image
• Cameras are usually infrared • Camera placement is the main difference
• Under the display, best and most expensive • Technically difficult to detect through the display
• Microsoft surface • Above the display, cost vary
• Easy to obscure object of interest with hand/body • Many research tables are of this style
• In the bezel, cheapest • Multi-touch limited • Limited accuracy and ghost touches
• Desktop touch screens
Pen and Touch | IAT 351 | 14.11.2012
13
Active : Anoto pen
• Reads dot pattern on paper
• Turns into strokes • Transmits via Bluetooth
www.anoto.com
14
Active: mimio
• Active pens • IR + ultrasonic
• Portable sensor • Converts any whiteboard
to input surface • Can chain these
to create big surface • http://www.mimio.com
Interacting with pen/touch • Tradi5onal keyboard & mouse has 100’s of unique combina5ons
• Replacing keyboard and mouse with one pen or a couple of touch points creates some interac5on challenges!
Pen and Touch | IAT 351 | 14.11.2012
Pen input
1. Free-form ink (mostly uninterpreted) • Tablet PC applications, digital notebooks, etc.
2. Soft keyboards (text, symbols) • high-accuracy (although slow) mechanism for inputting
machine-interpretable text
3. Recognition systems • Recognition of content • Recognition of commands
16
New data type: free form ink
• ink as data: when uninterpreted, the easiest option to implement • humans can interpret • time-stamping perhaps (to support rollback, undo) • implicit object detection (figure out groupings, crossings,
etc.)
• special-purpose “domain” objects (add a little bit of interpretation to some on-screen objects) • E.g., Newton: draw a horizontal line across the screen to start a
new page
Pen and Touch | IAT 351 | 14.11.2012
Annotation Tools – ink over documents
• hKp://www.cs.auckland.ac.nz/research/hci/digital_ink/annota5ontools/penmarked.shtml
Pen and Touch | IAT 351 | 14.11.2012
Diagramming Tools
• Support early design • Paper like
• Quick • No decisions • No rules
Pen and Touch | IAT 351 | 14.11.2012 InkKit
Intelligent editing support
Pen and Touch | IAT 351 | 14.11.2012
Parallel visualizations
Pen and Touch | IAT 351 | 14.11.2012
Discrete versus continuous data • Key press or mouse click
• Unambiguous • Defined event
• Gesture • Requires a recognizer • Mul5ple interpreta5ons
Pen and Touch | IAT 351 | 14.11.2012
2. Soft Keyboards
• Make “recognition” problem easier by forcing users to hit specialized on-screen targets
• (Sometimes a blurry line between what’s “recognition” and what’s a “soft keyboard”)
• common on small mobile devices • many varieties • tapping interfaces • Key layout (QWERTY, alphabetical, … ) • learnability vs. efficiency
23
Virtual Keyboards
• Virtual keyboard • Lack tac5le feedback • High error rates
• Swipe keyboards • Need dic5onary support
• Pie menus • Never taken off
Pen and Touch | IAT 351 | 14.11.2012
3. Recognizing pen input
• Unlike soft keyboards, recognize more “natural” pen strokes
• Can be used for both content and commands • Some are less natural than others: Graffiti
• unistroke alphabet • Other ink recognizers
• for commands (Stanford flow menus; PARC Tivoli implicit objects)
• measure features of strokes (Rubine, Long)
• usually no good for “complex” strokes
25
Abstract writing
• Enter text with specialized, stroke based recognition • Optimized for automatic recognition • Not human readable • Character based or word based
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Cirrus (Georgia Tech)
interaction
education © Richard Anderson, HCI for Pen-Based Computing, 2007
Pen and Touch | IAT 351 | 14.11.2012
Quikwrite [Perlin, NYU]
c a
r p
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Write helloworld
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Graffiti (Palm)
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
More
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Graffiti
• Mostly single stroke • Close to standard alphabet (learnability) • Write only • Location written for additional meaning
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Handwriting Recognition: Identify the following words
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Recognition results
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Flow Menu
• Use movement through octants for control information
• Relies on crossing defined objects
Item Text
Shape
Item
Move
Zoom Highlight
Custom
400%
200%
100%
66.6%
50%
800% 25%
Custom
400%
200%
100%
66.6%
50%
800% 25%
75.0 100.0
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
36
Mixing modes of pen use
• Users need to specify two types of things: • Content • Commands • How to switch between them? • (1 mode) recognize which applies: contextual commands, a la Tivoli, Teddy, etc. • (2 modes) visible mode switch: Graffiti (make special command gesture) • (1.5 modes) special pen action switches: temporary or transient mode, e.g., Wacom tablet pens
Mode Problem
• Cognitive difficulties in remembering / keeping track of modes • Which mode? • Remapping operations • Retaining mode across context switch
• But modes are very useful • Efficient use of limited input controls
• Not all modes are the same • Shift key vs. Caps Lock • Mouse move vs. mouse drag • Pen color
© Richard Anderson, HCI for Pen-Based Computing, 2007
Pen and Touch | IAT 351 | 14.11.2012
What kinds of things have modes?
• A system has modes if it has states where the controls have different functions.
• Do cars have modes? If so, give an example • Do phones have modes? Examples
• How do these affect the experience?
• Does your touch device/game/computational artefact have modes?
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Pen mode solutions
• Problem: How do you allow different operations with a pen • Ink vs. erasing
• Explicit modes • Ink vs. gesture
• Recognition of gesture overrides ink • Ink vs. recognition vs. control
• Area based modes
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
General Issues – Pen input
• Initial training required • Learning time to become proficient • Speed of use • Generality/flexibility/power • Special skills • Screen space required • Computational resources required • No defined set of gestures
Pen and Touch | IAT 351 | 14.11.2012
Pen and Touch | IAT 351 | 14.11.2012
Evolution: Mouse-Controlled GUIs
• Mice, Touchpads, & Trackballs are each unique: – Mouse/Touchpad operate on a 2D plane – Mouse/Trackball operate on “rolling” concept – Touchpad is most like pen interface, without the direct feedback – Multiple buttons (for most)
• …But they share this in common: – Keeping the cursor steady is easy (hands off!) – Separation between movement and display (hand down here,
screen up there)
Pen and Touch | IAT 351 | 14.11.2012
Problems with Mouse to Pen Mappings
• What happens when we layer pen computing on top of a mouse-based GUI? • Double clicking: rather difficult because the cursor tends to “wiggle” (hard to tap the same spot twice)
• Control primitives were different
Pen and Touch | IAT 351 | 14.11.2012
Problems with Mouse to Pen Mappings
• Fitt’s Law: “the time to acquire a target is a function of the distance to and size of the target.” • What are the 5 easiest targets to reach?
• four corners of the screen, and under the cursor • The edges and corners are great targets, because you can
whip your mouse as far as you want and the cursor will be stuck on an edge
• This is why scrollbars that are flush to the screen edge are great, or an application menu stuck in the upper or lower left corner
Pen and Touch | IAT 351 | 14.11.2012
Problems with Mouse to Pen Mappings
• Fitt’s Law: “the time to acquire a target is a function of the distance to and size of the target.” • But what about using a pen instead of a mouse? • Whoops. • Distance is no longer an issue (Ren & Moriya, 2001)
Pen and Touch | IAT 351 | 14.11.2012
But what about using a pen instead of a mouse?
• Fitt’s Law: “the time to acquire a target is a function of the distance to and size of the target.”
• Direction to target is more important (hard to go to the area obscured by the hand)
• Location of target is also important (again, the opposite direction of where the hand obscures the screen)
• Edges and corners are very hard to touch • Size of target is important, but in a different way;
• small targets (< 5px) are very bad, because of the small offset between digitizer surface and screen surface
• But super-large targets don’t really offer any great improvement, because distance and precision are measured differently than with a mouse
Pen and Touch | IAT 351 | 14.11.2012
Problems with Mouse to Pen Mappings
• A further problem is that pen use often precludes the use of the keyboard (it’s hard to use both at the same time)
• Hand occlusion is an issue • New library of control primitives • Began with pen movement • Evolved to touch/gesture operations
Pen and Touch | IAT 351 | 14.11.2012
Pen-Based Paradigms
• General paradigm: Goal-Crossing • Design incentive: “clicking” with a pen is an odd gesture; pens
are usually used for strokes, not dots • But clicking is central to mouse-based interfaces (buttons, radio
buttons, checkboxes, list selection, menu selection, icon selection)
• What happens when we replace a “click” with a pen stroke?
Pen-Based Paradigms
• General paradigm: Goal-Crossing • Events in goal-crossing are triggered by drawing a line across a
target area on screen • Instead of 2D areas (like buttons), targets become lines or
barriers to draw across • Example: pie menu
Pen and Touch | IAT 351 | 14.11.2012
Basic pen operation
• Crossing • Operation triggered by a stroke crossing a line segment
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
CrossY UI: • Specify operations by
drawing through
Pen and Touch | IAT 351 | 14.11.2012
Pen-Based Paradigms
• General paradigm: Goal-Crossing • More examples (from Lapizco-Encinas & Rodriguez, 2003,
CrossEd)
Hierarchical crossing
• Principle – multiple commands without lifting the pen
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Item Text
Shape
Item
Move
Zoom Highlight
Custom
400%
200%
100%
66.6%
50%
800% 25%
Custom
400%
200%
100%
66.6%
50%
800% 25%
75.0 100.0
Discrete selection
• Choose from a finite set • Command from a menu • Character from an alphabet
• Repeated selection from finite sets • Hierarchical menus • Commands with arguments • Sequences of characters
• Words • Multi-digit numbers
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Selection mechanisms
• Crossing • Pointing • Writing • Tapping • Pressure
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Pen and Touch | IAT 351 | 14.11.2012
Pen-Based Selection Paradigms (Ren and Moriya)
• Direct On: like a mouse click; touching the target selects it
• Direct Off: can slide into the target; lifting the pen over the target selects it
• Space On: Like Direct On, but the target highlights when the pen hovers over it before touching
• Slide Touch: target is selected when the pen crosses its border
• Slide Off: target is selected after the pen crosses its border and is lifted off the surface
• Space Touch: target is highlighted by hovering over it, then selected by tapping anywhere
Selection problem
• Identify one or more graphical elements from a domain • Mechanisms
• Bounding Region • Geometric defined by stroke • Distance from cursor
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
If the red circle is a selection tool, what is selected?
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Bubble cursor (Grossman, 2005)
• Selection radius depends on object proximity
© Richard Anderson, HCI for Pen-Based Computing, 2007
Pen and Touch | IAT 351 | 14.11.2012
Recognition UIs
• UIs based on attaching meaning to ink • Gestures • Diagram recognition • Handwriting recognition
• Free form • Constrained recognition
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Gestures
• Commands issued with a single stroke • May be drawn or invisible • Requires support from SDK
• Register gestures to be recognized • UI Issues
• Similar to keyboard short cuts • Speed up for experts • Hard to learn / remember
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Gestures
• Ambiguity • Distinction between gestures • Distinction between gesture and other ink
• Robustness • Handling misrecognized gestures
• False positive • False negative
• Gesture initiated actions should be undoable
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Diagram recognition
• Challenges to recognition • Even simple shapes are hard! • Variation in drawing • Ink artifacts
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Recognition scenarios
• What level of error is tolerable • How is feedback provided to the user on recognition • How does the user specify corrections?
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Obstructions and handedness
Pen and Touch | IAT 351 | 14.11.2012
• Hand blocks the screen • Accommodate left and
right handedness • Menu direction • Context menus
• Difficulties at the edge of the screen
• Directions of gesture?
© Richard Anderson, HCI for Pen-Based Computing, 2007
Screen orientation
• Landscape vs. Portrait mode • Surprisingly big difference in
feel of applications • Tablet PC requires rapid
orientations switch • Many standard desktop apps
not designed for portrait mode
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Ink
• Ink based development tools
• Ink on browsers using silverlight
Pen and Touch | IAT 351 | 14.11.2012
A Founda(on in Pen-‐based Compu5ng
Resource
Sarah Sood, Christine Alvarado, and Zach Dodds
Windows Presenta(on Founda(on is Microso_’s API for accessing students’ own hand-‐drawn strokes, including 5me-‐stamp informa5on and pen status. To help focus on implemen5ng algorithms, the first HW asks students to build a windows-‐journal-‐like pen-‐based applica5on with WPF.
Results
one with autocorrection
journal result
Silverlight
• Silverlight is the only Web technology to support collection and display of high-quality pen input from Tablet PC devices
Pen and Touch | IAT 351 | 14.11.2012
Silverlight
• About Microsoft Silverlight • cross-browser, cross-platform • plug-in for delivering the next generation of media
experiences and rich interactive applications (RIAs) • Incorporate video, animation, interactivity, and
stunning user interfaces. • Silverlight (previously codenamed "WPF/E") is
a lightweight subset of XAML for building rich media experiences on the web.
Pen and Touch | IAT 351 | 14.11.2012
Ink in Silverlight
• The ink in Silverlight is made up of a StrokeCollection object • StrokeCollection object is made up of individual Stroke objects • A Stroke can be a dot, a straight line, or a curving line. • Each Stroke is made up of a StylusPointCollection object,
which, in turn, is made up of individual StylusPoint objects. • StylusPoint objects are collected when the pen moves while it
is in contact with the digitizer. The ink also contains characteristics such as color, width, and outline color, all of which are contained in the DrawingAttributes class.
Pen and Touch | IAT 351 | 14.11.2012
Ink Presenter
• The InkPresenter element is a Canvas that displays ink.
• InkPresenter is generally the same as the Canvas element, but also has a StrokeCollection
• you add Stroke objects to the StrokeCollection, the InkPresenter automatically renders them.
Pen and Touch | IAT 351 | 14.11.2012
Collecting ink
• Collect ink by retrieving StylusPoint objects when the pen moves while in contact with the screen.
• Then add a Stroke object, which consists of the collected StylusPoint objects, to the StrokeCollection that is associated with the InkPresenter.
Pen and Touch | IAT 351 | 14.11.2012
A Founda(on in Pen-‐based Compu5ng
Resource
Sarah Sood, Christine Alvarado, and Zach Dodds
Windows Presenta(on Founda(on is Microso_’s API for accessing students’ own hand-‐drawn strokes, including 5me-‐stamp informa5on and pen status. To help focus on implemen5ng algorithms, the first HW asks students to build a windows-‐journal-‐like pen-‐based applica5on with WPF.
Results
one with autocorrection
journal result
Ink
• Ink based development tools
• Ink on browsers using silverlight
Pen and Touch | IAT 351 | 14.11.2012
Ink on Browsers • Ink on browsers is now possible • ASP Net 2.0 server controls, onClientClick simplifies the process of
moving ink from webpage to webserver • Can tie a client side event to a server side event
• AJAX allows ink to remain in the page during a refresh event • Steps of Building an ink enabled application on browser
• Add Microsoft.Ink API into project • Add inkOverLay component to control • Add color buttons and other controls and enable them by writing a click method
for each • Add public methods so that web pages client-side code can interact with the
controls • Returns current Color Name, loading data into control, returns ink data as a base-64
encoded string • Create a webpage to host the ink controls
• Enable partial postbacks with ASP.net AJAX updates • Send the ink control data to an image file • Retrieve ink data from an image file and load it into the ink control for additional
drawings • Get the page to interact with the controls
• Communicating with control to send/receive ink data • Java script can enable this
• Get the ink data to the server side • Client side – get the ink data from control • Client Side – store it in a hidden control that is accessible from the server side • Server Side – Access the data in the hidden control
Pen and Touch | IAT 351 | 14.11.2012
Silverlight • About Microsoft Silverlight
• cross-browser • cross-platform • plug-in for delivering the next generation of media
experiences and rich interactive applications (RIAs)
• Incorporate video, animation, interactivity, and stunning user interfaces.
• Silverlight (previously codenamed "WPF/E") is a lightweight subset of XAML for building rich media experiences on the web.
Pen and Touch | IAT 351 | 14.11.2012
Silverlight
• Silverlight is the only Web technology to support collection and display of high-quality pen input from Tablet PC devices
Pen and Touch | IAT 351 | 14.11.2012
Ink in Silverlight • The ink in Silverlight is made up of a StrokeCollection
object • StrokeCollection object is made up of individual
Stroke objects • A Stroke can be a dot, a straight line, or a curving
line. • Each Stroke is made up of a StylusPointCollection
object, which, in turn, is made up of individual StylusPoint objects.
• StylusPoint objects are collected when the pen moves while it is in contact with the digitizer. The ink also contains characteristics such as color, width, and outline color, all of which are contained in the DrawingAttributes class.
Pen and Touch | IAT 351 | 14.11.2012
Ink Presenter • The InkPresenter element is a Canvas that displays
ink. • InkPresenter is generally the same as the Canvas
element, but also has a StrokeCollection • you add Stroke objects to the StrokeCollection, the
InkPresenter automatically renders them.
Pen and Touch | IAT 351 | 14.11.2012
Collecting ink • Collect ink by retrieving StylusPoint objects when
the pen moves while in contact with the screen. • Then add a Stroke object, which consists of the
collected StylusPoint objects, to the StrokeCollection that is associated with the InkPresenter.
• Example
Pen and Touch | IAT 351 | 14.11.2012
General Issues – Pen input
• Initial training required • Learning time to become proficient • Speed of use • Generality/flexibility/power • Special skills • Screen space required • Computational resources required
Pen and Touch | IAT 351 | 14.11.2012
Introduced new interaction paradigm
• Pen -àà touch
• Similar techniques are used for pen-based UIs
• Recognisers are largely the same
• Gesture-based computing
Pen and Touch | IAT 351 | 14.11.2012
Touch Control primitives
• Hover • Tap • Double Tap • Press-and-hold • Hold-through • Drag • Hold-drag
© Richard Anderson, HCI for Pen-Based Computing, 2007 Pen and Touch | IAT 351 | 14.11.2012
Gesture-‐Based Compu5ng
• Different devices can recognize a variety of Gesture-‐Based inputs. • Touch Screens and Surfaces recognize
• Touch • Taps • Swipes • Mul5ple Fingers/Touches
• Handheld Devices respond when they are • Tilted • Shaken • Moved in space
Pen and Touch | IAT 351 | 14.11.2012
Gesture-‐Based Compu5ng
• Gesture-based Computing refers to the ability to interface with devices through “natural” human movement.
• Gesture Based Computing is already used widely today. – iPod touch, – Smart Phones, – Tablets, – Nintendo Wii, – Kinect
• Playstation Move and SMART Boards all respond to “natural” human gestures.
Pen and Touch | IAT 351 | 14.11.2012
Changing Metaphors
• A new idea, the touch metaphor • Users touch the content, smart phones, tablets, and other mobile
devices • Example: the Cover
Flow mechanism for scanning through a list, using a sweeping motion of the pointer
Pen and Touch | IAT 351 | 14.11.2012
Common multi-touch gestures
Pen and Touch | IAT 351 | 14.11.2012
Touch Metaphor Gestures
Pen and Touch | IAT 351 | 14.11.2012
Toolkits
• Gesture and Activity Recognition Toolkit (GART), Georgia Tech • Linux shell script • Java libraries
• Gesture toolkit • Silverlight, WPF ,NET
• JavaFX 2.2 • Newest release, multi-touch “standard” gestures
Pen and Touch | IAT 351 | 14.11.2012
Pen and Touch | IAT 351 | 14.11.2012
Examples
• iPhone -‐ iPad • Children’s Demo • Control Surgical Device • Hands Free Access Informa5on • Marke5ng Technique • Add a Sixth Sense – Informa5on Access • The Future -‐ BCI
Pen and Touch | IAT 351 | 14.11.2012
Ink on Browsers • Ink on browsers is now possible • ASP Net 2.0 server controls, onClientClick simplifies the process of
moving ink from webpage to webserver • Can tie a client side event to a server side event
• AJAX allows ink to remain in the page during a refresh event • Steps of Building an ink enabled application on browser
• Add Microsoft.Ink API into project • Add inkOverLay component to control • Add color buttons and other controls and enable them by writing a click method
for each • Add public methods so that web pages client-side code can interact with the
controls • Returns current Color Name, loading data into control, returns ink data as a base-64
encoded string • Create a webpage to host the ink controls
• Enable partial postbacks with ASP.net AJAX updates • Send the ink control data to an image file • Retrieve ink data from an image file and load it into the ink control for additional
drawings • Get the page to interact with the controls
• Communicating with control to send/receive ink data • Java script can enable this
• Get the ink data to the server side • Client side – get the ink data from control • Client Side – store it in a hidden control that is accessible from the server side • Server Side – Access the data in the hidden control
Pen and Touch | IAT 351 | 14.11.2012