33
MobAppDev Touchscreens, MotionEvents, Development & Consumption of Touch Gestures, Remote Robot Control on Android Smartphones Vladimir Kulyukin www.vkedco.blogspot.com

MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

MobAppDev

Touchscreens, MotionEvents, Development & Consumption of Touch Gestures,

Remote Robot Control on Android Smartphones

Vladimir Kulyukin

www.vkedco.blogspot.com

Page 2: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Outline● Touchscreens● Motion Events● Capturing Motion Event Sequences● Development & Consumption of Touch Gestures● Remote Robot Control on Android Smartphones

Page 3: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Touchscreens

Page 4: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Motivation● Touch gestures are a viable input alternative, especially

when traveling● Touchscreen – is a touch gesture consumption surface● Recognition accuracy for many (not all) users are higher for

touch gestures than for speech● Unlike speech utterances, touch gestures preserve privacy

Page 5: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Touchscreen● Touchscreen is a surface (e.g., an electronic visual display) for

detecting touches within a given area● Touchscreen is made up of special materials that capture pressure

inputs and translate them into digital data● Digital data (x, y, pressure, etc) are passed to software that process

them ● Some touchscreens require the use of a stylus● Touchscreens are becoming a must for many personal digital

assistants, ATMs, and video games

Page 6: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Bits of History: Air Traffic & Music

● The concept of touchscreen is described by E.A. Johnson in 1965● Potential application in air traffic control systems proposed in 1968● First application developed at CERN (European Center for Nuclear

Research) in 1973 ● In the 1980's several musical sampling and synthesis systems used

light pen technology (Fairlight CMI, Fairlight CMI IIx)● HP-150 built in 1983 is one of the first touchscreen computers

Page 7: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Touchscreen Technologies● Resistive:

touchscreen consists of two thin electric layers separated by thin space

voltage is applied to top layer and sensed by bottom one

● Surface acoustic wave (SAW): utrasonic waves pass over touchscreen surface

when surface is touched a portion of the wave is absorbed

changes in waves register touch positions and send them to touch processor

● Capacitive: Capacitance is the ability of a body to store electrical charge

Capacitive touchscreens consist of insulators (e.g., glass) coated with transparent conductors

Since human hand stores electrical charge, touching surface results in surface's electrostatic field distortions used to capture touch's data

Page 8: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Motion Events

Page 9: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

MotionEvent

● On Android, digital data from touchscreens are captured as MotionEvent objects

● MotionEvent objects are created when the user touches the device's touchscreen

● MotionEvent objects are handled by the View.onTouchEvent() method

Page 10: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

MotionEvent Sequences

● When user places his/her finger, moves it (without lifting it up), and then lifts it up, a sequence of MotionEvent objects is generated

● Such sequences can be captured and used in touch gesture recognition● Each MotionEvent object contains information on: 1) type of action

captured (e.g., MotionEvent.ACTION_DOWN, MotionEvent.ACTION_UP, etc); 2) pressure value; 3) x and y coordinates; 4) time of event

Page 11: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

MotionEvent● On Android, digital data from touchscreens are captured as

MotionEvent objects● These objects are created when the user touches the

device's touchscreen● As the bare minimum each MotionEvent object contains the

x and y coordinates of the touch● MotionEvent objects are handled by the

View.onTouchEvent() method

Page 12: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

MotionEvent.getAction()● MotionEvent.getAction() can be used to retrieve the type of action● Examples:

MotionEvent.getAction() returns ACTION_DOWN when user touches screen

MotionEvent.getAction() returns ACTION_MOVE when user moves sideways

MotionEvent.getAction() returns ACTION_UP when user lifts his/her finger

Page 13: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Interface View.OnTouchListener

● Classes the receive touchscreen events must implement View.OnTouchListener

● Two main methods to handle MotionEvents are onTouch(View, MotionEvent) and onTouchEvent(MotionEvent)

● onTouch(View, MotionEvent) is used when one OnTouchListener handles MotionEvents from multiple views

Page 14: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Interface View.OnTouchListener

● If onTouch() or onTouchEvent() consumes a received MotionEvent and no other component should know about it, the method should return true

● If onTouch() or onTouchEvent() does not/cannot consume a received MotionEvent, it should return false

Page 15: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Capturing MotionEvent Sequences

Page 16: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Problem: Motion Event Sequence Capture

Write an Android application that captures sequences of MotionEvents and outputs the description of each MotionEvent into LogCat.

Source of MotionEvenSequenceCapture is here

Page 17: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Main Layout

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"

android:id="@+id/my_rel_layout"

android:tag="My Relative Layout"

android:orientation="vertical"

android:layout_width="match_parent"

android:layout_height="wrap_content">

<TextView android:text="Touch screen and look at LogCat"

android:id="@+id/message"

android:tag="my text view"/>

</RelativeLayout>

Page 18: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Implementing OnTouchListenerpublic class MotionEventSequenceCaptureAct extends Activity

implements OnTouchListener

{

public boolean onTouch(View v, MotionEvent ev) {

String myTag = v.getTag().toString();

Log.v(myTag, "----------------------");

Log.v(myTag, "Current view in onTouch() is " + myTag);

Log.v(myTag, describeEvent(ev));

return true;

}

}

Page 19: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Describing Motion Eventsstatic String describeEvent(MotionEvent mot_ev) {

StringBuilder result = new StringBuilder(500); int action_type = mot_ev.getAction(); String action_name = null;

switch ( action_type ) {

case MotionEvent.ACTION_DOWN: action_name = "ACTION_DOWN"; break;

case MotionEvent.ACTION_UP: action_name = "ACTION_UP"; break;

case MotionEvent.ACTION_MOVE: action_name = "ACTION_MOVE"; break;

default: action_name = "ACTION_UKNOWN"; break; }

result.append("Downtime: ").append(mot_ev.getDownTime()).append("ms\n");

result.append("Event time: ").append(mot_ev.getEventTime()).append("ms");

result.append(" Elapsed: ").append(mot_ev.getEventTime()-

mot_ev.getDownTime()); result.append(" ms\n");

return result.toString();

}

Page 20: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Development & Consumption of

Touch Gestures

Page 21: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Development of Touch Gestures● You can create your own gesture recognizer that

captures and interprets a library of touch gestures● You can also use the Android GestureBuilder

application to develop a library of touch gestures● The built gestures are saved on sdcard● Each gesture receives a name and can be

subsequently deleted or refined

Page 22: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Problem

Use the Android GestureBuilder application to develop gestures for handwritten letters a, b, and c. Write an application that consumes the built gestures and toasts the names of recognized gestures.

Source of BuiltGesturesConsumer is here

Page 23: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Develop & Save Touch Gestures

1. Start GestureBuilder 2. Draw a Gesture 3. Save it

Page 24: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Consumption of Touch Gestures● On my emulator, the gestures are saved in /storage/sdcard/gestures● To implement a touch gesture consumer (BuiltGesturesConsumer)

Create a directory /res/raw

Place the gestures file from /storage/sdcard/gestures (this is the file you build with the Android GestureBuilder) into /res/raw

The main layout of the gesture consumer must be android.ges-ture.GestureOverlayView

The class that consumes gestures must implement

OnGesturePerformedListener

Page 25: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

BuiltGesturesConsumer: Main Layout<?xml version="1.0" encoding="UTF-8"?>

<android.gesture.GestureOverlayView

xmlns:android="http://schemas.android.com/apk/res/android"

android:id="@+id/gestures"

android:layout_width="match_parent"

android:layout_height="match_parent”

android:orientation="vertical">

<!-- Your widgets go in here →

</android.gesture.GestureOverlayView>

Page 26: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

BuiltGesturesConsumer: Gesture Library

// Class must implement OnGesturePerformedListener

public class BuiltGesturesConsumerAct extends Activity implements

OnGesturePerformedListener {

// Gestures are saved in GestureLibrary

private GestureLibrary mLibrary;

}

Page 27: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

BuiltGesturesConsumer: Gesture Library@Override

public void onCreate(Bundle icicle) {

super.onCreate(icicle);

this.setContentView(R.layout.main);

// Load the gesture library from /res/raw/gestures

mLibrary = GestureLibraries.fromRawResource(this, R.raw.gestures);

if (!mLibrary.load()) { finish(); }

GestureOverlayView gestures = (GestureOverlayView) findViewById(R.id.gestures);

// Make this class to be the gesture listener

gestures.addOnGesturePerformedListener(this);

}

Page 28: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

BuiltGesturesConsumer: Gesture Consumption@Override

public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {

// 1. get the predictions

ArrayList<Prediction> predictions = mLibrary.recognize(gesture);

// 2. make sure that there is at least one prediction

if (predictions.size() > 0) {

// 3. get the best prediction

Prediction prediction = (Prediction) predictions.get(0);

// 4. make sure the confidence score is > 1.0

if (prediction.score > 1.0) {

// Show the letter

Toast.makeText(this, prediction.name, Toast.LENGTH_SHORT).show();

}}}

Page 29: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Remote Control of Mobile Robots on

Android Smartphones

Page 30: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Problem

Smartphones can be used to control robots remotely. This problem becomes increasingly important as more and more service robots get into our everyday activities: vacuum cleaners, wheelchairs, various robotic guides, smart environments, etc.

Page 31: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Example: Robot Control UI

Robot Position on a Map in MobileSim

Remote UI to controlrobot motion

Page 32: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

Example: Robot Control Touch Gestures

We can build the following touch gestures to control the robot

Page 33: MobAppDev (Fall 2014): Touchscreens & Motion Events; Capturing Sequences of Motion Events; Design & Consumption of Touchscreen Gestures; Remote Robot Control on Android Smartphones

References ● en.wikipedia.org/wiki/Touchscreen

● http://developer.android.com/reference/android/view/MotionEvent.html

● http://developer.android.com/reference/android/view/View.OnTouchListener.html