33
Multi Touch Display System The project selected by our group “Multi Touch Display System” . The project has completed in the Engineering College, AJMER. This project report inculcates the theoretical aspects as well as the analysis and figures associated with the project. Due to recent innovations multi-touch technology has become affordable. For this project a camera based multi-touch device has been constructed. To perform multi-touch point tracking we used Touchlib, a free open sou rce mul ti- tou ch fra mework. To demons trat e the pos sib ili tie s of multi-touch input technology we created new rich media applications which are cont rolled by gest ur es. Multi- touc h syst ems are ofte n stand al one systems that do not have external input devices attached. In order to simplify common tasks, a gesture r ecognition engine has been designed (Gesturelib). Thr oug h a set of exp eri ments we eva lua te how mul ti- tou ch inp ut performs on tasks compared to conventional mouse input. Unlike interaction on a desktop computer multi-touch allows multiple users to interact with the same devices at the same time. We present measurements that show how coll abor at ion on a mult i- touc h tabl e ca n impr ove the pe rfo rma nce for  specific tasks. Contents 1. INTRODUCTION

multi touch system

Embed Size (px)

Citation preview

Page 1: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 1/33

Multi Touch Display SystemThe project selected by our group “Multi Touch Display System” . Theproject has completed in the Engineering College, AJMER. This projectreport inculcates the theoretical aspects as well as the analysis and figuresassociated with the project.

Due to recent innovations multi-touch technology has becomeaffordable. For this project a camera based multi-touch device has beenconstructed. To perform multi-touch point tracking we used Touchlib, a freeopen source multi-touch framework. To demonstrate the possibilities of multi-touch input technology we created new rich media applications whichare controlled by gestures. Multi-touch systems are often stand alone

systems that do not have external input devices attached. In order to simplifycommon tasks, a gesture recognition engine has been designed (Gesturelib).

Through a set of experiments we evaluate how multi-touch inputperforms on tasks compared to conventional mouse input. Unlike interactionon a desktop computer multi-touch allows multiple users to interact with thesame devices at the same time. We present measurements that show howcollaboration on a multi-touch table can improve the performance for

specific tasks.

Contents

1. INTRODUCTION

Page 2: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 2/33

2. REQUIREMENTSI. Software requirement

II. Hardware requirement3. DESIGN AND CONSTRUCTION OF MULTITOUCH DEVICE

I. Camera based multi touch technique• Front-side Illumination

II. Camera devicesIII. Hardware description

4. MULTITOUCH DETECTION AND PROCESSINGI. Touchlib

• Video image processing• Blob detection and tracking

II. Programming language interfaces• TUIO/OSC protocol

III. Gesture based interaction•

Gesture classification• Gesture recognition models• Gesturelib

5. MULTITOUCH APPLICATIONSI. Software architectureII. Multi-touch exampleIII. Multi-touch demo applications

6. DIFFUSER MATERIAL

7. TOUCHLIB REFERENCEI. Project DetailsII. Touchlib config.xmlIII. Touchlib filtersIV. Touchlib calibration

8. COST BENEFIT ANALYSIS9. CONCLUSION10. FUTURE PROSPECTS11. BIBLIOGRAPHY

Page 3: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 3/33

Introduction

Multi-touch displays are interactive graphics devices that combinecamera and tactile technologies for direct on-screen manipulation. Multi-touch technology is not entirely new, since the 1970s it has been available indifferent forms. Due to the improvement of processing power of moderndesktop computers, multi-touch technology does no longer requireexpensive equipment.

Page 4: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 4/33

Modern computer interaction consists of a monitor, keyboard and a mouse.Limited by the operating system, it allows us only to control a single pointer on the screen.With multi-touch, multiple input pointers are introduced to the system whichall can be controlled independently. Depending on the size of the displaymulti-touch allows multiple persons to interact with the same display at the

same time.

Overview of this work

Recent publications have shown that multi-touch displays have a potential torevolutionize human-computer interaction in the way that they allowintuitive interaction with applications through direct manipulation of graphical constructs. Several research groups have demonstrated that theperformance of simple tasks through multi-touch displays show greatimprovement over conventional methods. Unfortunately, most interactive

graphical applications that exist today are not able to exploit this potentialbecause they are designed for a single pointer device such as the mouse.This research project has been divided into four parts:

The design and construction of a camera based multi-touchdevice:

In order to perform the required experiments it is necessarily to designand construct a multi-touch device from scratch.

The design and implementation of a gestural interaction library:

In order to use gestures on a multitouch device it is required to designand implement a multi-touch gestural interaction library. The librarywill retrieve information from the used multi-touch tracking softwareand process them into gestural primitives. Detected gestures areexposed to application developers through an easy to use applicationprogramming interface (API).

Implementation of test case applications:

In order to demonstrate the input capabilities and benefits of a multi-touch device a set of (example) applications will be developed. The

Page 5: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 5/33

applications should show examples of collaborative tasks and gesturebased interaction on the device.

Performance evaluation of multi-touch enabled tasks:

To compare the performance of the multi-touch input device in the

field of human computer interaction a set of tasks will be designedand developed .

Requirements

Software Requirements:

Page 6: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 6/33

‐ Flash Player 9 (only required if running swf Flash Demos)‐ Java runtime 1.6+ (only required if running swf Flash Demos)‐ Visual Studio 2005 SP1 x86 redistributable package

1. Download the latest version of MTmini Package fromhttp://ssandler.wordpress.com/MTmini2. Open the zip file and extract the contents to your preferred location.

Hardware Requirements :

Following components to built a Multitouch Table

• Wooden Box• Transparent/clear material• Paper • Webcam

The design and construction of a multi-touch device

Page 7: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 7/33

Camera based multi-touch techniques:

Camera based multi-touch devices share the same concept of processingand filtering captured images on patterns. In general the interaction can bedescribed as the pipeline in Figure 2.1. The pipeline begins when the user

views the scene on the panel and decides to interact. To interact with thedevice, the user touches the device panel. On the hardware level the cameraregisters touch. Because the captured frame might not only include thecontact points but also a (static) background, it is required to perform imageprocessing on each captured frame. The captured frame is converted to agray scale image and the static background is removed by subtracting thecurrent frame with a reference frame. As a result the frame only shows whitecontours (blobs) which are the points of contact. By tracking these blobs thesystem becomes capable of sensing touch. In order to track the blobs, thepositions are transmitted to the blob tracker which matches blobs fromearlier frames with the current frame. After processing, events will betriggered which can be used in a multi-touch capable application. Theapplication modifies objects in the current scene according to the new blobpositions. The result is returned to the user through the display.

Figure 2.1: A graphical overview of the multi-touch interaction pipeline. The indicatedsections contain additional information on each part.

The performance of a camera based multi-touch device depends on the usedhardware and software. When a user touches a multi-touch device, it expectsthe device to respond directly. The responsiveness of the device depends onthe time it needs to process the user's input and present the users a resultthrough the display. In the interaction pipeline two levels are important, thehardware and the software level. Using a camera which is capable of 30frames per second allows smooth interaction with the system. However, this

Page 8: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 8/33

requires a system that can handle image processing and blob tracking in 1/30of a second.

Front-side Illumination:

The technique based on diffused illumination is called Front-sideIllumination (FI). Like the RI technique it is based on light being diffused.However instead of Using infrared light sources, it depends on the ambientlight from the environment. With FI the diffuser is attached to the front sideof the display. The ambient light illuminates the diffuser, which from thecamera's point of view, results in an evenly colored rectangle. By touchingthe surface, shadows will appear underneath the fingertips because theambient light cannot reach it.

Figure 2.5: Schematic view of the multi-touch panel using front illumination.

Figure 2.5 shows a basic setup for FI. Because the used touch libraryrequires touched spots to be colored white, a simple invert filter is beingapplied. FI can be seen as the cheapest way of creating a camera basedmulti-touch capable device. However due to its dependency of evenlydistributed light it is less reliable and precise than FTIR and RI.

Camera devices:

Page 9: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 9/33

A requirement when using FI based techniques is that the camerasensor is capable of detecting Blobs by capturing the frame. When thecamera captures blobs, the display often shows this as a bright white spotwith a blue/dark purple glow. Whilst high-end consumer USB cameras arecapable of transmitting images of VGA resolution (640fi480 pixels) atreasonable frame rates, they often introduce a latency. Because this reducesthe responsiveness of the multi-touch devices it is recommended to use aFireWire based camera instead.

Depending on the size of the display and the projected image it isrecommended to use at least a camera running a VGA resolution because of precision. The frame rate should be at least 30 frames per second to allowsmooth interaction.

Hardware description:

The host computer consists out of two hyperthreaded Intel Xeonprocessors running at 3.0 GHz. The computer contains 4.0 GB of memoryand uses Windows XP or Windows seven as operating system.Since closed box is required for the front side illumination method(FI).Hence the dimensions of the table are: 12 inch* 12 inch * 80 inch(L*W*H).The diffuser (butter paper) is wrapped onto a glass sheet. In order toilluminate the Diffuser several bulbs are placed on the front panel of thetable.To provide the ambient light environment for the FI method a regulator is used to control the illumination of the bulbs.

Multi-touch detection and processing

To perform camera based multi-touch detection and processing severalframeworks are available. Here we describe the used multi-touchframework, how a multi-touch framework connects to a multi-touchapplication and the different type of gestures.

Page 10: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 10/33

Page 11: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 11/33

Page 12: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 12/33

perform blob tracking. By using the OpenCV functioncvFindContours() we obtain a list of contours found in the currentframe. All found contours (and their properties) are stored in acontours list. Each contour is checked on whether it is a fiducialmarker or a touch. On each contour Touchlib tries to fit a polygon

which matches the outlines of the contour. If the fitted polygon isbuild out of four points it might be a possible fiducial marker.Next itwill check whether the angle of all the corners matches approximately90 degrees. If the result is true, the contour is considered a fiducialand the position (center) will be calculated. It will also assign a uniquetag identifier based on the pattern found within the square. If thepolygon is more complex than four points, it is assumed to be a touch.Touchlib will fit an ellipse on the contour. The properties of theellipse are used to determine the position, orientation and the size of ablob. If the size of the blob fits the minimum and maximumrequirements on height and width, the blob will be added to the bloblist.

• Blob tracking :In order to track blobs it is required to have at least two data sets thatcontain blobs in different states. For our example we first define thetwo data sets.The first data set contains the blob list from a previousframe and is defined as follows:

p1; p2; p3; :::; pn

where n is the number of active blobs in the previous frame.The second set contains the blobs list of the current frame and isdefined as follows:

c1; c2; c3; :::; cm

where m is the number of active blobs in the current frame.After each frame, the data from set p is replaced by set c. Set c will befilled with the new blob list.

Example:We define an example with two states represented in Figure 3.1.

Page 13: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 13/33

The data set of the previous frame:

p1 : (1; 4); p2 : (3; 1) ) n = 2

The data set of the current frame:

c1 : (3; 4); c2 : (2; 3); c3 : (1; 1) ) m = 3

Figure 3.1: An example of two frames containing different blob states.

In order to match blobs in the previous and current states it is required tocreatea matrix that contains the possible conversion states. The number of possiblestatescan be calculated using the following equation:

where x = m - n.In our example x = m-n = 1. According to this Equation the number of possible states is:

Page 14: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 14/33

For each possible state an entry is added into the transition matrix (Table3.1).

Table 3.1: The transition matrix showing all possible transition states.

Touchlib's blob tracker tries to find a state which contains the lowestdistancevalue. This value is calculated by comparing the distances between the blobsout of set p and set c. In our example set c contains more values than set p, whichmeansthat the current frame contains a new blob. When calculating the distancevalue, the new blob will be assigned the value of zero. Based on thetransition matrix we can calculate the following distance values:

Page 15: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 15/33

Figure 3.2: Final state blob tracker.

From these results Touchlib will choose the row with the lowest distancevalue. In our example this is s1 which corresponds to Figure 3.2. It ispossible that multiple rows share the same distance value, in this caseTouchlib chooses the first row containing this value (if this value isconsidered the lowest distance).

Programming language interfaces: In order to use Touchlib with other languages than C++, severalwrappers are available which allow applications to receive data fromTouchlib.

TUIO/OSC protocol:By default Touchlib comes with a wrapper which sends TUIO eventsover the commonly used Open Sound Control (OSC1) protocol. For many modern programming languages such as C#, Adobe Flash(Action script 3), Java, Max/DSP, Processing, Pure Data, Python andVisual Basic, OSC libraries are available. When using Flash it isrequired to convert UDP packages to TCP. This can be done by usinga tool called Flosc which acts as a proxy (Figure 3.3).When anapplication uses the OSC protocol, it will only be able to receiveevents containing properties of the detected blobs. It is not possible toadjust the settings of Touchlib from the application. Since OSC usesthe UDP network protocol to transfer data it makes it possible to

Page 16: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 16/33

create a setup in which a dedicated system provides blob tracking andtransfers the data to another system which provides the visualization.

Figure 3.3: Schematic view of sending TUIO events over the OSC protocol. Floscconverts the UDP packages to TCP packages.

Gesture based interaction:In comparisons with traditional desktop computing with mouse and

keyboard, a multi-touch device provides additional input methods. Instead of working as a point and click device it can be improved with gesture basedinteraction.

Gesture classification:In order to distinguish different gesture the following gesture aredefined: direct gestures and symbolic gestures.

• Direct gestureIn the definition of direct gestures we describe gesture patterns

that allows a user to manipulate objects directly. The manipulationcan be performed with multiple pointers. For example, when usinga multi-touch system it is very common to use a pinching likemotion in order to change the size of an object. Depending on thedistance between two fingers, the object increases or decreases insize. Examples of common used direct manipulation gestures areshown in Figure 3.4.

Page 17: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 17/33

Figure 3.4: A common used direct manipulation gesture set used for objectmanipulation.

• Symbolic gestures:

Symbolic gestures are patterns based on the gesture location andtrajectory. Patterns can be made in any shape such as triangles, circlesor even text characters. Symbolic gestures are commonly used inapplications to simplify tasks.

Gesture recognition models:In order to perform gesture recognition, different recognition models

are available. We discuss the most common used models. Based onthe advantages and disadvantages we select a technique that will beused for Gesturelib.

• Region based:A popular gesture recognition program for the X Window System isXstroke .Xstroke is a full-screen gesture recognition program whichallows users to bind commands to a gesture.The technique whichXstroke uses to recognize gestures is region based. Internally Xstrokeholds a 3*3 grid which is numbered from 1 to 9. The grid is centeredon the gesture and will scale to the extends of the obtained gesturepath. For example, if the user draws an N shaped gesture, the line willpass the following regions: 7, 4, 1, 5, 9, 6, 3. Xstroke stores the valuesof the passed grid cells in a string. By comparing the result by regular

Page 18: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 18/33

Page 19: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 19/33

24 }2526 std :: map < int , TouchData > touchfilist ;27 gesturelib glib ;28 }

Listing 3.1: Adding Gesturelib hooks to the ITouchListener object.

On initialization Gesturelib reads the file mtgesture.xml. This file containssettings such as screen dimensions and tolerance settings for Gesturelib. Bydefault it requires a minimum gesture size (path) of sixteen points. The filealso contains a list of known gesture patterns. Each gesture entry contains anidentifier for the recognized gesture and the string based on the pathdirection.When a gesture is being created, Gesturelib will keep recording the pathuntil glib.TouchUp is called. When this function is called, Gesturelib willstart processing the path. If enough path data is available, it will startcalculating the path directions and compare the result with the gesturepattern database. The result is returned to the main application. Theapplication now decides how to process this result.An overview of the event data pipeline is shown in Figure 3.5.

Figure 3.5: Schematic view of the events pipeline using Touchlib and Gesturelib inan application.

Page 20: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 20/33

Multi-touch applications

Multi-touch technology allows users to interact with a computer in a new

way. When using a multi-touch device, the type of interaction methodsdepend on the application. Current operating systems do not support multi-touch natively. Therefore it is required to implement multi-touch callbacksin the application code manually. This chapter describes how multi-touchapplications are created and how existing (single touch) applications can beimproved with multi-touch.

Software architecture:Depending on the used software language, difierent methods are used to

implement a multi-touch capable application. Figure 4.1 presents thesoftware architecture of C++, C# and Flash based applications. In order to use multi-touch in a C# application, the touchlibCOMwrapper is usedwhich provides a two way interface with Touchlib. When using Flash, itis only possible to receive blob positions from Flosc. It is not possible tocontrol Touchlib from the Flash application or Flosc.

Figure 4.1: A block diagram presenting the software architecture of C++, C# andFlash based applications.

Page 21: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 21/33

Multi-touch example:To demonstrate the structure of a multi-touch application we start outwith a basic C++ framework of a graphical single touch application(Listing 4.1).

1 int main ()2 {3 // . . . Setup s c ene and i n i t i a l o b j e c t p o s i t i o n s . . .4 do {5 getfikeyboardfimousefiinput ();6 setfiobjectfipositions ();7 displayfiscene ();27Multi-touch applications8 } while ( running );9 }

Listing 4.1: An example of a single touch application.

This example contains a basic application framework with a program loop.Theprogram loop includes a function which requests the state of the inputdevices, afunction which sets the object positions and a function which displays theobjects onthe screen.Since we want to be able to manipulate the object withmulti-touch we add multitouch support by using the Touchlib framework.We add two objects to the example applications namely, ITouchScreen andITouchListener. The ITouchScreen object handles the image processing andblob tracking. The ITouchListener object handles the events fromITouchScreen. This object contains three mandatory functions (fingerDown,fingerUpdate and fingerUp), an example is given in Listing 4.2. If it isrequired to access the touch information from functions that are not in thisclass, we use a variable such as touch list that keeps track of the currentlyactive blobs.

1 class ExampleApp : public ITouchListener 2 {3 public :4 ExampleApp ();5 ~ ExampleApp ();67 virtual void fingerDown ( TouchData data ) {

Page 22: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 22/33

8 touchfilist [ data .ID] = data ; // St o r e touch data9 }1011 virtual void fingerUpdate ( TouchData data ) {12 touchfilist [ data .ID] = data ; // Update touch data13 }

1415 virtual void fingerUp ( TouchData data ) {16 touchfilist . erase ( data .ID ); // Remove touch data17 }1819 std :: map < int , TouchData > touchfilist ;20 }

Listing 4.2: Implementing the ITouchListener object .

Next the ITouchScreen needs to be initialized. If available it will read thecamerasettings and the filter pipeline from a configuration file. If the configurationfile is not available it will use a default setting. In order to connect theITouchScreen with the application we register the ITouchListener to theITouchScreen object. Each time an update is requested by the application(with getEvents) the ITouchScreen object will call ITouchListener withevent updates. After completing initialization we can start the imageprocessing pipeline and the blob tracker.In the program loop we add getEvents which request updates from Touchlib.Depending on the required interaction methods object positions can bemodifiedaccording to the position of blobs stored in the touch list. If it is desired tomanipulate objects by direct manipulation, this can be done by checking thenumber of blobs that are active on an object. By storing the blob data andidentity we can compare the position of the current with the previous state.Depending on the change of position and distance the object can bemanipulated.The final result of the example application is in Listing 4.3.

1 int main ()2 {3 // . . . Setup s c ene and i n i t i a l o b j e c t p o s i t i o n s . . .4 ITouchScreen * screen ; // ITouchScreen o b j e c t5 ExampleApp myapp ; // ITouchLi s t ene r o b j e c t6

Page 23: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 23/33

7 // Setup Touchl ib8 loadficonfigfile ();28Multi-touch demo applications9 registerfiitouchlistener ();10 startfiimagefiprocessing ();

11 startfiblobfitracker ();1213 do {14 getfikeyboardfimousefiinput ();15 screen -> getEvents ();16 setfiobjectfipositions ();17 displayfiscene ();18 } while ( running );19 }

Listing 4.3: The example application including additions to process multi-touchevents.

Multi-touch demo applications:Here we are using various flash applications such as puzzles, flashgames etc. through this multi touch table via using Flosc which act as aproxy .While using flash based application Flosc is used to receive theblob positions and it then transfer these positions to Touchlib for further processing.

Snap shot of a flash demo application

Page 24: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 24/33

Diffuser materials

On the front side of the display a diffuser material is placed for two reasons.First it prevents the camera from seeing objects behind the screen and thusproviding a cleaner stable input signal. Secondly it functions as a projectionscreen for the DLP projector. There could be various type of diffuser material can be used.

Tested diffuser materialsCalqueer 90Cromatico extra white

Polyester filmTracing paper Translucent pearl

Figure 5.1Camera test results using Tracing paper.

Page 25: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 25/33

Touchlib Reference

Project detailsProject site: www.touchlib.com

SVN: http://code.google.com/p/touchlib/SVN (repository): http://touchlib.googlecode.com/svn/multitouch/

Touchlib config.xmlThe configuration file consist out of the following parts:

1. XML version definitionexample: <?xml version="1.0" ?>

2. Tracker configurationexample:<blobconfig distanceThreshold="250" minDimension="2"maxDimension="250"ghostFrames="0" minDisplacementThreshold="2.000000" />Tolerance settings of the blobtracker. The distanceThreshold contains thevalueof how many pixels a blob can travel. The minDimension andmaxDimensionvariables specify how small or large a contour can be to be detected as atouch.The value of ghostFrames specifies the number of extra frames Touchlibshoulduse for the blobtracker. The minDisplacementThreshold specifies how manypixels a contour needs to be moved before calling the update event.

3. Bounding boxexample:<bbox ulX="0.000000" ulY="0.000000" lrX="1.000000" lrY="1.000000" />

Specifies in which region blob detection should be applied.

4. Screen calibration pointsexample: <screen>...[points]...</screen>These values will be filled when running the configapp.

5. The filtergraph

Page 26: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 26/33

example: <filtergraph>...[filters]...</filtergraph>Filters are explained in the next section.

Touchlib filters:All filters need to be in between of the filtergraph tags.

< filtergraph > ::: < =filtergraph >The first filter in the filtergraph needs to be a video capture filter, the lastone mustbe the rectify filter.

Video processing filters:

dsvlcaptureDescription:This is an implementation which uses DirectShow to capture a video stream.Usage:

<dsvlcapture label="dsvlcapture" />

Mono filterDescription:Touchlib requires an 8 bit grey scale source image. This filter is onlyrequired if thevideo capture filter is not capable of delivering the right image format.Usage:

<mono label="monofilter" />

Invert FilterDescription:The filter inverts a greyscale image. This filter is only required for FI.Usage:

<invert label="invert" />

Background FilterDescription:

This filter removes the background by creating a reference image oninitialization andsubtracting it from the current active frame.Usage:<backgroundremove label="backgroundfilter"><threshold value="20" /></backgroundremove>

Page 27: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 27/33

The threshold value has a range from 0-255.

Highpass Filter (Simple)Description:Same purpose as the previous filter, however it uses a simpler algorithm andthereforeperformance faster than the default HighpassFilter. The filter contains twodifierentmethods to amplify the source images, currently it is recommended to set thenoiseMethod to 1.Usage:<simplehighpass label="simplehighpass"><blur value="13" /><noise value="3" /><noiseMethod value="1" />

</simplehighpass>

Scaler FilterDescription:If the previous used filter gives a weak output the scaler filter is used toamplify thecurrent image.Usage:<scaler label="scaler" ><level value="70" />

</scaler>

Rectify FilterDescription:This is the final filter of the image processing pipeline. The threshold is setto avalue in which blobs are visible but noise input is ignored. This image willbe usedto perform blob detection on.Usage:<rectify label="rectify"><level value="75" /></rectify>

The level value has a range from 0-255.

Page 28: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 28/33

Touchlib calibration:In order to calibrate the Touchlib it is required to have a fullyfunctioning multitouch table (this includes a camera and projector). It isrecommended to look at the example xml files to create a config.xmlconfiguration file. Every multi-touch technique requires a diffierent filter chain. When the filter chain has been set up, the configurationapplication can be used by starting configapp.exe from the Touchlib bindirectory (or ./configapp from the /src directory under linux). It nowdisplays the filters you have entered in the config.xml file. Adjust thesliders in the filters until the rectify filter only shows clear blobs whenbeing touched.

Page 29: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 29/33

COST BENEFIT ANALYSIS

Cost benefit analysis is a technique, which help us to determine the

economic feasibility of the proposed system. The primary objective of costbenefit analysis is to find out that whether the proposed system is

economically worthwhile to invest in.

Cost benefit analysis should be done before preparing the proposed

system. Cost benefit analysis is performed by first all the cost associated

with the project. Conceptually, the cost of project represents all the items of

outlay associated with a project, which are associated with long-term funds.

It is the sum of outlays of the following:

LAND AND SITE DEVELOPMENT: No extra space will be allotted.

So the cost of land and site development is nil in our project.

SOFTWARE COST: There is software required which is downloaded

from internet. Thus its cost is also nil.

MACHINERY: The cost of the machinery includes the cost of

hardware plus the cost of connecting wires, wooden box and bulbs.

Page 30: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 30/33

Conclusion

With the construction of our own camera based multi-touch table we havedemonstrated that multi-touch technology has become affordable. The

precision, reliability and responsiveness of a multi-touch table depends onthe used hardware. Because of the use of a camera based multi-touchtechnology, ambient light has a large influence in the tracking performance.

In our set of demo applications we have demonstrated the power of multi-touch devices. Compared to desktop applications, users were able tomanipulate objects in a natural way by touch.

We have evaluated the performance of our multi-touch table through a set of experiments. Results show that our current hardware using the a high latencyprojector inuenced our test results.

While multi-touch is capable of performing some task faster than the mousedevice, it should not be considered as a replacement.

When tasks require precision, the multi-touch device shows longer task completion times with higher error rates. Multi-touch device however,encourage collaboration.

Our test results show significant improvement when the number of users isincreased.

Page 31: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 31/33

Future Prospects

In order to extend this project and make it grow as a major project we willuse this multi touch interface for device controlling.

For device controlling purpose we will create a virtual panel through whichswitching of devices will be done. This will eliminate the need of manualyoperated switches.

Swicthing will be carried out using touch events registered on the touchpanel.Devices will be triggerred using the multi touch interface built in theminor project and also the number of applications will be increased.

Page 32: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 32/33

Bibliography

Touchlibhttp://www.touchlib.com

Latest Version of SVN (Subversion)http://code.google.com/p/touchlib/

NUIgrouphttp://www.nuigroup.com

Multigesturehttp://www.multigesture.net

Sectional Computation Sciencehttp://www.science.uva.nl/research/scs/

MTmini Packagehttp://ssandler.wordpress.com/MTmini

Page 33: multi touch system

8/7/2019 multi touch system

http://slidepdf.com/reader/full/multi-touch-system 33/33