44
Exploring Alternative Interfaces for Controlling Robotic Water Cannons Albin H ¨ ubsch Albin H ¨ ubsch VT 2016 Examensarbete, 30 hp Supervisor: Ulrik S¨ oderstr ¨ om External Supervisor: Rickard Gunnarg ˚ ard Examiner: Thomas Mejtoft Civilingenj ¨ or Interaktion & Design, 300hp

Exploring Alternative Interfaces for Controlling Robotic ...umu.diva-portal.org/smash/get/diva2:1044978/FULLTEXT01.pdf · Exploring Alternative Interfaces for Controlling Robotic

Embed Size (px)

Citation preview

Exploring Alternative Interfaces forControlling Robotic Water Cannons

Albin Hubsch

Albin HubschVT 2016Examensarbete, 30 hpSupervisor: Ulrik SoderstromExternal Supervisor: Rickard GunnargardExaminer: Thomas MejtoftCivilingenjor Interaktion & Design, 300hp

Abstract

In this thesis we are exploring the possibilities to use new and mod-ern interfaces to control industrial robotic hardware, in specific a singlewater cannon system called TARGA. Using a iterative Design-Build-Test cycle we have been able to build three different prototype inter-faces for controlling this robotic water cannon. One touch based, onegesture based and one gamepad interface has been designed, built andtested on users. We found that with modern interfaces, touch, gesturesand gamepads we are able to create user experiences easier to use thanthe standard physical water cannon joystick. Although our user testsshowed that the standard joystick still was the preferred interface to useby the users we did find that these modern interfaces can act as greatcomplements to the already existing interface for increased usability.

Acknowledgements

Our thankful greetings to Unifire AB that not only gave us the possibility to execute thisproject in cooperation with them but also gave us access to top in the market hardwarethat was used during developing and testing. We also want to thank the supervisor andexaminer of this paper for their support during the complete project and not to forget thepeer reviewers for their kind and patience feedback.

Contents

1 Introduction 2

1.1 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.2 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.3 Unifire AB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2 TARGA System 5

2.1 TARGA Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.2 Technical Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3 Background 8

3.1 Related Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.2 Leap Motion Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3.3 Gamepad Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

4 Method 13

4.1 Introduction of Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.2 Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.3 Internal Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.4 Development of Prototypes . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4.4.1 Design: Goal Establishment & Generating Alternatives . . . . . . . 15

4.4.2 Build: Implementation of Prototypes . . . . . . . . . . . . . . . . 15

4.4.3 Test: Testing & Evaluating . . . . . . . . . . . . . . . . . . . . . . 16

4.5 Presentations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5 Results 19

5.1 Introduction of Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

5.2 Results from Literature Research . . . . . . . . . . . . . . . . . . . . . . . 19

5.3 Design Phase Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

5.4 Prototypes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

1(39)

5.4.1 Low Fidelity Prototypes . . . . . . . . . . . . . . . . . . . . . . . 20

5.4.2 High Fidelity Prototypes . . . . . . . . . . . . . . . . . . . . . . . 22

5.5 User Test Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

5.5.1 Touch Screen Interface . . . . . . . . . . . . . . . . . . . . . . . . 29

5.5.2 Leap Motion Interface . . . . . . . . . . . . . . . . . . . . . . . . 30

5.5.3 Gamepad Interface . . . . . . . . . . . . . . . . . . . . . . . . . . 31

6 Discussion 33

6.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

6.1.1 General . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

6.1.2 Touch Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

6.1.3 Gesture Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

6.1.4 Gamepad interface . . . . . . . . . . . . . . . . . . . . . . . . . . 34

6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

6.2.1 Further Development . . . . . . . . . . . . . . . . . . . . . . . . . 35

6.3 Restrictions & Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . 35

6.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

References 37

2(39)

1 Introduction

Today, different interfaces are available when interacting with computers and technology.Interfaces like touch-screens, keyboards, mice and many others. A wide selection of dif-ferent types of sensors like light- and sound-sensors, Gyro’s, accelerometer’s and evenmolecule sensors are also available for interaction. All these, and more, are all used partiallyto give humans the best and most natural way of interacting with technology possible.

As the amount of technology used in our daily lives will increase [1], the need of great inter-faces and interaction design will be of even more importance [1, 2]. Interaction Designerswith a great set of tools and knowledge about different ways of interacting with technology,including knowledge about “how” and “when” to use them will most probably be able tobetter meet the future requirements of interaction design. And as interaction design oftenis described as, the shaping of digital things for people’s use [3], it is a knowledge great tomaster.

Before the emergence of personal computers in the late 1970s computers was mainly usedby information technology professionals and dedicated hobbyists [3]. Interacting with thesecomputers was done through a command line interface and all interaction was done bydirectly programming the computer. A few years later the mice and the graphical windowinterface was invented that completely revolutionized the whole computer industry. Butsince then, over 20 years ago, interaction with computers and technology has almost lookedthe same [1], keyboards and mice are still a standard interface method. A method provento work well with personal computers and machines through the years. Although today alot of other well working alternative interfaces like touch screens, voice control and evengesture based interfaces such as the Leap Motion [4], to mention one, do exists. In thisreport we are exploring the possibilities to use these modern user interfaces with technologyotherwise strongly associated with very traditional interfaces. Usability will be the focuswhen evaluating these new possibilities.

Human Computer Interaction (HCI) is often defined as the concept of usability [3]. Aconcept containing qualities like fun, efficiency, flow, aesthetic tension and many similar.Today a lot of research is done in the field of HCI, all stretching from psychology andergonomics to computer science and philosophy. HCI is a interdisciplinary field of studythat has gained increasingly more attention since the birth of personal computers [3, 5]. Andit looks like the research will not stop in the near future. The HCI field will keep evolvingat great pace for many years to come [5].

Unifire AB is a small Swedish company that is building and developing robotic water can-non control systems. These robotic water cannon systems are mainly used in fire fightinginstallations, dust control applications and as pirate defense system (figure 1). Water can-nons are still today and has for a very long time been manually maneuvered with some kind

3(39)

of physical joystick1 or pointer2 [6]. But when fully automated systems, functional AI andother new modern technology slowly starting to enter this industry (and every other industryas well) old habits on human interaction are getting challenged. As robotic water cannonsoften are used in challenging environments with harsh conditions, alternatives in controllingthat are both researched and evaluated is of great interest from the industry of robotic watercannon technology. But also future interaction designers working with robotics in generalmay have great use of it. In addition, great research in HCI and HUI (human user interface)do exists but research comparing different interfaces in the sense of robotic water cannonsdoes not yet exists.

1.1 Objective

The aim of this thesis is to research, implement and explore different methods of interactingwith a robotic water cannon system. This includes, a literary study, researching differentways of interacting with hardware and simple robotics and their respective pros and cons.The research will later culminate into a conceptual implementation of some various moderninterfaces that can control the robotic water cannon. These concepts will then be user testedand evaluated with help of the background research to see if they have the potential to be ofany good use when interacting with robotics, and in specific robotic water cannons. Ideally,we want to answer the question: can we increase usability with these new interfaces?

The objective of the thesis is to gain knowledge in the field of modern interaction andcontribute by exploring the possibilities in using this knowledge and modern interfacestogether with robotics and robotic water cannons. The goal is to develop three differentconceptual interfaces for controlling the robotic water cannon system that might also beused as guidelines in the future development of robotic interaction.

1.2 Outline

To get the most out of this report it is important to understand its structure. The report isstructured in 6 chapters starting with a background explaining the Unifire TARGA systemused in this thesis project to test different interaction methods on. The background does alsointroduce the research area. Following the background does the method, describing in depthhow everything was done during the thesis work. The method describes how the prototypeswas designed out of the research and then how the testing was done. The results of theresearch, prototypes and testing is presented in the results chapter. A discussion togetherwith a future work section is found in a chapter at the end of this paper.

1.3 Unifire AB

Unifire AB is a small industry and engineering company based outside Gothenburg, Swe-den. Unifire AB sells robotic water cannons and its proprietary control system to customers

1Figure 32Figure 3

4(39)

all around the world. The robotic control system engineered and built by Unifire is uniqueon the market and puts them in a position far ahead of all their competitors. Their systemis used in fire fighting, dust control, fountain installations and as pirate defense on cargovessels (figure 1) as a non lethal defense alternative.

Figure 1: A cargo vessel pirate defense system using a non lethal water cannon system.Several cannons can be mounted around the vessel to give full protection againstpirates and/or other intruders.

5(39)

2 TARGA System

This chapter contains a brief introduction to the TARGA, a CAN bus system designed anddeveloped by Unifire AB. The goal is to give an overall understanding of the TARGA and atechnical, and somewhat, in depth description of the system.

2.1 TARGA Overview

TARGA (figure 2) is the name of a robotic control system developed by Unifire AB. It isa CAN bus (Controller Area Network) system designed to control robotic water cannons,nozzles, valves, pumps and such similarities. The system is extremely modular and fullycustomizable in order to be able to fulfill different customers and their requirements. Thecore of the system is the uniquely designed PLC (Programmable logic controller) that canbe programmed after specific requirements and different unique installations.

Figure 2: The TARGA system. Inside is an PLC and a Raspberry Pi communicatingthrough one of two CAN bus interface. The TARGA have a on board screenfor motor calibration and logical settings. Physical connections for motors, eth-ernet, HUI’s and a USB port for PLC programming can be found on the outside.

The TARGA can be physically customized to provide different numbers of motor contactsand/or HUI interfaces such as joysticks or pointers, figure 3. These connections does alsoallow for connecting other hardware such as nozzles, pumps or valves. Unifire AB’s mostused controller is the TARGA PI Joystick. The Pi Joystick has the capability to control acannons position, its nozzle and other functions in the system, such as open valves and/or

6(39)

record a move pattern.

(a) TARGA Pointer.The pointer setsabsolute positionsfor the cannon.

(b) TARGA Joystick.The joystick givesthe cannon motorsa rotational speedon the x and y axis.

(c) TARGA PI Joy-stick. A Joystickcombined withhard button func-tions such as openvalve, take controletc.

Figure 3: Three different types of human user interfaces compatible with the TARGA sys-tem.

2.2 Technical Description

Inside the TARGA fits a Raspberry Pi 2 model B1 that is able to communicate over oneof the two available CAN bus interface. This Raspberry Pi can then easily be connected toeither a local or public network through the Ethernet connection and provide robotic controlover a simple web connection. figure 4 for a simplified system overview. The currentimplementation of the Raspberry Pi provides connection through web socket technology,a relatively new standard in web development2. The web socket enables full duplex, opencommunication between the Raspberry Pi and the client/user i.e a Smart Phone or any otherweb socket capable device. The goal is to provide a platform that is both modern and easilymodified but also extremely extensible and available on many devices and hardware. Thiswill give the opportunities to build more user friendly applications and installations andexplore new ways of interacting with the robotic water cannons earlier not possible.

1More information: https://www.raspberrypi.org/products/raspberry-pi-2-model-b/2More information: https://developer.mozilla.org/en-US/docs/Web/API/WebSocket

7(39)

Figure 4: A simplified version of the TARGA system. The Raspberry Pi is connected tothe PLC over a CAN bus interface. The Raspberry itself can be hooked up witha router or similar to provide network access.

8(39)

3 Background

To firmly understand the objective of this thesis some background information is necessary.In this chapter we will present different human input interfaces like the Leap Motion andits technology and go through earlier and related work that has been motive and seed to thisthesis work.

3.1 Related Research

We have reviewed research about different Human Interaction methods such as multi-touchscreens, hardware interfaces (joysticks, controllers) and gesture based interaction. Goodtechnology for gesture based interfaces have quite recently appeared on the consumer mar-ket. The Leap Motion and Microsoft Kinect are both high quality real time gesture trackingdevices with extreme accuracy which should make them ideal to interact with technology.

The main learning from the papers are that traditional interfaces, such as buttons, keyboards,mice, joysticks etc can in many situations be replaced or accompanied by new modern inter-face technology like multi-touch screens without any degrade in usability or performance.Touch screens do also have the advantage of being able to change its content over time tobest fit the needs of the user in certain situations. Gesture based interfaces where the useruses his/her hands or full body to interact has also shown to be working well with robotics,although a lot of work has still to be done here to compare its performance against moreestablished interfaces.

Performance of Touch Screens

Lee, Seungyon et al. [7] studied the performance of touch screen soft buttons. Three differ-ent experiments were performed and a total of 13 subjects participated. The first experimentconducted explored and compared the operational differences between soft buttons, soft but-tons with feedback, such as auditive and/or tactile, and hard buttons. Their findings showsthat soft buttons together with some feedback, either tactile or audative feedback performedequally good as hard buttons. If no feedback was used together with the soft buttons theydid perform worse than hard buttons, in the sense of corrections needed. They continuethe discussion meaning that the lack of natural feedback with soft buttons is their biggestdisadvantage. In their second and third experiment they investigated the influence of sizeson soft buttons and differences between capacitive and resistive touch screen. In these ex-periments it was shown that button sizes impacts the performance. Buttons smaller than10mm particularly decreased performance. In the case of capacitive and resistive sensorsthere was a marginal difference in performance between them, although capacitive sensorsare today preferred as they are more sensitive to overall touch.

9(39)

Research investigating the performance of different on screen gamepads for smartphoneshas also been done [8]. In their work Baldauf et al. compared four different kinds ofgamepad designs implemented on a touch screen. Versions with directional buttons, an 8-way d-pad supporting swiping, a virtual joystick and a gestural tilt control was tested. Allversions performed well, and the joystick showed to be very promising. It also in some casesperformed better than the alternatives, even though insignificantly better. The Tilt versionhowever did perform significantly worse in some tests. Oshita et al. has also proved theperformance of touch screens being great [9]. In their work comparing a physical gamepadand a touch based interface the touch based interface performed significantly better. Thismostly thanks to the flexibility of displays giving the ability to set custom labels on buttonsmaking it easier for users to press the correct ones.

Touch Based Robotic Interaction

Mark John Micire made extensive work and research [10] in the field of multi-touch roboticinteraction and control. In his work, Mark compares physical joystick interfaces with multi-touch interfaces through extensive user testing and research. His findings are that a multi-touch interface does not degrade the performance compared to the physical joystick in-terface. Further on he implements a multi-touch joystick emulation controller called the“DREAM controller”. This controller is later tested and proved to perform significantlybetter than the physical joystick interface. The tests does also show that the DREAM con-troller is faster and easier to learn for the users than the joystick. The DREAM controllerdoes also completely adapt to each individual user by measuring and adjusting itself to thecurrent hands using it. Findings in this report is very interesting since they prove that touchinterfaces can replace hardware interfaces without introducing any performance degrade.

The DREAM controller is a multi-touch controller inspired by the design of gamepads andjoysticks. It is a dynamically adjusting interface, both in size and position, making it aperfect interface for a wide range of hand sizes and shapes. The controller will follow theusers hand while moving around the screen making it easy for the user to reach functions atall time without need of moving the hand [10].

A. Singh et al. also proves in their article “An Interface for Remote Robotic ManipulatorControl That Reduces Task Load and Fatigue” [11] that with a touch interface they areable to increase user performance when controlling a remote robotic arm compared to whenusing a physical industrial interface. 73% of the users that participated in their tests alsomentioned that they preferred the touch based experience.

One of the greatest advantages with robots is the ability to put them in remote and hazardouslocations. Instead of sending a human to disarm a bomb we can use robots and control themfrom a remote position, safe from any impact and harm to humans. This of course puts greattrust on the operator and its controllers. They have to be both responsive and precise inmanipulating the robot. Singh et al. investigated and developed a touch based interface forremote robotic manipulation [11]. Their implementation both improved ease of operationand task efficiency compared to a standard industry physical controller in that niche.

10(39)

Gesture Based Interaction

Gesture based interaction is becoming more and more accessible to the crowd with newtechnology like the Leap Motion and Microsoft Kinect. It is a relatively new approach tointeracting with technology but plenty of research has been made upon this method inves-tigating its future possibilities and use cases. Szymanski et al. developed an interface fullycontrolled by hands and gestures. They found that even though it had a quite high errorrate during the user tests it still got some great potential [12]. Prochazka et al. found thisas well in their article “Mainstreaming Gesture Based Interfaces” [13] were they tested asimple gesture based slide navigation system. An interesting conclusion in their findingsis that gestures has to be non complex in order to offer a great user experience. They alsofound that delay in software or feedback is something that should be prevented at all costs.Equally did Szymanski et al. find that lag has a big negative effect on the interface and userexperience. Lopez et al. also investigated and user tested an interface were the users shouldinteract with augmented objects through a Leap Motion [14]. Their findings was similar tothe others.

Prochazka et al. are also pointing out that gestures used in a gesture based interfaces shouldnot be created by the developer. Instead all gesture used should be designed by the targeteduser group itself. A user-centered design approach is preferred [13].

Robotic Control Using Leap Motion

Chen at al. [15, 16, 17] shows that robotic control or manipulation is fully possible using theLeap Motion1. They show that a gesture based interaction with a robotic arm is technicallypossible with the Leap Motion and Microsoft Kinect. It also appeared to be a very intuitiveway to interact as their test subjects mentioned it as, very easy to learn, compared to tradi-tional joysticks. It is also discussed to be especially beneficial in tele-operated applicationswhere environment is a severe risk factor.

Gamepad Based Interaction

Gamepads are a well known type of interface for many people. Modern gamepads oftengives the user two thumb sticks with each four degrees of freedom making them ideal forrobot control [10]. A survey [18] measuring the user experience of gamepads has shownthat they are phenomenal in providing comfort for long sessions and a great button layoutfor easy reach. Compatibility was also one of the most positive aspects appearing in thissurvey [18]

3.2 Leap Motion Interface

The Leap Motion is a real time hand tracking USB device that connects to any personalcomputer with either Microsoft Windows2 or Mac OSX3. The Leap Motion is able of track-

1Read more: http://www.leapmotion.com2More information: https://www.microsoft.com/en-us/windows3More information: http://www.apple.com/osx/

11(39)

ing two hands with a resolution of up to 200 frames per second with a reported accuracyof sub 0.7 millimeter [19, 20, 21]. The Leap Motion uses two infrared cameras and threeinfrared LEDs. Together they work with a wavelength of 850 nanometers, which is outsidethe visible spectrum of light (for humans). The 3D space that the Leap Motion provides forinteraction extends to approximately 60cm around the whole device but is limited to 150°on the sides and 120° in depth, figure 5 for a visualization of the 3D space.

Figure 5: Visualization of the 3D interaction space that the Leap Motion provides. It ex-tends to approximately 60cm above, 60cm on each side (150°) and 60cm deepon both sides (120°) from the controller.

Leap Motion also provides a complete set of development tools for a wide variety of pro-gramming languages. A programming friendly API4 is at the moment available in Python,JavaScript, C++, C#, Java, Unity, Objective-C and Unreal.

Figure 6: The Leap Motion USB Device is a small hand tracking device with great accu-racy and performance. It is compatible with all major personal computer operat-ing systems after installing additional drivers provided by Leap Motion.

The Leap Motion API provides a set of easy to use classes. Classes returning detailed data4More information: https://developer.leapmotion.com/documentation/python/api/Leap_

Classes.html

12(39)

about users hands and arms. This data contains in depth information such as finger bonelengths, a hands spherical center and radius etc.

3.3 Gamepad Interface

Gamepads can be found in a wide variety of alternatives. They are mostly used when play-ing video games. Video game consoles has also been one of the driving forces behind thedevelopment of gamepads. Among the most popular gamepads, at the time of this thesis, areSony Playstation DualShock 4 and Xbox One controllers (figure 7) [22]. These controllersare mostly referred to as being part of the eighth generation of gamepad controllers.

The Sony playstation dual shock and Xbox One controllers are quite similar despite thegeneral design differences. They both have two analog thumb sticks one for each hand.They have both one 4-way digital pad on left hand side and 4 action buttons on the righthand side. Each one of them also have 4 shoulder buttons 3 more buttons in the center foraddition functions usually used for accessing in game menus and system menus etc.

Even though gamepads mostly have been used for game consoles they are to be foundtogether with personal computers, usually with a USB connection interface. Even thoughthe operating system usually don’t take advantage of the controller, 3rd party programs canuse the controller, pc-games being one obvious example. Lately also web browsers havestarted to implement support for usb-gamepads allowing websites and web based gamestaking advantage of the controllers.

(a) A Xbox One controller used togetherwith Xbox One.

(b) A Playstation DualShock 4 con-troller used together with Playstation4.

Figure 7: Two popular gamepad interfaces, the Xbox One controller and the PlaystationDualShock 4 controller. Both coming together with their respective game con-sole.

13(39)

4 Method

In this chapter all methods used during the thesis work are presented. How the researchwas done by searching and reading previous work in the field and how our prototypes wasimplemented and evaluated.

4.1 Introduction of Method

During the work of this master thesis we divided the work into several stages to get a gras-pable overview of the work. We began with doing research in the topic and did an internaltraining/education at Unifire AB. We then went on with developing our prototypes in adesign-build-test cycle. Lastly we concluded our findings in this report.

The method chapter is divided into several sections and subsections and may need someclarifications. In our prototyping we used a method called design-build-test. This method isa cyclic method (figure 8). We have divided our sections for describing our work into eachstage of the cycle. Each section then includes descriptions of each cycle of that stage.

4.2 Research

To get an understanding of the current situation of available interfaces both on the consumermarket but also in the research field extensive investigation has been done reading articlesmostly found on the internet in Umea University’s online articles database which have ac-cess to a majority of all the online publications. Readings on recent trends in robotics,personal computing and interaction design has also provided additional contributions to thethesis.

4.3 Internal Training

To gain required knowledge about the TARGA system and robotic water cannons UnifireAB offered internal training at their facilities to give in depth information about the systemarchitecture, usage and other necessary information. Unifire AB also provided us with aTARGA unit, some motors and a joystick (figure 3) to use during the thesis work and futurework. The internal training consisted of several informal meetings both at Unifire AB’sHead Quarter in Gothenburg but also over Phone, Skype and email conversations with theirlead engineer.

Thanks to the openness of the TARGA system it could be used to experiment and testdifferent configurations. It also provided easy access to all software for inspection and

14(39)

examination to easily get an understanding of how it is built and designed.

4.4 Development of Prototypes

To be able to evaluate and determine the potential of different interfaces for controlling arobotic water cannon we decided to develop prototypes and test them. Research has shownthat prototypes and user tests are great tools to early detect if an idea is worth spending timeand money on [23]. Low-fidelity prototypes can preferable be used with big advantages,like its ease of iteration and low cost, without any loss of information compared to its high-fidelity counterparts [23, 24].

Throughout the prototyping stage we applied an iterative problem solving cycle modelcalled Design-build-test. We used a slightly modified version of the model presented inthe article “Accelerating the Design-build-test Cycle for Effective Product Development”[25] by Steven C. Wheelwright and Kim B. Clark. The model focuses entirely on rapiddevelopment of prototypes or production line products. The model starts with the designprocess. This is best described as the stage where product goals are established and theproblem is framed in a graspable matter. Before entering the next stage alternatives to solvethe problems are generated, favorably with brainstorming sessions and/or workshops. Allalternatives then goes into the build phase where they are implemented as prototypes ormodels. This phase can easily be very time consuming if a lot of alternatives are to beimplemented. Third and last is the test phase. This is the stage where all prototypes aretested and evaluated, either by running simulations or experiments like user tests. This cy-cle is then iterated until one or several alternatives meet the earlier established goals. Weperformed a total of two(2) iterations. In the first iteration low-fidelity prototypes was builtand in the second iteration high fidelity prototypes. In figure 8 the whole Design-build-testcycle is described visually.

15(39)

Figure 8: The iterative problem solving cycle, Design-Build-Test, presented by Steven C.Wheelwright and Kim B. Clark [25]. Start by establish the goals and generatealternatives. Move on by building prototypes of each alternative and end thecycle by testing and evaluate each prototype. Iterate this process until a solutionthat meets the goal is reached.

4.4.1 Design: Goal Establishment & Generating Alternatives

The design phase is the first of three phases in the design-build-test cycle (figure 8). To es-tablish our goals we took help from Unifire AB (the industry) and the Academy by readingliterature in the topic. We also searched the web for trends in interaction design and inter-faces. As we knew that involvement of real end users of the water cannons was impossible,Unifire AB and its staff was considered end users instead as they have close relations totheir customers, extensive experience with water cannons and how to interact with them.

When all goals was established the ideation of different interfaces started. This was a pro-cess we mainly performed together with Unifire AB. Several small brainstorming sessionswas performed at different times both on location and over Skype or phone, notes wheretaken afterwards summarizing each session. All brainstorming sessions where held inter-nally with staff at Unifire only. A workshop was also held together with staff at UnifireAB were we discussed, tested and played with different interfaces such as the Leap Motion,different touch screens and joysticks.

Our established goals and selected alternatives that made it into the first implementationphase are presented in chapter 5.

4.4.2 Build: Implementation of Prototypes

Established goals and generated alternatives from the design phase led us to implementthree different interfaces for controlling the water cannon. All implementations uses web-sockets for communication with the TARGA but are all slightly different in their generalstructure and underlying method of implementation. Below is described how each one ofthe prototypes was implemented.

16(39)

Implementation of Touch Screen Interface

Our touch screen based interface prototype was in the first iteration implemented usingpaper and pencil, sketched as a low-fidelity prototype. In the second iteration it was im-plemented using the latest web technologies, HTML5, CSS3 and JavaScript. The choice todevelop with web technologies was mainly made thanks to the fast iterating cycles and wideplatform support it provides. JavaScript also fulfilled all our requirements on functionalityas we did not require any OS specific support. A web based interface was also ideal as itcould be hosted on and served from the built in TARGA Raspberry Pi letting any one witha internet browser, connected to the network, access the interface. All HTML, CSS andJavaScript was written from scratch to best fit the TARGA protocol.

To get good positioning of all touch buttons and elements we took a closer look at thumbreach [26] and the problems that appears with smart phones and their functional area.

Implementation of Leap Motion Interface

The Leap Motion low-fidelity prototype was implemented by using just a cold wired LeapMotion. The high-fidelity prototype was implemented as an web app using HTML5, CSS3and JavaScript. The Leap Motion JavaScript API was used together with the modern stan-dard web technologies. The implementation was done following a object oriented program-ming methodology [27, 28, 29]. This to simply be able to add more features in the future.The Leap Motion JavaScript API requires a modern web browser, preferable a new versionof Google Chrome or Firefox. The implementation uses a websocket to communicate withthe TARGA. The Leap Motion JavaScript API also requires that an installation of the LeapMotion Proprietary drivers has been installed on the computer that is being used.

Implementation of USB Gamepad Interface

The USB game controller interface low-fidelity prototype was implemented using a coldwired Trust GXT540 USB game controller. In the second iteration we implemented anhigh-fidelity prototype, capable to control the robotic water cannon. The implementation isa web based solution using JavaScript and the HTML5 Gamepad API1. The gamepad APIis at the time of this report at an experimental stage but all major web browsers, such asChrome (version: 35 and above) and Firefox (version: 29.0 and above) do support it andit works. Thanks to the gamepad API and the web based prototype there are no system re-quirements other than a modern web browser and operating system. The prototype can evenbe hosted directly on the TARGA’s Raspberry Pi to eliminate any additional installations be-fore getting it up and running. Our implementation follows an object oriented programmingmethodology [27, 28, 29] to easily be able to reuse and extended it in the future.

4.4.3 Test: Testing & Evaluating

As it from start was known that communication and testing with real end-users of the watercannon system would not be possible during this thesis work the staff at Unifire AB was

1More information: https://developer.mozilla.org/en-US/docs/Web/API/Gamepad_API/Using_the_Gamepad_API

17(39)

used as source of prototype feedback and input. They were also used as test subjects forprototypes in the first iteration. Other randomly picked people, non related to the watercannon industry, was used for testing the prototypes in the second iteration. In total weperformed eight(8) tests, three in the first iteration and five in the second iteration. No oneparticipated in both the first and second iterations of tests.

The first iteration testing was made as “open interviews” [30]. As the low fidelity prototypeswere not functionally working with the TARGA we asked the participants to imagine thatthey were controlling a robotic water cannon. We asked questions of how they though itwould work to control a robotic water cannon using the given low fidelity prototype. Thefollowing questions was asked for each low fidelity prototype.

• Using the given interface, how would you take control of the robotic water cannon?

• Using the given interface, how would you move the cannon to aim?

• Using the given interface, how would you open & close the valve(start/stop waterflow)?

• Using the given interface, how would you change the nozzle water spread?

These questions was asked so that we could implement a user drive design approach intoour second iteration of prototypes.

Questions regarding other functionality like, “record”, “play” and “home/park” was notasked as they are not functions affecting the basic configuration of the TARGA system.

Second iteration testing was based on our high fidelity prototypes. We tested the gesturebased Leap Motion interface on three(3) users, the gamepad and touch interface with oneuser respectively. During test, the prototypes were connected to a TARGA controlling themotors of a water cannon in real time. Although as the system mostly is installed as aremote system the participants was not able to see the physical aim of the cannon, insteadthey had to rely on feedback presented in the prototype [11].

All participants was asked to complete the following predefined tasks, one at a time.

• Take control of the water cannon

• Move the water cannon and aim it at an arbitrary position

• Open and close the valve

• Change nozzle spread (not with the leap motion prototype)

After the prototype test each participant got to try the TARGA PI joystick and do the sametasks with it as with our prototype. We did this so we could ask the participant two questionsthat would give our prototype a measurement against a reference point. The question weasked was.

• What interface was easiest to use when performing the tasks?

• What interface would you prefer to use?

18(39)

Notes was taken by the supervisor during each test. Such notes was e.g if the participanthad problems finding a specific function, problems maneuvering the interface or the watercannon. The participants was also asked to think out loud while they were doing the test[31]. This method called “Thinking Aloud” has several advantages where the largest maybe the ability to capture preference and performance simultaneously without the need ofremembering to ask questions about this later [32]. The supervisor took notes on everythingthe participant said. To gain additional data each participant was asked directly after the testwhat they thought about each prototype.

Test Findings & Prototype Evaluation

To quickly ascertain the problem hot spots within each prototype a preliminary analysis wasdone immediately after each user test [32]. This analysis was based on the supervisors notesand objective observations during the test. The analysis was a short summary describingeach prototypes problems and/or benefits. This allows for fast iterations [32]. Evaluationwas then made by matching this analysis against the goals established in the design phase(phase 1).

4.5 Presentations

At the end of the thesis work two concluding presentations will be held. The first one is aformal presentation focusing on the research and thesis in whole. The second presentationwill be an informal presentation for invited only presenting the results and prototypes fromthis thesis. To assist during the presentations, presentation slides will be used.

19(39)

5 Results

This chapter presents all results obtained during the thesis work, which includes resultsfrom our literature study, our prototypes and all outcomes from user testing the prototypes.

5.1 Introduction of Results

Our results consist of several sub results. We have designed 3 different interface prototypesfor controlling a water cannon. We have tested the prototypes with users and comparedthem to the standard controller used by Unifire AB today (TARGA PI Joystick). We willfirst present some findings from the literature research and then present what we foundduring the design phase. Next we will present all our prototypes, both low-fidelity andhigh-fidelity. Finally our user test results are presented. Conclusions and futher discussionregarding the results can be found in chapter 6.

5.2 Results from Literature Research

We found that according to several articles [7, 9, 11] touch screen interfaces belongs tothe future and users are comfortable with using them as a daily method of interacting withtechnology. However, the lack of physical feedback can still be a problem when the user isnot able to see the screen. But today the pros often overcomes the cons with touch screeninterfaces in most daily use case situations.

Gesture based interfaces on the other hand is not yet as commonly accepted as touch in-terfaces. Although initial research shows that it has some great potential on becoming amore mainstreamed method of interaction in the near future. Several articles have provedthat a gesture based interface can be very intuitive, easy to learn and especially beneficial intele-operated applications.

5.3 Design Phase Results

Together with Unifire AB we established our prototype goals during the brainstorming ses-sions. Our main established goal is the following rule. Each prototype shall possess thebasic functionality to be able to control a system with one robotic water cannon. And to-gether with Unifire AB we defined basic functionality as:

• The ability to take control of the robotic water cannon.

• The ability to move the robotic water cannon in both x and y axis.

20(39)

• The ability to open and close a valve.

• The ability to change nozzle spread.

Other functionality like “record”, “play” and “home/park” was added only as a sub goalsand considered as a bonus if they got implemented.

A second established goal was each prototype must be able to visualize the current state ofthe cannon. What this means is basically that when a user is controlling the water cannonremotely he/she has to be able to see where the water cannon is aiming and how it is acting.In order to meet this requirement we had to accompany the Leap Motion interface and Usbgamepad interface with a screen and a visual user interface. Neither the leap or the gamepadhas the ability to give the user any feedback of the required sort.

During the design phase we also held a small workshop together with staff at Unifire AB.During this workshop the main goal was to find three different interfaces to develop proto-types for. The interface should be modern and available to the bigger crowd. During theworkshop four main interfaces was discussed as being of extra interest. These four were.

• Touch based

• Gesture based

• Thumb stick based (gamepads)

• Gyro based

From our background research we knew that a gyro based interface was not a recommendedmethod from a user performance perspective. As the staff at Unifire was considered end-users we let them decide on Touch, Gesture and Thumb stick/Gamepad interfaces as theysaid those was by far the most interesting ones.

5.4 Prototypes

In this section all our prototypes are presented. Both our low fidelity and high fidelityprototypes.

5.4.1 Low Fidelity Prototypes

Our low fidelity prototypes are the result from the initial interviews with Unifire AB and theresearch made. Our low-fidelity prototypes are sketches and high level user flow descrip-tions and proposals. The low fidelity prototypes are mostly designed to give the users inthe first user test a great starting point and to test some initial ideas. You can find our highfidelity prototypes in section 5.4.2.

Touch Screen Interface

Our low fidelity touch screen interface consists of sketches proposing a touch based userinterface with functions according to the basic functionality goal.

21(39)

(a) The little arrow at the nozzle sliderindicates the nozzle current positionwhile the slider is where the userwants to set it. Cannon position ispresented with an x and y axis closeto the joystick.

(b) Nozzle has been changed and thecannon position has also changed.

Figure 9: our low fidelity touch prototype sketches.

Our prototype has a joystick like imitation to change cannon position at its lower part tobe in easy reach for any hand or thumb. The decision to use a joystick like imitation wasbased on users responds to this technique in the article “Investigating On-Screen GamepadDesigns for Smartphone-Controlled Video Games” [8]. We placed a valve button and aslider for changing nozzle spread in the middle. We choose to make the control buttonas a switch and placing it at the absolute top of the app to differentiate it from the otherfunctions. This due to the fact that this function is only intended to be called once when applaunches and you want to take control of the cannon. We placed a triangular indicator onthe nozzle slider to indicate current state reported by the TARGA. We did also put gaugesclose to the joystick to visualize the current position of the cannon.

Gesture Based Interface

In our low fidelity gesture based prototype we developed a proposed setting with the LeapMotion controller cold wired with a computer and a screen. We propose that on screen

22(39)

instructions are given during the interaction with the leap. This way the user get instantfeedback on every interaction and can correct failure and follow up on success. The fol-lowing flow of interaction is proposed for the gesture based prototype. This proposal is notintended to be shown to the user as we wanted a user drive design. This setting was onlyfor us to see how close we can be the user proposal.

1. Place both hands above the Leap Motion sensor to send control to TARGA.

2. Open valve by giving a left thumbs up. Close by not giving a thumbs up.

3. Start moving cannon by fist the right hand once and then change roll and pith of thehand to give speed and direction to the cannon.

4. Change nozzle spread by rolling left hand.

Gamepad Interface

Our low fidelity gamepad interface prototype proposes a setting with one cold wired USBconnected gamepad (TRUST GXT 540) to a computer with a screen that shows feedbackdirectly from the TARGA. It would show current position, and activated functions. Wedesigned a proposed mapping for TARGA functions to a USB controller, based on a twoanalog thumb stick controller layout. This mapping was only intended to be compare withthe users proposal in the first user test. Also as a backup if the user did not have anyproposals.

• Left analog thumb stick controls position of the cannon.

• middle right system button (usually a “select/start” button) takes control over thecannon

• Lower shoulder buttons control nozzle spread.

• Lower right action button (usually the “A” button) toggles valve.

This fulfills the goal about basic functionality.

5.4.2 High Fidelity Prototypes

In the sections below you will find our three high fidelity prototypes. Several screen shootsare attached for each prototype and a text describing how each one works.

Touch Screen Interface

Our solution for the high fidelity prototype consists of a single page mobile web applicationwith a menu allowing for some minor app customization. The app (figure 10), meets thebasic functionality goal by allowing the user to be able to take control over the robotic watercannon with a switch in the top right corner. The user can move the cannon around with thetouch based joystick placed in the bottom part of the app. The valve can easily be opened

23(39)

and closed by toggling the valve button found in the middle of the app. Nozzle spread ischanged by dragging the nozzle slider placed in the middle. The app also visualizes thecannons x and y position around the joystick. Beside the basic functionality support theprototype also provides four additional buttons for extended functionality, in this case wehad them mapped to system power, park/goto home, record pattern and play a recordedpattern. These functions exists on the TARGA PI joystick.

The joystick being one of the most important part of the touch based interface we chooseto place in lower part of the app, close and easy for either left or right thumb to reach.Other buttons such as the “take control” were placed more “out of reach” as they are not asfrequently used.

We also put some extra effort into the touch prototype making it feel like a real app. Aside menu was added to provide some extra functionality as status messages of the socketconnection, an auto take control function that automatically sends a take control messagebefore every other call. And a safe delay function forcing the user to hold on a button fortwo seconds in order to send the command. These functions was only implemented fortesting purposes and does not include in the user testing or basic functionality requirement.

24(39)

(a) The touch implementation supports po-sition control and feedback, nozzlespread, valve control and feedback andsupport for some additional features.

(b) The implementation also provides a sim-ple menu allowing the user to changesome of the most basic app behaviors.

Figure 10: Our touch screen joystick interface. You have full control over position, nozzlespread, valve and other functions.

Gesture Based, Leap Motion Interface

As earlier mentioned we had to accompany the Leap Motion and usb gamepad interfaceswith a screen containing necessary system feedback. With the Leap interface we chooseto do this in the browser as everything else was done with web technologies. The greatestchallenge in designing an interface with the leap motion was by far designing a good inter-action flow with gestures. There is simply to many possibilities to try and test them all. Inour prototype we choose the following flow of interaction with the water cannon based onthe results from our first iteration user tests (figure 11).

1. The interface starts in a disarmed state. To arm the interface and also send a “Takecontrol” command to the TARGA both hands should be placed over the leap motion.We did this to get the users full attention.

2. When the interface is armed the user gets two choices, either open the valve by ex-tending the index finger and the thumb (figure 11b) or moving the cannon around by

25(39)

fist the right hand and open it again to activate scene three.

3. When scene three is active the user can move the cannon around by simply changingthe roll and pith of the right hand. The user can stop manipulation of movement byfist the right hand again.

4. If leap motion ever loses sight of both hands it will automatically be disarmed and allall actions will be canceled. As long as it sees at least one hand it will stay armed andfully operational. If right hand is ever lost during move it will automatically pausemovement and the user will be put back in scene two.

During the development of the gesture based interface we were not able to implement thedesired user proposed function for nozzle control. To be able to implement this feature mod-ifications to the TARGA firmware and its configuration has to be done. Doing this wouldhave, with the current version (at the time of this thesis) of the TARGA software destroyedthe other interfaces capabilities to control the nozzle. Therefore we stripped down the func-tionality to only include valve and movement control. Nozzle control was left behind. Thison the other hand does not match the basic functionality requirement, so therefor this wouldnot make it into a third round of the design-build-test cycle, if the established goals wouldnot get changed.

26(39)

(a) The first screen in the Leap Motion interface gives the user instructions how to begin controllingthe cannon using the interface.

(b) When the user has taken control over the cannon basic instructions are presented on how to performmove action or open valve.

27(39)

(c) If the user starts moving the cannon a 2D plane is presented visualizing current position of thecannon and a indicator representing the hand input, speed and direction for the cannon. So whenthe white hand indicator is placed in the middle the cannon is steady.

Figure 11: In this figure is our Leap Motion user interface presented. It consists of threestages where the first one is the disarmed stage. In this stage nothing is sent tothe TARGA. The user can arm the Leap by putting both hands above the sensoritself. When armed the user can either open the valve by extending the left handindex finger and thumb (like a gun). The user can also start moving the cannonby crunching the right hand to a fist and then releasing again, the interface willindicate moving is active and a dot will move accordingly to the hand as wellas a dot representing the cannons actual position.

USB Gamepad Interface

Similarly as with the Leap Motion interface we had to accompany the USB gamepad inter-face with a screen presenting status feedback from the TARGA. This to be able to meet therequirements that we set in the goal establishment phase. From our first iteration interviewswe found that the following distribution of function among the buttons were preferred. Fig-ure 12 for a visual explanation.

• The water cannons position is controlled with left thumb joystick.

• Control is taken by pressing the triangle start button.

• Nozzle is changed with the two lower shoulder buttons.

28(39)

• Valve can be opened and closed by pressing the A-button.

Figure 12: A function to buttons map explaining which button corresponds to what func-tion on the TARGA. As can be seen the USB gamepad interface supports allfunctions that can be found on a TARGA PI joystick controller.

Earlier findings showed that it can be hard to remember which button goes to what function.Therefore we included the mappings in the supporting screen interface making it easierfor new users to adapt to the interface faster and also limiting any possible human errorsregarding memory. The screen interface can be seen in figure 13.

29(39)

Figure 13: The accompanied screen for the USB gamepad interface instructing the useron how to use the actual gamepad to control the cannon. It does also showreal time feedback from the TARGA including cannon position, nozzle spread,valve status and control status. Extra functions as play, record, home/park andpower is also visualized.

5.5 User Test Results

We tested all our prototypes with users. The low fidelity prototypes was tested with staff atUnifire AB and the high fidelity prototypes was tested with people not related to the industry.Testing the low fidelity prototypes was made to gain knowledge about our progress and whatdirection to take with our prototypes into the next iteration. The high fidelity user tests wasmainly done to measure our results.

5.5.1 Touch Screen Interface

Our results from the user testing of our touch screen interface are presented below.

Low Fidelity Prototype

The low fidelity prototype consisted of sketches of a proposed interface. During the testthe user answered our predefined low fidelity test questions by pointing at buttons in the

30(39)

sketches. The user was able to point on all our pre drawn functions and had no futhercomments.

High Fidelity Prototype

To begin the user struggled a bit with the first task, taking control. The user first spent sometime looking at the application and then tried to move the touch joystick around withoutsuccess. The user then found the control switch in the upper right corner and successfullytook control over the cannon. The following tasks was performed without any noted issues.The user acted directly on task without any hesitation.

Similarly as with the touch prototype when using the TARGA PI joystick, the user struggleda bit with the first task. After a little time the user found the on button and successfully tookcontrol. When the user was asked to change nozzle spread the user could not find the desiredfunction. The supervisor showed the knob on the joystick and the user was first after thathelp able to complete the task. All other tasks was completed without any issues.

On our questions the user answered that the touch screen interface was the easiest one touse thanks to clear buttons and informational texts. Although the TARGA PI joystick waschosen as the preferred interface due to its precision. The user mentioned that when youhave tried the TARGA PI and memorized all buttons and functions, that one is the easiestto use, and therefore the preferred one.

5.5.2 Leap Motion Interface

This section presents our results during the user test of the Leap Motion interface.

Low Fidelity Prototype

In our low fidelity prototype with the Leap Motion we had no predefined interaction pat-tern, only our defined interface and proposed setting. We let our user answer the followingquestions and propose what gestures to be used.

• How would you take control of the cannon? By holding one hand over the sensor.

• How would you move the cannon? Rolling and pitching the right hand.

• How would you open and close the valve? Extending only my left index finger.

• How would you change nozzle spread? Hmm, by rising and lowering the left handin relation to sensor.

After this we could compare the users answers to our proposed setting and adjust those. Aswe wanted a user driven design we went with the users proposal. However, we had to makeone exception due to a safety reason. The user proposed taking control over the cannon byholding one hand over the sensor. This was adjusted to be both hands. We want our user tobe fully concentrated on the task, therefor both hands. The user thought this was a good ideaand approved it. We did also need to add a lock to the manipulation of cannon movement.The user proposed a fist like interaction with right hand. Close and open right hand to startmanipulation of movement.

31(39)

High Fidelity Prototype

The Leap Motion interface turned out to be quite intuitive to use. The users only struggled abit at the stage to activate the movement manipulation of the cannon. The users did not firstrealize that the hand should be opened after fist to take control of the steering. The users justneeded some extra time to figure this out, one user needed a little more time than the others.After this the users had no problems with interacting with the cannon. It should be notedthough, that the Leap Motion sometime lost visual of the hands and interrupted the users.In these cases they had to start over. It should also be noted that due to the incompletenessof the nozzle implementation the users were not able to adjust nozzle spread.

The visualization of the position of the cannon and the hand indicator was also easily un-derstandable by the users.

All asked users thought the TARGA PI controller was both the easiest and the preferredinterface to use compared to the Leap Motion. From their choices were based on the factthat the leap motion felt a bit unreliable and not as precise as the TARGA PI.

5.5.3 Gamepad Interface

In the following sections we present the results from our user tests on the low and highfidelity gamepad interface.

Low Fidelity Prototype

The user testing the low fidelity gamepad interface answered the following on our lowfidelity prototype questions when.

• How would you take control of the cannon? By pressing the start button

• How would you move the cannon? By using left thumb stick

• How would you open and close the valve? By pressing the “A” button

• How would you change nozzle spread? Using the two big shoulder buttons.

This fits well with what we proposed in our suggested design. What the supervisor did noteaside from this was that the user had some hard time deciding which function should bemapped to what button. Indications on that the mapping might be hard to remember wasalso noted. Therefore we added a picture on the controller with all mappings to the screenin the high fidelity prototype.

High Fidelity Prototype

Our user who tested the high fidelity gamepad interface prototype had average previousexperience with gamepads. The user easily completed each task without any complicationsat all.

When the user got to test the TARGA PI joystick problems appeared when the user got tothe task where to change nozzle spread. The user had problems locating the desired function

32(39)

on the TARGA PI. After a while the user got a hint from the supervisor and found it thenimmediately after that. The nozzle spread is changed with a rotating knob on the top of thejoystick.

On our questions the user answered that the gamepad interface was the easiest to use al-though the user said he would prefer the TARGA PI due to the feeling of more precision inmovement control.

33(39)

6 Discussion

We will in this chapter discuss our proposed prototypes and the user test results. We willadd our findings and opinions to the discussion started by the authors of related research.We will also go through future work and further development of the prototypes.

6.1 Conclusions

In this section we preset our conclusions from the different parts of our work. We willgo through general conclusions, conclusions regarding touch based interface, gesture basedand gamepad interface.

6.1.1 General

We have succeeded and developed three different types of interfaces all capable of control-ling a water cannon. Our gesture based interface had somewhat degraded performance interms of the established design goals due to a limitation in the TARGA configuration atthe time of this thesis. Our user tests proved this and also told us we created interfacespossibly easier to use than the old well established standard physical joystick, the TARGAPI. Although, the tests also showed that the TARGA PI still is the better performer and thepreferred interface to use. We can from this conclude that the TARGA PI has a steeperlearning curve but better performance. Our interfaces on the other hand has a short learningcurve but lacking the same performance. Therefor, they greatly complement each other.

6.1.2 Touch Interface

Touch screens are today widely accepted as an human interface. According to researchmade they have shown to perform equally good as physical interfaces based on the andeven better with factors like learnability. This was also shown in our user tests were thetouch interface was chosen as the easiest interface to use and learn compared to the physicalTARGA PI joystick. Although this do strengthen the fact that a touch based interface canbe easier to learn, it does not mean it will be the best and most performing interface in aspecific task. Our post tests questionnaire did show that the TARGA PI, after it was fullyunderstood by the user, was chosen as the preferred interface for the tasks. Users mentioned“precision” as an argument pro TARGA PI, our guessing is that the physical feedback isthe deal breaker. And as mentioned earlier, the lack of physical feedback is one of thebiggest disadvantage with touch interfaces [7]. The lack of physical feedback can not bereplaced but the negative effects can be reduced by introducing other feedback like soundsand vibration [7].

In general touch interfaces has several obvious benefits against physical ones. The ability to

34(39)

change visuals on the fly either after a specific user needs or based on a specific interactionis simply great.

6.1.3 Gesture Interface

Technology today, and solutions like the Leap Motion, is very precise at collecting datafrom our world, although the margin of errors by users using them is still a bit to high forbeing god for use in a production state. It should also be mentioned that these gesture basedinterfaces often makes the users tired in their arms and shoulders after a short time of usemaking it a less good choice for longer sessions. This was actually something some of ourusers mentioned during the workshop.

An interesting finding during our work with the gesture based interface was the actual hard-ware performance of the Leap Motion. In our background research we found that severalarticles mentioned that lag and slow response times was one of the major problems anddrawbacks with the the gesture based technology. During our work lag was extremely rareand we did not encountered this on a regular basis. We had a few ones happening incon-sistently. This indicates that technology is moving in the right direction and removing oneobstacle at the time. Hopefully the technology is completely ready in just a few years fromnow.

What we also noticed during our time with the Leap Motion is what we call “the lack ofphysical boundaries”. With applications like ours this is a deal breaking flaw. Our usersshowed tendencies on leaving the leap motion vision area several times mostly withoutnoticing it. This due to paying attention to the instructions and the task and not to were theyhold their hands. And of course this is how it should be. The users should not need to makesure the hand is completely correct adjusted above the sensor. Introducing some kind ofphysical boundary would probably decrease the amount of visual loss for the leap motion.

In a production state we would not recommend a gesture based interface due to the simplefact that we think the disadvantages over weights the benefits.

6.1.4 Gamepad interface

Gamepad interfaces are great. They provides a great number of available buttons and func-tions. They have a design and a button layout that many users prefer and think is comfort-able and are familiar with. During our users tests with the gamepad controller we found thatthey are also very easy to use in a environment when you present a clear button layout asgamepads usually do not support dynamically changing labels. Which is of course one ofthe downsides that they share with all physical controllers. Our users thought the gamepadinterface was easier to use but preferred the TARGA PI for a more precision controller.

Gamepads are great and can be used in many applications. They are easy to access and toprogram with any computer. We think that a gamepad interface is good complement to thestandard TARGA PI joystick.

35(39)

6.2 Future Work

Our work done is not in any way complete. There are several areas available for improve-ment and continuation.

During the thesis work we were not able to get in touch with real end users of the watercannon system and therefore forcing us to trust staff at Unifire and to use people not relatedto the industry as test persons. Our hope is to be able to change this in the future and havea more close collaboration with real end users. This is although quite a big challenge dueto the severe number of use cases and applications for the system. The spread of clientlocation base is large and many of Unifire’s clients are not based in Sweden.

In this thesis we explored three different modern interfaces. This is far from every possibleinterface that could be potentially useful. Examples on other interfaces and areas we like tosee improved and investigate are,

• High end Gyros can today be found in almost every modern smart phone. We wantto test if this can complement the prototypes in any way.

• Control by standard mouse and keyboard. This could be especially interesting foroperators in remote places that are already using desktop or laptop computers.

• Voice control is gaining more and more grounds, especially in the smart phone mar-ket. This might be a very convenient way to complement a touch based interface.

• Introduce some sort of physical boundary for the gesture based interface. This couldbe a vibrating bracelet or a hollow framed box for mentioning two proposals.

6.2.1 Further Development

As a follow up to this thesis work Unifire AB will continue to explore different methods ofinteracting with their system. The development of the touch screen interface will, from thispoint be continued by Unifire AB. The plan is to develop a fully working and stable touchscreen interface with support of multiple languages, color schemes, TARGA controllers etc.Unifire will also continue to use the Leap Controller interface as a proof of concept demoproduct presented at fairs etc. The leap motion interface will most likely not be used inproduction in the near future. The gamepad interface will like the gesture based interfacemost likely not be used in production. Although, the gamepad interface has appeared tobe a great interface for debugging TARGA installations from a remote position by Unifirestaff. Therefor the gamepad interface will be continued and stabilized to work with moregamepad brands and models.

6.3 Restrictions & Limitations

This master thesis covers 30hp (Swedish academic units) which corresponds to 20 weeks offull time studies. The project was performed together with Unifire AB on a distance whichalso added some limitations in communication between the student and Unifire. But con-stant updates through different communication channels helped to minimize this limitation.

36(39)

We have chosen not to include any bigger industrial computer or robotic systems in thisproject due to availability and time limitations. The hope is although that the outcome ofthis thesis can be of some kind of guidance when designing for big systems similar to thisas well.

During the thesis we had no access to real end users of the TARGA system. Instead staff atUnifire AB was used together with users not related to the industry or the field of topic.

6.4 Summary

In this master thesis we have shown that using modern human interfaces like touch, gesturebased and gamepads together with industrial water cannon applications are possible andcould be a very good complement to the traditional physical interfaces that are commonlyused in these systems today.

We have tested three modern interfaces and compared them to a traditional water cannonjoystick interface. All our users preferred the old established and well developed joystickinterface but in all cases the users said that our new interfaces was easier to learn andunderstand, with the exception of the gesture based interface which the user thought was abit hard to use and unreliable.

We found during our research that touch based interfaces are today so good that they canperform equally good as a physical interface if used and implemented correctly. It was alsofound that gesture based interfaces can be very intuitive and give the user a very naturalway of interaction. Gamepads are a familiar type of interface for many users. They haveshown to be comfortable, have great button layouts and not to forget a very decent amountof buttons and functions.

37(39)

References

[1] M. Porta, “Human–computer input and output techniques: an analysis of currentresearch and promising applications,” Artificial Intelligence Review, vol. 28, no. 3,pp. 197–226, 2007.

[2] A. Dix, Human-computer interaction. Springer, 2009.

[3] M. Soegaard and R. F. Dam, “The encyclopedia of human-computer interaction,” TheEncyclopedia of Human-Computer Interaction, 2012.

[4] S. Sandqvist and L. Stenmark, “Att navigera med gester: Gestbaserad teknik forframtiden,” 2014.

[5] J. Canny, “The future of human-computer interaction,” ACM Queue, vol. 4, no. 6,2006.

[6] U. AB, “Delivery statistics and sells data.” http://unifire.com, 2015.

[7] S. Lee and S. Zhai, “The performance of touch screen soft buttons,” in Proceedingsof the SIGCHI Conference on Human Factors in Computing Systems, pp. 309–318,ACM, 2009.

[8] M. Baldauf, P. Frohlich, F. Adegeye, and S. Suette, “Investigating on-screen gamepaddesigns for smartphone-controlled video games,” ACM Transactions on MultimediaComputing, Communications, and Applications (TOMM), vol. 12, no. 1s, p. 22, 2015.

[9] M. Oshita and H. Ishikawa, “Gamepad vs. touchscreen: a comparison of action se-lection interfaces in computer games,” in Proceedings of the Workshop at SIGGRAPHAsia, pp. 27–31, ACM, 2012.

[10] M. J. Micire, “Multi-touch interaction for robot command and control,” tech. rep.,University of Massachusetts Lowell, 2010.

[11] A. Singh, S. H. Seo, Y. Hashish, M. Nakane, J. E. Young, and A. Bunt, “An interfacefor remote robotic manipulator control that reduces task load and fatigue,” in RO-MAN, 2013 IEEE, pp. 738–743, IEEE, 2013.

[12] J. M. Szymanski, J. Sobecki, and P. Chynał, “Actiontracking in gesture based informa-tion systems,” in Proceedings of the 2014 Mulitmedia, Interaction, Design and Inno-vation International Conference on Multimedia, Interaction, Design and Innovation,pp. 1–7, ACM, 2014.

[13] D. Prochazka, J. Landa, T. Koubek, V. Ondrousek, et al., “Mainstreaming ges-ture based interfaces,” Acta Universitatis Agriculturae et Silviculturae MendelianaeBrunensis, vol. 61, no. 7, pp. 2655–2660, 2013.

38(39)

[14] G. Lopez, L. Quesada, and L. A. Guerrero, “A gesture-based interaction approach formanipulating augmented objects using leap motion,” in International Workshop onAmbient Assisted Living, pp. 231–243, Springer, 2015.

[15] S. Chen, H. Ma, C. Yang, and M. Fu, “Hand gesture based robot control system usingleap motion,” in Intelligent Robotics and Applications, pp. 581–591, Springer, 2015.

[16] D. Bassily, C. Georgoulas, J. Guettler, T. Linner, and T. Bock, “Intuitive and adaptiverobotic arm manipulation using the leap motion controller,” in ISR/Robotik 2014; 41stInternational Symposium on Robotics; Proceedings of, pp. 1–7, VDE, 2014.

[17] T. Grzejszczak, M. Mikulski, T. Szkodny, and K. Jedrasiak, “Gesture based robotcontrol,” in Computer Vision and Graphics, pp. 407–413, Springer, 2012.

[18] B. Merdenyan and H. Petrie, “User reviews of gamepad controllers: A source of userrequirements and user experience,” in Proceedings of the 2015 Annual Symposium onComputer-Human Interaction in Play, pp. 643–648, ACM, 2015.

[19] F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the accuracy androbustness of the leap motion controller,” Sensors, vol. 13, no. 5, pp. 6380–6393, 2013.

[20] A. Colgan, “How does the leap motion controller work?.” http://blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work/,2014.

[21] A. Chan, T. Halevi, and N. Memon, “Leap motion controller for authentication viahand geometry and gestures,” in Human Aspects of Information Security, Privacy, andTrust, pp. 13–22, Springer, 2015.

[22] A. Melcon, “Best pc game controllers 2016.” http://www.tomsguide.com/us/best-pc-game-controllers,review-2776.html, 2016.

[23] M. Walker, L. Takayama, and J. A. Landay, “High-fidelity or low-fidelity, paper orcomputer? choosing attributes when testing web prototypes,” in Proceedings of theHuman Factors and Ergonomics Society Annual Meeting, vol. 46, pp. 661–665, SAGEPublications, 2002.

[24] C. Boothe, L. Strawderman, and E. Hosea, “The effects of prototype medium on us-ability testing,” Applied ergonomics, vol. 44, no. 6, pp. 1033–1038, 2013.

[25] S. C. Wheelwright and K. B. Clark, “Accelerating the design-build-test cycle for effec-tive product development,” International Marketing Review, vol. 11, no. 1, pp. 32–46,1994.

[26] J. Bergstrom-Lehtovirta and A. Oulasvirta, “Modeling the functional area of the thumbon mobile touchscreen surfaces,” in Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems, pp. 1991–2000, ACM, 2014.

[27] J. G. Allen, R. Y. Xu, and J. S. Jin, “Object tracking using camshift algorithm andmultiple quantized feature spaces,” in Proceedings of the Pan-Sydney area workshopon Visual information processing, pp. 3–7, Australian Computer Society, Inc., 2004.

[28] I. O. oriented Technology Center, Developing object-oriented software: anexperience-based approach. Prentice Hall, 1997.

39(39)

[29] C. S. Horstmann, C. S. Horstmann, and C. S. Horstmann, Big Java Early Objects.Wiley, 2014.

[30] E. A. Hoffmann, “Open-ended interviews, power, and emotional labor,” Journal ofContemporary Ethnography, vol. 36, no. 3, pp. 318–346, 2007.

[31] C. M. Barnum, Usability testing essentials: ready, set... test! Elsevier, 2010.

[32] J. Rubin and D. Chisnell, Handbook of usability testing: how to plan, design andconduct effective tests. John Wiley & Sons, 2008.