10

Click here to load reader

Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

  • Upload
    hadien

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

Keyword Base Framework for DLNA Acceptance Testing

Bruno Filipe Guia de [email protected]

Instituto Superior Tecnico, Lisboa, Portugal

November 2015

Abstract

A successful acceptance represents the minimum operational functionality required by the customerto accept a software delivery, along with validation by use case studies. So far, acceptance frameworksfocused on automating this testing type. We propose an acceptance testing framework based on akeyword model whereby each device is mapped into the network through a state machine and accep-tance testing scenarios mapping are translated into a unique code, replacing the manual testing for anautomatic approach. Results showed that the proposed framework can reduce the demand for manualtesting in the acceptance process, hence eliminating part of the limitations associated with human op-erations, so far required in testing on a regular basis. However, the framework itself does not producea fail or pass result. That result evaluation derives from the level of functioning achieved in the finaluser implementation test.Keywords: Testing framework , Acceptance testing, DLNA, keyword driven model

1. Introduction

Testing is an expensive activity, counting up to 55%of a project cost [5]. In fact, failure to detectand correct defects in early states of a project canincrease considerably its cost [3] and compromisethe company reputation. Therefore, optimizationof testing activities can lead to important savingson the overall project cost. Cost is reduced in twointerlinked ways: by reducing the testing activitiescost as a whole and by increasing success in earlyfinding of any testing bugs, hence avoiding costly so-lutions for later issues. The core of the testing activ-ities are test cases whereby a test case correspondsto a set of instructions and expected result. Com-bining several test cases originates initially a testingsuite and ultimately a test plan which defines a setof test cases to be implemented. A frequent testplan required to be performed either several timesper month or even per week, is a good candidate foroptimization. Manual testing of the test plan canbe tedious for the tester and time consuming for thecompany resulting in increased costs and increasinghuman error potential. Testing automation is a de-sirable solution as it overcomes manual testing lim-itations through operational optimization. Most ofthe work and research was conducted at ACCESSEurope GmbH1 offices in Oberhausen, Germany aspart of my Software Tester position in the QualityAssurance department. The primary focus for the

1http://eu.access-company.com/

present work was to improve the acceptance testingprocedure by developing a new automation frame-work, easily expansible and able to perform simpleand complex workflow testing without overlappingwith existing tools. Currently, most of internallytesting performed by the quality assurance team isbase on a called End-To-End (E2E) approach. Inorder to apply a distributed testing based on enduser interactions, a manual setup of a test envi-ronment would be required. In a test environmentand prior to testing, it is required to setup the net-work configuration, deploy the System Under Test(SUT) and Device Under Test (DUT) while ensur-ing that all configurations are correctly configured.Testing operations involve one or two testers thatperform test cases as part of a test plan and reportthe results. Interactions between software sharingthe DLNA norm and digital media is one of themain products. Currently, an automatic solutionthat controls all the different devices potentially in-volved and their interactions occurring at a humanlevel and that simulates a wide range of end-userscenarios is unavailable. The present work aimedto develop a solution applicable not only to test-ing between network devices but also to additionalscenarios and products.

1.1. Objectives

The main objectives underpinning this thesis wasto develop an automatic testing framework solutionthat can save time and resources on the long run by

1

Page 2: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

creating templates, reusable codes and an easy tofollow method for extending the solution. A suc-cessful testing framework would allow the release ofresources from manual testing to others activitiessuch as testing of other features not automated.

2. BackgroundDigital Living Network Alliance is a protocol foraccessing and sharing multimedia content betweendevices developed by an organization with the samename. DLNA categorizes the products in Deviceclasses, whereby each class has its own set of rulesand characteristics.The main device classes are thefollowing:

• DMS : correspond to the servers where the con-tent including audio, video or images are storedand sent on request to another device class.

• DMR : are devices where the content from theserver is displayed, like an TV.

• DMC : is a controller that enables the user tosee a list of available servers and renders

• DMP : corresponds to a DMC with an embed-ded DMR

• DMPrs: Is a printer using DLNA protocol andable to print images.

2.0.1 Use Cases

To ensure interoperability between devices, whichcould have differing manufacture origins, a certifi-cation is issued by DLNA after verifying each prod-uct through a program partially described in sec-tion 3. The certification is attributed only whenthere is proof of meeting the necessary require-ments. DLNA provides a certification program, in-cluding tests against specific tools or testing withother already certified devices. Typical use cases ofDLNA protocol can be summarized in three types:

1. Content playback from the server to a renderer:

2. Upload or download content from the server:Clients can upload and download content fromDMS.

3. Printing content: Printers can be connected tothe DLNA eco system and print content sendby DMC or DMP.

2.1. Test Automation TechniquesAccording to various authors and studies [4], test-ing automation techniques can be divided in 5 largetypes: Capture-Replay Testing; Script Based Test-ing; Keyword Driven Testing ; Keyword DrivenTesting and Model Base Testing. All of those whichwill be describe in detailed in the next sub sections.

2.1.1 Script Based Testing

Script Based testing is supported on the idea of de-composing testing into small units and creating asmall testing script for each unit. Once the scriptsare implemented execution is done automatically.To create new tests, scripts can be combines amongthemselves and new tests can be created by chang-ing the script execution order. The script base test-ing abstraction and quality depends in large part onthe implementation used. Coverage is not easily ex-tracted and needs to be manually compared to therequirements.

2.1.2 Keyword Driven Testing

Keyword base script are based on the idea of hav-ing keywords and additional info, and the keywordis mapped to scripts and library executing test-ing. Adaptation between the script and SUT canbe achieved hence increasing the tool abstraction.Keyword-driven testing allows a high level of ab-straction and can be automatically executed how-ever coverage is difficult to measure.

2.1.3 Model Base Testing

Model base testing consists in the idea of repre-senting the behavior or a sub set of this behavior ofSUT into a written form. Further testing follows themodel to ensure the correct behavior of the device.The model testing is generated by a test modulegenerator that later is executed with the supportof test scripts.Some authors indicate that the mainmodel base testing advantage is the awareness ofthe system and coverage[1].

2.2. Continuous integration

Continuous integration is based on the idea of con-tinuously getting the latest version of the sourcecode and compilation to ensure proper working, ontop of which tests can be executed. This practiceensures that the source quality is tested in earlystates and broken source code can be detected andfixed early in the project. According to several web-sites, Jenkins is currently the most used continu-ous integration tool, also the one used by ACCESSGmbH making it the tool we are required to have agood operational understanding.

3. Implementation

Proposed Testing Framework

The main result and outcome of the present thesisis a testing Framework called GenFramework.Dueto the fact that this thesis is being conducted in abusiness environment, cost and time to market areboth two critical criteria. If the solution develop-ment costs are higher than others available in the

2

Page 3: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

market, the present research would have limited in-terest and application. Concerning time-to-market,given that shareholders (testers, programmers andmanagers) can be consider the main end-users ofthis framework, the shorter the time to deliver thetool results the better, otherwise this tool wouldonly have internal use application. Negative feed-back or delay in presenting results may result in aproject cancellation. GenFramework is required toprovide easy integration and support of new hard-ware/operating system needs given that new oper-ating systems and mobile devices may arise. De-spite the fact that the SUT running hardware andsoftware operate with minimum change and prefer-ably in an invisible mode, the new element to beadded should be integrated in the most transpar-ent way possible. Once the setup and configura-tion is done, the development and maintenance oftest cases should be executable by a tester withoutprogramming skills. This approach releases the pro-grammers from this time consuming task. However,changes in the environment such as new platformsstill require a programmer support. After supportis added give, interaction between similar scriptsshould be applicable. In a distributed system suchas DLNA operations are not always sequential. Acommon example is a server access, when differentdevices can perform operations in the server simul-taneously. GenFramework should be able to controlmultiple devices simultaneously. When executing atest case all SUT present in the system need to bemonitored and controlled. A failure may occur inany of the SUT and GenFramework needs to be ableof detect individual SUT failures.

3.1. Solution overview

The idea behind the solution of GenFramework isusing a script files as the one in listing 1. Eachscript file represent one test script that should beexecuted. The language used is high level and isnot bound to any specific language. The technicalrequirements are the following:

• Web sockets modules : pip support many mod-ules.

• Communication with Appium

• Communication with Selenium

• Support multi platform : Python is supportedin Windows MacOS and Linux

• Integration with Jenkins

• Good community support (Subjective criteria)

• Support of reflection and dynamic properties

Listing 1: Simple script example

1 \# Sta r t i ng a p p l i c a t i o n2 DMC myControl l i nux3 DMS myServer windows4 DMR myRenderer Android5 \# s t a r t i n t e r a c t i o n s6 myControl play myServer myRenderer v ideo

Using a keyword-driven framework approachtesters only need to write scripts similar to exam-ple in 1. Lines 2, 3 and 4 represents initiationof devices, a DMC, DMR and DMS are created.In line 6 testing start by the created DMC play-ing a video from DMS in the the DMR. To makethis possible the framework in background needsto perform a large amount of operations and moreinformation is needed. DMS needs to a have thecontent ; DMC needs to locate content and DMRneeds to be on a ready state to get input. To im-prove testing quality the framework should haveknowledge about current state of all SUT. This al-low to better differing if some behavior or action iscorrect. State Chart extensible Markup Language(SCXML) is a XML language created by W3C [2]to represent event base state machines. DLNA de-vice classes mainly only react to events, from otherdevices or users. SCML is presented as a as eventbase xml, this turns SCXML a good candidate tomodeling DLNA components. The goal is only touse SCXML to help modeling the DUT, compat-ibility to SCXML will not be completed but onlyinspired in SCXML. This approach can hopefullyhelp saving time by avoiding implemented instruc-tions that are not needed in this content and notfollowing SCXML convention completely. It is im-portant to mention that we do not aim to fully sup-port SCXML and/or generic implementations, butinstead only use the concepts and operations fromSCXML needed to better modeling DLNA systems.In the next sub sections some this tags and otherSCXML concepts will be explained in more detail.An event can have multiple transitions, howeveronly one can be selected. Every time a event match,condition will be evaluate. Only when the conditionmatch the event is executed. When none of theevents match the condition or name does not exist.Generic event called ”*” can still be executed. Anyevent name can match with ”*”. However condi-tion also needs to be verify. Transition are executedwith a up down approach stopped in the first exe-cutable one. Non accessible transitions should notoccur however they are possible if previous transi-tion are always invoked. SCXML allow compoundstates which means that internally a state can havemore states. Also allow parallel states which are

3

Page 4: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

very similar to compound states, a group of statetags have before a parallel tag and all children aresimultaneous activated. All onEntry and onExitare executed. In the framework the implementa-tion for simplification the path of a parallel stateneed to be specify. In DLNA content, upload anddownload are two example of functionality inside aparallel state called streaming state.

DataModel is supported by SCXML supports inhis specification a data model embedded in theSCXML machines. However we decided not to useit. In alternative each SCXML will have an as-sociate Data Model using the is own specification.This option keeps the SCXML more clean and focuson devices functionality only, testing specific infor-mation are stored in different files and location.

3.1.1 New State Machine

When a generic line is read on the script a instanceneeds to be created. A new process with all relevantinformation is crated. this process should managethe binary to deploy ( on request ). This info (lo-cation of the delivery is passed to the process. Thebinary is not running, needs to Wait for onStart()instruction from GenFramework to start.

3.2. Solution Architecture and implementationFor a better understanding of the framework spe-cific parts of the framework will be more detailedexplained. To achieve a framework as much genericas possible some options had to be more complexthat expected. Allowing the framework to be moregeneric as possible.

3.2.1 TestRunner

TestRunner is the center of the framework, is fromthis module all framework is controlled. Once theframework is invoked TestRunner receives the no-tification with the information of the script undertest.

Every SUT will have is own separated process.This allow the framework to control an unlimitednumber of devices and control each one in a inde-pendent way.

3.2.2 State Machines

This module create a state machine from SCXMLfiles. To keep this solution generic state machinesneed to be construct on run time. To achieve thisstate machine basic idea is to rely on list and dic-tionaries to map the state machines. State ma-chine modules is divided in two main blocks, thegeneric block where a state machine is constructand a concrete block were a state machine is ini-tialized and use by other modules. To construct

the generic machine a folder defined in configura-tion folder SCXML folder file should be there, forexample a DMS.scxml file will be the state machinefor a DMS machine.

3.3. AdaptersAdapters is the name of the components created inthe framework that will connect with the SUT inorder to interact with the SUT. The Adapters arestructured in the following way: Inside the folder”adapters”, device class folders should exist. Adevice class name is obtain by the state machinenames. Inside the platform is expected more folder,one for each implemented platform ( linux, win-dows, and so on..) and a Generic folder. Insideeach folder a events.py file is expected. By defaultthe frameworks tries to access always first the plat-form under test. If the function wanted does notexist framework tries then the generic platform.

3.3.1 Scripts

Scripts are .txt files containing instructions abouthow tests should be executed. Ideally instructionsare mapped from already existing manual test cases.The two main type of instructions supported bythis script are generic and concrete instructions.Generic instructions are the creation of a new ob-ject, representing a new instance of a concrete ob-ject under test. The syntax should be ”DeviceClassName Param” where:

• DeviceClass: Correspond to a existing statemachine and a concrete instance of it will becreated.

• Name: Name from which a concrete instanceof a state machine will be refer.

• Param: Extra parameters, each device class oreven external conditions dictates the parame-ters passed. A unlimited number of parametersare supported.

Once devices are created they can be invoked,this is refer as concrete operations, to invoke themthe used name should be use. This option make thescript more readable and helps user to know withdevice is interacting with. The syntax for concreteoperations is the following:

• Name: Name from which a concrete instanceof a state machine will be refer.

• Operation: Correspond to a existing state ma-chine and a concrete instance of it will be cre-ated.

• Param: Extra parameters, each device and op-eration have is own parameters, tool support aunlimited number of parameters.

4

Page 5: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

Generic and concrete instructions can mix duringthe script, however concrete instructions can onlybe perform after generic instructions or it will beno device to perform testing activities. A third setof instructions can also exist on script, call config-uration options to device specific rules in case ofabortion or what to do when some event occur.

When a line of the script fails, the execution isaborted and the exception and/or error message isreturned indicating also the line of the script wherethe error occur.

3.3.2 DataModel

Scripts to run need to have information, for exam-ple instructions such as ”DMS myDMS linux” persi does not provide all information needed, besidethat is a DMS running in a Linux machine. Ex-tra information is needed, in this case would be atleast the ip of the machine. Information can besend directly in the scripts ”DMS myDMS linuxip=192.168.12.3”. To keep the scripts more genericand avoiding modifications at each test run or en-vironment modification a Data Model Module wascreated. All added parameters either in Genericor Concrete modules without allocation, it means,a=b, DataModel will be consult and string replaceby the correct and complete information.

3.4. Communication between process and Gen-Framework

Having each SUT controlled by a StateMachine run-ning in is own process may create communicationissues. Action in one SUT may have consequencesin another SUT. Using DMC to play a video onDMR changes DMR status, communication is re-quested to update DMR state machine. In somecases like the previous example of a video playingin DMR, the state machine only have a passive be-haviour, should only update state and if some veri-fication function is implemented may check if DMRis actually playing content. Reporting ok or fail toGenFramework. Several features are added to solvethe communication issue . The first one is a sharememory. Taking advantage of python multiproces-sor libraries, a manager from multiprocess moduleis created and instantiate with a dictionary. Thisdictionary is not iterable so when is accessible bysomeone needs to know what is searching for. Eachconcrete instance should add a entry to the dictio-nary with the instance name. The procedure is notstrict but is expected that the new entry with theconcrete name of a instance is himself a new dic-tionary. Returning to the example from previousphase, now DMC to play a video in the selectedDMR, can search in this dictionary for the concretename of the DMR and is expected to have all infor-mation needed to interact with the DMR.

3.5. Modification to ACCESS DLNA testingOnce a generic based framework was obtained it wasimportant to adapt it so as to enable its compati-bility with ACCESS use. The adaptation includesmodification of the framework to be integrated withcurrent infrastructure and also the creation of thelinks to interact with the SUT. In order to evalu-ate the effectiveness of the proposed framework foracceptance testing in DLNA context a wide rangeof scripts were created. The selected tests incor-porate features relevant to acceptance testing andhave highly defined test cases. When possible onlymature features should be used and at this test-ing stage, only very defined specifications alreadymanually tested should be considered. This willfacilitate the evaluation before deciding if the toolis compatible with its application in new and morecomplex cases where no manual test was performed.From analysis of the existing and performed testcases and plans a list of candidates were selected.A key list of features together will their test planswas identified:

• Parental Guidance : Restrict visibility ofcontent to certain devices.

• Download/Upload : Download and uploadof content either to DMC or DMS.

• Content Playback : Multimedia operations( play, pause, stop..) on audio video or images.

• Discovery Services : Visibility of devices onthe network

3.6. ACCESS DLNA Testing ArchitectureGenFramework must be incorporated in currentACCESS testing architecture, not added as a newsystem running separately which would increasecosts and resources to assure its functionality. Cur-rent architecture for testing products is based ona continuous integration using a Jenkins solution.As can be seen in Figure1, the Jenkins setup iscontrolled at the highest level by a master Jenkinswhich coordinates all the slaves where all specificbinaries - Windows, Linux, iOS and Android - arecompiled

Figure 1: Jenkins setup and organization

5

Page 6: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

Source code is stored in GitHub and fetched byJenkins to compile. Once the compilation process iscompleted the testing can start. Pytest2 is the test-ing framework selected by ACCESS. Pytest testscode is stored together with the source code fetchedfrom GitHub. Once invoked by Jenkins, a specificfolders is analyzed recursively and run all pythonfiles with test tag identifiable by Pytest.

For DMS testing a server with content was addedto the infrastructure and to provide content to theservers on request. An API set was created to inter-act with all builds in order to decrease the individ-uals interactions times. GenFramework does notdirectly interact with the builds created by Jenk-ins but only through a proxy API. This abstrac-tion level boosts GenFramework implementation inJenkins, simplifying interactions and implementa-tion details. This API enables devices creation andspecifies what devices are to be created and interactwith other devices using an API.

As an alternative to this framework, API to theback-end of the devices is also available with com-munication following a similar procedure to the onefrom DLNA protocols. This keeps communicationsimple by allowing a back-end displaying similar in-formation of the information sent to the network.This internal API is called Application Helper Li-brary(AHL).

When using Jenkins and py.test 1 the internalprocedure does not allow to interact by using DLNAspecification events that DUT sends and receives.The main reason for this decision was that usingDLNA specification events, either as listening or bysending commands, could interfere with testing insome unexpected way - for instance adding morenoise to the network whereby more events had to beprocessed - could lead to issues not directly relatedto the features undergoing testing.

3.7. GenFramework Integration with Jenkins

To integrate our GenFramework with Jenkins test-ing an interaction was made with pytest. Pytest isa testing framework written in python that gener-ates reports and if needed can be integrated withJIRA to automatically create bug reports. In orderto make GenFramework compatible only a config-uration removing report generation flag is requiredand invocation is done as in example 2.

Given that the test run is performed by pyTest,the report module was removed from the frameworkitself so as to clean the code. Each test is run in-dividually by pyTest. Instances of testing are cre-ated in the beginning of each execution and deletedat the end. pyTest simply runs all the tests withthe specific tags, i.e,all py test files with a def testinside. Hence, the tests in the repository require

2http://pytest.org

Listing 2: Modification to run from py.test

import AutomationFramework .MainClass as MainClass

de f t e s t my io t ( ) :mc = MainClass . MainClass ( )mc . run ( ’ IOT Script ’ )

modification ensuring that only the right tests arerun by pytest. This was achieved within Jenkins bycreating specific jobs for each feature we wanted tobe tested, ensuring that for the script would onlyrun inside folders where tests were required.

Figure 2: Modification in the architecture to sup-port jenkins and py.test.

3.8. Test BedGenFramework only ensures testbed partiallythrough the integration module that obtains thetesting deliveries and ensures the proper configu-ration. However, this module depends on the im-plementation specifics and may depend of externaltools such as Github to obtain source code or Jenk-ins to compile and generate a testing executable.Within DLNA testing, if inadvertent devices are inthe network GenFramework may be unable to fail todetect them, since this is out of its scope but thesedevices may interfere with testing by originatingfalse failures. In the scenario were GenFrameworksrunsfrom Jenkins no testbed is needed. Testbed is atask performed by Jenkins and pytest, leaving Gen-Framework free to execute.

Originally, the framework had a module calledsimulated. For DLNA testing this element wouldallow creating any device class to simulate with thedevices under test. However, during implementa-tion it was concluded that this behavior could beencapsulated in the interaction module. Noncon-formity test test result due to test bed error ( Sametest repeated multiple times getting different results) is mitigate by Jenkins and structure approach.

3.9. Implementation workflowTo achieve an valid implementation, a set of stepsis required. First, DUT needs to be identified in

6

Page 7: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

order to create the state machines. DMS, DMR,DMC and DMP were identified. DMP from linuxand mobile have different features and therefore theneed of being different. In Linux the download andupload folder can be specified and the DMS for eachfolder needs to specify if an upload is accepted ornot, mobile version use only the pre-define folders.However, during the early stage DMC was discon-tinued and only equivalents of DMP were incorpo-rated in testing. Another relevant aspect is thatmobile versions of DMP have more options thanthe equivalent DMP for Linux. To accommodatefor this, the machines states of DMP were scriptedin two, one for mobile and another for Report re-quired for parental guidance. Besides the continu-ous integration, when the framework is integratedwith pytest and runs directly from Jenkins server, itshould also work as a stand-alone application to runtests in isolated networks where pytest and Jenkinsare not available. Parental guidance testing mayuse a DMS installed in a router when connectionto outside is not available. Binary is already in-stalled since this test is performed in an ad hoc waywhereby binary installation is kept outside of thetool due to the effort involved in created a solutionto keep the test isolation but connected to exter-nal in order to obtain the needed binaries or eventcontent for the servers.

3.10. Modification on the framework

ACCESS has its own testing automation systemimplemented and as such, in order to apply Gen-Framework to it adaptations are required. Two keyscenarios were identified, integration with the AC-CESS testing framework and ad Hoc testing.

Figure 3: state updates between devices

3.11. Parallel operations

Current test setup, base on unit and integrationtests ran by Jenkins or testing using tools providedby DLNA focuses on testing at system level wherebyall functions are tested. My aim in this testingwas too focus on parallel and interoperability op-erations, Multi-support of parallel operations offersan unique advantage worthwhile considering con-current operations between devices and not simpleone to one testing.

Listing 3: Example of Json call to MIMD

so f tware=”Android”Hardware=”Nexus”Real Device=”Yes”SUT=DMCJ e n k i n s l o c a t i o n=” l i n k ”

3.11.1 Middleware interactor for DLNADevices

For testing purposes, DLNA products software is re-quired to be deployed in a wide range of hardwareand software. Mobile operated systems Android,iOS and Windows Phone have their own specifi-cation, software and rules about how to start anddeploy software. The enabling framework, calledmiddleware interactor for DLNA devices (MIDD)runs as a server. MIDD was applied outside ofGenFramework so that it can be used by any tool.This type of operation is required not only by Gen-Framework but by any other framework. Imple-mentation and architecture of this software is out ofthe scope for this work, which focuses on how Gen-Framework interacts with MIDD. Testing in mo-bile, iOS, Android and Windows Phone presentsimportant challenges due to limitations and speci-fications of these operating systems. New instancesof DMS, DMC, DMP or DMC cannot be simplylaunched. Only one instance can run per device.Interactions with so many different interfaces maybe challenging due to the different codes and toolsrequired. Also, the mobile interface of our applica-tions evolves fast making it difficult to assure fea-sibility. This problem was solved by developing in-ternally a new framework outside Genframework tounify the interactions with mobile devices An error:No Hardware available will be displayed if not pos-sible to launch the device and aborts the test execu-tion. Some of the fields can be optional and hard-ware can also be optional if any device that can dothe test is considered and not a specific hardware.Frequently connected to the test environment areother manufacturers hardware and software, TVswith DLNA DMRs, boxes with DMR and DMS de-vices. Regarding the external devices, the Middle-ware Framework allows obtaining the IP from thedevice and a limited set of requested information.The abstraction level introduced at MIDM opti-mizes the GenFramework development time by pre-senting simplified interfaces. Once testing is com-pleted, a release instruction of hardware should besent to MIDD, or otherwise the device is marked asin use and cannot be used for testing by either otherinstance of GenFramework or other frameworks.

7

Page 8: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

3.12. Parental Guidance

Parental guidance is a feature restricting contentvisibility to individual devices. The restricting lev-els and names vary according to the region of theglobe, and we will use as of now the American lev-els: G (General Audience) to NC-17 were under 17are not allowed to watch.

Parental Guidance are enforced in 3 differentways:

1. By default content levels whereby each DMSdefines default levels for added devices and foradded content. All new content added to theserver, independently from the type is given adefault level.

2. By default device level is a new device in thenetwork detected by DMS. The added devicedoes not need to interact with DMS to beadded to the list. DMR are not subject toParental guidance levels because they do notnavigate through content. The parental guid-ance from the DMR is therefore equal to thelevel of DMC.

3. By DMS modification of the device and/or con-tent levels. Inside DMS, folder levels can be al-tered and changes are recursively propagated ifa given level is less restrictive than a new folderlevel. For instance if a folder is changed to levelR, all G, PG-13 levels inside this folder and subfolders will update to level R, but NC-17 willremain unchanged, because is more strict thanthe above-level folder, which is acceptable andan expected result. ı

3.13. Download/Upload

Besides streaming, which is the default option forbroadcast content, the user can also opt for down-load or upload content. Through a DMC the usercan download a file from DMS from its local de-vice or upload a file to DMS. From the technicalperspective this is only one flag difference. on theDMS side. Flags used to identify the operation inthe protocol are different and DMS could not evenauthorize it.

3.14. Remote Operations

When testing in simulated home networks and out-side Jenkins where the infrastructure manages theinteractions, GenFramework can run in a specificdevice and this issue is not handled in the Jenk-ins environment. Testing is only applied when us-ing web services and browsing navigation and whenavailable in MIDM. Other type of communicationis are allowed but should be avoided to decrease thechanges or testing interference.

Figure 4: Presentation page

3.15. External Devices

External device corresponds to all of those notmanufactured by ACCESS GmbH and/or with-out access to special API. Attempting to incorpo-rate these devices into the GenFramework contextbrings new challenges to the interactions and vali-dation of results. The external devices work purelyin a black-box testing context. Typically these de-vices are already on the market so during the devel-opment of GenFramework it is reasonable to assumethat any issue - until proved otherwise- results fromour products under development. Inside Jenkinstesting, external devices are obtained using MIMD,a request is done when MIMD replies with OK.

3.16. Negative testing

Negative testing consists in doing something thatis expected to fail in order to pass the test. Thisallows testing limit situations or non-expected in-put to analyze DUT behavior. Notice the followsequence of actions: 1. DMC1 play audio DMR1DMS1; 2. DMC1 stop DMR1 3 ; DMC1 pauseDMR1 In a normal test, DMC1 would be consid-ered a failed test since DMR1 reported failure tochange state. However, behavior was the expectedresult since DMR1 was in a state which a pausecannot be an issue as no content is available. Byknowing DMR1 state machine status, it would bepossible for the framework to identify that in thecurrent state of DMR1 a pause cannot be emitted.But this approach may not always guarantee an ex-pected result. Due to implementation of paralleloperations explained in chapter ?? it is not possi-ble to ensure which operation will be executed firstand the expected result. Testers constructing thetest cases should avoid this ambiguity on the testcases. A negative tag is added to the second pa-rameter and a keyword negative becomes a reservedword in the framework and should not be used todefine anything in the scripts, otherwise potentially

8

Page 9: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

triggering an unstable behavior.

3.17. Setting up the Framework and AdaptersThe framework development and configuration pro-cesses were very time consuming tasks. The frame-work development was the most time consumingtask with 12 weeks. The development stage also in-cluded the necessary adapters configuration for thetesting framework code required to accommodatenew features. The framework development time linewas also affected by external factors such as: fea-tures requirements and changes, or difficulties in theAPI communication with the GenFramework andsometimes not defined since until middle or end ofthe project.

4. ResultsDuring the development of acceptance testing forspecific features, time was tracked for further analy-sis. In order to evaluate the efficacy of the proposedframework for acceptance testing in the DLNA con-text, a wide range of scripts was created. The se-lected tests should have features relevant to accep-tance testing and very well defined test cases. Ifpossible, features used should be mature and atthis testing stage, should suitably reflect well de-fined specifications already manually tested. Thisprocedure will facilitate the tool evaluation, beforedeciding its further application in new and morecomplex cases where no manual test was previouslyperformed.

4.1. TestingTo verify the usability of the tool by testers withoutprogramming skills the tool was briefly explainedand then testers were required to create test cases.Deployment attempts were made to deploy (as ex-plained previously) however, configuration changeswere too heavy to apply in every test run. Duringthe development of acceptance testing for specificfeatures, time was tracked for further analysis. Thesub sector bellow contains the analyses of the re-sults by category. Due to time restrictions, Gen-Framework was only applied in the Jenkins envi-ronment at full scale, all the other implementationsin isolated network were just experiences not carryforward.

4.2. DiscussionParental Guidance results shown the testing num-ber increases when executed automatically sincethe majority of the tests were not possible to beexecuted manually (simultaneous download/uploadand operations while download or upload). Numberof bugs were smaller when manually executed. Dif-ference is due to bugs be related not with function-ality but with interface imperfections reported bytested while testing this functionality. Manual test-ing is finding different diferent bugs in this features.

To catch the same bugs as manual testing interfaceneeded to be tested as well and in the present thatis not done.

Download and upload shown an increase of bugsfindings when comparing to manual testing. Mainreason to this fact is that download and upload canbe done in parallel. Manual testing are by defaultexecuted by only one tester and parallel testing ishard when not impossible to execute. This also pro-voked the higher number of test cases when com-pared to manual testing, the combinations are nowmuch higher. Verification of a successfully down-load or upload can be independently verify by ana-lyzing the file after the operation concludes.

Discovery services testings in GenFramework aredifferent from the manual testing executing. Auto-matic testing focus on detection of devices after con-nect and disconnect without notification. Manualtesting beside this also focus in Ethernet connectionon and off, router disconnects and other operationsnot yet implemented by the framework. Doing in aautomatic way allow a combination of testing, suchas verify if all devices are listed or not and lunchstop a large amount of devices.

Content playback when automated is one of thehardest features to testing due to validation if isactually working without human intervention tocheck. Current validation is done by back end APIcalls, meaning GenFramework software asking forthe current state and trusting in the response qual-ity.

4.2.1 ACCESS Testing Overview

Content playback is the most difficult element totest automatically because is difficult to know whenit works. MCVT, the official DLNA developmenttool has the same issue, solving the issue with a re-quest for manual import or requesting a predefinedserver port for the output. The API works, howeverinterface issues where not identified if the internalstate changes from Play to Pause but video contin-ues to play. As seen in table ?? the range of testingincreases significantly in this situation. We suggestthat automatic executions run and manual testingshould be made occasionally to implement a qual-ity control that verifies the match between expectedand real results. During testing, parental guidancerules had to be changed, forcing a framework read-justment at the PG levels. The number of combi-nations and complexity involved were exponentialand almost impossible to do manually, however, au-tomatic enforcement worked quite well. DiscoveryServices when doing automatically have more quan-tity, but manual testing of discovery is more focuson quality of testing since testing the same withsmall differences and Manual testing simply focus

9

Page 10: Keyword Base Framework for DLNA Acceptance Testing · Keyword Base Framework for DLNA Acceptance Testing ... Test Automation Techniques ... Communication with Appium

in different areas of discovery services not yet fo-cus by Content playaback testing still requires im-provements in the result validation. Current APIapproach does not offer enough accuracy and canlead to false negative or false positive results min-ing the trust in the GenFramework. Across testingof different functionalities was perceptible that in-terface itself needs to be tested. Either interfacetesting as a category of is own or improving currenttest to also test interfaces.

5. Conclusions

It is important to keep in mind that the goal inher-ent to the solution was not to replace manual test-ing completely as activities such as ensuring thatlook and feel are working as expected and reactiontimes, quality of items and free testing are all taskspotentially difficult to automatize. However, from auser point of view ensuring the functionality whenan automatic test passes in a E2E test case does notmean that the manual user can identify issues notdetected by the automatic test. However, automa-tion ensures in a regular basis that at least in someextent the test case still works and the solution isnot damaged. The proposed goal of applying an au-tomatic framework to map acceptance testing wasaccomplished, however, it does not replace manualtesting completely.

The results from the use of GenFramework bytesters without knowledge were discouraging sincetesters were confused and without knowing what todo. A better documentation of the framework isrequired, the tool principles and operational guide-lines should be made very clear to the tester. Un-fortunately, due the lack of resources testers couldnot use the framework for longer times and there-fore tests scripts were developed together with thetool development.

6. Achievements

The overall number of bugs found was larger in au-tomatic than in manual testing. However is im-portant to register that the fact that some auto-matic tests categories reported less issues than themanual execution. However, this issue requires adeeper analysis. When considering the mobile test-ing, the automatic testing focuses on behavior only,and declaring issues when some functionality doesnot do the expected result. During the same pro-cess, the manual testing will report also glitches inthe interface, sometimes even reporting non-relatedissues at all but for unidentified reasons, the testerexecuted further actions leading to a bug. Whenconsidering only the functionality feature and notthe interface or secondary aspects of testing, man-ual test in general was equal or inferior to automatictesting.

6.1. Future WorkFor some well-defined type of applications such asmobile (iOS, Android) or web pages the static anal-ysis could be applied. From the state machine in-ferred information, data model and testing scriptswould be generated automatically and run throughGenFramework.

Some aspects of the framework can be improve-ment in the future, mainly extend SCXML sup-ported languages, automatic generation of testcasesand new interfaces. During the development of thecode to interact with ACCESS solutions some im-pediments were identified that should be taken intoconsideration on a future work mainly the lack ofgraphical interface and IP configuration. Documen-tation still need to go one level up to improve theusability of the tool by testers without program-ming skills.

References[1] L. Apfelbaum and J. Doyle. Model based test-

ing. In Software Quality Week Conference,pages 296–300, 1997.

[2] J. Barnett, R. Akolkar, R. Auburn, M. Bodell,D. C. Burnett, J. Carter, S. McGlashan,T. Lager, M. Helbing, R. Hosn, T. Raman,K. Reifenrath, N. Rosenthal, and J. Roxendal.State chart xml (scxml): State machine nota-tion for control abstraction. 2015.

[3] B. Boehm and P. Papaccio. Understanding andcontrolling software costs. Software Engineer-ing, IEEE Transactions on, 14(10):1462–1477,Oct 1988.

[4] N. Javvaji, A. Sathiyaseelan, and U. M. Selvan.Data driven automation testing of web appli-cation using selenium. In STEP-AUTO, pages32–37, 2011.

[5] L. Y. Laporte, N. Berrhouma, M. Doucet, andE. Palza-Vargas. Measuring the cost of softwarequality of a large software project at bombardiertransportation: A case study. ASQ SoftwareQuality Professional, 14(3), 2012.

10