Keyword Base Framework for DLNA Acceptance Testing ?· Keyword Base Framework for DLNA Acceptance Testing…

Embed Size (px)

Text of Keyword Base Framework for DLNA Acceptance Testing ?· Keyword Base Framework for DLNA Acceptance...

  • Keyword Base Framework for DLNA Acceptance Testing

    Bruno Filipe Guia de

    Instituto Superior Tecnico, Lisboa, Portugal

    November 2015


    A successful acceptance represents the minimum operational functionality required by the customerto accept a software delivery, along with validation by use case studies. So far, acceptance frameworksfocused on automating this testing type. We propose an acceptance testing framework based on akeyword model whereby each device is mapped into the network through a state machine and accep-tance testing scenarios mapping are translated into a unique code, replacing the manual testing for anautomatic approach. Results showed that the proposed framework can reduce the demand for manualtesting in the acceptance process, hence eliminating part of the limitations associated with human op-erations, so far required in testing on a regular basis. However, the framework itself does not producea fail or pass result. That result evaluation derives from the level of functioning achieved in the finaluser implementation test.Keywords: Testing framework , Acceptance testing, DLNA, keyword driven model

    1. Introduction

    Testing is an expensive activity, counting up to 55%of a project cost [5]. In fact, failure to detectand correct defects in early states of a project canincrease considerably its cost [3] and compromisethe company reputation. Therefore, optimizationof testing activities can lead to important savingson the overall project cost. Cost is reduced in twointerlinked ways: by reducing the testing activitiescost as a whole and by increasing success in earlyfinding of any testing bugs, hence avoiding costly so-lutions for later issues. The core of the testing activ-ities are test cases whereby a test case correspondsto a set of instructions and expected result. Com-bining several test cases originates initially a testingsuite and ultimately a test plan which defines a setof test cases to be implemented. A frequent testplan required to be performed either several timesper month or even per week, is a good candidate foroptimization. Manual testing of the test plan canbe tedious for the tester and time consuming for thecompany resulting in increased costs and increasinghuman error potential. Testing automation is a de-sirable solution as it overcomes manual testing lim-itations through operational optimization. Most ofthe work and research was conducted at ACCESSEurope GmbH1 offices in Oberhausen, Germany aspart of my Software Tester position in the QualityAssurance department. The primary focus for the


    present work was to improve the acceptance testingprocedure by developing a new automation frame-work, easily expansible and able to perform simpleand complex workflow testing without overlappingwith existing tools. Currently, most of internallytesting performed by the quality assurance team isbase on a called End-To-End (E2E) approach. Inorder to apply a distributed testing based on enduser interactions, a manual setup of a test envi-ronment would be required. In a test environmentand prior to testing, it is required to setup the net-work configuration, deploy the System Under Test(SUT) and Device Under Test (DUT) while ensur-ing that all configurations are correctly configured.Testing operations involve one or two testers thatperform test cases as part of a test plan and reportthe results. Interactions between software sharingthe DLNA norm and digital media is one of themain products. Currently, an automatic solutionthat controls all the different devices potentially in-volved and their interactions occurring at a humanlevel and that simulates a wide range of end-userscenarios is unavailable. The present work aimedto develop a solution applicable not only to test-ing between network devices but also to additionalscenarios and products.

    1.1. Objectives

    The main objectives underpinning this thesis wasto develop an automatic testing framework solutionthat can save time and resources on the long run by


  • creating templates, reusable codes and an easy tofollow method for extending the solution. A suc-cessful testing framework would allow the release ofresources from manual testing to others activitiessuch as testing of other features not automated.

    2. BackgroundDigital Living Network Alliance is a protocol foraccessing and sharing multimedia content betweendevices developed by an organization with the samename. DLNA categorizes the products in Deviceclasses, whereby each class has its own set of rulesand characteristics.The main device classes are thefollowing:

    DMS : correspond to the servers where the con-tent including audio, video or images are storedand sent on request to another device class.

    DMR : are devices where the content from theserver is displayed, like an TV.

    DMC : is a controller that enables the user tosee a list of available servers and renders

    DMP : corresponds to a DMC with an embed-ded DMR

    DMPrs: Is a printer using DLNA protocol andable to print images.

    2.0.1 Use Cases

    To ensure interoperability between devices, whichcould have differing manufacture origins, a certifi-cation is issued by DLNA after verifying each prod-uct through a program partially described in sec-tion 3. The certification is attributed only whenthere is proof of meeting the necessary require-ments. DLNA provides a certification program, in-cluding tests against specific tools or testing withother already certified devices. Typical use cases ofDLNA protocol can be summarized in three types:

    1. Content playback from the server to a renderer:

    2. Upload or download content from the server:Clients can upload and download content fromDMS.

    3. Printing content: Printers can be connected tothe DLNA eco system and print content sendby DMC or DMP.

    2.1. Test Automation TechniquesAccording to various authors and studies [4], test-ing automation techniques can be divided in 5 largetypes: Capture-Replay Testing; Script Based Test-ing; Keyword Driven Testing ; Keyword DrivenTesting and Model Base Testing. All of those whichwill be describe in detailed in the next sub sections.

    2.1.1 Script Based Testing

    Script Based testing is supported on the idea of de-composing testing into small units and creating asmall testing script for each unit. Once the scriptsare implemented execution is done automatically.To create new tests, scripts can be combines amongthemselves and new tests can be created by chang-ing the script execution order. The script base test-ing abstraction and quality depends in large part onthe implementation used. Coverage is not easily ex-tracted and needs to be manually compared to therequirements.

    2.1.2 Keyword Driven Testing

    Keyword base script are based on the idea of hav-ing keywords and additional info, and the keywordis mapped to scripts and library executing test-ing. Adaptation between the script and SUT canbe achieved hence increasing the tool abstraction.Keyword-driven testing allows a high level of ab-straction and can be automatically executed how-ever coverage is difficult to measure.

    2.1.3 Model Base Testing

    Model base testing consists in the idea of repre-senting the behavior or a sub set of this behavior ofSUT into a written form. Further testing follows themodel to ensure the correct behavior of the device.The model testing is generated by a test modulegenerator that later is executed with the supportof test scripts.Some authors indicate that the mainmodel base testing advantage is the awareness ofthe system and coverage[1].

    2.2. Continuous integration

    Continuous integration is based on the idea of con-tinuously getting the latest version of the sourcecode and compilation to ensure proper working, ontop of which tests can be executed. This practiceensures that the source quality is tested in earlystates and broken source code can be detected andfixed early in the project. According to several web-sites, Jenkins is currently the most used continu-ous integration tool, also the one used by ACCESSGmbH making it the tool we are required to have agood operational understanding.

    3. Implementation

    Proposed Testing Framework

    The main result and outcome of the present thesisis a testing Framework called GenFramework.Dueto the fact that this thesis is being conducted in abusiness environment, cost and time to market areboth two critical criteria. If the solution develop-ment costs are higher than others available in the


  • market, the present research would have limited in-terest and application. Concerning time-to-market,given that shareholders (testers, programmers andmanagers) can be consider the main end-users ofthis framework, the shorter the time to deliver thetool results the better, otherwise this tool wouldonly have internal use application. Negative feed-back or delay in presenting results may result in aproject cancellation. GenFramework is required toprovide easy integration and support of new hard-ware/operating system needs given that new oper-ating systems and mobile devices may arise. De-spite the fact that the SUT running hardware andsoftware operate with minimum change and prefer-ably in an invisible mode, the new element to beadded should be integrated in the most transpar-ent way possible. Once the setup and configura-tion is done, the development and maintenance oftest cases should be executable by a tester withoutprogramming skills. This approach releases the pro-grammers from this time consuming task. However,changes in the environment such as new platformsstill require a programmer support. After supportis added give, interaction between similar scriptsshould be applicable. In a distributed system suchas DLNA operations are not always sequential. Acommon