Download pdf - SOFTWARE TESTING

Transcript
Page 1: SOFTWARE TESTING

May/June2009Volume 2

CHAPTER 1A SOA VIEWOF TESTING

CHAPTER 2AGILE TESTINGSTRATEGIES:EVOLVING WITHTHE PRODUCT

SOFTWARE TESTINGTesting strategies for complex environments:

Agile, SOA

Page 2: SOFTWARE TESTING

EDITOR’S LETTER

p UNLESS YOU LIVE in a sci-fi novel,there’s one rule of thumb for any newplace you go: some things are differ-ent; some are the same. Usually, suc-cess, fun or survival in that new placedepends upon how well you handlethe differences. In this issue ofSearchSoftwareQuality.com’s Soft-ware Testing E-Zine, the new “places”are a service-oriented architecture(SOA) and an agile software develop-ment environment.Examining the ins and outs of soft-

ware testing in SOA environments in“An SOA view of testing,” consultantMike Kelly focuses on the “subtle dif-ferences.” What’s the same is thattesters here must start with thebasics. In SOA, the basics are connec-tivity, basic capacity, and authoriza-tion and authentication—not that dif-ferent from other environments. Thelevel of complexity of data models,however, is very different from that ofother architectures and requires somenew methods of testing.While agile development requires

different testing methods than thewaterfall model, those differences lieas much in human behavior as intechnology, according to consultant

Karen N. Johnson in “Agile testingstrategies: Evolving with the product.”Agile is a more collaborative processand calls for seizing “iterations as achance to evolve test ideas,” Johnsonwrites. Fundamental testing tasks, likeexploratory and investigative testing,stay the same, but the productivity ofcompleting those jobs can increase.Johnson and Kelly discuss the finer

points of testing in agile and SOA,respectively, in this issue’s articles.They also describe revelations they’vehad while working in those environ-ments and best practices they’velearned and continue to use.The theme of connectivity runs

through these discussions of SOA andagile testing, as the former focuses onsystem connectivity and the latter onhuman collaboration. Both approach-es, when done well, can help develop-ment and IT organizations achievelower costs and better software. �

JAN STAFFORD

Executive [email protected]

2 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

1Testing in the newworlds of SOA and agile

Page 3: SOFTWARE TESTING

p SERVICE-ORIENTED architectures(SOAs) are stealthily becoming ubiq-uitous. Large and small enterprisesleverage them to integrate widearrays of disparate systems into acohesive whole. Most companies andproject teams that implement SOA doso to reduce the cost, risk and difficul-ty of replacing legacy systems, acquir-ing new businesses or extending thelife of existing systems.Testing a SOA requires a solid

understanding of the SOA design andunderlying technology. Yet, SOA’scomplexity makes figuring out thefunctional purpose of the SOA diffi-cult and choosing and implementingthe right testing tools and techniquesto use with it more difficult.SOA is not simplyWeb services.

According to iTKO (the makers of theTechTarget award-winning SOA test-ing tool LISA), nine out of 10Webservices also involve some other typeof technology. In addition, they statethat most testing for SOA isn’t doneat the user interface level. It’s doneusing either specialized tools or aspart of an integration or end-to-end

test. This means people testing SOAneed to be comfortable with variedand changing technology, need tounderstand the various models forSOA, and should be familiar withthe risks common to SOA.

WHY SERVICE-ORIENTEDARCHITECTURES?Large companies typically turn toSOA because their existing systemscan’t change fast enough to keep upwith the business. Because businessoperations don’t exist in discrete orfinite units, there are a lot of depend-encies built into existing systems.SOA is an attempt to separate thesystem from the business operationsit supports. This allows companies toincrementally build or decommissionsystems and minimizes the impact ofchanges in one system to changes inanother.Implementing a SOA removes

the requirement of direct integration,which results in business operationsthat are easier to understand. Thattranslates into a more testable system

3 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

CHAPTER 1

1A SOA view of testingThe complexity of a service-oriented architecture cancomplicate testing and make choosing and implementingthe right testing tools more difficult. BY MICHAEL KELLY

Page 4: SOFTWARE TESTING

overall. Over time, this makes iteasier to support a broader technolo-gy base, which in turn makes it easierto acquire and integrate new systems,extend the life of existing systemsand replace legacy systems.There are a number of different

model architectures for implementingSOA, including publish and subscript(often called “pub/sub”), requestand reply, and synchronous versusasynchronous. In addition, they canbe implemented in a broad rangeof technologies, including message-oriented middleware (MOM),simple HTTP posts with DNS routing,Web services, MQ series, JMS (orsome other type of queuing system)and even files passed in batchprocesses.If you’re testing a SOA you need

to understand the implementationmodel and technologies involved,because these typically combine togive you some common features ofa SOA. These features often becomea focal point for your testing. Theycommonly include:

� Transformation and mapping:As data moves through a SOA, it’soften transformed between dataformats and mapped to variousstandard or custom data schemes.

� Routing: The path informationtakes as it moves through a SOAis often based on business rulesembedded at different levels ofthe architecture.

� Security: SOAs often use stan-

dards such as application-orientednetworking (AON), WS-Security,SAML, WS-Trust, WS-SecurityPolicy,or other custom security schemes fordata security as well as authorizationand authentication.

� Load balancing: SOAs are oftendesigned to spread work betweenresources to optimize resourceutilization, availability, reliabilityand performance.

� Translation: As data movesthrough a SOA, it’s often convertedor changed based on business rulesor reference data.

� Logging: For monitoring andauditing, SOAs often implementmultiple forms of logging.

� Notification: Based on the modelof SOA implemented, different notifi-cations may happen at differenttimes.

� Adapters: Adapters (both customand commercial) provide APIs toaccess data and common functionswithin a SOA.

FIGURING OUT WHAT TO TESTThe first thing you need to do whenyou start figuring what to test for yourSOA is to develop (or start develop-ing) your test strategy. When I’mfaced with a new project and I’m notsure where to start, I normally pullout the Satisfice Heuristic Test Strate-gy Model and start there. Each sec-

4 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

Page 5: SOFTWARE TESTING

AutomatedQA(978) 236-7900

DownloadTestComplete

FREEwww.testcomplete.com

Agile development moves fast, too fast for manual testing alone. In an agile environment you need automated testing tools to succeed. Check out TestComplete from AutomatedQA. TestComplete lets anyone automate tests for the widest range of technologies like Web, Ajax, Windows, .NET, Java, Flash, Flex and Silverlight. TestComplete has won four Jolt awards, been voted Best Testing Tool and yet it's still a�ordable.

TestComplete's extensive feature set and great price has long made it the choice of expert testers. Now version 7 adds script-free test creation and a simpli�ed user interface so new testers can get productive fast. TestComplete 7's easy automated testing lets your entire QA team test more, test faster and ship on time with con�dence.

Feeling the Agile Rush?

Test Faster With TestComplete7TestComplete7TestComplete7The EasiestTestComplete Ever!

Script-Free Test CreationSimple to learn and easy to extend

Download and Customize ExtensionsPre-built and custom add-ons make complex tests point and click easy

Unmatched Technology SupportNew technology? No problem! Rapid support for the latest software

Price and Performance LeaderUnsurpassed features and a low price to make you smile.

Start Testing In Minutes!Download TestComplete now and start testing today.

Page 6: SOFTWARE TESTING

tion of the Satisfice Heuristic TestStrategy Model contains things you’llneed to consider as you think aboutyour testing:

� Test techniques:What types oftesting techniques will you be doing?What would you need to supportthose types of testing in terms ofpeople, processes, tools, environ-ments or data? Are there specificthings you need to test (like security,transformation and mapping, or loadbalancing) that may require special-ized techniques?

� Product elements:What productelements will you be covering in yourtesting?What’s your scope? To whatextent will different data elements becovered in each stage?Which busi-ness rules?Will you be testing thetarget implementation environment?Howmuch testing will you be doingaround timing issues? How will youmeasure your coverage against thoseelements, track it and manage it froma documentation and configurationmanagement perspective?

� Quality criteria:What types ofrisks will you be looking for while test-ing?Will you focus more on the busi-ness problem being solved or the risksrelated to the implementation modelor technology being used?Will youlook at performance at each level?What about security? How will youneed to build environments, integratewith service providers, or find toolsto do all the different types of testingyou’ll need to do?

� Project environment:What fac-tors will be critical to your successand the success of the in-house teamas you take over this work? How willyou take in new work, structure yourrelease schedules, or move codebetween teams, phases or environ-ments?What are the business driversto moving to a SOA and how willthose come into play while you’retesting?

Recognize that as you think ofthese questions, it’s a matrix of con-cerns. A decision (or lack of decision)in each of these categories affects thescope of the decisions in the otherthree. Therefore, you will most likelyfind yourself approaching the problemfrom different perspectives at differ-ent times.Once you have an understanding

of what features you’ll be testing andyou know which quality criteria you’llbe focused on, you’re ready to dig in.Don’t be surprised if in many waysyour testing looks like it does on anon-SOA project. SOA isn’t magic,and it doesn’t change things all thatmuch. However, I’ve found that thereare some subtle differences in wheremy focus is when I’m working on aSOA project.

COVER THE BASICS AS SOON(AND AS OFTEN) AS YOU CANThe first area to focus on is typicallyconnectivity. Establishing a successfulround-trip test is a big step early in aSOA implementation. The first timeconnecting is often the most difficult.

6 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

Page 7: SOFTWARE TESTING

Once you have that connectivity,don’t lose it. Build out regressiontests for basic connectivity that youcan run on a regular basis. While aSOA evolves, it is easy to break con-nectivity without knowing it. If you

break it, you’ll want to find out assoon as possible so it’s easier todebug any issues.Next you may want to look at basic

capacity. Capacity at this level can beanything from basic storage to con-nection pooling to service throughput.I’ve been on a couple of projectswhere these elements weren’t lookedat until the end, only to find that wehad to requisition new hardware atthe last minute or end up spendingdays changing configuration settingson network devices andWeb servers.Figure out what you think you’ll needearly on, and run tests to confirmthose numbers are accurate andconsistent throughout the project.In addition, don’t put off testing for

authorization and authentication untilyou get into the target environment.I’ve worked on several teams where

development and test environmentshad different security controls thanproduction, which led to confusionand rework once we deployed andtried to run using an untested permis-sions scheme. If you use LDAP in pro-duction, use it when testing. If you’rehandling authorization in your SOAPrequest, do it consistently throughyour testing, even if you think it’seasier to disable it while doing“functional” testing.

FOCUS ON THE DATAService-oriented architecturesaren’t just cleaner interfaces betweensystems, they are also complex datamodels. As data moves through aSOA it’s translated, transformed,reformatted and processed. Thatmeans you’ll have a lot of testsfocused on testing data. These teststypically make up the bulk of theregression test beds I’ve seen onprojects implementing a SOA.Tests focused around testing data

in an SOA typically happen at thecomponent level and leverage stubsand harnesses to allow for testing asclose as possible to where the changetakes place. This simplifies debuggingand allows for more granular trackingof test coverage (based on a schema,mapping document or set of businessrules). SOA teams are constantlythinking about regression testing forthese tests. Small changes can havea large impact on the data, so thesetests become critical to refactoringor when adding new features.For many teams, these tests can

7 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

As data movesthrough a SOA it’stranslated, transformed,reformatted andprocessed. Thatmeans you’ll havea lot of tests focusedon testing data.

Page 8: SOFTWARE TESTING

be part of the continuous integrationprocess. While building these regres-sion test beds can be very easy, onceyou have libraries of XML test caseslying around, maintaining them can bevery time consuming. That makes ateam’s unit testing strategy a criticalpart of the overall testing strategy.What will be tested at the unit level,component level, integration level andend-to-end? Starting as close to thecode as possible, automating there,and running those tests as often aspossible is often the best choice fordata-focused testing.

UNDERSTANDYOUR USAGE MODELSWhen I test a SOA, I develop detailedusage models. I originally used usagemodels solely when doing perform-ance testing, but I’ve also found themhelpful when testing SOAs. When Ibuild usage models, I use the usercommunity modeling language(UCML) developed by Scott Barber.UCML allows you to visually depictcomplex workloads and performancescenarios. I’ve used UCML diagramsto help me plan my SOA testing, todocument the tests I executed, and tohelp elicit various performance, secu-rity and capacity requirements. UCMLisn’t the only way you can do that; ifyou have another modeling techniqueyou prefer, use that one.Usage models allow us to create

realistic (or reasonably realistic) test-ing scenarios. The power behind amodeling approach like this is that it’sintuitive to developers, users, man-

agers and testers alike. That meansfaster communication, clearerrequirements and better tests. Atsome level, every SOA implementa-tion needs to think about perform-ance, availability, reliability andcapacity. Usage models help focus thetesting dialogue on those topics (asopposed to just focusing on function-ality and data).When testing for SOA performance,

I focus on speed, load, concurrencyand latency. My availability and relia-bility scenarios for SOA often focuson failover, application and infrastruc-ture stress, fault tolerance and opera-tions scenarios. And conversationsaround capacity often begin with abasic scaling strategy and spiral outfrom there (taking in hardware, soft-ware, licensing and network concernsas appropriate). Because the scope isso broad, effective usage modelingrequires balancing creativity andreality.

DEMONSTRATINGYOU CAN IMPLEMENTBUSINESS PROCESSESAt the end of your SOA implementa-tion, you will have delivered some-thing that implements or supports abusiness process. If the bulk of yourtesting up to that point has focusedon various combinations of service-level processes and data functions,then your end-to-end acceptancetests will focus on those final imple-mented business processes. Thesetests are often manual, utilize the endsystems (often user interface sys-

8 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

Page 9: SOFTWARE TESTING

tems), and are slow and costly. All thetesting outlined up to this point hasbeen focused on making this testingmore efficient and effective.

If you’ve successfully covered thebasics, early and often, then youshould have few integration surpriseswhile doing your end-to-end testing.When you deploy the end-to-endsolution, you don’t want to be debug-ging basic connectivity or authoriza-tion/authentication issues. If you’vesuccessfully covered the data at theunit and service level, then you won’thave thousands of tests focused onsubtle variations of similar data.Instead, you’ll have tests targeted athigh-risk transactions, or tests basedon sampling strategies of what you’ve

already tested. If you’ve done a goodjob of facilitating discussions aroundperformance, availability, reliabilityand capacity, then you likely alreadyreduced most of the risk around thosequality criteria and have clear plansfor migration to production.If you’re testing your first SOA

implementation, take some time tolearn more about the underlying tech-nologies and implementation models.Once you’re comfortable with what’sbeing done, start looking at the toolsthat are available to support yourtesting needs. There are many greatcommercial and open source toolsavailable, and there is always theoption to create your own tools asneeded. Just make sure you’re select-ing tools to fit the types of testing youwant to do. Many tools will do morethan you need, or may force you to dosomething in a way that’s not optimalfor your project.Finally, remember that whatever

you’re building now will change overtime. Make sure you’re working close-ly with the rest of the team to ensureregression capabilities have beenimplemented in the most effectiveand cost-effective ways possible.When you come back a year later tomake a small change, you don’t wantto have to re-create all the workyou’ve done the first time around. �

9 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

Make sure you’reselecting tools to fitthe types of testingyou want to do. Manytools will do more thanyou need, or may forceyou to do somethingin a way that’s notoptimal for yourproject.

ABOUT THE AUTHOR:

MICHAEL KELLY is currently the director of application development for Interactions. He alsowrites and speaks about topics in software testing. Kelly is a board member for the Associationfor Software Testing and a co-founder of the Indianapolis Workshops on Software Testing, aseries of ongoing meetings on topics in software testing. You can find most of his articles andblog on his website, www.MichaelDKelly.com.

Page 10: SOFTWARE TESTING

Making the decision to outsource the development of your critical application is not easy. There are pros and cons, but, once the decision is made, it is imperative that the quality you paid for and expected is the quality you receive.

Ask yourself these questions:

1. What is the quality of the code that has been developed?

2. Is it too complex, making it unmaintainable and unreliable?

3. How thoroughly was it tested prior to delivery back to you?

4. Are you convinced that the most complex areas of your critical application are being tested?

5. Is the code quality and test coverage trending in a positive direction?

If you do outsource, you owe it to yourself, your organization, and most importantly your customers to know the answers to these questions.

McCabe IQ provides those answers using advanced static and dynamic analysis technology to help you focus your attention on the most complex and risky areas of your code base. McCabe IQ’s breakthrough visualization techniques, enterprise reporting engine, and executive dashboard provide you with a complete picture of what is being produced.

It’s time you made the most of your outsourcing investment and benefitted from our 30+ years of software quality research and development.

Don’t just think your code is good. Be sure of it, with McCabe IQ.

Page 11: SOFTWARE TESTING

CHAPTER 2

p YOU CAN ACHIEVE successfultesting in agile if you seize upon itera-tions as a chance to learn and evolvetest ideas. This is the art of softwaretesting strategy in agile. The chal-lenge: How can testing be plannedand executed in the ever-evolvingworld of the iterative agile methodol-ogy? This article discusses some ofphilosophies, pros, cons and realitiesof software testing in the agile itera-tive process.One aspect of an agile software

development project that I believebenefits testers is working with itera-tions. Iterations provide a previewopportunity to see new features and achance to watch features evolve. Theearly iterations are such sweet timesto see functionality flourish. I feel asthough I’m able to watch the process,to see the growth and the maturity ofa feature and a product. I feel more apart of the process of developmentwith an iterative approach to building

than I do when I’m forced to waitthrough a full waterfall lifecycle to seean allegedly more polished version ofan application.But there is an opposing point of

view that sees the iterative approachas frustrating. Early iterations could

be viewed as a hassle; the fact thatfeatures will evolve and morph ispotentially maddening if a testerthinks their test planning couldbecome obsolete. What if the fea-tures morph in such a way as to crushthe test planning and even crush thespirit of a waterfall-oriented tester? I

11 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

1Agile testingstrategies: Evolvingwith the productSoftware testing in an agile environment is the art ofseizing iterations as a chance to learn and evolve test ideas.BY KAREN JOHNSON

Iterations providea preview opportunityto see new featuresand a chance to watchfeatures evolve.

Page 12: SOFTWARE TESTING

think it’s a matter of making a mindshift.Early iterations of functionality

provide time to learn and explore.Learning is opportunity. While I’mlearning the product, I’m devisingmore ideas of how to test and what totest. Cycles of continuous integrationare chances to pick up speed withmanual testing and fine-tune testautomation.The iterative process also gives me

time to build test data. I’m able totake advantage of the time. I also seemy learning grow in parallel to thematurity of a feature. As I’m learning,and the functionality is settling down,I can’t help but feel that both theproduct as a whole and I as a testerare becoming more mature, robustand effective. I’m learning and devis-ing more ideas of how to test andwhat to test and potentially buildingtest data to test. We’re growingtogether.Cem Kaner outlined the approach

of investigatory testing, and ScottAmbler discussed it in an articlecalled “Agile Testing Strategies”.Having been through a couple of agileprojects before reading Scott’s article,it was interesting to find myself nod-ding in agreement throughout. One ofKaner’s ideas on investigative testingthat I especially like and have experi-enced is finding both “big picture”and “little picture” issues.I enjoy the early iterations to

stomp about an application so thatI can understand it. I like seeing myknowledge escalate to the pointwhere I’m asking good, solid ques-

tions that help the developmentprocess move along. I especially enjoyasking development a question andreceiving a pensive look from a devel-oper, telling me I’ve just found a bugin code that I’ve not even seen or thathasn’t even been built—talk aboutfinding your bugs upstream. Frankly,

there’s also a greater chance that myown ideas will be incorporated, sinceagile gives me the opportunity to pro-vide feedback about features earlyenough that my ideas can actuallymake it into a product. This is farmore likely with agile than if the prod-uct was fully baked before I had myfirst whiff.I’ve long loathed the idea that a fat

stack of requirements could give meeverything I needed. “Here you go,”I’ve been told. “Now you can writethose test cases you testers build,because if you only test to require-ments, the requirements and specifi-cations are all you need.” I want toholler, “No, it isn’t so!” An hour at thekeyboard has more value to me thana day reading specifications. I wanttime with an application, I want timeto play, and I want time to learn. Andthe agile iterative process gives methat.

12 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

I’ve long loathedthe idea that a fatstack of requirementscould give meeverything I needed.

Page 13: SOFTWARE TESTING

.PUTTING STRATEGY INTOPRACTICE: OUTLINE TEST IDEASReview the user stories created forupcoming features. Read each storyclosely, envisioning the functionality.Write questions alongside the userstories; if possible, record your ques-tions on a wiki where your commentsand the user stories can be viewed bythe team. Sharing your early thoughtsand questions on a team wiki givesthe entire team a chance to see thequestions being asked. Raising morequestions and conversations helps theentire team’s product knowledgeevolve. Also record your test ideas.This gives the developers a chance tosee the test ideas that you have. Mis-communications have an early chancefor clarification.When I write my early test ideas,

they’re usually formatted as a bullet-ed list—nothing glamorous. I raise myquestions with the team and thedevelopers on team calls or at times Iarrange for that purpose. I openlyshare my test ideas and hunches atany possible time that I can. I tellevery developer I work with that mywork is not and does not need to bemysterious to them. I share as muchas anyone is willing to listen to orreview as early and as often as I can.To be honest, often the developers arebusy and ask for less than I would pre-fer, but that’s OK.I sometimes mix how I record my

test ideas, but the recording isn’t thepoint. The idea is to be constantlythinking of new and inventive ways tofind a break in the product. Find a wayto record test ideas. Be prepared to

record an idea at any time by havinga notebook, index cards, a voicerecorder—whatever the medium,make the process easy.

GAIN A SENSE OF PRIORITYAfter reviewing all of the user stories,develop a sense of priority withineach story and then across all the sto-ries. What do you want to test first?What have you been waiting to see?And what test idea is most likely toevoke the greatest issue? These arethe first tests.The theory behind the first-strike

attack is the old informal risk analysis.Risk analysis doesn’t go away withagile.When the next iteration comes, I’m

not in pause mode. I know where Iwant to be, I know which tests I wantto run. I don’t have any long testscripts to rework because of changes.I expect change. As I test these earlyreleases, I take notes. I adjust myideas as I learn the application. I findthat some early test ideas no longermake sense, no longer apply. I gener-ate new ideas as I learn with eachiteration.

USE SWEEP TESTINGIn 2007, I blogged about my approachto exploratory testing through a con-cept I call sweep testing. The practiceis one of accumulating test ideas,often using Excel. I could say I buildtest ideas into a central repository,but the term “repository” has such aheavy sound to it, when what I really

13 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

Page 15: SOFTWARE TESTING

15 SOFTWARE TESTING MAY/JUNE 2009

mean is that I take the time to collectthose scraps of test ideas and earlyhunches and pool them together inone place. As the early iterations con-tinue and my test ideas come to me atrandom, unplanned times, I’ve jotteddown ideas on index cards in my car,as voice recordings on my BlackBerryor as notes on the team wiki.As the project continues, early test

ideas fall by the wayside and newideas come into place. Taking the timeto list my ideas in one place helps merethink ideas and gain that revised,better informed sense of order andimportance. Once I’ve built my listand my list sweeps across all thefunctionality, I’m ready. I feel pre-pared, and I can launch into testingrapidly once the build hits.I function as the only tester on proj-

ects fairly often, so I’m able to use myconcept of sweep testing withoutneeding to share, do version controlor address the issues that come into

play when I’m not testing alone. Onprojects where I have had othertesters, I have used the concept ofsweep testing to assign differentworksheets or sections of a sheetfrom the same Excel file and had noissues with the divide-and-conquer

approach. I write in bulleted lists. Thelists are ideas, not steps or details.Now it’s not just the product that

has evolved through the iterations—Ihave evolved as a tester as well.

COLLABORATIVE SPIRIT:PEOPLE WORKING TOGETHERThe shortage of people, money andtime, and the twisted situations eachof these can present—agile doesn’tcure all the blues. So it’s best not to betoo naïve, to think a process changeor a change in approach is going tosuddenly resolve all the murky eventsand behaviors that can happen on aproject.I still believe that the most powerful

advantage on any project is peopleworking together who want to worktogether. This is one of the core prin-ciples of the context-driven schoolof testing.Bret Pettichord discusses several

ways to build a collaborative spiritin his presentation “Agile TestingPractices.” The concepts of pairingtogether and no one developing spe-cialized knowledge are two constructsthat I find especially insightful. I justseem to work alone more often thannot and have had fewer chances per-sonally to put some of his collabora-tive test team concepts into practice.But as the solo tester, I can certainlyrelate to and find his thoughts on test-ing “half-baked” code highly practical.A small benefit I’ve seen in working

with scrum has been a side effect ofthe daily scrummeeting that I hadn’texpected. The frequency of the meet-

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

The idea is to beconstantly thinkingof new and inventiveways to find a breakin the product.

Page 16: SOFTWARE TESTING

ings helps to drive a friendly familiari-ty. Given the short duration of themeetings, people seem less inclinedto open a laptop and ignore each

other as they check emails and thedaily news. The forced frequencyseems to help. A more practicalaspect is hearing from teammemberswhat activities people have beenpulled into, how time has been spentor misspent. People have a place toexplain how hours have been burntover disruptive, unexpected andexpensive tasks. These daily meetingsbring people together; they build asense of team and, for that reasonalone, add great value.In his “Agile Test Strategies and

Experiences” article, Fran O’Hara talksabout “independent testers” needing

to be integrated with the team. Thisconcept resonates deeply with me;I have long felt it is the team and nota single stakeholder who will knowwhat value I have brought to the proj-ect. As an independent consultant,this has been interesting; I often havea stakeholder who owns my contract,but it is the team that knows what Ideliver.

CONCLUSIONWhile some of the test practices inagile may be different than workingon a waterfall project, the fundamen-tal ideas in testing still apply. Explor-atory and investigative testing pro-duce more bugs that matter in atimelier manner than detailed testscripts can provide. Risk-based think-ing helps steer testing to find thedefects that matter most. Testing willalways be affected by the reality thattime at the end game is tight, and thebest way to cope with tight time is tobe flexible. And people who want towork together can produce greatproducts, so finding ways to buildhealthy, constructive relationshipsis likely the most valuable strategyof all. �

16 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

These daily meetingsbring people together;they build a senseof team and, for thatreason alone, addgreat value.

ABOUT THE AUTHOR:

KAREN JOHNSON is an independent software test consultant with 14 years of experience insoftware testing and software test management. She views software testing as an intellectualchallenge and believes in the context-driven school of testing. Karen frequently speaks at soft-ware testing conferences and participates in software testing workshops, in addition to author-ing articles in software testing publications.

Page 17: SOFTWARE TESTING

q Learn More and Download TestComplete Free

About AutomatedQA:AutomatedQA has been making award-winning soft-ware products for quality assurance and software development worldwidesince 1999.AutomatedQA’s flagship product is TestComplete, the automated soft-

ware testing solution that’s easy to use and affordable for any size team.TestComplete makes testing fast and simple for all Windows software,includingWeb, .NET, Java, Windows Desktop, Flash/Flex, Ajax and Sil-verlight.TestComplete’s easy script-free keyword testing and unified interface

enable testers to learn just one product and automate the full range of testtypes including functional testing, load testing, distributed client/servertesting and unit testing for Windows andWeb.

17 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

FROM OUR SPONSOR

Page 18: SOFTWARE TESTING

q Improved Software Testing UsingMcCabe IQ Coverage Analysis

qMore Complex = Less Secure: Miss a Test Path and You Could Get Hacked

qUsing Code Quality Metrics in Management of OutsourcedDevelopment andMaintenance

About McCabe Software, Inc.:McCabe Software provides Software QualityManagement and Software Change & Configuration Management solutionsworldwide. “McCabe IQ” is used to analyze and visualize the quality andtest coverage of mission, life, and business critical applications, utilizing acomprehensive set of advanced software metrics including the McCabe-authored Cyclomatic Complexity metric. Our configuration managementsolution, “McCabe CM”, utilizes exclusive Integrated Difference technologyto manage software changes faster and more efficiently, ensuring qualitythroughout the Application Lifecycle. McCabe Software has offices in theUnited States, distribution worldwide, and can be found online atwww.mccabe.com.

18 SOFTWARE TESTING MAY/JUNE 2009

FROM OUR SPONSOR

Page 19: SOFTWARE TESTING

q iTKOWhitepaper: Service Virtualization in Enterprise ApplicationDevelopment

q iTKOWhitepaper: Minimize IT Outsourcing Risk with CollaborativeQuality

q Free Analyst Report from Butler Group: LISA 4.6 Technology Audit

About iTKO: iTKO helps our customers transform the software developmentand testing lifecycle for greater quality and agility in an environment of con-stant change. iTKO’s award winning LISA(tm) product suite can dramaticallylower quality assurance costs, shorten release cycles, reduce risks, andeliminate critical development and testing constraints by virtualizing ITresources to provide accessibility, capacity and security across interdepend-ent teams. LISA enables test, validation, and virtualization solutions opti-mized for distributed, multi-tier applications that leverage SOA, BPM, inte-gration suites, and ESBs. iTKO customers include eBay, American Airlines,Allstate, TimeWarner, SwissRe, Bank of America and the U.S. Departmentof Defense. Visit http://www.itko.com.

19 SOFTWARE TESTING MAY/JUNE 2009

aEDITOR’S

LETTER

aCHAPTER 1

A SOA VIEWOF TESTING

aCHAPTER 2

AGILE TESTINGSTRATEGIES:

EVOLVINGWITH THEPRODUCT

FROM OUR SPONSOR


Recommended