32
Software testing analysis & review The greatest software testing conference in the universe (we think!) Choose from a full week of learning, networking, and more SUNDAY Software Tester Certification—Foundation Level Training (3 days), Using Visual Studio ® 2010 Ultimate to Improve Software Quality (3 days) MONDAY – TUESDAY 24 In-depth half- and full-day tutorials WEDNESDAY – THURSDAY 5 Keynotes, 35 Concurrent Sessions, the EXPO, Networking Events, Receptions, Bonus Sessions, and more FRIDAY Testing & Quality Leadership Summit April 25–30, 2010 Orlando, Florida Rosen Shingle Creek REGISTER BY MARCH 26, 2010 AND SAVE UP TO $200. GROUPS OF 2 SAVE EVEN MORE! www.sqe.com/stareast

Software testing

Embed Size (px)

Citation preview

Page 1: Software testing

Software testinganalysis & review

The greatest software testing conference in the universe (we think!)

Choose from a full week of learning, networking, and more

SundaySoftware Tester Certification—Foundation Level Training (3 days), Using Visual Studio® 2010 Ultimate to Improve Software Quality (3 days)

Monday – TueSday24 In-depth half- and full-day tutorials

WedneSday – ThurSday5 Keynotes, 35 Concurrent Sessions, the EXPO, Networking Events, Receptions, Bonus Sessions, and more

FridayTesting & Quality Leadership Summit

April 25–30, 2010Orlando, FloridaRosen Shingle Creek

regiSTer by MarCh 26, 2010

and SaVe uP To $200. grouPS oF 2 SaVe eVen More!

www.sqe.com/stareast

Page 2: Software testing

ConTenTS ToP Ten reaSonS To aTTend

Who’S behind The ConFerenCe?Software Quality Engineering assists professionals interested in improving software practices. Five conferences are hosted annually—the STAR conference series, the Better Software Conference, and the Agile Development Practices series. Software Quality Engineering also delivers software training, publications, and research. www.sqe.com

Better Software magazine brings you the hands-on facts you need to run smarter projects and to deliver better products that win in the marketplace. www.betterSoftware.com

StickyMinds.com is a complete online resource to help you produce better software. It offers original articles from industry experts, technical papers, industry news, a tools guide, forums, and much more. www.StickyMinds.com

4 Conference-at-a-Glance Build Your Own Conference!

6 Speaker Index7 Networking and Special Events

8 Software Tester Certification Training9 Using Visual Studio® Ultimate Training10 24 In-Depth Tutorials16 5 Keynote Presentations18 35 Concurrent Sessions25 Attendee Reviews and Get Connected26 Testing & Quality Leadership Summit

28 Bonus Sessions29 The EXPO, Conference Sponsors & Exhibitors

30 Ways to Save31 Registration Details and Event Location

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g2

www.sqe.com

1. Pick from dozens of learning options: tutorials, training classes, keynotes, concurrent sessions, bonus sessions, leadership Summit, and more!

2. get in-depth learning with the subjects of your choice with pre-conference training classes beginning on Sunday and the highly popular half- and full-day tutorials on Monday and Tuesday.

3. Network with hundreds of your peers to problem solve, collaborate, and gain fresh ideas.

4. Enjoy opportunities to meet with the speakers throughout the week.

5. gather ideas that will help you communicate the value of testing to others in your organization.

6. Visit the testing EXPo and explore the solutions that are right for you and your organization.

7. Multiply your learning by bringing the team! groups of two or more save big.

8. work toward internationally recognized certification with Software Tester Certification training Sunday-Tuesday.

9. get inspired and motivated by keynote speakers who are top professionals in their industries.

10. Engage with summit participants in thoughtful discussions about leadership at the Testing & Quality leadership Summit.

Software testing analysis & review

April 25–30, 2010Orlando, FloridaRosen Shingle Creek

Page 3: Software testing

Who Should aTTend?

Software and test managers, QA managers and analysts, test practitioners and engineers, IT directors, CTos, development managers, developers, and all managers and professionals who are interested in people, processes, and technologies to test and evaluate software intensive systems

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 3

ConFerenCe SChedule

Build your own conference—tutorials, training classes, keynote presentations, concurrent sessions, summit sessions, and more—packed with information covering the latest technologies, trends, and practices in software testing.

24 In-depth half- and full-day tutorials

Software Tester Certification—Foundation level Training (3 days)Using Visual Studio® 2010 Ultimate to Improve Software Quality (3 days)Bonus Session—Assessing Your Readiness for the ISTQB® Foundation Exam

5 Keynote Presentations35 Concurrent SessionsThe EXPoSpecial EventsBonus Sessions…and More!

Friday

WedneSday – ThurSday

Monday – TueSday

Sunday

Testing & Quality leadership SummitAdd a fifth day to your conference event by attending the Testing & Quality leadership Summit Thursday evening and Friday. Join senior leaders from the industry to gain new perspectives and share ideas on today’s software testing issues. See page 26 for more information on the Testing & Quality leadership Summit.

buy one geT one halF oFF

register two people at the same time and save half off the second registration.

The 50% savings will be taken off the lower of the two registration amounts.

To take advantage of this offer, please call the Client Support Group at 888.268.8770 or 904.278.0524 or email them at [email protected] and reference promo code BOGO.

Page 4: Software testing

Software Tester Certification—Foundation level Training (3 days) (8:30 a.m. – 5:00 p.m.)8:30

8:30

8:30

lunch (12:00 p.m. – 1:00 p.m.)12:00

4 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

C o n F e r e n C e - a T - a - g l a n C e

Tutorial Sessions (8:30 a.m. – 12:00 p.m.)8:30

12:00

1:00

Tutorial Sessions (8:30 a.m. – 12:00 p.m.)8:30

12:00

1:00

4:30

5:30

Monday, aPril 26

Sunday, aPril 25

TueSday, aPril 27

MA FundamentalTestPracticeswithSTEP™:ARisk-basedProcess Dale Perry, Software Quality Engineering

MB TestProcessImprovementwiththeTPI®Model Martin Pol and Ruud Teunissen, POLTEQ IT Services BV

MC BecominganInfluentialTestTeamLeader Randy Rice, Rice Consulting Services

MDKeyTestDesignTechniques Lee Copeland, Software Quality Engineering

ME BecomingaTrustedAdvisortoSeniorManagement Lloyd Roden, Grove Consultants

MF TheCraftofBugInvestigation neW

Jon Bach, Quardev, Inc.

MGExploratorySoftwareTestingInteractive Jonathan Kohl, Kohl Concepts, Inc.

MHATestLeader’sGuidetoGoingAgile neW

Bob Galen, iContact

MA FundamentalTestPracticeswithSTEP™:ARisk-basedProcess Dale Perry, Software Quality Engineering

MB TestProcessImprovementwiththeTPI®Model Martin Pol and Ruud Teunissen, POLTEQ IT Services BV

MC BecominganInfluentialTestTeamLeader Randy Rice, Rice Consulting Services

MDKeyTestDesignTechniques Lee Copeland, Software Quality Engineering

MI Discovery,Collaboration,andLearning:EssentialTesterSkills neW

Mukesh Mulchandani and Krishna Iyer, ZenTEST Labs

MJ TestingRichInternetApplications neW

Paco Hope, Cigital, Inc.

MK Risk-basedTesting:FocusingYourScarceResources Julie Gardiner, Grove Consultants

ML MeasurementandMetricsforTestManagers Rick Craig, Software Quality Engineering

TA TestAutomation:TheSmartWay Dorothy Graham, Software Testing Consultant

TB EssentialTestManagementandPlanning Rick Craig, Software Quality Engineering

TC CriticalThinkingforTesters neW

James Bach, Satisfice, Inc.

TD SoftwarePerformanceTesting:Planning,Executing,andReporting Dale Perry, Software Quality Engineering

TA TestAutomation:TheSmartWay Dorothy Graham, Software Testing Consultant

TB EssentialTestManagementandPlanning Rick Craig, Software Quality Engineering

TC CriticalThinkingforTesters neW

James Bach, Satisfice, Inc.

TD SoftwarePerformanceTesting:Planning,Executing,andReporting Dale Perry, Software Quality Engineering

TE FindingAmbiguitiesinRequirements Richard Bender, BenderRBT, Inc.

TF Whittaker:OnTesting James Whittaker, Google

TG ExploratoryTesting:NowinSession neW

Jon Bach, Quardev, Inc.

TH ReliableTestEffortEstimation Ruud Teunissen, POLTEQ IT Services BV

TI UsingVisualModelsforTestCaseDesign neW

Rob Sabourin, AmiBug.com

TJ Cause-EffectGraphing Richard Bender, BenderRBT, Inc.

TK PlanningYourAgileTesting:APracticalGuide neW

Janet Gregory, DragonFire, Inc.

TL MakingTestAutomationWorkinAgileProjects Lisa Crispin, ePlan Services, Inc.

Monday Full day TuTorialS Monday Morning TuTorialS

Monday aFTernoon TuTorialSMonday Full day TuTorialS (ConTinued)

TueSday Morning TuTorialSTueSday Full day TuTorialS

TueSday aFTernoon TuTorialSTueSday Full day TuTorialS (ConTinued)

Lunch

Tutorial Sessions (1:00 p.m. - 4:30 p.m.)

Lunch

Tutorial Sessions (1:00 p.m. – 4:30 p.m.)

Welcome Reception (4:30 p.m. – 5:30 p.m.)

Bonus Session—Speaking 101: Tips and Tricks (5:30 p.m. – 7:00 p.m.)

using Visual Studio® 2010 Ultimate to improve Software Quality (3 days) (8:30 a.m. – 5:00 p.m.)

bonus Session—assessing your readiness for the iSTQb® Foundation exam (8:30 a.m. – 5:00 p.m.)

Page 5: Software testing

Test Management Test Techniques Test automation The new Wave Special Topics

Networking Break • Visit the EXPo (10:30 a.m. – 2:00 p.m.)

lunch • Visit the EXPo • Meet the Speakers

Networking Break • Visit the EXPo (3:30 p.m. – 6:30 p.m.)

Stop guessing about how Customers use your Software — Alan Page, Microsoft

8:30

9:45

8:30

10:00

11:00

11:30

12:30

1:45

3:00

4:00

4:30

W1

W2

W3

W4

W5

W6

W7

W8

W9

W10

W12

The buccaneer Tester: Winning your reputation — James Bach, Satisfice, Inc.

1:30

3:00

5:30

6:30

Reception in the EXPo Hall (5:30 p.m. – 6:30 p.m.)

Bonus Session: A Panel Discussion—The Reality of Testing (6:30 p.m. – 7:30 p.m.)

W11

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 5

W15

W13

W14

Friday, aPril 30

Test Management Test Techniques Performance Testing agile Testing Special Topics

Test Management Test Techniques Security Testing agile Testing Testing Mobile applications

T1

T11

T16

T2

T12

T17

T3

T13

T8

T18

T4

T14

T9

T19

T5

T15

T10

T20

10:45 Networking Break • Visit the EXPo, 10:30 a.m.– 3:00 p.m.

lunch • Visit the EXPo • Meet the Speakers12:15

11:15

Networking Break • Visit the EXPo2:30

lessons learned from 20,000 Testers on the open Source Mozilla Project — Tim Riley, Mozilla Corporation4:15

Testing & Quality leadership SummitAdd a fifth day to your conference event and attend the Testing & Quality Leadership Summit Thursday evening and Friday. Join senior leaders from the industry to gain new perspectives and share ideas on today’s software testing issues. See page 26 for more information on the Testing & Quality Leadership Summit.

TheMythsofRigorJames Bach, Satisfice, Inc.

PatternsofTestabilityAlan Myrvold, Microsoft

UsingTestAutomationFrameworksAndrew Pollner, ALP International

TestingAJAX:WhatDoesItTake?Joachim Herschmann, Borland (a Micro Focus company)

TheManyHatsofaTesterAdam Goucher, Consultant

TheTopTenChallenges—orOpportunities—WeFaceTodayLloyd Roden, Grove Consultants

AutomatedTestCaseGenerationUsingClassificationTreesPeter M. Kruse and Magdalena Luniak, Berner & Mattner Systemtechnik GmbH

TestAutomationSuccess:ChoosingtheRightPeopleandProcessesKiran Pyneni, Aetna, Inc.

VirtualTestLabsintheCloudRavi Gururaj, VMLogix

CharteringtheCourse:GuidingExploratoryTestingRob Sabourin, AmiBug.com

Service-drivenTestManagementMartin Pol, POLTEQ IT Services BV

Meet“Ellen”:ImprovingSoftwareQualitythroughPersonasEuan Garden, Microsoft

AutomatedTestingwithDomainSpecificTestLanguagesMartin Gijsen, DeAnalist.nl

VirtualizingOverutilizedSystemstoEliminateTestingConstraintsRajeev Gupta, iTKO

TheElusiveTester-DeveloperRatioRandy Rice, Rice Consulting Services

ATestOdyssey:BuildingaHighPerformance,DistributedTeamMatt Heusser, Socialtext

IWouldn’tHaveSeenItIfIHadn’tBelievedIt:ConfirmationBiasinTestingMichael Bolton, DevelopSense

PerformanceTestingThroughouttheLifeCycleChris Patterson, Microsoft

ImplementingAgileTestingRobert Reff, Thomson Reuters

HowGoogleTestedChromeJames Whittaker and the Google Chrome Test Team, Google

HeuristicsforRapidTestManagementJon Bach, Quardev, Inc.

TestEnvironments:TheWeakestLinkinYourTestingChainJulie Gardiner, Grove Consultants

PerformanceTestingSQL-basedApplicationsAlim Sharif, Ultimate Software

AvoidFailurewithAcceptanceTest-drivenDevelopmentC.V. Narayanan, Sonata Software Limited

CreatingCrucialTestConversationsBob Galen, iContact

ProvingOurWorth:QuantifyingtheValueofTestingLee Copeland, Software Quality Engineering

FocusingTestEffortswithSystemUsagePatternsDan Craig, Coveros, Inc.

Tour-basedTesting:TheHacker’sLandmarkTourRafal Los, Hewlett-Packard

EnableAgileTestingthroughContinuousIntegrationSean Stolberg, Pacific Northwest National Laboratory

CreatingtheRightEnvironmentforMobileApplicationsTestingNat Couture, Professional Quality Assurance Ltd.

ThePowerofRiskErik Boelen, qa consult

KeystoaSuccessfulBetaTestingProgramRob Swoboda, HomeShore Solutions

WebSecurityTestingwithRubyJames Knowlton, McAfee, Inc.

TamingBugReportsandDefects:TheAgileWayLisa Crispin, ePlan Services, Inc.

CrowdsourcedTestingofMobileApplicationsDoron Reuveni, uTest

you Can’t Test Quality into your Systems — Jeff Payne, Coveros, Inc.

agile Testing: uncertainty, risk, and how it all Works — Elisabeth Hendrickson, Quality Tree Software, Inc.

T7

T6

WedneSday, aPril 28

ThurSday, aPril 29

Page 6: Software testing

conference speakers

keynote speaker Tutorial speaker class speaker summit speaker

JamesBachSatisfice, Inc.

JaneFraserElectronic Arts

ElisabethHendricksonQuality Tree Software, Inc.

RafalLosHewlett-Packard

AndrewPollnerALP International

RobSwobodaHomeShore Solutions

JonBachQuardev, Inc.

BobGaleniContact

JoachimHerschmannBorland (a Micro Focus company)

MagdalenaLuniakBerner & Mattner Systemtechnik GmbH

KiranPyneniAetna, Inc.

RuudTeunissenPOLTEQ IT Services BV

RichardBenderBenderRBT, Inc.

EuanGardenMicrosoft

MatthewHeusserSocialtext

MukeshMulchandaniZenTEST Labs

RobertReffThomson Reuters

JamesWhittakerGoogle

ErikBoelenqa consult

JulieGardinerGrove Consultants

PacoHopeCigital, Inc.

AlanMyrvoldMicrosoft

DoronReuveniuTest

MichaelBoltonDevelopSense

MartinGijsenDeAnalist.nl

KrishnaIyerZenTEST Labs

C.V.NarayananSonata Software Limited

RandyRiceRice Consulting Services

GorankaBjedovGoogle

AdamGoucherConsultant

AndyKaufmanInstitute for Leadership Excellence & Development, Inc.

AlanPageMicrosoft

TimRileyMozilla Coporation

NatCoutureProfessional Quality Assurance Ltd.

DorothyGrahamSoftware Testing Consultant

JamesKnowltonMcAfee, Inc.

ChrisPattersonMicrosoft

LloydRodenGrove Consultants

DanCraigCoveros, Inc.

JanetGregoryDragonFire, Inc.

JonathanKohlKohl Concepts, Inc.

JeffPayneCoveros, Inc.

RobSabourinAmiBug.com

RickCraigSoftware Quality Engineering

RajeevGuptaiTKO

PeterM.KruseBerner & Mattner Systemtechnik GmbH

DalePerrySoftware Quality Engineering

AlimSharifUltimate Software

LisaCrispinePlan Services, Inc.

RaviGururajVMLogix

MilesLewittIntuit

MartinPolPOLTEQ IT Services BV

SeanStolbergPacific Northwest National Laboratory

ProgramChair

LeeCopelandSoftware Quality Engineering

6 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

Page 7: Software testing

keynote speaker Tutorial speaker class speaker summit speaker

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 7

neTWorking and SPeCial eVenTS

Welcome receptionTuesday, April 27, 2010 • 4:30 p.m. - 5:30 p.m.

Kick off the STAREAST conference with a welcome reception! Mingle with experts and colleagues, and enjoy complimentary food and beverages.

bookstore and Speaker book SigningsTuesday through Thursday, purchase popular industry books—many authored by STAREAST speakers—from BreakPoint Books. Authors are available for questions and book signings during session breaks and EXPo hours.

eXPo receptionWednesday, April 28, 2010 • 5:30 p.m. - 6:30 p.m.

Network with peers at the EXPo reception and enjoy complimentary food and beverages. Be sure to play the Passport game for your chance to win great prizes!

Meet the Speakers at lunchWednesday, April 28 – Thursday, April 29, 2010 • During Lunch

Meet with industry experts for open discussions in key areas of software testing. wednesday’s designated tables will be organized by topic of interest, and Thursday’s will be organized by your industry area. Come pose your toughest questions!

STar Presenter one-on-oneOffered Wednesday and Thursday

STAREAST offers the unique opportunity to schedule a 15-minute, one-on-one session with a STAREAST presenter. our speakers have years of industry experience and are ready to share their insight with you. Bring your toughest challenge, your testing plans, or whatever’s on your mind. leave with fresh ideas on how to approach your testing challenges. You’ll have the chance to sign-up during the conference and get some free consulting!

The Testing & Quality leadership SummitFriday, April 30, 2010

Join senior industry leaders in a highly interactive day focused on finding innovative ways to improve software quality and testing productivity.

Page 8: Software testing

8 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

Combine in-depth training with your conference and save an additional $500!

Software Tester Certification—Foundation level TrainingSunday, April 25 – Tuesday, April 27, 2010 • 8:30 a.m. – 5:00 p.m.

about the SpeakerDawn Haynes is a Software Quality Specialist and Trainer for Security Innovation, Inc. She has more than eleven years of experience in manual, functional, and performance testing of software systems on both windows and UNIX platforms, and more than nine years of technical training experience, including course development and training management. Her career has included technical positions at companies like Xerox, IBM Rational Software, SoftBridge Microsystems, Ipswitch, Inc., John Hancock Mutual life Insurance Company, and New England Medical Center. She is a contributing author of the book Quality Web Systems: Performance, Security & Usability. Dawn holds a BSBA in MIS from Northeastern University.

sunday monday tuesday wednesday thursday friday

Software Tester Certification—Foundation level Training

keynote PresentationsConcurrent Classesnetworking eXPo

bonus SessionsSpecial events…and More!

Testing & Quality

leadership Summit

get certified while you’re at STarEAST 2010! Are you looking for internationally recognized certification in software testing?Delivered by top experts in the testing industry, Software Tester Certification—Foundation level is an accredited training course, designed to help prepare you for the ISTQB™ Certified Tester—Foundation level exam. This certification program, accredited by the ISTQB™ through its network of National Boards, is the only internationally accepted certification for software testing. The ISTQB™, a non-proprietary and nonprofit organization, has granted more than 115,000 certifications in more than 46 countries around the world. This course is most appropriate for individuals who recently entered the testing field and those currently seeking ISTQB™ certification in software testing.

In the Software Tester Certification—Foundation level Training you will learn:

• Fundamentals of software testing—Concepts and context, risk analysis, goals, process, and psychology

• Lifecycle testing—How testing relates to development, including models, verification and validation, and types of tests

• Static testing—Reviews, inspections, and static tools

• Test design techniques—Black-box test methods, white-box techniques, error guessing, and exploratory testing

• Test management—Team organization, key roles and responsibilities, test strategy and planning, configuration management, defect classification and management

• Testing tools—Tool selection, benefits, risks, and classifications

register early—Space is limited!This course is conveniently located with the STAREAST conference. Register for the Software Tester Certification—Foundation level training for $1,995* or combine this training with your conference registration and SAVE an additional $500. Call the Client Support group at 888.268.8770 or 904.278.0524 or email [email protected] for more information.*There is an additional $250 fee for the ISTQB™ exam.

Build your week of learning to include Software Tester Certification—Foundation level Training and benefit from all STAREAST has to offer. Plus—if you stay through Friday, you can attend the Testing & Quality leadership Summit. Stay for three days or maximize your experience by attending the conference while you’re in orlando. See the week’s schedule below.

Page 9: Software testing

C a l l 8 8 8 . 2 6 8 . 8 7 7 0 o r 9 0 4 . 2 7 8 . 0 5 2 4 t o r e g i s t e r • w w w . s q e . C o m / s e r e g 9

Combine in-depth training with your conference and save an additional $500!

Using Visual Studio® 2010 Ultimate to Improve Software QualitySunday, April 25 – Tuesday, April 27, 2010 • 8:30 a.m. – 5:00 p.m.

Get Practical Experience with Visual Studio® 2010 Testing at STAREAST 2010!the tester training course provides students with the knowledge and skills to use the latest testing tools provided by Visual studio 2010® to improve their software quality. test case creation and management will be covered, as well as test execution and automation practices. Creating and managing virtual lab environments using Visual studio lab management 2010 will be discussed within the context of test planning and execution. By the end of the course, students are equipped begin planning the implementation of Visual studio 2010® for improving testing practices within their organizations. this course is currently taught using Beta 2 of Visual Studio 2010® Ultimate.

Hands-on Experience with Visual Studio® 2010 Testing Toolsthis course provides hands-on experience with the detailed Visual studio 2010® testing functions and new features including: work item tracking, version control, automated tests, microsoft test manager 2010, and automated builds.

in Using Visual Studio® 2010 Ultimate to improve software quality training you will:

• explore the testing components of Visual studio® 2010 and how these are used to improve software quality • Understand integrated application lifecycle management (alm) and how Visual studio® 2010 aids the alm process • Understand the work management tools available in Visual studio® 2010• explore the tester and test manager’s tasks in Visual studio® 2010 • Create test plans, and define configurations for testing • write and maintain test cases • execute tests and collect video captures and system information for filing rich bugs • Discover the purpose and value of a virtual lab environment

Register Early—Space is Limited!this course is conveniently located with the starEAST conference. register for Using Visual Studio® 2010 Ultimate to improve software quality training for $1,995 or combine this training with your conference registration and saVe an additional $500. Call the Client support group at 888.268.8770 or 904.278.0524 or email [email protected] for more information.

Build your week of learning to include Using Visual Studio® 2010 Ultimate to improve software quality training and benefit from all starEAST has to offer. Plus—if you stay through Friday, you can attend the testing & quality leadership summit. stay for three days or maximize your experience by attending the conference while you’re in orlando. see the week’s schedule below.

About the SpeakerChris menegay is a Principal Consultant for Notion solutions, inc. He has been helping clients develop business applications for more than ten years. Chris works with customers to help with team system adoption, deployment, customization and learning. in his role with Notion solutions, Chris has written team system training for microsoft that was used to train customers using the beta versions of team system. Chris holds his mCsD.Net & mCt certification and is a member of the microsoft south Central District Developer guidance Council. Chris is a team system mVP, a microsoft regional Director and a member of the iNeta speaker’s bureau.

sunday monday tuesday wednesday thursday friday

Using Visual Studio® 2010 Ultimateto Improve Software Quality

Keynote PresentationsConcurrent ClassesNetworking EXPO

Bonus SessionsSpecial Events…and More!

Testing & Quality

Leadership Summit

Page 10: Software testing

10 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

Monday, APRIl 26, 8:30-4:30 (FUll DAY)

T u T o r i a l S

Fundamental Test Practices with STeP™: a risk-based ProcessDale Perry, Software Quality Engineering

whether you are new to testing or looking for a better way to organize your test practices and process, the Systematic Test and Evaluation Process (STEP™) offers a flexible approach to help you and your team succeed. Dale Perry describes this risk-based framework—applicable to any development lifecycle model—to help you make critical testing decisions earlier and with more confidence. The STEP™ approach helps you decide how to focus your testing effort, what elements and areas to test, and how to organize test designs and documentation. learn the fundamentals of test analysis and how to develop an inventory of test objectives to help prioritize your testing efforts. Find out how to translate these objectives into a concrete strategy for designing and developing tests. with a prioritized inventory and focused test architecture, you will be able to create test cases, execute the resulting tests, and accurately report on the effectiveness of your testing. Take back a proven approach to organize your testing efforts and new ways to add more value to your project and organization.

Test Process improvement with the TPi® ModelMartin Pol and Ruud Teunissen, POLTEQ IT Services BV

what is the maturity of your testing process? How do you compare to other organizations and to industry standards? To find out, join Martin Pol and Ruud Teunissen for an introduction to the Test Process Improvement (TPI®) model, an industry standard for testing maturity assessments. Although many organizations want to improve testing, they lack the foundation required for success. Improving your testing requires three things: (1) understanding key test process areas, (2) knowing your current position in each of these areas, and (3) having the tools and skills to implement needed improvements. Rather than guessing at what to do, begin with the TPI® model as your guide. Using examples of real world TPI® assessments they have performed, Martin and Ruud describe a practical assessment approach that is suitable for both smaller, informal organizations and larger, formal companies. Take back valuable references, templates, examples, and web links to start your improvement program.

TPI® is a registered trademark of Sogeti USA LLC.

becoming an influential Test Team leaderRandy Rice, Rice Consulting Services

Have you been thrust into the role of test team leader? Are you in this role now and want to hone your leadership skills? Test team leadership has many unique challenges, and many test team leaders—especially new ones—find themselves ill-equipped to deal with the problems they face. The test team leader must motivate and support the team while keeping testing on track, within time and budget constraints. Randy Rice focuses on how you can grow as a leader, influence your team and those around you, and positively impact those outside your team. learn how to become a person of influence, deal with interpersonal issues, and help your team build their skills and value to the team and the organization. Discover how to communicate your team’s value to management, how to stand firm when asked to compromise principles, and how to learn from your successes and failures. Develop your own action plan to become an influential test team leader.

key Test design TechniquesLee Copeland, Software Quality Engineering

All testers know that we can identify many more test cases than we will ever have time to design and execute. The major problem in testing is choosing a small, “smart” subset from the almost infinite number of possibilities available. Join lee Copeland to discover how to design test cases using formal black-box techniques, including equivalence class and boundary value testing, decision tables, state-transition diagrams, and all-pairs testing. Explore white-box techniques with their associated coverage metrics. Evaluate more informal approaches, such as random and hunch-based testing, and learn the importance of using exploratory testing to enhance your testing ability. Choose the right test case design approaches for your projects. Use the test results to evaluate the quality of both your products and your test designs.

with more than thirty years of experience in information technology, Dale Perry has been a programmer/analyst, database administrator, project manager, development manager, tester, and test manager. Dale’s project experience

includes large systems development and conversions, distributed systems, and online applications, both client/server and web-based. He has been a professional instructor for more than fifteen years and has presented at numerous industry conferences on development and testing. with Software Quality Engineering for thirteen years, Dale has specialized in training and consulting on testing, inspections and reviews, and other testing and quality related topics.

MA

MB

MC

MD Lee Copeland has more than thirty-five years of experience as a consultant, instructor, author, and information systems professional. He has held a number of technical and managerial positions with commercial and non-profit

organizations in the areas of applications development, software testing, and software development process improvement. lee frequently speaks at software conferences both in the US and internationally and currently serves as Program Chair for the Better Software conference, the STAR testing conferences, and Software Quality Engineering’s Agile Development Practices conference. lee is the author of A Practitioner’s Guide to Software Test Design, a compendium of the most effective methods of test case design.

A leading author, speaker, and consultant with more than thirty years of experience in the field of software testing and software quality, Randy Rice has worked with organizations worldwide to improve the quality of their information systems

and optimize their testing processes. He is co-author with william E. Perry of Surviving the Top Ten Challenges of Software Testing. Randy is an officer of the American Software Testing Qualifications Board (ASTQB). Founder, principal consultant, and trainer at Rice Consulting Services, Randy can be contacted at www.riceconsulting.com where he publishes articles, newsletters, and other content about software testing and software quality. Visit Randy’s blog at randallrice.blogspot.com.

An international test consultant at PolTEQ IT Services BV, Ruud Teunissen has performed several

test functions in a large number of IT projects: tester, test specialist, test consultant, and test manager. He participated in the development of the structured testing methodology TMap®—Test Management Approach. Together with Martin Pol and Erik van Veenendaal, Ruud is co-author of several books on structured testing, including Software Testing: A Guide to the TMap® Approach.

Martin Pol has played a significant role in helping to raise the awareness and improve the performance of testing worldwide.

Martin provides international testing consulting services through PolTEQ IT Services BV. He’s gained experience by managing testing processes and implementing and improving structured testing in many organizations around the world. A co-author of Test Process Improvement, a classic text on models for improving testing, Martin has developed approaches to successfully manage test outsourcing services.

Page 11: Software testing

Monday, APRIl 26, 8:30-12:00 (HAlF DAY - AM)

T u T o r i a l S

becoming a Trusted advisor to Senior ManagementLloyd Roden, Grove Consultants

Testing generates a huge amount of raw data, which must be analyzed, processed, summarized, and presented to management so that effective decisions can be made quickly. As a test manager or tester, how can you present information about your test results so that decision-makers receive the correct message? Using his experiences as a test manager and consultant, lloyd Roden shares ways to communicate with and disseminate information to management. Develop your skills so you become a “trusted advisor” to senior management rather than the classic “bearer of bad news.” Discover innovative ways to keep the information flowing to and from management and avoid losing control of the test process, particularly near the delivery date. learn how to deal effectively with various controversies that often prevent senior managers from taking you seriously.

The Craft of bug investigation neW

Jon Bach, Quardev, Inc.

Although many training classes and conference presentations describe processes and techniques meant to help you find bugs, few explain what to do when you find a good one. How do you know what the underlying problem is? what do you do when you find a bug, and the developer wants you to provide more information? How do you reproduce those pesky, intermittent bugs that come in from customer land? In this hands-on class, Jon Bach helps you practice your investigation and analysis skills—questioning, conjecturing, branching, and backtracking. For those of you who have ever had to tell the story about the big bug that got away, Jon offers up new techniques that may trap it next time so you can earn more credibility, respect, and accolades from stakeholders. Because collaboration and participation are encouraged in this class, bring your mental tester toolkit, tester’s notebook, and an open mind.

Laptop required.

exploratory Software Testing interactiveJonathan Kohl, Kohl Concepts, Inc.

Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of the tester to continually optimize the value of his work. It is the process of three mutually supportive activities performed in parallel—learning, test design, and test execution. with skill and practice, exploratory testers typically uncover considerably more problems than when the same amount of effort is spent on scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer testers can articulate the process. Jonathan Kohl describes specific heuristics and techniques of exploratory testing to help you get the most from this highly productive approach. Jonathan focuses on the skills and dynamics of exploratory testing itself and how it can be combined with scripted approaches.

Laptop required. This is a hands-on course. A laptop—preferably with Microsoft Windows capability—is required for some of the exercises.

a Test leader’s guide to going agile neW

Bob Galen, iContact

Much of the work of moving traditional test teams toward agile methods is focused on the individual tester and agile methods. often, the roles of test director, test manager, test team leader, and test-centric project manager are marginalized—but not in this workshop where we’ll focus on agile testing from the test leader’s perspective. Join experienced agile test leader and long-time coach Bob galen to explore the central leadership challenges associated with agile adoption: how to transform your team’s skills toward agile practices, how to hire agile testers, how to create a “whole-team” view toward quality by focusing on executable requirements, and how to create powerful done-ness criteria. Beyond the tactical leadership issues, Bob explores strategies for becoming a partner in agile adoption pilot projects, changes to test automation strategies, and how to reinvent your traditional planning and metrics for more agile-centric approaches that engage stakeholders.

with more than twenty-five years in the software industry, Lloyd Roden has worked as a developer, managed an independent test group within a software house, and joined UK-based grove Consultants in 1999. lloyd has been a

speaker at STAREAST, STARWEST, EuroSTAR, AsiaSTAR, Software Test Automation, Test Congress, and Unicom conferences as well as Special Interest groups in software testing in several countries. He was Program Chair for both the tenth and eleventh EuroSTAR conferences.

Currently a managing consultant for Seattle-based test lab Quardev, Inc., Jon Bach has been in testing for fourteen years, twelve as a manager. His experience includes managing teams at Microsoft, HP, and lexisNexis. The

co-inventor (with his brother James) of Session-Based Test Management, Jon frequently speaks about test management and exploratory testing. Jon is co-author of Microsoft’s Patterns and Practices book on acceptance testing (freely available online) and has written articles for testing magazines. Find him on Facebook, Twitter, or his many presentations, articles, and his blog at http://jonbox.wordpress.com/

ME

MF

MG

MH

Laptoprequired

Laptoprequired

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 11

Bob Galen is the director of R&D at iContact and president of RgCg, llC., a North Carolina-based firm specializing in strategy development, coaching, and training teams making the shift to Scrum and other agile practices. Bob regularly

speaks at international conferences and professional groups on topics related to software development, project management, software testing, and team leadership. He is a Certified Scrum Master Practicing (CSP), Certified Scrum Product owner (CSPo), and an active member of the Agile Alliance and Scrum Alliance. In 2009, he published Scrum Product Ownership–Balancing Value from the Inside Out, which addresses the gap in guidance toward effective agile product management. You can reach Bob at [email protected] or www.rgalen.com.

Jonathan Kohl is the founder and principal software testing consultant with Kohl Concepts, Inc., based in Calgary, Alberta, Canada. A noted testing thinker, Jonathan is recognized as a leader in the exploratory testing community. He is

a popular author and speaker who believes that testing is a challenging, intellectual craft. Jonathan’s blog on software development and testing issues is one of the most often-read testing blogs in the industry. Jonathan is also a regular contributor to Better Software magazine, as an author.

Page 12: Software testing

12 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

Monday, APRIl 26, 1:00-4:30 (HAlF DAY - PM)

T u T o r i a l S

discovery, Collaboration, and learning: essential Tester Skills neW

Mukesh Mulchandani and Krishna Iyer, ZenTEST Labs

Today, focusing solely on testers’ domain knowledge and technical excellence is not enough. From their experiences training more than 2,000 testers, Mukesh Mulchandani and Krishna Iyer describe non-technical skills they have found essential to hone the mind of the tester—collaboration, discovery, and learning. Join Mukesh and Krishna to explore ways to collaborate effectively with others in your organization, improve your innate discovery skills, and unleash the higher productivity inside you and your test team. Their focus is on developing your thinking skills, becoming an effective human engineer, and doing better discovery during your everyday tasks. learn some of the latest research in cognitive thinking that can be applied to testing and participate in exercises that testers and test managers can take back to teach these skills to other testers in your organization.

Testing rich internet applications neW

Paco Hope, Cigital, Inc.

Rich Internet applications (RIAs) use technologies such as AJAX (Asynchronous JavaScript and XMl), Flex (based on Flash), and Microsoft’s Silverlight to deliver web applications. These technologies allow a web-based application to look and feel much like a desktop or client/server system. However, RIAs pose unique testing challenges because so much of the application’s logic runs inside the web browser. To thoroughly test RIAs, you need to understand the technology, adopt unique testing approaches, and employ special testing tools. Paco Hope introduces dynamic HTMl, JSoN, and the core technologies that make RIAs possible. Then, he explores the approaches required to adequately test these applications—from the outside-in and the inside-out. Examine an AJAX application and uncover different test strategies while discovering the trade-offs of different testing tools—some open source and some commercial—to interactively and automatically test RIAs.

risk-based Testing: Focusing your Scarce resourcesJulie Gardiner, Grove Consultants

Risks are endemic in every phase of every project. one key to project success is to identify, understand, and manage these risks effectively. However, risk management is not the sole domain of the project manager, particularly with regard to product quality. It is here that the effective tester can significantly influence the project outcome. Julie gardiner explains how risk-based testing can shape the quality of the delivered product in spite of such time constraints. Join Julie as she reveals how you can apply product risk management to a variety of organizational, technological, project, and skills challenges. Through interactive exercises, you will get practical advice on how to apply risk management techniques throughout the testing lifecycle—from planning through execution and reporting. Take back a practical process and the tools you need to apply risk analysis to testing in your organization.

Measurement and Metrics for Test ManagersRick Craig, Software Quality Engineering

To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics are complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. learn the guidelines for developing a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms, including goal-Question-Metric, and discusses the pros and cons of each. Participants are urged to bring their metrics problems and issues for use as discussion points.

A technical manager with Cigital, Inc., Paco Hope has twelve years of experience in application security at the software and operating system level. He has focused on analyzing the security of web-based applications and embedded

systems—online gaming, lottery systems, cell phones, casino gaming devices, and smart cards. He is a frequent speaker on software security, security testing, and web application security. Paco is co-author of the Web Security Testing Cookbook and is a subject matter expert for the Certified Secure Software lifecycle Professional (CSSlP) certification.

MI

MJ

MK

ML A consultant, lecturer, author, and test manager, Rick Craig has led numerous teams of testers on both large and small projects. In his twenty-five years of consulting worldwide, Rick has advised and supported a

diverse group of organizations on many testing and test management issues. From large insurance providers and telecommunications companies to smaller software services companies, he has mentored senior software managers and helped test teams improve their effectiveness. Rick is co-author of Systematic Software Testing and is a frequent speaker at testing conferences, including every STAR conference since its inception.

In the IT industry for nearly twenty years, Julie Gardiner has spent time as an analyst programmer, oracle DBA, and project manager. She has first-hand experience as a test analyst, test team leader, test consultant,

and test manager. At UK-based grove Consultants, Julie provides consultancy and training in all aspects of testing, specializing in risk-based testing, agile testing, test management, and people issues. She is a certified ScrumMaster. Julie won best presentation at STAREAST; best presentation at BCS SIgiST; and best tutorial at EuroSTAR. She has been a keynote speaker at STARWEST, Innovate Test Management, and STANZ.

CEo of ZenTEST labs, Krishna Iyer is a young entrepreneur, a prolific speaker, and author. Prior to

ZenTEST labs, Krishna was a quality manager at Kanbay where he worked with clients such as CitiFinancial, HSBC, IBM, and gE. Krishna shapes ZenTEST labs’s strategy using his financial background, improves its operations using his rich IT and process consulting experience, and transforms its culture using his expertise as a behavioral trainer. He is a chartered accountant and holds international certifications in software quality. Krishna is a regular presenter at testing and quality conferences including STARWEST and STAREAST.

CTo of ZenTEST labs, Mukesh Mulchandani is the architect behind the organization’s various testing solutions and is responsible for establishing

ZenTEST labs as a key player in the software-testing domain. He has eight years of experience in the information technology industry, most spent in the banking and financial services sector. Before joining ZenTEST labs, Mukesh worked with Kanbay, Capgemini Consulting, and Fortune 500 clients. He has played a major role in designing functional automation processes at the organizational level. A Certified Software Test Engineer and a Certified Product Consultant for winRunner and QTP, Mukesh is a regular presenter at STARWEST and STAREAST. Contact Mukesh at [email protected].

Page 13: Software testing

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

TueSday, APRIl 27, 8:30-4:30 (FUll DAY)

T u T o r i a l S

13

Test automation: The Smart WayDorothy Graham, Software Testing Consultant

Many organizations never achieve the significant benefits that are promised from automated test execution tools. what are the secrets to test automation success? There are no secrets, but the paths to success are not commonly understood. Dorothy graham describes the most important automation issues that you must address, both management and technical, and helps you understand and choose the smartest approaches for your organization—no matter which automation tools you use. If you don’t begin with legitimate objectives for your automation, you will set yourself up for failure later. For example, if “find more bugs” is your goal, automating regression tests will not achieve it. Even objectives that seem sensible, such as “run tests overnight” or “automate x% of tests,” can be counterproductive. Join Dorothy to learn how to assess your current automation maturity, identify achievable and realistic objectives for automation, build testware architecture for future scalability, and devise an effective automation strategy.

essential Test Management and PlanningRick Craig, Software Quality Engineering

The key to successful testing is effective and timely planning. Rick Craig introduces proven test planning methods and techniques, including the Master Test Plan and level-specific test plans for acceptance, system, integration, and unit testing. Rick explains how to customize an IEEE-829-style test plan and test summary report to fit your organization’s needs. learn how to manage test activities, estimate test efforts, and achieve buy-in. Discover a practical risk analysis technique to prioritize your testing and become more effective with limited resources. Rick offers test measurement and reporting recommendations for monitoring the testing process. Discover new methods and develop renewed energy for taking test management to the next level in your organization.

Critical Thinking for Testers neW

James Bach, Satisfice, Inc.

Critical thinking is the kind of thinking that specifically looks for problems and mistakes. Regular people don’t do a lot of it. However, if you want to be a great tester, you need to be a great critical thinker, too. Critically thinking testers save projects from dangerous assumptions and ultimately from disasters. The good news is that critical thinking is not just innate intelligence or a talent—it’s a learnable and improvable skill you can master. James Bach shares the specific techniques and heuristics of critical thinking and presents realistic testing puzzles that help you practice and increase your thinking skills. Critical thinking begins with just three questions—Huh? Really? and So?—that kick start your brain to analyze specifications, risks, causes, effects, project plans, and anything else that puzzles you. Join this interactive, hands-on session and practice your critical thinking skills. Study and analyze product behaviors and experience new ways to identify, isolate, and characterize bugs.

Software Performance Testing: Planning, executing, and reportingDale Perry, Software Quality Engineering

what does it take to properly plan, implement, and report the results of a performance test? what factors need to be considered? what is your performance test tool telling you? Do you really need a performance test? Is it worth the cost? These questions plague all performance testers. In addition, many performance tests do not appear to be worth the time it takes to run them, and the results never seem to resemble—yet alone predict—production system behavior. Performance tests are some of the most difficult tests to create and run, and most organizations don’t fully appreciate the time and effort required to properly perform them. Dale Perry discusses the key issues and realities of performance testing—what can and cannot be done with a performance test, what is required to do a performance test, and how to present what the test “really” tells you.

Dorothy Graham has been in testing for more than thirty years and is co-author of Software Inspection, Software Test Automation, and Foundations of Software Testing. She helped start testing qualifications in the UK. Dorothy

holds the European Excellence Award in Software Testing and was Programme Chair for EuroSTAR in 1993 and 2009.

TA

TB

TC

TD

A consultant, lecturer, author, and test manager, Rick Craig has led numerous teams of testers on both large and small projects. In his twenty-five years of consulting worldwide, Rick has advised and supported a

diverse group of organizations on many testing and test management issues. From large insurance providers and telecommunications companies to smaller software services companies, he has mentored senior software managers and helped test teams improve their effectiveness. Rick is co-author of Systematic Software Testing and is a frequent speaker at testing conferences, including every STAR conference since its inception.

James Bach is founder and principal consultant of Satisfice, Inc., a software testing and quality assurance company. In the eighties, James cut his teeth as a programmer, tester, and SQA manager in Silicon Valley in the world of market-

driven software development. For ten years, he has traveled the world teaching rapid software testing skills and serving as an expert witness on court cases involving software testing. James is the author of Lessons Learned in Software Testing and the recently published Secrets of a Buccaneer-Scholar: How Self-Education and the Pursuit of Passion Can Lead to a Lifetime of Success.

with more than thirty years of experience in information technology, Dale Perry has been a programmer/analyst, database administrator, project manager, development manager, tester, and test manager. Dale’s project experience

includes large systems development and conversions, distributed systems, and online applications, both client/server and web-based. He has been a professional instructor for more than fifteen years and has presented at numerous industry conferences on development and testing. with Software Quality Engineering for thirteen years, Dale has specialized in training and consulting on testing, inspections and reviews, and other testing and quality related topics.

Page 14: Software testing

TueSday, APRIl 27, 8:30-12:00 (HAlF DAY - AM)

T u T o r i a l S

14 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

Finding ambiguities in requirementsRichard Bender, BenderRBT, Inc.

Through the years, studies have shown that poor requirements are one of the most significant contributors to project failure—and that half of all defects have their origin in bad requirements. we know that the earlier a defect is found, the cheaper it is to fix. our experience tells us that if specifications are ambiguous, there is nearly a 100% chance that there will be one or more defects in the corresponding code. Richard Bender explains how to review specifications quickly and quantitatively to identify what is unclear about them. learn how your feedback can lead to early defect detection and future defect avoidance. Discover how applying these review techniques can reduce the ambiguity rate by 95% on subsequent specifications and how that translates into a significant reduction in the number of defects in the code even before testing begins. Join Richard to learn how this process also can be applied to design specifications, user manuals, training materials, and online help, as well as agreements and contracts ensuring clarity of communications.

Whittaker: on TestingJames Whittaker, Google

Here is your chance to enjoy an educational and entertaining session with expert tester, teacher, and author James whittaker as he discusses the testing topics that are challenging test managers and testers everywhere. James is a master of hot topics and reads the industry tea leaves for new trends and technologies that will stand the test of time. Topics will include discussions on insourcing vs. outsourcing vs. crowdsourcing, manual vs. automated testing, and developer vs. tester as the owners of quality. James also expounds on technical topics, including exploratory testing techniques, test automation, and less technical topics like managing your testing career, becoming a better tester, and making your organization more quality focused. Although James has prepared material based on these testing topics, the real value of this session is engaging in a conversation that will take your understanding to the next level. This could be a unique opportunity to impact your career and help your organization achieve its goals.

exploratory Testing: now in Session neW

Jon Bach, Quardev, Inc.

The nature of exploration, coupled with the ability of testers to rapidly apply their skills and experience, make exploratory testing a widely used test approach—especially when time is short. Unfortunately, exploratory testing is often dismissed by project managers who assume that it is not reproducible, measurable, or accountable. If you have these concerns, you may find a solution in a technique called Session-Based Test Management (SBTM), developed by Jon Bach and his brother James to specifically address these issues. In SBTM, testers are assigned areas of a product to explore, and testing is time-boxed in “sessions” that have mission statements called “charters” to create a meaningful and countable unit of work. Jon discusses—and you practice—the skills of exploration using the SBTM approach. Jon demonstrates a freely available, open source tool to help manage your exploration and prepares you to implement SBTM in your test organization.

Laptop required.

reliable Test effort estimationRuud Teunissen, POLTEQ IT Services BV

How do you estimate your test effort? And how reliable is that estimate? Ruud Teunissen presents a practical and useful test estimation technique related directly to the maturity of your test and development process. A reliable effort estimation approach requires five basic elements: (1) Strategy – Determine what to test (performance, functionality, etc.) and how thoroughly it must be tested. (2) Size – Yes, it does matter—not only the size of the system but also the scope of your tests. (3) Expected Quality – what factors have been established to define quality? (4) Infrastructure and Tools – Define how fast you can test. without the proper organizational support and the necessary tools, you’ll need time you may not have. (5) Productivity – How experienced and efficient is your team? Join Ruud to improve your test estimations and achieve more realistic goal setting and test strategies.

James Whittaker has spent his career in software testing. He was an early thought leader in model-based testing where his Ph.D. dissertation became a standard reference on the subject. while a professor at the Florida Institute of

Technology, James founded the world’s largest academic software testing research center and helped make testing a degree track for undergraduates. while at FIT, he wrote How to Break Software and the series follow-ups How to Break Software Security (with Hugh Thompson), and How to Break Web Software (with Mike Andrews). As a software architect at Microsoft, James transformed many of his testing ideas into tools and techniques for developers and testers, and wrote the book Exploratory Software Testing. He is currently the Test Engineering Director for the Kirkland and Seattle offices of google where he’s busy forging a future in which software just works.

Richard Bender has more than forty years experience in software with a primary focus on quality assurance and testing. He has consulted internationally to large and small corporations, government agencies, and the military.

He has been involved in establishing industry standards for software quality, serving as the Technical lead for the International Y2K Test Certification Standards and assisting the U.S. Food and Drug Administration in defining their Software Quality guidelines. He was one of the first programmers ever awarded IBM’s outstanding Invention Award. This was for his breakthroughs on code-based testing.

TE

TF

TG

TH

Currently a managing consultant for Seattle-based test lab Quardev, Inc., Jon Bach has been in testing for fourteen years, twelve as a manager. His experience includes managing teams at Microsoft, HP, and lexisNexis. The

co-inventor (with his brother James) of Session-Based Test Management, Jon frequently speaks about test management and exploratory testing. Jon is co-author of Microsoft’s Patterns and Practices book on acceptance testing (freely available online) and has written articles for testing magazines. Find him on Facebook, Twitter, or his many presentations, articles, and his blog at http://jonbox.wordpress.com

An international test consultant at PolTEQ IT Services BV, Ruud Teunissen has performed several test functions in a large number of IT projects: tester, test specialist, test consultant, and test manager. He participated in the

development of the structured testing methodology TMap®—Test Management Approach. Together with Martin Pol and Erik van Veenendaal, Ruud is co-author of several books on structured testing, including Software Testing: A Guide to the TMap® Approach.

Laptoprequired

Page 15: Software testing

TueSday, APRIl 27, 1:00-4:30 (HAlF DAY - PM)

T u T o r i a l S

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 15

using Visual Models for Test Case design neW

Rob Sabourin, AmiBug.com

Designing test cases is a fundamental skill that all testers should master. Rob Sabourin shares graphical techniques he employs to design powerful test cases that will surface important bugs quickly. These skills can be used in exploratory, agile, or engineered contexts—anytime you are having problems designing a test. Rob illustrates how you can use Mindmaps to visualize test designs and better understand variables being tested, one-at-a-time and in complex combinations with other variables. Through a series of interactive exercises, he presents the Application-Input-Memory (AIM) heuristic. You’ll use FreeMind, a widely available free, open-source tool, to help implement great test cases and focus testing on what matters to quickly isolate critical bugs. If you are new to testing, these techniques will remove some of the mystery of good test case design. If you’re a veteran tester, these techniques will sharpen your skills and give you some new test design approaches.

Laptop Required. Participants are requested to bring a MS-Windows notebook computer in order to participate in classroom exercises.

Cause-effect graphingRichard Bender, BenderRBT, Inc.

Cause-Effect graphing is the most rigorous of all software testing approaches. It identifies missing requirements and logical inconsistencies in the specifications and is the only software testing technique that addresses the problem of defect observability—multiple defects can cancel each other out in a test execution or that something going right on one part of the path hides something going wrong elsewhere. Richard Bender demonstrates how test cases generated from the graphs are not only highly optimized, but they also guarantee that if there is a defect anywhere in the logic that it will show up at an observable point. All other test design techniques focus only on reducing the test set to a manageable number without a claim of completeness. The graphing technique allows you to design 90% of the functional tests needed for the project (there are still design dependent and coding dependent issues to address). Join Richard to learn how this process moves most of test design effort early in the project, where it is most effective and efficient.

Planning your agile Testing: a Practical guide neW

Janet Gregory, DragonFire, Inc.

Traditional test plans are incompatible with agile software development because we don’t know all the details about all the requirements up front. However, in an agile software release, you still must decide what types of testing activities will be required—and when you need to schedule them. Janet gregory explains how to use the Agile Testing Quadrants, a model identifying the different purposes of testing, to help your team understand your testing needs as you plan the next release. Janet introduces you to alternative lightweight test planning tools that allow you to plan and communicate your big picture testing needs and risks. learn how to decide who does what testing—and when. Determine what types of testing to consider when planning an agile release, the infrastructure and environments needed for testing, what goes into an agile “test plan,” how to plan for acquiring test data, and lightweight approaches for documenting your tests and recording test results.

Making Test automation Work in agile ProjectsLisa Crispin, ePlan Services, Inc.

Agile teams must deliver production-ready software every four-, two-, or one-week iteration—or possibly every day! This goal can’t be achieved without automated tests. However, many teams just can’t seem to get traction on test automation. The challenge of automating all regression tests strikes fear into the hearts of many testers. How do we succeed when we have to release so often? By combining a collaborative team approach with an appropriate mix of tools designed for agile teams, you can, over time, automate your regression tests and continue to automate new tests during each programming iteration. lisa Crispin describes what tests should be automated, some common barriers to test automation, and ways to overcome those barriers. learn how to create data for tests, evaluate automated test tools, implement test automation, and evaluate your automation efforts. An agile approach to test automation even helps if you’re a tester on a more traditional project without the support of programmers on your team.

Richard Bender has more than forty years experience in software with a primary focus on quality assurance and testing. He has consulted internationally to large and small corporations, government agencies, and the military.

He has been involved in establishing industry standards for software quality, serving as the Technical lead for the International Y2K Test Certification Standards and assisting the U.S. Food and Drug Administration in defining their Software Quality guidelines. He was one of the first programmers ever awarded IBM’s outstanding Invention Award. This was for his breakthroughs on code-based testing.

Robert Sabourin, P. Eng., has more than twenty-five years of management experience leading teams of software development professionals. A well-respected member of the software engineering community, Robert has

managed, trained, mentored, and coached hundreds of top professionals in the field. He frequently speaks at conferences and writes on software engineering, SQA, testing, management, and internationalization. Robert wrote I am a Bug!, the popular software testing children’s book; works as an adjunct professor of software engineering at Mcgill University; and serves as the principle consultant (and president/janitor) of AmiBug.Com, Inc. Contact Robert at [email protected].

TI

TJ

TK

TL

The co-author of Agile Testing: A Practical Guide for Agile Testers and Teams, Janet Gregory is a consultant who specializes in helping teams build quality systems using agile methods. Based in Calgary, Canada,

Janet’s greatest passion is promoting agile quality processes. As tester or coach she has helped introduce agile development practices into companies and has successfully transitioned several traditional test teams into the agile world. Her focus is working with business users and testers to understand their roles in agile projects. Janet teaches courses on agile testing and is a frequent speaker at agile and testing software conferences around the world.

An agile testing coach and practitioner, Lisa Crispin has co-authored Agile Testing: A Practical Guide for Testers and Agile Teams (with Janet gregory) and Testing Extreme Programming (with Tip House). lisa specializes in

showing agile teams how testers can add value and guide development with business-facing tests. For the past ten years, lisa has worked as a tester on agile teams developing web applications in Java and .Net. lisa regularly contributes articles to Better Software magazine, IEEE Software, and Methods and Tools. For more about lisa’s work, visit www.lisacrispin.com.

Laptoprequired

Page 16: Software testing

T E S T I N g E X P E R T S S H A R E I N S I g H T

16

k e y n o T e P r e S e n T a T i o n S

you Can’t Test Quality into your Systems Jeff Payne, Coveros, Inc.Many organizations refer to their test teams and testers as QA departments and QA engineers. However, because errant systems can damage—even destroy—products and businesses, software quality must be the responsibility of the entire development team and every stakeholder. As the ones who find and report defects, and sometimes carry the “quality assurance” moniker, the test community has a unique opportunity to take up the cause of error prevention as a priority. Jeff Payne paints a picture of team and organization-wide quality assurance that is not the process-wonky, touchy, feely QA of the past that no one respects. Rather, it’s tirelessly evaluating the software development artifacts beyond code; it’s measuring robustness, reliability, security, and other attributes that focus on product quality rather than process quality; it’s using risk management to drive business decisions around quality; and more. Join Jeff as he explores ways that senior-level test managers and test engineers can lead their organizations—one step at a time—to make incremental progress toward a culture that not only finds defects early but actually prevents them.

Jeff Payne is CEO and founder of Coveros, Inc., where he has led the startup and growth of the company. Prior to Coveros, Jeff was Chairman of the Board, CEO, and co-founder of Cigital, Inc. Under his direction, Cigital became a leader in software security and software quality solutions, helping clients mitigate the business risks associated with failed software. Jeff is a recognized software expert and speaks to companies nationwide about the business risks of software failure. He has been a keynote and featured speaker at CIO and business technology conferences, and frequently testifies before Congress on issues of national importance, including intellectual property rights, cyber-terrorism, and software quality.

agile Testing: uncertainty, risk, and how it all Works Elisabeth Hendrickson, Quality Tree Software, Inc.Teams that succeed with agile methods reliably deliver releasable software at frequent intervals and at a sustainable pace. At the same time, they can readily adapt to the changing needs and requirements of the business. Unfortunately, not all teams are successful in their attempt to transition to agile and, instead, end up with a “frAgile” process. The difference between an agile and a frAgile process is usually in the degree to which the organization embraces the disciplined engineering practices that support agility. Teams that succeed are often the ones adopting specific practices: acceptance test-driven development, automated regression testing, continuous integration, and more. why do these practices make such a big difference? Elisabeth Hendrickson details essential agile testing practices and explains how they mitigate common project risks related to uncertainty, ambiguity, assumptions, dependencies, and capacity. Join Elisabeth to explore the role of testing and testers in today’s agile development teams.

Elisabeth Hendrickson is the founder and president of Quality Tree Software, Inc., a consulting and training company dedicated to helping software teams deliver working solutions consistently and sustainably. Elisabeth wrote her first line of code in 1980. Moments later, she found her first bug. Since then Elisabeth has held positions as a tester, developer, manager, and quality engineering director in a variety of companies ranging from small startups to multi-national enterprises. A member of the agile community since 2003, Elisabeth has served on the board of directors of the Agile Alliance and is one of the co-organizers of the Agile Alliance Functional Testing Tools program. These days, Elisabeth splits her time between teaching, speaking, writing, and working on agile teams with test-infected programmers who value her obsession with testing. She blogs at testobsessed.com. You can also find her on Twitter as @testobsessed.

Stop guessing about how Customers use your SoftwareAlan Page, Microsoftwhat features of your software do customers use the most? what parts of the software do they find frustrating or completely useless? wouldn’t you like to target these critical areas in your testing? Most organizations get feedback—much later than anyone would like—from customer complaints, product reviews, and online discussion forums. Microsoft employs proactive approaches to gather detailed customer usage data from both beta tests and released products, achieving greater understanding of the experience of its millions of users. Product teams analyze this data to guide improvement efforts, including test planning, throughout the product cycle. Alan Page shares the inner workings of Microsoft’s methods for gathering customer data, including how to know what features are used, when they are used, where crashes are occurring, and when customers are feeling pain. learn how your organization can employ similar strategies to make better decisions throughout your development life cycle. Alan shares approaches for gathering customer data that can work for any software team—and improve the customer experience for everyone.

A tester since 1993, Alan Page joined Microsoft in 1995 and currently is the Director of Test Excellence, where he oversees the technical training program for testers and other activities focused on improving testers, test tools, and testing across Microsoft. At Microsoft, Alan has worked on various versions of Windows, Internet Explorer, and Windows CE. He is the lead author of How we Test Software at Microsoft (http://www.hwtsam.com), writes about testing on his blog (http://blogs.msdn.com/alanpa), and recently contributed a chapter to Beautiful Testing. Alan is a board member of the Seattle Area Software Quality Assurance Group (SASQAG) and speaks frequently about software testing and careers for software testers.

WedneSday, April 28, 8:30 a.m.

WedneSday, April 28, 4:30 p.m.

Alan Page

WedneSday, April 28, 10:00 a.m.

Elisabeth Hendrickson

Jeff Payne

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

Page 17: Software testing

MONDAY, MAY 16, 8:30-5:00

T E S T I N g E X P E R T S S H A R E I N S I g H T

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 17

k e y n o T e P r e S e n T a T i o n S

The buccaneer Tester: Winning your reputation James Bach, Satisfice, Inc. who drives your career as a tester or test leader? Hopefully, not the company for which you work. It’s you—you must be the driver. Because the craft of testing is still relatively free and open, there is no authority structure that defines or controls our industry. There are no generally accepted and standardized credentials that will admit you to the upper tier of income and respect as a tester. There are no universities that offer degrees in testing—although certificates and certifications abound. what we do have is a pastiche of communities, proprietary methodologies, schools of thought—together with ambitious individuals who write articles, teach, argue with each other, and speak at conferences. James Bach, who has made his own way in his twenty-three year testing career, describes how you can develop your own personal portfolio and reputation to stand out as a senior tester or leader who is indispensable to your current—or any—organization. Join James for an insightful look at what you can do to develop and win your reputation to help your team, project, and company excel.

James Bach is founder and principal consultant of Satisfice, Inc., a software testing and quality assurance company. In the eighties, James cut his teeth as a programmer, tester, and SQA manager in Silicon Valley in the world of market-driven software development. For ten years, he has traveled the world teaching rapid software testing skills and serving as an expert witness on court cases involving software testing. James is the author of lessons learned in Software Testing and the recently published Secrets of a Buccaneer-Scholar: How Self-Education and the Pursuit of Passion Can lead to a lifetime of Success.

lessons learned from 20,000 Testers on the open Source Mozilla ProjectTim Riley, Mozilla Corporationopen source community-based software development can be extremely wild and woolly. Testing in this environment is even more so, given that it is often less structured than software design and coding activities. what are the differences between testing open source and commercial or corporate applications? what can you learn from the open source community? Take a peek into the open source testing world with Tim Riley as he describes how the Mozilla Project develops and tests the Firefox browser. Tim describes how they monitor new builds, how people all around the world engage in testing, and how anomalies quickly bubble up to the release team. Although some of the tools they use may look familiar, how the Mozilla Project applies them will give you a fresh perspective. Find out how to apply the lessons learned at Mozilla to your projects and unleash the creative power of really smart people inside and outside your organization. Now, more than ever, we need all the help we can get!

Director of Quality Assurance at Mozilla, Tim Riley has tested everything from spacecraft simulators, ground control systems, high security operating systems, language platforms, application servers, hosted services, and open source Web applications. He has managed international software-testing teams from startups to large corporations. Tim holds a software patent for a test execution framework that matches test suites to available test systems. His new book Beautiful Testing is a collaboration with twenty-seven leading testers and developers. In addition to his interest in software testing, Tim enjoys live/studio sound engineering and being a breeder-caretaker for Canine Companions for Independence (cci.org).

ThurSday, April 29, 8:30 a.m.

ThurSday, April 29, 4:15 p.m.

Tim Riley

James Bach

Page 18: Software testing

WedneSday, APRIl 28, 11:30 a.m.

C o n C u r r e n T S e S S i o n S

18 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

W1 TEST MAnAGEMEnT

The Myths of rigor James Bach, Satisfice, Inc.

we hear that more rigor means good testing and, conversely, that less rigor means bad testing. Some managers—who’ve never studied testing, done testing, or even “seen” testing up close—insist that testing be rigorously planned in advance and fully documented, perhaps with tidy metrics thrown in to make it look more scientific. However, sometimes measurement, documentation, and planning don’t help. In those cases, rigor may require us not to do them. As part of winning court cases, James Bach has done some of the most rigorous testing any tester will do in a career. James shows that rigor is at least as dangerous as it is useful and that we must apply care and judgment. He describes the struggle in our craft, not just over how rigorous our processes should be, but what kind of rigor matters and when rigor should be applied.

W2 TEST TECHnIquES

Patterns of TestabilityAlan Myrvold, Microsoft

Testability requires interfaces for observing and controlling software, either built into the software itself or provided by the software ecosystem. observability exposes the input and output data of components, as well as monitoring execution flow. Controllability provides the ability to change data and drive actions through the component interface. without testability interfaces, defects are harder to find, reproduce, and fix. Manual testing can be improved by access to information these interfaces provide, while all automated testing requires them. Alan Myrvold shares software component diagrams that show patterns of testability. These patterns will help you architect and evaluate the observability and controllability of your system. Apply these testability patterns to describe and document your own testability interfaces. learn new ways to think and talk about the testability of your software and create interfaces for a more robust test effort.

W3 TEST AuToMATIon

using Test automation Frameworks Andrew Pollner, ALP International

As you embark on implementing or improving automation within your testing process, you’ll want to avoid the “Just Do It” attitude some have taken. Perhaps you’ve heard the term “test automation framework” and wondered what it means, what it does for testing, and if you need one. Andrew Pollner, who has developed automated testing frameworks for more than fifteen years, outlines how frameworks have grown up around test automation tools. Regardless of which automation tool you use, the concepts of a framework are similar. Andrew answers many of your questions: why build a framework? what benefit does it provide? what does it cost to build a framework? what RoI can I expect when using a framework? Explore the different approaches to framework development and identify problems to watch out for to ensure the approach you take will provide years of productivity. leave with a better understanding of automated frameworks and the confidence to build your own or improve the one you have.

W4 THE nEW WAVE

Testing aJaX: What does it Take?Joachim Herschmann, Borland (a Micro Focus company)

Using AJAX technologies, web 2.0 applications execute much of the application functionality directly in the browser. while creating a richer user-experience, these technologies pose significant new challenges for testers. Joachim Herschmann describes the factors that are critical in testing web 2.0 applications and what it takes to master these challenges. After presenting an overview of typical web 2.0 application technologies, Joachim explains why object recognition, synchronization, and speed are the pillars for a truly robust and reliable AJAX test automation approach. He shows how to architect testability directly into AJAX applications, including examples of how to instrument applications to provide the data that testing tools require. Joachim shares his experiences of Micro Focus’s linz development lab and describes how they overcame the challenges of testing their modern AJAX applications. Join Joachim to learn how testers and developers can collaborate to create more testable web 2.0 applications.

W5 SPECIAL ToPICS

The Many hats of a TesterAdam Goucher, Consultant

As testers, we must wear many hats to do our job effectively. Quite often, it is the pith helmet of an explorer, hacking through the vines and darkness of the unknown; or the baseball cap of the crime scene investigator, determining how the failure occurred. To make things even more interesting, the hats we need often differ from project to project and organization to organization. Adam goucher begins with a general discussion of some hats testers typically wear and when they are appropriate or inappropriate. He then leads an “Art Show” exercise—a brainstorming process resulting in lots of “art” on the walls—illustrating the hats we all may wear in our daily testing activities. Through the Art Show process, you’ll take away new insights into what hats you and other testers need, tips for wearing the beautiful ones with success, and how to avoid putting on the ugly ones.

“Wonderful, fabulous, and EXTREMELY EDUCATIONAL.”— Denise Cormier, Business Analyst

Page 19: Software testing

MONDAY, MAY 16, 8:30-5:00

WedneSday, APRIl 28, 1:45 p.m.

C o n C u r r e n T S e S S i o n S

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 19

W6 TEST MAnAGEMEnT

The Top Ten Challenges—or opportunities—We Face TodayLloyd Roden, Grove Consultants

Some people thrive on challenges; others struggle to deal with them. Handled well, challenges can make us stronger in our passion, drive, and determination. lloyd Roden describes the software testing challenges we face today and how we can respond in a positive, constructive manner. one challenge lloyd often sees is identifying and eliminating metrics that lie. while we (hopefully) do not set out to deceive, we must endeavor to employ metrics that have significance, integrity, and operational value. Another challenge test leaders face is providing estimates that have clarity, accuracy, and meaning. often when developing test estimates, we omit a vital ingredient—the quality required in the product. A third challenge is convincing test managers to actually test regularly to attain credibility and respect with the team they are leading. Join lloyd as he delivers passionate and compelling arguments for turning these and other challenges into opportunities for the test team.

W7 TEST TECHnIquES

automated Test Case generation using Classification Trees Peter M. Kruse and Magdalena Luniak, Berner & Mattner Systemtechnik GmbH

The basic problem in software testing is choosing a subset from the near infinite number of possible test cases. Testers must select test cases to design, create, and then execute. often, test resources are limited—but you still want to select the best possible set of tests. Peter M. Kruse and Magdalena luniak share their experiences designing test cases with the Classification-Tree Editor (CTE Xl), the most popular tool for systematic black-box test case design of classification tree-based tests. Peter and Magdalena show how to integrate weighting factors into classification trees and automatically obtain prioritized test suites. In addition to “classical” approaches such as minimal combination and pair-wise, they share new generation rules and demonstrate the upcoming version of CTE Xl that supports prioritization by occurrence probability, error probability, or risk. If your test case design methods are stuck in the last century, this session is for you.

W8 TEST AuToMATIon

Test automation Success: Choosing the right People and ProcessesKiran Pyneni, Aetna, Inc.

Many testing organizations mistakenly declare success when they first introduce test automation into an application or system. However, the true measure of success is sustaining and growing the automation suite over time. You need to develop and implement a flexible process, and engage knowledgeable testers and automation engineers. Kiran Pyneni describes Aetna’s two-team automation structure, the functions that each group performs, and how their collaborative efforts provide for the most efficient test automation. Kiran explains how to seamlessly integrate your test automation lifecycle with your software development lifecycle. He shares specific details on how Aetna’s automation lifecycle benefits their entire IT department and organization, and the measurements they use to track and report progress. Take back an organization structure, an automation process, and a set of the metrics to showcase how test automation can add value to your entire development organization.

W9 THE nEW WAVE

Virtual Test labs in the Cloud Ravi Gururaj, VMLogix

In most software engineering organizations, development and test labs continuously demand regular computer, storage, and networking infrastructure upgrades and continuous support. lab administrators have moved toward server consolidation powered by virtualization platforms from vendors such as Citrix, Microsoft, and VMware, often accompanied by a management layer called virtual lab automation (VlA). Together, virtualization and VlA enable the lab to operate as a private, on-premise cloud. while this solves some problems, there are still other challenges to consider. Some test labs now leverage public cloud infrastructures such as Amazon web Services. Ravi gururaj reviews virtual labs enabled in private, public, and hybrid clouds, and explains how they improve development, build, and test processes. learn about the key areas of consideration for organizations adopting or evaluating this approach, including security, public cloud infrastructure, operational flexibility, user access models, and compliance.

W10 SPECIAL ToPICS

Chartering the Course: guiding exploratory Testing Rob Sabourin, AmiBug.com

Charters help you guide and focus exploratory testing. well-formed charters help testers find defects that matter and provide vital information to stakeholders about the quality and state of the software under test. Rob Sabourin shares his experiences defining different exploratory testing charters for a diverse group of test projects. For example, reconnaissance charters focus on discovering application features, functions, and capabilities; failure mode charters explore what happens to applications when something goes wrong. In addition, you can base charters on what systems do for users, what users do with systems, or simply the requirements, design, or code. Rob reviews key elements of a well-formed testing charter—its mission, purpose, focus, understanding, and scope. learn how to evolve a test idea into an exploratory charter using examples from systems testing, Scrum story testing, and developer unit testing.

“Overall, one fantastic experience that inspires improvement and learning! I would recommend to others.”— Kathleen Greenburg, Application Test Coordinator

Page 20: Software testing

WedneSday, APRIl 28, 3:00 p.m.

C o n C u r r e n T S e S S i o n S

20 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

W11 TEST MAnAGEMEnT

Service-driven Test Management Martin Pol, POLTEQ IT Services BV

over the years, the test manager’s role has evolved from “struggling to get involved early” to today’s more common “indispensable partner in project success.” In the past, when “us vs. them” thinking was common, it was easy to complain that the testing effort could not be carried out as planned due to insufficient specs, not enough people, late and incomplete delivery, no appropriate environments, no tools, tremendous time pressure, etc. Martin Pol explains how today’s test managers must focus on providing a high level of performance. By using a service-driven test management approach, test managers support and enhance product development, enabling the project team to improve overall quality and find solutions for any testing problem that could negatively impact the project’s success. learn how to manage in the modern testing scene by anticipating from the beginning, organizing alternatives and shortcuts, enabling the project team to deliver quality, and taking ownership as a full partner in the project.

W12 TEST TECHnIquES

Meet “ellen”: improving Software Quality through Personas Euan Garden, Microsoft

Users are the ultimate judge of the software we deliver because it is critical to their success and the success of their business. However, as a tester, do you really understand their tasks, skills, motivation, and work style? Are you delivering software that matches their needs and capabilities—or yours? Personas are a way to define user roles—imaginary characters—that represent common sets of characteristics of different users. Euan garden shares how his team at Microsoft defined and used one persona named “Ellen” to help them design, develop, and test the first version of a new product. Euan shares before Ellen and after Ellen examples of the product, showing how the product changed when Ellen joined the team. See examples of the robust test cases and acceptance scenarios they defined from unique insights that Ellen provided. Discover how to create and utilize an “Ellen” on your next project or on the one you’re struggling with today.

W13 TEST AuToMATIon

automated Testing with domain Specific Test languages Martin Gijsen, DeAnalist.nl

Although testers are good at test analysis and design, few are expert developers who can create good automated tests. This often leaves automated test development in a kind of no man’s land between the disciplines of testing and development. Martin gijsen shows how using a high-level, Domain Specific Test language (DSTl) solves this dilemma. Applicable to any system or application with either waterfall or agile development, this advanced form of keyword-driven testing allows testers to focus on testing issues and developers to focus on programming by separating “what” to test from “how” to test. Using DSTls, automated tests require no programming skills to write and maintain. Martin demonstrates how developers can implement DSTl keywords using any suitable technology, including free open source software, and how testers can benefit from effective automated testing as systems evolve.

W14 THE nEW WAVE

Virtualizing overutilized Systems to eliminate Testing Constraints Rajeev Gupta, iTKO

organizations currently are using virtualization in the test lab to eliminate underutilized systems such as physical computers and software. So why not virtualize the costly, overutilized, or completely unavailable elements of the software architecture that have serious access and data issues for testing? These elements required for realistic end-to-end testing—mainframe computers, production systems of record, and computing services hosted by other companies—are often difficult or expensive to access for testing. Rajeev gupta explains how virtualizing these overutilized systems can make the constraints of capacity, test data, and availability for testing a distant memory. Discover how service virtualization, employed as an adjunct to hardware lab virtualization, eliminates the bottlenecks and data management efforts that stymie many test and development teams. Enhance your test lab’s availability and capacity and lower costs at the same time with virtualization services.

W15 SPECIAL ToPICS

The elusive Tester-developer ratio Randy Rice, Rice Consulting Services

Perhaps the most sought after and least understood metric in software testing is the ratio of testers to developers. Many people are interested in learning the standard industry ratio so that they can determine the proper size of their test organization. Randy Rice presents the results of his recent research on this metric and explores the wide range of tester-developer ratios in organizations worldwide. learn why this metric is almost always not the best way to determine your test organization’s staffing levels and how to understand and apply this metric in more helpful ways. Find out how different tester-developer ratios relate to test effectiveness. Take away a new appreciation of your own tester-developer ratio and ways to meaningfully convey this metric to management to help rightsize your test team and improve the RoI of testing. Determine the “right ratio” of testers to developers in your team and company.

“I walked away with a handful of items to take back to our business to improve our testing. I couldn’t have asked for anything more.”— John Kimpel, Software Quality Manager

Page 21: Software testing

MONDAY, MAY 16, 8:30-5:00

ThurSday, APRIl 28, 9:45 a.m.

C o n C u r r e n T S e S S i o n S

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 21

T1 TEST MAnAGEMEnT

a Test odyssey: building a high Performance, distributed Team Matt Heusser, Socialtext

It seemed simple enough—hire the best available technical staff that would work from home to build some great software. Along the way, the team encountered the usual problems: time zone differences, communication headaches, and a surprising regression test monster. Matt Heusser describes how Socialtext built their high-performance development and test team, got the right people on the bus, built a culture of “assume good intent and then just do it,” created the infrastructure to enable remote work, and employed a lightweight yet accountable process. of course, the story has the impossible deadlines, conflicting expectations, unclear roles, and everything you’d get in many development projects. Matt shares how the team cut through the noise, including building a test framework integrated into the product, to achieve their product and quality aims. Take away a list of technologies that make remote work possible, cultural ideas to make it effective, and some things to try on Monday—plus, some apparently good ideas that you definitely want to avoid!

T2 TEST TECHnIquES

i Wouldn’t have Seen it if i hadn’t believed it: Confirmation bias in Testing Michael Bolton, DevelopSense

“It ain’t what we don’t know that gives us trouble; it’s what we know that ain’t so.” will Rogers was talking about confirmation bias—the tendency to feel secure in our beliefs rather than to seek evidence that might challenge them. In testing, confirmation bias prompts us to stop a test too early, to choose tests that conform too closely to the happy path, or to ignore results that confound our expectations. As a result, defects have a chance to hide in our self-induced blind spots. we can’t eliminate confirmation bias, but we can manage and control it by diversifying our models, our techniques, and our test teams. In this hands-on and eyes-on session, Michael Bolton presents a set of exercises, videos, and conversations that show testing biases in action. Discover some new tricks that can help you defend yourself and your testing clients from being too sure, too soon, and later … sorry.

Laptop Required. Participants are encouraged to bring a laptop computer on which to perform exercises.

T3 PERFoRMAnCE TESTInG

Performance Testing Throughout the life Cycle Chris Patterson, Microsoft

Even though it is easy to say that you should continuously test your application for performance during development, how do you really do it? what are the processes for testing performance early and often? what kinds of problems will you find at the different stages? Chris Patterson shares the tools and techniques he recently used during the development of a highly concurrent and highly scalable server that is shipping soon. Chris explores how developers and testers used common tools and frameworks to accelerate the start of performance testing during product development. Explore the challenges they faced while testing a version 1 product, including defining appropriate performance and scale goals, simulating concurrent user access patterns, and generating a real world data set. learn from his team’s mistakes and their successes as Chris shares both the good and the bad of the process and results.

T4 AGILE TESTInG

implementing agile Testing Robert Reff, Thomson Reuters

once the company decides to move to an agile development methodology, questions invariably arise: How should we implement this methodology? what are the expected benefits and pitfalls? How does testing fit into this new approach? Join Robert Reff as he describes real world experiences that helped his test team move from the design-code-test approach to a test-driven, agile development philosophy. Robert offers concrete advice on how to integrate testing, what testing activities to include or drop, and what to expect from both automation and exploratory testing. He describes possible practices, focus, and pitfalls, rather than the all-or-nothing approach often recommend by well-meaning experts. Take home the basis of a plan to implement agile testing without falling into common traps such as lack of training, demoralizing your team, sacrificing a project to the methodology gods, and—worst of all—abandoning processes that work while ignoring needed improvements.

T5 SPECIAL ToPICS

how google Tested Chrome James Whittaker and the Google Chrome Test Team, Google

Ever wish you could peek inside a big, high-tech company and see how they actually do testing? well, now you can. led by James whittaker, google’s Chrome Test Team will detail everything they have done to test google Chrome—both the browser and the Netbook operating system—beginning with their process for test planning and how they design test automation. James and his team share their initial plans, automation efforts, and actual results in what is likely to be the most candid and honest assessment of internal testing practices ever presented. learn what worked, what didn’t work, and how they’d proceed if they had it all to do over again. Take away copies of google’s actual test artifacts and learn how to apply google’s test techniques on the product you are currently testing.

“I really liked the variety of track sessions that were offered. Really did a great job at covering all testing possibilities”— Alex Pappas, Test Engineer

Laptoprequired

Page 22: Software testing

ThurSday, APRIl 29, 11:15 a.m.

C o n C u r r e n T S e S S i o n S

22 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

T6 TEST MAnAGEMEnT

heuristics for rapid Test Management Jon Bach, Quardev, Inc.

whether you are a tester or a test manager, Jon Bach believes you have little time to do the things you want to do. Even the things on your “absolutely must do” list are competing for your limited time. Jon has a list of what he calls “half-baked” ideas on how to cope. That is, these ideas are still in the oven—still being tested. In his role as a tester and manager, Jon has learned that it’s not about time management; it’s really about energy management—where you focus your personal energy and direct your team’s energy. Jon shares ideas that have worked for him and some that have failed: open-Book Testing, Dawn Patrols, Tester Show-and-Tell, Test Team Feud, and Color-Aided Design. learn how these ideas may solve your problems with test execution, reporting, measurement, and management—all at low or no cost and relatively easy to implement.

T7 TEST TECHnIquES

Test environments: The Weakest link in your Testing ChainJulie Gardiner, Grove Consultants

Test environments are an important part of our testing portfolio, yet often we seem to spend very little time planning, creating, and maintaining them. Julie gardiner explains the reasons we fail to build test environments that are realistic, reliable, representative, and have integrity. As a result, they become the weakest link in our testing process. Julie provides examples of environments—good, bad, and sometimes ugly—and shares why the ugly are often a symptom for an organization’s disregard for testing. She offers practical advice for transforming your current test environment from the weakest into the strongest link of your testing. Julie’s specific advice includes identifying early signs that the environment will cause problems, convincing management that extra resources are required, and obtaining tool support—for test data test data preparation and test execution—to assist in creating an excellent environment.

T8 PERFoRMAnCE TESTInG

Performance Testing SQl-based applications Alim Sharif, Ultimate Software

often, we discover the “real” software performance issues only after deploying the product in a production environment. Even though performance, scalability, stability, and reliability are standards of today’s software development, organizations often wait until the end of the development life cycle to discover these limitations, resulting in late deliveries and even chaos. He embraces agile development’s philosophies to explain how performance testers can identify and resolve software performance issues early and continue performance testing throughout the development process. learn how to optimize the use of performance tuning tools such as SQl profiler and MS PerfMon to identify and fix MS SQl server, application, and web server performance issues. Institute agile methods in your performance testing efforts to avoid that “oh, no!” moment when the system goes live.

T9 AGILE TESTInG

avoid Failure with acceptance Test-driven development C.V. Narayanan, Sonata Software Limited

one of the major challenges confronting traditional testers in agile environments is that requirements are incrementally defined rather than specified at the start. Testers must adapt to this new reality to survive and excel in agile development. C.V. Narayanan explains the Acceptance Test-Driven Development (ATDD) process that helps testers tackle this challenge. He describes how to create acceptance test checkpoints, develop regression tests for these checkpoints, and identify ways to mitigate risks with ATDD. learn to map acceptance test cases against requirements in an incremental fashion and validate releases against acceptance checkpoints. See how to handle risks such as requirements churn and requirements that overflow into the next iteration. Using ATDD as the basis, learn new collaboration techniques that help unite testing and development toward the common goal of delivering high-quality systems.

T10 SPECIAL ToPICS

Creating Crucial Test ConversationsBob Galen, iContact

Many test leaders believe that development, business, and management don’t understand, support, or properly value our contributions. You know what—these test leaders are probably right! So, why do they feel that way? Bob galen believes it’s our inability and ineffectiveness in communicating—selling—ourselves, our abilities, our contributions, and our value to the organization. As testers, we believe that the work speaks for itself. wrong! we must work harder to create the crucial conversations that communicate our value and impact. Bob shares specific techniques for holding context-based conversations, producing informative status reports, conducting attention-getting quality assessments, and delivering solid defect reports. learn how to improve your communication skills so that key partners understand your role, value, and contributions. Discover why improving your cross-team communications and feedback skills is a key to creating highly effective test teams. Come prepared to engage and begin developing your own crucial conversations.

ConFerenCe bonuS!One-YearDigitalSubscriptiontoBetter SoftwareMagazine!

STAREAST 2010 conference attendees receive a one-year digital subscription (six issues) to Better Software magazine—the only magazine delivering relevant, timely information so you can tackle the challenges of building better quality software, regardless of your role in the software development lifecycle. www.betterSoftware.com

If you are a current subscriber, your subscription will be extended for an additional six digital issues.

October 2008 $9.95 www.StickyMinds.com

The Print Companion to

CODERS GONE WILD

Three common

archetypes and how

to handle them

GOOGLE WEB TOOLKIT

Your key to simplifying

Ajax app builds

Page 23: Software testing

MONDAY, MAY 16, 8:30-5:00

ThurSday, APRIl 29, 1:30 p.m.

C o n C u r r e n T S e S S i o n S

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 23

T11 TEST MAnAGEMEnT

Proving our Worth: Quantifying the Value of Testing Lee Copeland, Software Quality Engineering

over the years, experts have defined testing as a process of checking, a process of exploring, a process of evaluating, a process of measuring, and a process of improving. For a quarter of a century, we have been focused on the internal process of testing, while generally disregarding its real purpose—creating information that others on the project can use to improve product quality. Join lee Copeland as he discusses why quantifying the value of testing is difficult work. Perhaps that’s why we concentrate so much on test process; it is much easier to explain. lee identifies stakeholders for the information we create and presents a three-step approach to creating the information they need to make critical decisions. He shares key attributes of this information—accuracy, timeliness, completeness, relevancy, and more. learn how to turn vague impressions and a multitude of data into concise, actionable information that will improve your products and support your organization.

T12 TEST TECHnIquES

Focusing Test efforts with System usage PatternsDan Craig, Coveros, Inc.

Faced with the reality of tight deadlines and limited resources, many software delivery teams turn to risk-based test planning to ensure that the most critical components of the software are production ready. Although this strategy can prove effective, it is only as good as your underlying risk analysis. Unfortunately, understanding where risk lies within a product is difficult with the analysis often resulting in little more than an “educated guess.” These risk-based testing exercises can lead to uneven test coverage and the uneasy feeling that the team has neglected to test what is really important. Dan Craig describes how to employ system usage patterns and production defect reports to identify the real risks in a system. walk with Dan as he creates a risk-based test plan, explores which production usage patterns provide the most value, and describes how his teams leveraged agile practices to quickly account for unexpected gaps in test coverage.

T13 SECuRITy TESTInG

Tour-based Testing: The hacker’s landmark Tour Rafal Los, Hewlett-Packard

growing application complexity, coupled with the exploding increase in application surface area, has resulted in new quality challenges for testers. Some test teams are adopting a tour-based testing methodology because it’s incredibly good at breaking down testing into manageable chunks. However, hackers are paying close attention to systems and developing new targeted attacks to stay one step ahead. Rafal los takes you inside the hacker’s world, identifying the landmarks hackers target within applications and showing you how to identify the defects they seek out. learn what “landmarks” are, how to identify them from functional specifications, and how to tailor negative testing strategies to different landmark categories. Test teams, already choked for time and resources and now saddled with security testing, will learn how to pinpoint the defect—from the mountains of vulnerabilities often uncovered in security testing—that could compromise the entire application.

T14 AGILE TESTInG

enable agile Testing through Continuous integration Sean Stolberg, Pacific Northwest National Laboratory

Continuous integration is one of the key processes that support an agile software development and testing environment. Sean Stolberg describes how a traditional software tester—transitioning to an agile development environment—put a continuous integration infrastructure in place. In doing so, he helped improve development practices and made possible his team’s transition to agile testing. Sean discusses his team’s initial motivations for adopting agile development practices and dives into the nuts-and-bolts implementation details. He shares their post-assessment of the implementation using Martin Fowler’s “Practices of Continuous Integration” and concludes with a retrospective on implementing and promoting continuous integration within the context of agile testing. Find out how continuous integration can help improve your testing results and the quality of the software your team delivers.

T15 TESTInG MoBILE APPLICATIonS

Creating the right environment for Mobile applications Testing Nat Couture, Professional Quality Assurance Ltd.

Is your organization releasing applications that target multiple mobile devices, platforms, or browsers? If so, you have faced—or soon will face—the challenge of choosing and setting up a test environment for these devices and platforms. Nat Couture shows how to develop a cost-effective application test environment to mitigate the risks associated with deploying mobile applications. He shares his latest research on mobile devices, mobile platforms, and mobile browser usage, and explains in detail what you need to consider when choosing a test environment. learn how to select a winning combination of device-specific simulation, platform-specific simulation, and browser-specific simulation—coupled with tests on the actual devices. Build a mobile device testing program that reduces cost, increases coverage, and helps achieve the level of confidence you need to release mobile applications into production.

“Variety of speakers—some with very different viewpoints. Great that it’s not homogenized to one view. By the way, SQE ROCKS!!”— Jim Peak, Tester Manager

Page 24: Software testing

ThurSday, APRIl 29, 3:00 p.m.

C o n C u r r e n T S e S S i o n S

24

T16 TEST MAnAGEMEnT

The Power of riskErik Boelen, qa consult

Erik Boelen starts his risk-based testing where most others stop. Too often, risk-based test strategies are defined in the initial test plan and are never looked at or used again. Erik explores how a dynamic, living risk-based testing strategy gives testers a vital tool to manage and control testing activities and identify the infrastructure they need to perform these activities. Find out how to use your risk-based testing strategy as a tool for negotiations among the different stakeholders. Take on the important role of risk mediator for all of the parties in the project. The risk-based test strategy is a tool you can use to defend testing’s need for time and resources, especially when late delivery is possible. Use your risk-based strategy to drive and manage exploratory testing sessions. Take back the tools and skills you need to develop and maintain a comprehensive and up-to-date risk-based test plan to guide you and all stakeholders throughout the project.

T17 TEST TECHnIquES

keys to a Successful beta Testing ProgramRob Swoboda, HomeShore Solutions

Your company is ready to launch its new product. How will it perform under real-world conditions? will it meet the needs and expectations of the users? will it operate on all the platforms and configurations for which it was designed? with the future of the product, your company, and perhaps your job depending on the answers, beta testing is a great way to maximize your chances of success. Beta testing provides empirical metrics that prove or disprove that your product will meet clients’ expectations, providing you with input for necessary course corrections in the product. Rob Swoboda explains the process of beta testing as well as the key concepts needed to plan, execute, and evaluate a successful beta testing effort. Rob shares his insights into the practices he employs to design and manage high-priority beta test efforts and offers you the keys to succeed in your own beta test program.

T18 SECuRITy TESTInG

Web Security Testing with rubyJames Knowlton, McAfee, Inc.

To ensure the quality and safety of web applications, security testing is a necessity. So, how do you cover all the different threats—SQl injection, cross-site scripting, buffer overflow, and others? James Knowlton explains how Ruby combined with watir—both freely available—makes a great toolset for testing web application security. Testing many common security vulnerabilities requires posting data to a web server via a client, exactly what watir does. The Ruby side of watir, a full-function programming language, provides the tools for querying the database, checking audit logs, and other test-related processing. For example, you can use Ruby to generate random data or large datasets to throw at a web application. James describes common security attacks and demonstrates step-by-step examples of testing these attack types with Ruby and watir. leave this session with a new set of free, open source tools for security and other testing needs.

T19 AGILE TESTInG

Taming bug reports and defects: The agile Way Lisa Crispin, ePlan Services, Inc.

Software defects bug everyone. If your organization is like most and you have a large queue of defects waiting to be fixed, this session is for you. It’s probably not realistic to think we’ll get around to fixing all of these bugs; so, we need to consider another approach. lisa Crispin explains how agile teams address defects and how you can apply an agile approach to defects whether or not your development approach is “agile.” Explore with lisa ways to deal with a giant pile—or database—of old bug reports and which of the many, available defect tracking systems to consider—if you need one at all. See examples of alternatives to traditional bug reporting and how to shift your team’s mindset toward preventing bugs in the first place. get new ideas for taming your backlog of defects and discover ways your team can work together to minimize or eliminate bug reports all together.

T20 TESTInG MoBILE APPLICATIonS

Crowdsourced Testing of Mobile applicationsDoron Reuveni, uTest

with new mobile applications for Blackberry, iPhone, and Android battling for media attention and consumer dollars, the pressure to get applications built, tested, and launched has never been greater. getting high-quality apps to market quickly can make or break a product or company. However, the testing methods that work for web and desktop apps (e.g., in-house QA, outsourcing, emulators/simulators, and beta testers) do not meet the extreme testing needs of mobile apps. Companies must test across many handset makers and models, wireless carriers, operating systems, browsers, and locations. This calls for a new approach—crowdsourcing. Doron Reuveni provides insight into the growing trend of crowdsourced testing for mobile applications and addresses both the benefits and challenges of this new testing model. learn how, by tapping into a diverse crowd of testers operating outside the lab environment, companies can build a virtual test team that meets their coverage requirements.

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

See what a STAR experience can do for you and your team!

Page 25: Software testing

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 25

“Eye-opening. Anyone who wishes to lead a test team effectively must attend...”— Nawajish Noman, PhD., Product Engineer/Test Lead

“Renowned speakers shaping their thoughts and giving better insight into testing”— Shital Shisode, Test Lead

“Excellent throughout – very well organized high quality presentations” — Rich Brahm, Test Automation Engineer

“Networking was very helpful. We were able to brainstorm new testing ideas based on lessons learned from the seminar”— STAR Delegate

“This conference was highly recommended to me by some consultants who were doing testing work in my group. I’ve come away with some ideas that will really help my organization”— Jim Peak, Tester

Explore ways to network with your peers before, during, and after the event. Get real-time updates, connect with colleagues, get help with issues, and more!

get up-to-the-minute STAREAST updates on Twitter. Start following us today! twitter.com/stareast Join the STAR Testing Conferences page on Facebook. learn about ways to save, stay connected, and invite others to join. log on to Facebook and search STAR Testing Conferences. www.facebook.com RSVP to attend this event and invite others on linkedIn. Check for periodic updates on the event. log on to linkedin, click on your profile page, go to your events section, and under the find events tab search STAREAST 2010. www.linkedin.com

Get Connected!

Page 26: Software testing

friday, april 30

Thursday, april 29

The Testing & Quality leadership SummitFriday, April 30, 2010

26 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

As the economy stabilizes and begins to improve, what should software leaders—test managers, development managers, IT directors, software executives, and CIos—focus on to ensure that testing adds

the maximum value to the organization? The Testing & Quality leadership Summit is a unique opportunity for software leaders to share ideas and gain new insights into today’s software testing issues.

Program chairs Ross Collard and Rob Sabourin invite you to join senior leaders from the industry—Jane Fraser, Electronic Arts; Miles lewitt, Intuit; Andy Kaufman, Institute for leadership Excellence & Development; and goranka Bjedov, google—to share your experiences and ideas as you engage with summit participants in thoughtful discussions about the pressing testing and quality issues we all face.

The Software Testing & Quality leadership Summit provides the perfect opportunity for you to:

• Meet and network with your peers in the industry• Participate in insightful and informative sessions focusing on testing and quality solutions• Join in the “think tank” discussion on test leadership issues• Develop new ideas and action plans for innovation within your test organization

Who Should attend?

The Software Testing & Quality leadership Summit is for testing, software, and IT leaders who are looking for innovative ways to lead their organizations toward better software quality and improved testing productivity. whether you are experienced in managing software testing and QA or new to the field, the Testing & Quality leadership Summit offers you a unique opportunity to start or to continue your work toward being a great test leader.

8:00 registration and Breakfast

8:30 navigating rough Waters Jane Fraser, QA Director, Electronic Arts

9:30 networking Break

9:45 systematic innovation Miles Lewitt, VP Advanced Technology, Intuit

10:45 networking Break

11:00 Think Tank session: leadership solution Brainstorm and discussion

12:30 networking lunch Buffet

1:30 The dirty little secret of Business Andy Kaufman, Institute for Leadership Excellence & Development, Inc.

2:30 networking Break

2:45 Testing Quality vs. productivity: a View from the Trenches Goranka Bjedov, Senior Test Engineer, Google

3:45 Wrap-up and ongoing informal discussions with speakers and attendees

5:00 Welcome reception—Think Tank issues identification: as a leader, What is keeping you up at night? Rob Sabourin, Amibug.com

Page 27: Software testing

Testing & Quality leadership Summit Sessions

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

8:30 a.m.

navigating rough WatersJane Fraser, QA Director, Electronic Arts

when doing more with less is all too often the norm, retaining your team’s capabilities and a positive outlook is tough. Uncertainties are many: downsizing, offshoring, outsourcing, rumors, mergers, acquisitions, budget cuts, overexpansion, jostling for advantageous position, extreme competition for staff, and more. These challenges can result in all sorts of stresses on your team—and on you personally. Jane Fraser, who has lived through many of these issues herself, explores how stresses can impact your team and how you—as their leader—must work to keep everyone focused on the right goals without stressing out. Find out how, by managing emotions and information, the team can learn to direct itself and determine what it needs to do to get the job done. Discover new processes and approaches to mitigate stress and work through the inevitable uncertainties that crop up at work. Don’t let the rough waters undermine your team dynamics. Instead, take charge and lead your team to a new place where— together as one—they are empowered to manage difficult challenges.

9:45 a.m.

Systematic innovationMiles lewitt, VP Advanced Technology, Intuit

why is innovation on the agenda at the STAR Testing and Quality leadership Summit? If you link quality to both delighting your customers and improving business performance, the relationship to innovation becomes clear. To quote Peter Drucker, “The business enterprise has two—and only these two—basic functions: marketing and innovation. [These] produce results; all the rest are ‘costs.’” As the leader of a quality group and a member of the leadership team, you are personally responsible for innovation within your organization. Join Miles lewitt and explore ways to encourage and enable innovation at all levels of your organization. Examine how to balance test and other development resources among mature, rapidly growing, and new projects; free up resources to pursue innovative opportunities; enable your staff to develop and leverage their innovative capabilities; employ purposeful disruption to drive business growth; and connect the dots between emerging technologies and your customers’ needs. Most importantly, find out what it takes to act on the ensuing insights and turn your cost center into an innovation center.

1:30 p.m.

The dirty little Secret of business Andy Kaufman, Institute for Leadership Excellence & Development, Inc.

Today, how do you deal with difficult people who have the power to impede your ability to deliver? How can you persuade a disinterested party to act on your needs when you don’t have direct authority? How can you find time to build rapport when you are overwhelmed with day-to-day work and struggling to survive? How do some people appear to succeed naturally, almost effortlessly, when you seem to struggle? The secret is simple—it’s all about relationships. In this dynamic session, Andy Kaufman explores the relationship-building skills you must develop to advance your projects—and your career. Through group interaction, you’ll discover new insights into building mutual respect, practice empathetic listening techniques, and deepen your understanding of different personality types. learn what networking is—and isn’t—and how to increase the effectiveness of your networks and influence. Return to work with new tools for developing your personal effectiveness at work—and at home.

2:45 p.m.

Testing Quality vs. Productivity: a View from the TrenchesGoranka Bjedov, Senior Test Engineer, Google

Quality tests, which focus on predicting a system’s readiness for delivery, provide the project team, senior management, and client/user constituents with important information while finding late-stage defects. Usually, quality tests apply to the system as a whole rather than at the component level. Productivity tests, which run before code is checked into the code-base, increase confidence that a developer’s additions and updates work correctly and improve productivity by enabling developers to continue with new assignments more quickly. Most unit tests, integration tests, and micro performance benchmarks fall into the productivity category. Frequently, its proponents feel productivity testing is the only kind of testing needed on a project today. So, do we still need testing for quality? Is it possible that testing approaches of the past few decades are now passé? goranka Bjedov engages summit participants in speculating on the future of both quality and productivity testing. Find out what goranka and your peers think the next few years will bring as businesses continue to focus on maximizing value across the organization.

As an industry veteran with more than fourteen year of experience, Jane Fraser brought her expertise from the ecommerce and telecomm industry to the online gaming world when she joined Pogo in 2004. In her role as

QA director, Jane oversees the QA Department, which includes Pogo.com, Club Pogo, and the downloadable business. She has successfully launched more than sixty games in six territories including Scrabble and Battleship. During her time with Pogo, Jane has provided leadership, established testing process, and managed a team of testers, which she has grown from six to a robust team of sixty-eight in four countries. Prior to joining Pogo, Jane worked for companies including Corel, Vodafone, and Bigwords.

A vice president at Intuit since 2001, Miles Lewitt manages a corporate development group and specializes in leading change that produces dramatic improvements. A proven senior engineering leader with a

passion to deliver for customers, he takes teams on journeys of innovation and continuous improvement, and leads projects where the organization wants to do something significantly new. Early in his career, Miles managed Intel’s software technology lab and the development of many hardware and software products, including the Product of the Year as selected by Electronic Products magazine. He launched and managed Intel’s first overseas software development group.

Andy Kaufman works worldwide with people who want to improve how they lead teams and deliver results. He helps professionals and managers focus on the most important issues and proactively solve them. Andy’s workshops and

coaching services have reached tens of thousands of people—from hundreds of companies—helping them become more confident leaders and achieve the results they desire while maintaining balanced lives. Andy is the author of Navigating the Winds of Change: Staying on Course in Business & in Life, Shining the Light on The Secret, and an e-book How to Organize Your Inbox & Get Rid of E-Mail Clutter. He is a certified Project Management Professional (PMP®) and host of The People and Projects Podcast.

Known for her technical achievements and her refreshing wit, Goranka Bjedov has worked at google where she has planned and implemented product testing and has been responsible for risk analysis and assessment for

the past five years. Her current testing interests include both server and client-side performance, robustness, and scalability testing for web-based applications. Before google, goranka held senior test engineering positions at Network Appliance and AT&T labs. Previously, she was an Associate Professor in the School of Engineering at Purdue University. A speaker at numerous testing and performance conferences, goranka also has authored many papers, presentations, and two textbooks.

friday, april 30

27

Page 28: Software testing

28 C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

assessing your readiness for the iSTQb® Foundation exam Dale Perry, Software Quality Engineering

Sunday, April 25, 2010—8:30 a.m. - 5:00 p.m.

You’ve studied the ISTQB® Foundation syllabus. You’ve read supplementary material. You may have taken an exam preparation course, but you are not sure if you are ready to take the exam. This limited enrollment one-day review of the ISTQB® Foundations syllabus will help you assess your readiness for the exam. This is not an in-depth course, but an overview of key terms and concepts for those who have already studied. For many people, a key concern is testing terminology. As testers, many of the concepts presented in the syllabus are familiar to us. we have used many of them in our work. However, we may have used different terms in our organizations. For example, the word “review” is used in many different ways but has a very specific definition in the syllabus. Together, we review each section of the syllabus focusing on those areas that tend to cause the most confusion. This is an opportunity to evaluate your knowledge prior to sitting for the ISTQB® Foundation exam.

THIS BonuS SESSIon IS noT ACCREDITED By THE ISTqB® oR Any nATIonAL BoARD.

* Limited seats available. Reserve your seat by contacting the Client Support Group at 888.268.8770 or 904.278.0524 or email them at [email protected]

FuLL DAy SunDAy BonuS SESSIon

Bonus sessions

Speaking 101: Tips and TricksLee Copeland, Software Quality Engineering

Tuesday, April 27, 2010 • 5:30 p.m. – 7:00 p.m.

Are you a new STAR speaker or aspiring to be one in the future? Join us at this workshop on making effective conference presentations. learn the secrets of developing content, identifying the Big Message, preparing slides with just the right words and images, presenting your message, handling questions from the audience, and being ready when things go wrong. lee Copeland, a professional speaker since birth, shares ideas that will help you be a better speaker, no matter what the occasion.

a Panel discussion—The reality of TestingRoss Collard, Collard and Company

wednesday, April 28, 2010 • 6:30 p.m. – 7:30 p.m.

At STAREAST 2010, a panel of seasoned testers will debate the theme: “The Reality of Testing”. To paraphrase Dickens, some testers and observers believe that this is the best of times. others believe it is the worst of times. Arguably there is no reality, only our perspectives and perceptions of reality. Before we can understand where we are likely to be going, or where we want to be going, it helps to have a realistic sense of where we are today. In this panel discussion, we will take an opinionated tour of today’s thinking and practices in software, testing, and quality. The panelists will bring a variety of perspectives, and the discussion will be shaped by the results of a survey conducted beforehand.

Page 29: Software testing

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g 29

The eXPoApril 28–29, 2010

Visit Top Industry Providers offering the Latest in Testing SolutionsLooking for answers? Take time to explore this one-of-a-kind EXPO, designed to bring you the latest solutions in testing technologies, software, and tools. To support your software testing efforts, participate in technical presentations and demonstrations conducted throughout the EXPO. Meet one-on-one with representatives from some of today’s most progressive and innovative organizations.

EXPo HoursWednesday, April 2810:30 a.m. – 2:00 p.m.3:30 p.m. – 6:30 p.m. Reception: 5:30 p.m. – 6:30 p.m. All attendees are invited to the EXPO reception

for complimentary food and beverages.

Thursday, April 2910:30 a.m. – 3:00 p.m.

For Sponsor/Exhibitor news and updates, visit www.sqe.com/STAREAST.

i n d u S T r y S P o n S o r S

STAREAST 2010 sponsors are listed in bold. *This list only includes those that signed up before December 31, 2009.

For Sponsor/Exhibitor news and updates, visit www.sqe.com/STAReaST

Acmqueue

ASTQB

Better Software Magazine

Checkpoint Technologies, Inc.

CoDe Magazine

Cognizant

Configuration Management, Inc.

CPT global

Delasoft, Inc.

froglogic gmbH

hP Software & Solutions

IT Today

iTKo

lDRA

Methods and Tools

Microsoft

SearchSoftwareQuality.com

Software Quality Engineering

Sonata Software

SQE Training

StickyMinds.com

Toolbox.com

Cognizant (NASDAQ:CTSH) provides IT services focused on delivering strategic solutions that address client’s complex business needs. As the largest provider of Independent Verification & Validation services, Cognizant Testing Services provides comprehensive solutions, including functional testing, shared services, specialized testing services, Test consulting and enterprise test services. For additional information, visit www.cognizant.com/html/solutions/services/testing/landingPage.asp

CPT global is a leading provider of testing services, test environment & data management, performance tuning, mainframe & distributed capacity planning, and cost reduction services. Established in 1993, CPT is a publicly traded company servicing over 70 Fortune 500 companies in 28 countries in North America, Europe and Asia Pacific.

hP Software & Solutions helps companies optimize business outcomes and extract more value from infrastructures, applications, information, and people. our solutions help companies align IT and the business, raise service quality and availability, and use information to drive competition and reduce risk. www.hp.com/go/software

Sonata Software is a global IT consulting and Services Company with a deep focus on software testing. Sonata has a dedicated testing practice which provides end-to-end testing services including functional, non-functional and specialized services like SoA & Agile testing. Sonata’s marquee client base includes TUI, RBS, Dell and Fidelity. www.sonata-software.com

M e d i a S P o n S o r S

PLUS see These exhiBiTors and sponsors aT The TesTing expo (apr. 28–29)Please note, these are the sponsors and exhibitors as of the brochure printing.* Visit the Web site below for the most up-to-date information.

Page 30: Software testing

30

Early Bird offerReceive up to $200 off the regular conference registration fees if payment is received on or before March 26, 2010.

Buy one Get one Half offRegister two people at the same time and save half off the second registration. The 50% savings will be taken off the lower of the two registration amounts. To take advantage of this offer, please call the Client Support Group at 888.268.8770 or 904.278.0524 or email them at [email protected] and reference promo code BoGo.

Groups of 3 or More Save 25% Register a group of three or more at the same time and save 25% off each registration. To take advantage of this offer, please call the Client Support Group at 888.268.8770 or 904.278.0524 or email them at [email protected] and reference promo code GRP3.

PowerPass DiscountPowerPass holders receive an additional $100 off their registration fee. not a PowerPass member? Learn more at www.stickyminds.com/powerpass.

Alumni DiscountSTAREAST alumni receive up to an additional $200 discount off their registration fee.

Software Tester Certification—Foundation Level Training + Conference If you attend the Software Tester Certification—Foundation Level Training training course and the conference, you save an additional $500.

using Visual Studio® 2010 Ultimate to Improve Software quality + ConferenceIf you attend the Using Visual Studio® 2010 Ultimate to Improve Software quality and the conference, you save an additional $500.

Please Note—We will always provide the highest possible discount and allow you to use the two largest discounts that apply to your registration.

C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

WAYS TO SAVEON YOUR CONFERENCE REGISTRATION

Page 31: Software testing

Best Value!Best Value!

STSTararEASTEAST 2010 2010 rregiegiSTSTraraTTion inion inFFororMMaaTTionionaaPPril 25–30, 2010 ril 25–30, 2010 orlando, florida, USorlando, florida, USaa

registration Fees:* on or before March 26 after March 26

o ViP Package (Monday-Friday) $2,595 $2,795Includes 2 days of Pre-conference Tutorials, 2 Conference Days, and Testing & Quality leadership Summit

o Conference + 2 Tutorial Days $2,345 $2,495

o Conference + 1 Tutorial Day $2,145 $2,295

o Conference only (wed.-Thurs.) $1,895 $1,995

o 2 Tutorial Days $1,745 $1,795

o 1 Tutorial Day or the Testing and Quality leadership Summit $945 $995

o Software Tester Certification— Foundation level Training + Conference** $3,540 $3,740

o Using Visual Studio® 2010 Ultimate to Improve Software Quality + Conference $3,290 $3,490

o Add Testing & Quality leadership Summit (Friday) to any Conference package $395 $395

online:www.sqe.com/sereg

eMail:[email protected]

Phone:888.268.8770 904.278.0524

Easy to RegisterConFerenCe PriCing

31C A l l 8 8 8 . 2 6 8 . 8 7 7 0 o R 9 0 4 . 2 7 8 . 0 5 2 4 T o R E g I S T E R • w w w . S Q E . C o M / S E R E g

PayMenT inForMaTionThe following forms of payment are accepted: Visa, MasterCard, American Express, check, or company purchase order. Payment must be received before the registration is confirmed. Make all checks payable to Software Quality Engineering. You will receive a confirmation packet upon payment by check, credit card, or company purchase order. Payment must be received at Software Quality Engineering on or before March 26, 2010, to take advantage of the Early Bird conference rates listed above.

hoTel reSerVaTionSTake advantage of the discounted conference rate at the Rosen Shingle Creek in orlando, Florida. To make a reservation, visit www.sqe.com/go?SEHotel or call 904.278.0524 or 888.268.8770 and mention you are attending the STAREAST conference to receive your discount. Cancellations on a guaranteed reservation must occur more than five days prior to the specified arrival time to ensure a refund. If you need special facilities or services, please specify at the time of reservation.

CanCellaTion PoliCyConference registrations cancelled after April 5, 2010, are subject to a 20% cancellation fee. No cancellations or refunds may be made after April 12, 2010. Substitutions may be made at any time before the first day of the program. Call the Client Support group at 904.278.0524 or 888.268.8770 to obtain a cancellation code. All valid cancellations require a cancellation code.

SaTiSFaCTion guaranTeeSoftware Quality Engineering is proud to offer a 100% satisfaction guarantee. If we are unable to satisfy you, we will gladly refund your registration fee in full.

Media releaSeFrom time to time we use photographs, video, and audio of conference participants in our promotional and publishing materials. By virtue of your attendance at the STAREAST conference, you acknowledge that Software Quality Engineering, Inc. reserves the right to use your likeness in such materials.

* Your registration includes a one-year digital subscription (6 issues) to Better Software magazine. If you are a current subscriber, your subscription will be extended an additional six digital issues.

** $250 fee .

Special Special Early Bird Early Bird Offer!Offer!Receive up to $200 off the regular conference registration fee if payment is received on or before March 26, 2010. See discounted pricing information at left.

eVeVenenTT llooCCaaTTionionRosen Shingle Creek is nestled on a 230-acre site along Shingle Creek just off Universal Boulevard, east of the orange County Convention Center North/South expansion and just ten minutes from the orlando International Airport. This ideal location is just a short distance from a variety of orlando’s best attractions, restaurants, shopping, and entertainment venues.

SPSPeeCCialial hohoTTelel raraTTeeS FS Foror ST STararEAST EAST aaTTTTendeeendeeS!S!Book your reservation for your stay at the Rosen Shingle Creek at the discounted conference rate. To make a reservation, visit www.sqe.com/go?SEHotel or call 904.278.0524 or 888.268.8770. If you need special facilities or services, please specify at the time of reservation.

Cancellations on a guaranteed reservation must occur more than five days prior to the specified arrival time to ensure a refund.

onlineonline aaCCCCeeSS SS aaT TT Thehe C CononFFerenerenCCeeThere will be a wiFi café located near the STAREAST registration desk for use during the conference hours.

A exam is included in the registration pricecourse

Page 32: Software testing

330 Corporate way, Suite 300orange Park, Fl 32073

iF addreSSee iS no longer eMPloyed:Re-route to Director of Software Development

APRIl 28–29, 2010The eXPo

Visit Top Industry Providers Offering the Latest in Testing Solutions

TOOLS•SERVICES•TECHNIQUES•DEMOS

Conference Sponsors: Media Sponsors:

www.sqe.com/STarEAST REGISTER By MARCH 26, 2010, AND SAvE UP TO $200!

FRIDAY, APRIl 30, 2010Testing & Quality

leadership SummitAdd a fifth day to your conference!

PRESORTED

STANDARD

U.S. POSTAGE

PAID

GAINESVILLE, FL

PERMIT NO. 726

Want to go green? Email us at [email protected] with “Green” in the subject line to change your preferences to receive email communications only.

Software testing analysis & review April 25–30, 2010Orlando, FloridaRosen Shingle Creek

grouP diSCounTSaVailable

99% of2009 Attendees

Recommend STAR to Others

in the Industry