Upload
softwarecentral
View
240
Download
1
Tags:
Embed Size (px)
Citation preview
$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä?=RPˆ\ÆD$pèF=ÇD$Pè[|,L$LT$PQh$èƒ?=è?=è?=Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè¾==è°==‹|$PËèÆD$pèF=ÇD-èC=$‹Ì‰d$0RèBW‹ÎètÆD$pèF=ÇD==ÿèƒÇÿÿÿÿ‹t$j‹Îèc<=PT$QD$ RPL$@èTþÿÿL$èÛ:=QRL$4è¸ýÿÿ‹D$…À~AD$4L$PT$QDèÛ:=$ RPL$@è“[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè¾==è°==‹|$PËè-C=$‹þÿÿL$èš:=QRL$4è̉dèÛ:=$,Pè �G=Æ$hØ#PRŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™êÿÿ‹N4jèßêËè-èC=$‹Ì‰d$0RèùBÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-ÿÿPL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä·ýÿÿ‹D$…À¿èÛ:=¸$”ÆD$pèF=ÇD:=$j/L$0è:=L$,ˆ\$$è“9=;Öt‹ASW‹°‹<‰‹A‰<°_[^Âó«‹„$4_^3À[ÄÇD$RSÿìêÁéó¥‹Èƒáó¤¾ìêøê� � �Æ„$¤’Æ„$¤Æ„èÛ:=$¤ÆD$pèF=ÇD‹Ì‰dÆD$pèF=ÇD$,Pè� �G=ÆD$hØ#PèÛ:=RŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™ê[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹ÌèÛ:=‰d$0RèùB>=$>>=$L$(ÆD$DèÆèÛ:=D$pèF=ÇD¾==è°==‹|$PËè-èC=$‹Ëè-èC=$‹Ì‰d$0RèùBÿÿ‹N4jèßêÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-óÿÿPL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä?=RPˆ\ÆD$pèF=ÇD$Pè[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè¾==è°==‹|$PËèÆD$pèF=ÇD-èC=$‹Ì‰d$0RèùBW‹ÎètÆD$pèF=ÇD==ÿèƒÇÿÿÿÿ‹t$j‹Îèc<=PT$QD$ RPL$@èTþÿÿL$èÛ:=QRL$4è¸ýÿÿ‹D$…À~AD$4L$PT$QDèÛ:=$ RPL$@è“[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$-=$L$(ÆD$Dè¾==è°==‹|$PËè-èC=$‹þÿÿL$èš:=QRL$4è̉dèÛ:=$,Pè �G=ÆD$hØ#PRŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™êÿÿ‹N4jèßêËè-èC=$‹Ì‰d$0RèùBÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-PL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä·ýÿÿ‹D$…À¿èÛ:=¸$”ÆD$pèF=ÇD:=$j/L$0è:=L$,ˆ\$$è“9=;Öt‹ASW‹°‹<‰‹A‰<°_[^Âó«‹„$4_^3À[ÄÇD$RSÿìêÁéó¥‹Èƒáó¤¾ìêøêÆ„$¤’Æ„$¤Æ„èÛ� � � �:=$¤ÆD$pèF=ÇD‹Ì‰dÆD$pèF=ÇD$,Pè �G=ÆD$hØ#PèÛ:=RŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™ê[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹ÌèÛ:=‰d$0RèùB>=$>>=$L$(ÆD$DèÆèÛ:=D$pèF=ÇD¾==è°==‹|$PËè-èC=$‹Ëè-èC=$‹Ì‰d$0RèùBÿÿ‹N4jèßêÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-óÿÿPL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä?=RPˆ\ÆD$pèF=ÇD$Pè[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè¾==è°==‹|$PËèÆD$pèF=ÇD-èC=$‹Ì‰d$0RèùBW‹ÎètÆD$pèF=D==ÿèƒÇÿÿÿÿ‹t$j‹Îèc<=PT$QD$ RPL$@èTþÿÿL$èÛ:=QRL$4è¸ýÿÿ‹D$…À~AD$4L$PT$QDèÛ:=$ RPL$@è“[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè==è°==‹|$PËè-èC=$‹þÿÿL$èš:=QRL$4è̉dèÛ:=$,Pè �G=ÆD$hØ#PRŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™êÿÿ‹N4jèßêËè-èC=$‹Ì‰d$0RèùBÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-óÿÿPL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä·ýÿÿ‹D$…À¿èÛ:=¸$”ÆD$pèF=ÇD:=$j/L$0è:=L$,ˆ\$$è“9=;Öt‹ASW‹°‹<‰‹A‰<°� �_[^Âó«‹„$4_^3À[ÄÇD$RSÿìêÁéó¥‹Èƒáó¤¾ìêøêÆ„$¤’Æ„$¤Æ„èÛ:=$¤ÆD$pèF=ÇD�$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä?=RPˆ\ÆD$pèF=ÇD$Pè[|,L$LT$PQh$èƒ?=è?=è?=Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè¾==è°==‹|$PËèÆD$pèF=ÇD-èC=$‹Ì‰d$0RèBW‹ÎètÆD$pèF=ÇD==ÿèƒÇÿÿÿÿ‹t$j‹Îèc<=PT$QD$ RPL$@èTþÿÿL$èÛ:=QRL$4è¸ýÿÿ‹D$…À~AD$4L$PT$QDèÛ:=$ RPL$@è“[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$>>=$L$(ÆD$Dè¾==è°==‹|$PËè-C=$‹þÿÿL$èš:=QRL$4è̉dèÛ:=$,Pè �G=Æ$hØ#PRŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™êÿÿ‹N4jèßêËè-èC=$‹Ì‰d$0RèùBÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-ÿÿPL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä·ýÿÿ‹D$…À¿èÛ:=¸$”ÆD$pèF=ÇD:=$j/L$0è:=L$,ˆ\$$è“9=;Öt‹ASW‹°‹<‰‹A‰<°_[^Âó«‹„$4_^3À[ÄÇD$RSÿìêÁéó¥‹Èƒáó¤¾ìêøê� � �Æ„$¤’Æ„$¤Æ„èÛ:=$¤ÆD$pèF=ÇD‹Ì‰dÆD$pèF=ÇD$,Pè� �G=ÆD$hØ#PèÛ:=RŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™ê[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹ÌèÛ:=‰d$0RèùB>=$>>=$L$(ÆD$DèÆèÛ:=D$pèF=ÇD¾==è°==‹|$PËè-èC=$‹Ëè-èC=$‹Ì‰d$0RèùBÿÿ‹N4jèßêÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-óÿÿPL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä?=RPˆ\ÆD$pèF=ÇD$Pè[|,L$LT$PQh$èƒ?=è?‹D$…À~AD$4L$PT$QDèÛ:=$ RPL$@è“[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$‹Ì‰d$0RèùB>=$-=$L$(ÆD$Dè¾==è°==‹|$PËè-èC=$‹þÿÿL$èš:=QRL$4è̉dèÛ:=$,Pè �G=ÆD$hØ#PRŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™êÿÿ‹N4jèßêËè-èC=$‹Ì‰d$0RèùBÿÿ…Àu8hV‹Ëè-èC=$‹Ì‰d$0RèùB=B=‘A=-PL$ÆD$Hè#A=A=’”ÃèV@=”Ãè@=èä·ýÿÿ‹D$…À¿èÛ:=¸$”ÆD$pèF=ÇD:=$j/L$0è:=L$,ˆ\$$è“9=;Öt‹ASW‹°‹<‰‹A‰<°_[^Âó«‹„$4_^3À[ÄÇD$RSÿìêÁéó¥‹Èƒáó¤¾ìêøêÆ„$¤’Æ„$¤Æ„èÛ� � � �:=$¤ÆD$pèF=ÇD‹Ì‰dÆD$pèF=ÇD$,Pè �G=ÆD$hØ#PèÛ:=RŒ$€ÆD$tè4F=ÆD$pèF=ÇD$(DjRÿüñ‚‹Èèäçÿÿë3À‹L$H‹QR‹ÈÆD$@‰F4è™ê[|,L$LT$PQh$èƒ?=è?=è?=-Ëè-èC=$
Testing Trends and DevelopmentsTesting Trends and DevelopmentsACS Branch Forum – July 2008ACS Branch Forum – July 2008
Todd PasleyPrincipal Consultant
K. J. Ross & Associates Pty. Ltd.
PO Box 131, West Burleigh, 4219
Ph: 07 5522 5131 Fax: 07 5522 5232
http//www.kjross.com.au
2
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
About K. J. Ross & AssociatesAbout K. J. Ross & Associates
• Specialise in Software Testing– Testing Services
– Consulting
– Mentoring
– Training
• Established 1 July 1997• ~40 Staff• Tool agnostic• NATA accredited software testing
laboratory (ISO 17025)• Developers of Certified Software
Test Professional® programme
• Offices in:– Brisbane– Gold Coast– Melbourne– Sydney– Canberra
• Strong focus in R&D. Joint initiatives with:
– University of Queensland– Griffith University– Bond University– Software Quality Institute– NICTA– ISO 29119:2010 – Software Testing
3
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Testing Trends
Critical Challenges and Solutions
Testing Techniques and Initiatives
Typical improvement opportunities
Presentation OverviewPresentation Overview
4
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
21%
25%28%
0%
5%
10%
15%
20%
25%
30%
35%
40%
2 years ago Now 2 years on
Testing as a proportion of project Testing as a proportion of project budgetbudget
Preferred = 33%
5
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Spend in Software TestingSpend in Software Testing
6
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Growth DriversGrowth Drivers
• Greater visibility and exposure of IT problems• Quality is no longer a differentiator• Significant technical challenges (increased
integration complexity)• Greater dependency on IT systems
7
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Proportion of Test EffortProportion of Test Effort
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
40.0%
45.0%
Integration test System test Acceptance test Other test
8
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
CertificationCertification
19%
39%
56%
7%12%
33%
0%
10%
20%
30%
40%
50%
60%
2 years ago Now 2 years on
Foundation
Advanced
9
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Top 5 – Adoption GrowthTop 5 – Adoption Growth (Testing Techniques)(Testing Techniques)
In the last two years:
1. Model Based Testing (up 41%)
2. Unit testing (up 22%)
3. Use of formal test design techniques (up 17%)
4. Test automation (up 13%)
5. Load and performance testing (up 11%)
Others:• Test Governance• Risk Based Testing• Usability Testing
10
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Show me the moneyShow me the moneyPermanents ($,000 per annum)Permanents ($,000 per annum)
11
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Testing Trends
Critical Challenges and Solutions
Testing Techniques and Initiatives
Typical improvement opportunities
Presentation OverviewPresentation Overview
12
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Managers ForumTest Managers Forum
• Commenced in 2003• Workshop based• Restricted to 40 participants• Primarily Test Managers
• Focus on lessons learned and information sharing
13
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
TMF ParticipantsTMF Participants
14
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Critical Challenges WorkshopCritical Challenges Workshop
• Participants are divided into 6 groups• Individual test managers propose challenges to team• Team agree on challenges and present list• All challenges collected from teams
• Challenges consolidated, then surveyed with entire group – rating 1-low to 5-high in terms of impact
• Impact statistically analysed
• In total 47 critical challenges were identified
15
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Critical Challenges OverviewCritical Challenges Overview
Year Top Risk Key Themes
2003 Return on Investment ROI and Schedule
2004 Infrastructure Management Environments, Resources & Skills, Requirements
2005 Sufficient Skilled Staff Resources & Skills
2006 Environment and data configuration Test Environments, Communication and Process Improvement
2008 Obtaining Clear Requirements Resources & Skills, Schedule and Requirements
16
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Critical Challenges 2008Critical Challenges 2008
Ranking Challenge Rating Area
1 Obtaining clear requirements 3.850 Requirements
2 Identifying suitably skilled resources and hiring them 3.850 People
3 Building and developing skills in the team 3.825 People
4 Squeezed timeframes at end of project 3.775 Schedule
5 Constraints of resources and time to deliver 3.750 Schedule
6 Environments - appropriate environments 3.675 Environments
7 Managing expectations of testing scope 3.650 Requirements
8 Consistent governance and testing across multiple projects 3.625 Governance
9 Dealing with compressed test schedules 3.625 Schedule
10 Adequate recruitment processes to find the right people 3.600 People
10 Building complex distributed environments 3.600 Environments
10 Cost and Availability of providing test environments 3.600 Environments
10 Maintaining staffing levels at appropriate times 3.600 People
10 Obtaining testable requirements 3.600 Requirements
10 Recognition for testing 3.600 People
17
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
RequirementsRequirementsChallengesChallenges
Ranking Challenge Rating
1 Getting clear requirements 3.85
7 Managing expectations of testing scope 3.65
15 Obtaining testable requirements 3.60
18 Getting commitment from users regarding requirements 3.55
26 Changing requirements 3.35
• Getting requirements for test planning and design appears to be significantly challenging• Getting feedback from users and managing changes to requirements is less of a challenge
18
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
RequirementsRequirementsRecommendationsRecommendations
• Requirements Analysis• Involve business in Test Design review• Demonstrate cost with root cause analysis• Participation in Change Control meetings• Perform UAT planning and design earlier• Review and Walkthrough requirements for testability and provide
specific feedback• Shift methodology towards agile / prototyping approach for
business areas that don’t know what they want• Testing team is a formal part of the sign-off of requirements• Involve appropriate resources to assist with non-functional
requirement elicitation
19
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
People, Skills and ResourcingPeople, Skills and ResourcingChallengesChallenges
Ranking Challenge Rating
2 Identifying suitably skilled resources and hiring them 3.850
3 Building and developing skills in the team 3.825
11 Maintaining staffing levels at appropriate times 3.600
12 Adequate recruitment processes to find the right people 3.600
14 Recognition for testing 3.600
16 Obtaining the right resource in a buoyant market 3.575
21 Retaining staff 3.525
24 Morale and recognition for testers 3.450
38 Lack of professional test teams and seconding from development 2.950
42 Attitude with new starters - Gen Y alignment 2.900
47 Resourcing non-sexy test projects - legacy 2.275
• Managing people is a significant challenge
• Challenges were in finding and recruiting skilled testers, and building the skills in the team
• Supporting the variable workload with different staffing and sourcing levels is a challenge
• Getting recognition for testing (perhaps to attract support and staff) is challenging, but recognition for the testers themselves was less impacting
• Retaining staff, keeping the motivated, seemed less impacting
20
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
People, Skills and ResourcingPeople, Skills and ResourcingRecommendationsRecommendations
• Finding Skilled Staff– Developing relationships with Universities– Invest in skill development to leverage existing resources– Separate skill needs– Collect metrics to better understand resourcing needs– Supporting Test processes, procedures and infrastructure
• Developing staff skills– General Training courses and Certification (e.g. ISEB/ISTQB and
CSTP)– Specific training courses based on skill need– Mentoring, coaching, buddy system– Role and department rotation– Professional development plans
21
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
People, Skills and ResourcingPeople, Skills and ResourcingRecommendationsRecommendations
• Balancing Resources– Utilise historical estimates to improve planning
– Adopt Risk-based approach to prioritise testing
– Consider test outsourcing
– Increase skill versatility
– Centralise testing resources
22
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
ScheduleScheduleChallengesChallenges
Timeframes for testing creates significant challenge, perhaps leading to managers to seek out novel approaches to delivering in tight schedules
Ranking Challenge Rating
4 Squeezed timeframes at end of project 3.775
5 Constraints of resources and time to deliver 3.750
9 Dealing with compressed test schedules 3.625
22 Time pressure driven by external demands 3.500
23
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
ScheduleScheduleRecommendationsRecommendations
• Prioritise test effort (e.g. Risk-based testing)• Involvement in planning activities up front• Perform testing activities and phases earlier in SDLC• Iterative test cycles• Exploratory testing• Utilise historical estimation justify test schedule • Ramp-up resourcing
– Centralised Test Unit– Short-term Contracters– Outsourcing
• Clear visibility into progress, risks and issues
24
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
EnvironmentsEnvironmentsChallengesChallenges
• Challenges in gaining appropriate environments especially given the constraints of cost and infrastructure, this is particularly the case with distributed environments with significant integration
• Providing stable and timely environments is less an issues• Similarly, how virtualisation is incorporated into environments is less of an impact
Ranking Challenge Rating
6 Environments - appropriate environments 3.675
10 Cost and Availability of providing test environments 3.600
13 Building complex distributed environments 3.600
20 Environments - timeliness of providing environment 3.525
25 Getting integration working between application and environments 3.400
43 Building stable distributed systems (J2EE) environments 2.821
44 Understanding value of virtualisation 2.769
25
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
EnvironmentsEnvironmentsRecommendationsRecommendations
• Plan for test environment and data needs • Document test environment requirements• Work with infrastructure teams and other environment
stakeholders• Develop a test environments team• Consider virtualisation where appropriate• Consider outsourcing environments
26
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
GovernanceGovernanceChallengesChallenges
• Governance has become significant to have a consistent approach across the organisation. This implies that projects tend to have differing approaches.
• However other areas of governance are considered less impact
Ranking Challenge Rating
8 Consistent governance and testing across multiple projects 3.625
19 Adopting a governance framework 3.525
30 Alignment of process and skills across different groups 3.250
31 Timely governance to ensuring legislative compliance met 3.200
27
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
GovernanceGovernanceRecommendationsRecommendations
• Define suitable software test lifecycle, processes and procedures
• Customise to suit• Regular audits• Allow flexibility• Provide supporting tools• Continuously review to ensure usability • Collect metrics for measuring improvement
28
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Testing Trends
Critical Challenges and Solutions
Testing Techniques and Initiatives
Typical improvement opportunities
Presentation OverviewPresentation Overview
29
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Risk-Based TestingRisk-Based Testing
What?
1. Justify which testing activities are to be performed– Product risks (or weaknesses) are proposed by all relevant
stakeholders in workshop format (ISO9126)– Activities are proposed to mitigate high priority risks– Budgets and schedules are calculated– Testing activities are proposed based on risk coverage
versus costs and schedule
2. Prioritising test effort based on risks of product failure– Assign risk levels to ‘Features to be Tested’– Prioritise test execution based on risk
30
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Risk-Based TestingRisk-Based Testing
Why?• Opportunity for non-testing stakeholders to contribute to
priorities and scope • Ownership and buy-in• Justify testing activities selected based on ROI• Justify scope within testing activities• Maximise schedule
Considerations• Can be difficult to secure time of relevant stakeholders• As requirements and scope change, risks need to be revisited• Traceability and measurement against risks
31
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Requirements AnalysisRequirements Analysis
What?• Reviewing system requirements for defects:
– Incompleteness– Inconsistency– Ambiguity– Redundancy– Inaccuracy
Why?• As much as 60% of all defects in a systems lifetime originate from
deficient requirements • Reworking requirements defects on most software development projects
costs more than 40% of total project effort.• Requirements defects may cost between 10 to 200 times as much if
detected in a production system or 10 times as much if detected during testing compared to detection at the requirements stage.
32
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Requirements AnalysisRequirements AnalysisBehaviour Tree AnalysisBehaviour Tree Analysis
Behaviour Tree Analysis• A method of analysing and documenting requirements and
behaivour of large and complex systems.• Focuses on and formalises the detail in each requirements one
at a time• Aliases are resolved and then requirements are integrated• Builds a graphical representation of the system used for
communication and verification of requirements• Been shown to find between 15-30% more defects than other
techniques• Reported to save ~10% of the development costs of projects in
Defence
33
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Integrated Behavior TreeIntegrated Behavior Tree
Satellite ControlSystem
23 Pages Of
Text
Problems/Issues
=>Yellow – implied behavior=> Red – missing behavior
33
34
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Requirements AnalysisRequirements AnalysisBehaviour Tree AnalysisBehaviour Tree Analysis
Considerations• Insufficient skilled resources and training courses
available for BTA• Tool support is required for large systems• Can be challenging to get involved at requirements
phase• Any formal requirements evaluation technique will
provide value
35
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
CertificationCertification
What?• Industry recognised training programs aimed to build
testing competence
Why?• Baseline team competence• Leverage existing resource• Cheaper than hiring• Forms part of professional development• Reward for staff
36
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
CertificationCertification
Options:
1. ISEB/ISTQB• Quicker• Run in many countries• Supported by international experts
2. CSTP• Updated quicker• Significant practical focus• Developed and presented by practitioners
37
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
CertificationCertification
Considerations• Implementation of leaned skills• Time to perform training• Feedback• In-house vs public courses• Course customisation• Other training options
38
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Process AssessmentTest Process Assessment
What?• Benchmark process capability using a reference
model to identify improvement opportunities
Why?• Establish long term improvement plan• Cannot measure improvements without a baseline• Don’t know what you don’t know• Used to support business case for improvement
39
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Process AssessmentTest Process Assessment
Common Test Process Improvement Reference Models• Testing Maturity Model (TMM)• Test Process Improvement Model (TPI)• Test Improvement Model (TIM)• Minimal Test Practise Framework (MTPF)
Other• Test Organisation Maturity Model (TOM)
40
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Process Improvement Test Process Improvement (TPI®)
Key Areas (KAs)
Levels TestMaturity
Matrix
20
Checkpoints Improvement Suggestions
58
> 200 many
41
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Maturity MatrixTest Maturity Matrix
Key Area 0 1 2 3 4 5 6 7 8 9 10 11 12 13
1 Test Strategy A B C D
2 Life-cycle model A B
3 Moment of involvement A B C D
4 Estimating and planning A B
5 Test specification techniques A B
6 Static test techniques A B
7 Metrics A B C D
8 Test tools A B C
9 Test environment A B C
10 Office environment A
11 Commitment and motivation A B C
12 Test functions and training A B C
13 Scope of methodology A B C
14 Communication A B C
15 Reporting A B C D
16 Defect management A B C
17 Testware management A B C D
18 Test process management A B C
19 Evaluation A B
20 Low-level testing A B C
Controlled Efficient Optimising
42
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Our ApproachOur Approach
TOM Surveys
Organisation Documentation
Project Documentation
Interviews
Tool Demonstrations
Application Demonstrations
TPI AssessmentTPI Report
43
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test AutomationTest Automation
Why perform Test Automation?• Shorten duration for testing• Greater coverage• Improve productivity of scarce
resources• Improves morale• Reduces tester error and oversights• Unattended testing supported• Provides a detailed test log and audit
trail• Test scripts are an asset
Automation Maturity levels
1. Capture Playback
2. Scripted
3. Parameterised
4. Table-Driven
5. Keyword-Driven
6. Model-Based
Test Automation is the use of software to control test execution
44
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test AutomationTest Automation
Considerations• Time and effort to setup tests initially• Maintenance of test cases• Learning curve• Doesn’t eliminate manual testing• Easy to focus on tests that are simple to automate• Investment cost
45
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test AutomationTest AutomationEffort comparisonEffort comparison
Activity Manual Test Time (time units)
Automated Test Time (time units)
Initial Test Preparation 1 2 - 3 Test Execution 1 .1 - .2 Test Maintenance 1 2 - 3 Re-Test 1 .1 - .2 Test Results Checking 1 .1 - .2 Test Documentation 1 .1 - .2
Activity Manual Test Time (minutes)
Automated Test Time (minutes)
Initial Test Preparation 20 60 Test Execution (1st time) 30 6 Test Maintenance 5 x 3 15 x 3 Re-Test (3 times) 30 x 3 6 x 3 Test Results Checking (all 4 tests)
10 x 4 5 x 4
Test Documentation 20 4 Total 215 157
Typical Effort Ratios:
Example:
46
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Keyword-Driven AutomationKeyword-Driven Automation ( (Hans Buwalda)
What?• Separates technical script implementation and the test steps and data• Individual user actions are implemented as keywords• Keywords are grouped together to form test scenarios
Why?• Isolate skill needs – easier to find suitable resources• Non-technical testers can contribute to automation by defining new test
scenarios• Framework reduces maintenance overheads as application changes • Automation scenarios are quicker to develop than in less mature
approaches• Scenarios are like a test design, serving a dual role of test
documentation• Change in tool set does not result in change to test scenarios
47
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Keyword-Driven AutomationKeyword-Driven Automation
48
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Keyword-Driven AutomationKeyword-Driven Automation
Considerations• Time to setup initial framework• Requires technical support to build and maintain
library routines for each keyword • Determine Test Utility suitability early• Plan for maintenance
49
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Model Based TestingModel Based Testing
What?• Model-Based testing involves the use of a model of
system behavior to generate test sequences
Why?• Can generate very high volumes of test scenarios• Coverage can be traced back to requirements from
which the model was derived• Reuse as business rules change is significant
50
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
State MachineState Machine
• Ops– Create– Delete– Select All– Invert
Selection
• Over 2 files
51
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Chinese Postman WalkChinese Postman Walk
52
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Markov ModelsMarkov Models
IDLE TONE
RINGINGTALKING
pick-up
put-downincoming
call
callerputs-down
pick-up
put-down
callerputs-down
• Simulation according to user behaviour• Weighting for each edge for random selection
0.05
1.0
0.95
0.1
0.9
0.4 0.6
53
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Model Based TestingModel Based Testing
Considerations• Models can rapidly become quite complex• Modelling requires significant expertise • Very little tool support exists• Can generate many meaningless tests• Up-front costs are much higher• Careful test data planning is required
54
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Environment VirtualisationTest Environment Virtualisation
Virtualisation• Shares resources of a single computer across multiple
environmentsFor Example… • UAT, SFT, training and development environments may all
share the same physical environment
Why?• Reduce costs:
– Hardware– Licensing– Space– Maintenance
• Increased agility and flexibility• Quicker to deploy environments• Ability to roll back to known states and reuse environments
earlier
55
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Test Environment VirtualisationTest Environment Virtualisation
Tool Options• Citrix Zen• VMWare• Microsoft Virtual Server
Considerations• Virtualisation Support for your Operating system• Load and performance test environments• Virtualisation costs• Peripherals• Low-level device drivers• Outsourcing environments
56
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Testing Trends
Critical Challenges and Solutions
Testing Techniques and Initiatives
Typical improvement opportunities
Presentation OverviewPresentation Overview
57
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Typical Improvement OpportunitiesTypical Improvement Opportunities
1. Prioritise testing effort (e.g. Risk-based testing)
2. Skill development
3. Earlier involvement in SLDC
4. Test automation (appropriate adoption and maintenance)
5. Estimation and metrics
58
1110010100011100100011110011110010100011100111100100011110111001111001110001101111111001010111001010001010
Questions?Questions?
For further information:Todd Pasley