Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Feedback Focussed Process Improvement
Neil Thompson, Thompson Information Systems,
UK
T6
©
Feedback-FocussedProcess Improvement
Neil Thompson14th European Conference on Software Testing, Analysis & Review
4-7 December 2006: Manchester, England, UKTrack Session T6
©
Can Process Improvement for Information Systems learn from these?
TOYOTA CELICA GT4
TOYOTA PRIUS
©
Contents of presentation1. Traditional process improvement in STAR 2. “N ew ” m ethods in m anufacturing: how T oyota has been so
successful, comparison with Goldratt Theory of Constraints3. How that new paradigm translates into IT/IS:
• Agile methods• Lessons for process im provem ent in general…
4. Example: extending TPI® (using Strengths, Weaknesses,Opportunities & Threats)
5. Thinking tools & feedback loops6. Tipping points7. Other ways to apply the above (eg TMMSM, TOMTM, DIY)8. Integrating Goldratt-Dettmer Thinking Tools into Systems
Thinking9. Implications for Time-Cost-Quality-Scope
©
“Traditional” P rocess Im provem ent in Information Systems Review & Testing
Sources: TMMSM - http://www.stsc.hill.af.mil/crosstalk/1996/09/developi.aspTPI® – based on http://www.sogeti.nl/images/TPI_Scoring_Tool_v1_98_tcm6-30254.xlsTOMTM – based on http://www.evolutif.co.uk/tom/tom200.pdf, as
interpreted by Reid, S. Test Process Improvement –An Empirical Study (EuroSTAR 2003)
PREDEFINED SUBJECT AREAS
MATURITY LEVELS
Medicalanalogies Some
flexibilityTestOrganisationMaturityTM
TestMaturityModelSM
TestProcess
Improvement®
©
How Toyota progressed through Quality to Global dominance, & now Innovation• Quality (top reputation):
– has dominated JD Power satisfaction survey for decade– Toyota Production System (TPS): 14 principles across
Philosophy, Problem-Solving, Process and People & Partners
• Global dominance:– market value > GM, Chrysler & Ford combined– on track to becom e (2006) w orld’s largest-volume car mfr
• Innovation (fast):– Lexus: invaded the “quality” m arket and w on– Prius: not evolutionary but revolutionary – and launched 2
months early and sold above expectations– Toyota Product Development System (TPDS):
13 (!) principles across Process, People and Tools & Technology
©
PROCESS
Quality
Toyota‟s TPS & TPDS merged with Balanced Score-Card
9. Leaders10. People& Teams:
PEOPLE & PARTNERS(Respect, Challenge & Grow)
PROCESS(Eliminate
Waste)
PHILOSOPHY(Long-Term)
USERQuality
FINANCIAL Quality
PRODUCT Quality
1. … even at short-term expense
2. Continuous process flow
to surface problems
3. Pull to avoid over-production
4. Level workload
5. Stop-to-fix: “right first tim e”
6. Standardised tasks forcontinuous improv & empowerment
7. Visual control to see problems
8. Reliable tested technologythat serves Process & People
11a. Partners
11b. Suppliers
12. See for yourself to thoroughly understand14. Learning org via reflection & improvement
• func exp& cross-func int
• Build learning culture
• Tools
PROBLEM - SOLVING(Continuous Learning
& Improvement)
©
GOLDRATT:Drum-Buffer-RopeMaximise throughputC ritical C hain m anag’t
Monitoring buffers
Cause-effect treesConflict resol diagramsIdentify constraint,“elevate” & iterate
The “new ” paradigm in m anufacturing: value flow, pull not push, problem-solving
• And now these principles have been successfully applied beyond actual manufacturing, into product development
3. Pull to avoid over-production
• Customer-defined value (to separate value-added from waste)
2. Continuous process flow to surface problems
7. Visual control to see problems
12. See for yourself to thoroughly understand
13. Decide slowly (all options) by consensus
TOYOTA:Takt (rhythm), Low-Inventory (“lean”),Just-In-TimeMinimise wasteAndon (stop and fix)Kanban cardsTagging slow moversOne-page metricsC hain of 5 “w hy”s
Plan-Do-Check-Act
4. Level workload
14. Learning org via reflection & improvement
• But what about development of software?...
©
• Alistair Cockburn:– Increasing feedback & communication reduces need for
intermediate deliverables– Efficiency is expendable in non-bottleneck activities
• Mary & Tom Poppendieck: – Map value stream to eliminate waste– Critical Chain project management– Decide as late as possible
• David J Anderson:– Throughput of value through stages of spec, dev & test– “S tratagram s” (m y term )
• But whether or not we are using agile methods, we can use new paradigm principles to im prove processes…
The new paradigm in IS development: agile methods
Sources: Cockburn, A. Agile software development (Addison-Wesley Pearson Education 2002)Poppendieck, M&T. Lean software development (Addison-Wesley 2003)Anderson, David J. Agile management for software engineering (Prentice Hall 2004)
©
E xtending the new paradigm to “S TA R ”: B y rearranging TP I‟s key areas…
… w e can begin to see cause -effect trees…
1.Test strategy
2.Lifecycle model
3.M om of involv‟t
4.Estim & plan
5.Test spec techn
6.S tatic techn‟s
7.Metrics
8.Test automation
9.Test env‟t
10.O ffice env‟t
11.Commit & motiv
12.Test func & train
13.S cope of m eth‟y
14.Communication
15.Reporting
16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt
19.Evaluation
20.Low-lev testing
©
Cause-effect trees: can start w ith TP I‟s inbuilt dependencies
… eg for getting to at least level A throughout
1.Test strategy 2.Lifecycle model3.M om of involv‟t
4.Estim & plan 5.Test spec techn6.S tatic techn‟s 7.Metrics
8.Test automation9.Test env‟t10.O ffice env‟t
11.Commit & motiv 12.Test func & train13.S cope of m eth‟y 14.Communication 15.Reporting16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt19.Evaluation 20.Low-lev testing
A:Informal techniques
A:Single hi-level test
A:Budget & time
A:Plan, spec, exec
A:Compl test basis
A:Substantiated A:Product for project
B:Test int in proj org B:Progress, activities,prioritised defects
A:DefectsA:Internal
A :P lanning & exec’n
B:+Monitoring &adjustment
A:Managed-controlled
A:Testers & Test Mgr
A:Project-specific
B:Formal techniques
A:Internal
(slightly simplified)
©
C an add extra “key areas”, lifecycle inputs & outputs, general categories
… eg T P I / T M ap’s four “cornerstones”
1.Test strategy 2.Lifecycle model3.M om of involv‟t
4.Estim & plan 5.Test spec techn6.S tatic techn‟s 7.Metrics
8.Test automation9.Test env‟t10.O ffice env‟t
11.Commit & motiv 12.Test func & train13.S cope of m eth‟y 14.Communication 15.Reporting16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt19.Evaluation 20.Low-lev testing
TECHNIQUESin general
LIFECYCLEin general
ORGANISATIONin general
INFRA-STRUCTURE
in general
INPUTS &INFLUENCES
on STAR
OUTPUTSfrom STAR
+ Test data
+ Risk-BasedSTAR
©
Can go beyond the fixed questions: SWOT each subject area
INPUTS & INFLUENCES on STAR 4.Estimating & planning
(small Post-it® notes are good for this)
STRENGTHS
WEAKNESSES THREATS
OPPORTUNITIES STRENGTHS
WEAKNESSES THREATS
OPPORTUNITIES
N ot substantiated, just “w edid it as in previous project”
Monitored, and adjustmentsmade if needed
Source: solid borders denote as in TPI; dashed borders denote additional
System requirements areagreed too late
System specs & designs aredefective, just timeboxed
The most experiencedbusiness analysts are leaving,more may follow
Release dates are fixed
C an’t recruit m ore staff
The squeeze on testing islikely to worsen
Some managers are consideringagile methods
System specs are heavy textdocuments
Business analysts may bemotivated by UML training
Too busy for well-consideredestimating & planning
©
Some cause-effect trees m ay “re-root” to form loops
N ot substantiated, just “w edid it as in previous project”
System specs & designs aredefective, just timeboxed
System specs are heavy textdocuments
Too busy for well-consideredestimating & planning
4.Estim & planINPUTS & INFLUENCESon STAR
Text-only format makes defectdetection less likely
Note: solid borders denote as in TPI; dashed borders denote additional;items without borders have been added after considering the SWOT
If they cantimebox,so can we
Also too busy to do Risk-BasedSTAR
+ Risk-Based STAR
Live systems are buggy
OUTPUTS from STAR
Testers spend much timehelping diagnose, then retest
Management demand inquests,even m ore tim e “w asted” (oropportunity for causal analysis?)
Work takeslonger than“planned”
…(STAR) LIFECYCLE
in general3.Moment of involvement
… …
Testing starts later than whentest basis is completeCulture of our testers is to
prefer large text documentsto diagrams
Key testers are still involved in“firefighting” previous release
1.Test strategy
5.Test spec techn
6.S tatic techn‟s
19.Evaluation
20.Low-lev testing
VARIOUSADVERSEEFFECTS(+other areas,+ knock-ons)
©
A second example: Test Design
D on’t know actual test coverage
D on’t know how to define & use test coverage
Not trained in formal test techniques
“Inform al” techniques not really defined
Too many failures in Live
T oo busy “firefighting”
W rite num erous tests, “to be safe”
When execution time runs short, unsure which tests safest to omit
Test specifications difficult to review
Increased likelihood of coverage omissions & overlapsescaping rectification Difficult to select regression tests
Test Design does appear in TPI, TMM & TOM, but does this loop indicate it deserves more prominence?
No time to tabulate coverage from existing scriptsUnable to tabulate coverage for new tests
Omissions Overlaps
Add even more testsTest execution takes long time
T est specs are large & “texty”
Formal techniques difficult even after training
Tests probably includelow-value conditions & cases
Not time to attend courses
Increased likelihood of coverage omissions & overlapsescaping detection
©
New paradigm problem-solving: the Goldratt-D ettm er* “Thinking Tools”
Core problem+(other) Root causes
Intermediateeffects
Undesirableeffects
Prerequisites+Conflicts
Requirements +INJECTIONS
Objective
CURRENT REALITY+ Injections
Intermediateeffects
Desired effects
Intermediateobjectives
Obstacles
Objective
Needs+Specificactions
Intermediateeffects
Objective
CURRENTREALITY
........... What to change to .......(2) (3)
CONFLICTRESOLUTION
.... How to change ....
PRE-REQUISITES
TRANS-ITION
FUTUREREALITY
What tochange (1)
* very slightly paraphrased here
(4) (5)
S ources: D ettm er, W . G oldratt’s T heory of C onstraints (A S Q 1997)T hom pson, N . “B est P ractices” & C ontext-Driven – building a bridge (StarEast 2003)
©
Applying the Thinking Tools to information from SWOT analysisSTRENGTHS
WEAKNESSES THREATS
OPPORTUNITIES
T he S W O T m ethod can be “nested”, eg aggregate upfrom individual subject areas to whole lifeycle
CURRENT REALITYFUTURE REALITY
PRE-REQUISITES
TRANS-ITION
Anticipating &overcomingobstacles
Action planning
CONFLICTRESOLUTION
System specs are heavy textdocuments
Culture of our testers is toprefer large text documentsto diagrams
SDLC method does notencourage diagrams
T est specs are large & “texty”
Test coverage omissions & overlaps
Can still improve coverageat macro level withinformal techniques
(80/20)
Using extracts from both 1st & 2nd examples
Too many failures in Live
Some managers are consideringagile methods
Business analysts may bemotivated by UML training
STRATEGIC: Improve SDLC method
TACTICAL: Address culture byworked examples of diagrams
TACTICAL: Include tables &diagrams in test specifications
(Use Threats to helpidentify obstacles)
(Use Strengths to helpamplify opportunities)
ONGOING: Techniques
training & coaching
©
A third example: Defect fixing & retesting Source: Ennis, M. Managing the end game of a software project (StarWest 2000)
“understand relationship betw een m etrics”
T hen the “risk spider” m ight be renam ed a „correlation am oeba‟
Code turmoil
Test failure rate
Test completionslowness Defect detection rate
Defect backlog
Let us reverse 2 labels, to make all 5 quantities bad things
Failing testsslow completion
Failing testsneed fixes
Too many bad fixes,knock-on faults
More defectsadd to backlog
Tests fail ifthey expose
defects
Its shape, changingover time, could show uswhether we havea feedback loop,either bad… … or good
Code turmoil
Test failure rate
Test completionslowness Defect detection rate
Defect backlog
Less slowness Fewer fixesneeded
Code fixes are good,fixing faults
Fewer defects,backlog can reduce
Fewer defectsmeans fewerfailing tests
+
-
©
Tipping Points• Reversing the vicious loop into a virtuous loop:
– example on previous slide required only one relationship change…
– ie effect of code turmoil on defect detection rate to tip from positive to negative
• Examples of Tipping Points:– removing New York graffiti reduced serious crime rate– S tanford “P rison” E xperim ent: spiral of vindictiveness
• Achieving Tipping Point involves:– concentrating resources on a few key areas– a way to make a lot out of a little– fresh thinking, sometimes counterintuitive
N ote the sim ilarities w ith G oldratt’s T heory of C onstraints, and m ore generally the P areto principle (80 -20)
Source: Gladwell, M. The tipping point (Abacus 2000)
©
Fourth example: Documentation
• Documentation can also become a vicious loop:– documentation tends to get out of date– “do you w ant us to fix the docum entation or deliver softw are?”
– can then become unclear whether tests passed / failed: what is it meant to do?
– no-one reads the documentation any more, so defects in it are no longer detected
– so it gets even further from reality…
©
Documentation and flow of valueLEVELS OF DOCUMENTATION,pushed by specifiers
WORKINGSOFTWARE
Accepted
System-tested
Integrated
Unit / Component-tested
FLOW OF FULLY-WORKING SOFTWARE,
pulled bycustomer demand
Requirements
+ FuncSpec
+ TechnicalDesign
+ Unit / Componentspecifications
+ Test Specifications
©
Difficulties in problem-solving: conflict resolution (eg for documentation)
Objectives of documentationare to help build and maintain a fit-for-purpose
system by knowingand agreeingwhat built andwhat tested
“W e needmore
docum entation”
“W e needless
docum entation”
Developed further to Daich, GT. Software documentation superstitions (STAREast 2002)See also Rüping, A. Agile documentation (Wiley 2003)
CO
NFLIC
T
“R eview s are pow erful atfinding defects early, but it’sdifficult to review just speech”
“If it’s not w ritten,it can’t be signed off”
“S igned -off requirementsare counterproductive tosystems meeting realuser needs now ”
“D ocum ented test plansare counterproductive tothe best testing”
“P eople never readany docum entation”
“T hey w ill w hen theirmemories have fadedor w hen there’s acontract dispute”
Documentation varies:need to distinguish necessaryfrom unnecessary
Need to distinguish qualityof documentation, notjust quantity
Specifications are likeinventory, no end value
“W ill the live system bemaintained by itsits developers?” N o!
Our users cannot be on-sitewith the project throughout
“A re test analysts w ritingtests for others to run?” N o!
“T est reports need tobe form al docum ents”
Can mix exploratory& scriptedtesting
“S ign -off can be byagreed m eeting outcom es” “A re there few enough people
to make frequent widespreadm eetings practical?” N o!
“W hat docum entation is neededfor contractual reasons? Stilltim e to negotiate?” Y es!
Documentationis still needed
for maintenanceafter go-live
Agree in a workshop whatdocumentationis needed
Documentationdoesn‟t have tobe paper: usewikis etc
Make maximumuse of tables& diagrams
©
R esolving the “conflict” betw een agile and outsourcing
“O utsourcing /offshoring isthe w ay to go”
“A gile isthe w ay to go”
CO
NFLIC
T
Objectives of methodology
are to help build and maintain a
system toappropriate
time-cost-scope-quality/risk
balance
“D oes the tim e-cost-scope-quality/riskpyram id alw ays apply?” N o !
Inspired by Guckenheimer, S. with Perez, JJ. Software engineering with Microsoft Visual StudioTeam System (Addison-Wesley Pearson Education 2006)
“Low cost is the m ostim portant factor for us”
“F ast delivery is the m ostim portant factor for us”
“H ow m uch docum entationwill we need to controlproject & products?
“H ow confident are w e inknowing requirements?
“W hat are theregulatoryrequirements?
“H ow w ill w e com m unicateacross inter-company /international boundaries?
etc…
©
Can use these principles to focus TPI, TM M , TO M etc…
• TPI: (as in earlier slides)– use “off-question” inform ation directly via S W O T– use relationships between key areas to identify causes-effects,
loops, and hence biggest-payback improvements• TMM:
– choose between staged & continuous– consider moving selected items between levels, up or down
• TOM:– the “low = 1” scores are w eaknesses (but you m ay be able to
think of others); sim ilarly “high= 5” scores strengths– look for additional symptoms (via SWOT)
• Generally:– look for Tipping Point improvements from among the many
candidates– seek Tipping Point insights into the change management
challenges (Connectors, Mavens & Sellers)
©
… or invent your ow n m ethodology…
Risk management Quality management
Insurance Assurance
V-model: what testing against W-model: quality management
Risks: list & evaluate
Define & detect errors (UT,IT,ST)Give confidence (AT)
Use Risk-BasedSTAR
Tailor risks & priorities etc to factors
lRefine test specifications progressively:lPlan based on priorities & constraintslDesign flexible tests to fitlAllow appropriate script format(s)lUse synthetic + lifelike data
Allow & assess for coverage changes Document execution & management procedures
Distinguish problems from change requests
Prioritise urgency & importance
Distinguish retesting from regression testing
Use handover & acceptance criteria
Define & measure test coverage
Measure progress & problem significance
Be pragmatic over quality targets
Quantify residual risks & confidence
Decide process targets & improve over time
Define & use metrics
Assess where errors originally made
Define & agree roles & responsibilities
Use appropriate skills mix
Use independent system & acceptance testers
Use appropriate techniques & patterns
Plan early, thenrehearse-run,acceptance tests
Use appropriate tools
Improve efficiency of STAR
Effective STARBuild lessons learned into checklists
Do Reviews& Analysis
M odified after T hom pson, N . “B est P ractices” & C ontext-Driven – building a bridge (StarEast 2003)
©
… or just do sim ple lifecycle-focussed process improvement
1.Test strategy 2.Lifecycle model
3.M om of involv‟t4.Estim & plan
5.Test spec techn6.S tatic techn‟s
7.Metrics
9.Test env‟t10.O ffice env‟t
11.Commit & motiv12.Test func & train 13.S cope of m eth‟y
14.Communication15.Reporting
16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt
(a) MANAGE STAR END-TO-END(w hole system s… … … … ...… … … … .w hole lifecycle
architecture as part of QM/QA)
(b) USE RISK-BASED STAR
… eg “7 H abits of A dequately E ffective S T A R -ers”
(e) USE FORMAL TECHNIQUES
(f) USE METRICSAND ACT ACCORDINGLY
(g) USE TOOLS APPROPRIATELY
(c) USE STRUCTURED REVIEWS
(d) MANAGE COVERAGE OF TESTS
20.Low-lev testing
8.Test automation
(involving roles &checklists)
19.Evaluation
(including… )
POSITIVEFEEDBACK
©
Process Improvement is itself a reinforcing feedback loop…
… but first w e m ay need to reverse the negative loop of process stagnation (through its Tipping Point)
“C an‟t do structured reviewsbecause not trained inrisk analysis”
“N ow w e’re trained inrisk analysis, can dostructured reviewseven better”
“N ow w e’re trained informal techniques, can dotest coverage even better”
“C an‟t do test coveragebecause not trained inform al techniques”
“No point in theseimprovements becausenot responsibleend-to-end”
“N ow w e’ve gotend-to-end responsibility,these improvements willbe even more effective”
“No point in writingchecklists becausepeople know they’rejust w ishful thinking”
“N ow w e’ve gotroot cause analysis,we can make ourchecklists even better”
©
Systems thinking: more holistic view, eg people issues
Long working hours
Money now:overtime
Money later:promotionprospects
“P izzaH ero”Culture
Fatigue
Healthdamage
Abandon healthy shopping,regular exercise &social activities
H igher “efficiency”(per week)
So what? There are more hours
More mistakes,defects, faults,failures in dev & test…
More remedial work needed
Even more work wantedby management
“Z om bie”working
“M ust need aTime Management C ourse” C ulture
“M ad butE xploitable”Culture
P roject “delivered”on time
Tolerable degree oflive failures?
“S tretch goals”
“P izzaP arasite”Culture
Psychologicaldamage
O verall “adequate”work
Lower effectiveness (per hour)
Both? Give up!
More work absorbed
Guilt,feelings ofinadequacy
Is this a vicious or virtuous feedback loop?
Alternatingillness &hard workLower output (per year)
Too manylive failures?
Expert orreplaceable?
©
Systems Thinking notationDuration ofworking hours
Short-termfinancial reward
Long-termfinancial reward
Peer-approvalreward
Management-approvalreward
Efficiency(per hour)
Effectiveness(per week)
REINFORCINGLOOP
BALANCINGLOOP 1:
Quality targets
BALANCINGLOOP 2:
Capacity of staff
A loop is balancing if it contains an odd number of opposing links;else it is reinforcing
Each author seems to vary; this is N eil T hom pson’s,
incorporating elements ofG erald W einberg’s &D ennis S herw ood’s
Overallacceptabilityof projects
“C oping”mechanisms
for fatigueetc
HealthDegreeof live
failures
©
Connections between & beyond feedback loops
• G oldratt’s T heory of C onstraints:– sometimes cause-effect trees can form loops
• Systems Thinking:– loops can have “dangles”
• S o let’s fit the tw o together… !
Code turmoil
Test failure rate
Test completionslowness Defect detection rate
Defect backlog
Failing testsslow completion
Failing testsneed fixes
Too many bad fixes,knock-on faults
More defectsadd to backlog
Tests fail ifthey expose
defects
+
System documentationis too “texty”
No dedicatedconfigurationmanagement team
Difficult to select regression tests
CM not properlycoordinated
Mistakes madein CM D on’t know w hether
problems are bad fixesor CM mistakes
Everyone toobusy to arrangea CM team
Difficult to diagnose faults
Everyone toobusy to produceconciseoverviews &distil expertise
©
What the new paradigm means for the trad. Time-Cost-Quality-Scope pyramid
Inspired by Guckenheimer, S. with Perez, JJ. Software engineering with Microsoft Visual StudioTeam System (Addison-Wesley Pearson Education 2006)
Old pyramid reprised from Gerrard, P. & Thompson, N. Risk-Based Testing (EuroSTAR 2002)
Time Cost
Quality/Risk
Scope
OLDPARADIGM Quality
Low-cost
Speed Scope
CurrentRealityTrees
FutureRealityTrees
Note: this loop hasan even number ofopposing links, soshould be reinforcing:but could be viciousor virtuous?
Quality
Low-cost
Speed Scope
STARTHERE
STARTHERE
ConflictResolutionDiagram
Quality inbuilt not imposed,less waste (eg rework),
w orking softw are is iterated…
Mythical Man Month:small teams can be
surprisingly productive
?
?
NEWPARADIGM
©
Summary
• Traditional process improvement in testing may be improved by building in principles which made Toyota (and others) global successes
• This involves thinking beyond just STAR: need to consider whole lifecycle and quality regime
• You can either fine-tune an existing method (eg TPI, TMM, TOM) or build your own method
• The principles are not proprietary and require no special training
• Focussing on feedback loops is an example of the Pareto principle (80-20)
©
• Try it for yourself: – Strengths, Weaknesses, Opportunities & Threats on Post-itTM notes– Draw pencil connectors for current & future causes & effects– Move items around, look for loops– What would it take to balance / reverse vicious loops?
• Toyota: (see slide 5)• Agile methods: (see slide 8)• Selected reading on Goldratt:
– G oldratt: m ostly “novels” (everyone quotes T he G oal) – so try this…– William Dettmer: 1st of 2 books on the T hinking T ools…
• For Systems Thinking:– G erald W einberg: IT /IS context… – P eter S enge: business m gm t context… – D ennis S herw ood: m any exam ples of loops…
• Practical application of some new paradigm principles:– Sam Guckenheimer
Way forward & Where to find out more
©
Thanks for listening!
• Contact details:
Neil ThompsonThompson information Systems Consulting Ltdwww.TiSCL.com [email protected] Oast House CrescentFarnhamSurreyGU9 0NPEngland, UK+44 (0)7000 NeilTh (634584)
• Questions?