Upload
others
View
6
Download
0
Embed Size (px)
Citation preview
1
Selecting an Agile Process:Comparing the Leading Alternatives
Presented at SQuADOctober 15, 2002
By Mike Cohn
All slides copyright 2002, Mountain Goat Software
Presenter background
Spent much of the last 15 years consulting and running contract development projects:
Viacom, Procter & Gamble, NBC, United Nations, Citibank, other smaller companies
Have periodically taken full-time positions:Genomica, McKesson, Arthur Andersen
Diverse background across:Internal software vs. Shrinkwrap productsWeb vs. Client-serverJava vs. Microsoft languages
Master’s degrees in CS and Economics
2
All slides copyright 2002, Mountain Goat Software
Background, cont.Been managing projects since 1987 but remain a programmer at heartAuthor or lead author of three books on Java and one on C++ database programming, articles in STQE and CUJ.
All slides copyright 2002, Mountain Goat Software
Today’s agenda
What is agility?Leading agile processes
FDDScrumExtreme Programming
XBreedCrystalDSDM
Final comparisons
3
All slides copyright 2002, Mountain Goat Software
A Defined Process
ADefinedProcess
Every task must be completely understood.When given a well-defined set of inputs, the same outputs are generated every time.
Wha
t is
agilit
y?
All slides copyright 2002, Mountain Goat Software
Software development:A defined process?
Is every task completely understood?Are we even getting closer?
Given the exact same inputs (including people)
Will we get the same results every time?Can we even have the exact same inputs?
Wha
t is
agilit
y?
4
All slides copyright 2002, Mountain Goat Software
Project Noise Level
Simple
Complicated
Anarchy
Complex
Close toCertainty
Far fromCertainty
Technology
Close toAgreement
Far fromAgreement
Req
uire
men
ts
Wha
t is
agilit
y?
Source: Strategic Management and Organizational Dynamics by Ralph Staceyin Agile Software Development with Scrum by Ken Schwaber and Mike Beedle.
All slides copyright 2002, Mountain Goat Software
Empirical model of process control
Useful whenProcess cannot be sufficiently described to ensure repeatabilityThere is so much complexity or noise that the process leads to different outcomes
Expects the unexpectedExercises control through frequent inspection and adaptation
Wha
t is
agilit
y?
5
All slides copyright 2002, Mountain Goat Software
Empirical model
Process
Outputs• Incremental
product changes
Controls
Inputs• Requirements• Technology• Team
Wha
t is
agilit
y?
Adapted from Agile Software Development with Scrum by Ken Schwaber and Mike Beedle.
All slides copyright 2002, Mountain Goat Software
Defined vs. Empirical
“It is typical to adopt the defined (theoretical) modeling approach when the underlying mechanisms by which a process operates are reasonably well understood. When the process is too complicated for the defined approach, the empirical approach is the appropriate choice.”
Process Dynamics, Modeling, and Control, Ogunnaike and Ray, Oxford University Press, 1992
Wha
t is
agilit
y?
6
All slides copyright 2002, Mountain Goat Software
The Agile Manifesto
We have come to valueIndividuals and interactions over processes and toolsWorking software over comprehensive documentationCustomer collaboration over contract negotiationResponding to change over following a plan
Wha
t is
agilit
y?
All slides copyright 2002, Mountain Goat Software
Individuals and interactions
Individuals and Interactionsover
Process and Tools
Scalable Continuous process refinement
Adaptive, empowered, self-organizing teams
Absence of phases Use of minimal planning
Wha
t is
agilit
y?
Adapted from: “Will the Real Agile Processes Please Stand Up”, Ken Schwaber, Cutter Consortium E-Project Management Advisory Service, v. 2, no. 8.
7
All slides copyright 2002, Mountain Goat Software
Working software
Working SoftwareOver
Comprehensive Documentation
Iterative and incremental
Working software is primary measure of
progressArtifacts minimized
Wha
t is
agilit
y?
All slides copyright 2002, Mountain Goat Software
Customer collaboration
Customer CollaborationOver
Contract Negotiation
Customer involvement throughout
Adaptive, empirical customer
relationship
Wha
t is
agilit
y?
8
All slides copyright 2002, Mountain Goat Software
Responding to Change
Responding to ChangeOver
Following a Plan
Emergent requirements
Frequent inspections
Wha
t is
agilit
y?
All slides copyright 2002, Mountain Goat Software
Feature-Driven Development
Originates in Java Modeling in Color with UML by Coad, Lefebvre and De Luca in 1999Peter Coad
Founder of TogethersoftWell-known OO methodologistUML modeler
Palmer and Felsing book in 2002
Feat
ure-
Driv
en D
evel
opm
ent
9
All slides copyright 2002, Mountain Goat Software
Features
Serve as primary unit of workSimilar to XP Stories or Scrum backlog itemsSmall enough to do in two weeks
Feature SetCollection of featuresAssigned to a Chief Programmer and her team
Major Feature SetA domain area, one or more Feature Sets
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Example features
Estimate the closing price of a stock.
Change the password for a user.
Calculate the total cost of an order.
Retrieve the room number of a guest.
Format<action> the <result> <by|for|of|to> a(n) <object>
Feat
ure-
Driv
en D
evel
opm
ent
A short description of an action of value to users of the system:
10
All slides copyright 2002, Mountain Goat Software
VisibleProgress
Eight “Best Practices”
Need all 8 to be FDD
Feat
ure-
Driv
en D
evel
opm
ent
FeatureTeams
DomainObject
ModelingDeveloping
ByFeature
Inspections
IndividualClass
Ownership
RegularBuilds
ConfigurationManagement
All slides copyright 2002, Mountain Goat Software
Five processes
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
Feat
ure-
Driv
en D
evel
opm
ent
11
All slides copyright 2002, Mountain Goat Software
Process characteristics
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
First three processes are done sequentiallyRemaining two phases are iterativeFocus is on modeling (UML)Multiple small teams spin off and work on “feature sets”
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Entry Criteria Chief Architect, Chief Programmers and Domain Experts selected
Form the modeling teamConduct a domain walkthroughStudy documents (optional)Develop small group modelsDevelop a team modelRefine the overall object modelWrite model notes
TasksFeat
ure-
Driv
en D
evel
opm
ent
Verification Domain experts provide ongoing evaluation throughout process.
The Chief Architect is satisfied with the object model.Exit Criteria
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
12
All slides copyright 2002, Mountain Goat Software
Entry Criteria An overall model has been developed.
Form the Features List TeamBuild the Features ListTasks
Feat
ure-
Driv
en D
evel
opm
ent
VerificationSelf-assessment by modelers on the features list team.External verification by Domain Experts as necessary.
Project Manager and Development Manager are satisfied with Features List.
Exit Criteria
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
All slides copyright 2002, Mountain Goat Software
Sample Features List
Making a reservationReserve a room for a guestLook up a rate for a guest…
ReportingCalculate RevPAR for a hotelCalculate RevPAR for a Competitive Set…
Feat
ure-
Driv
en D
evel
opm
ent
13
All slides copyright 2002, Mountain Goat Software
Entry Criteria The Features List has been created.
Form the Planning TeamDetermine the development sequenceAssign Feature Sets to Chief ProgrammersAssign Classes to Developers
Tasks
Feat
ure-
Driv
en D
evel
opm
ent
VerificationSelf-assessment by Project Manager, Development Manager, and Chief Programmers.
Project Manager and Development Manager are satisfied with the Development Plan.
Exit Criteria
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
All slides copyright 2002, Mountain Goat Software
Sample Development Plan
……………
Aug 2002
AndrewView Internet rates for a hotelRatesReporting
……………
Sept 2002
JamesView future reservations for a competitive set
Future Reservations
Reporting
Sept 2002
TodView future reservations for a hotel
Future Reservations
Reporting
Sept 2002
ChrisUpdate a reservation for a guestReservationsInterfacing
Aug 2002
ChrisCancel a reservation for a guestReservationsInterfacing
Aug 2002
ChrisMake a reservation for a guestReservationsInterfacing
DateChief Programmer
FeatureFeature SetMajor Feature Set
Feat
ure-
Driv
en D
evel
opm
ent
14
All slides copyright 2002, Mountain Goat Software
Entry Criteria The Development Plan has been completed.
Form a Feature TeamConduct a domain walkthrough (optional)Study the referenced documents (optional)Develop the sequence diagramsRefine the object modelWrite class and method prologueDesign inspection
Tasks
Feat
ure-
Driv
en D
evel
opm
ent
Verification The design inspection.
A successful design inspection.Exit Criteria
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
All slides copyright 2002, Mountain Goat Software
Entry CriteriaThe Design by Feature process has been completed for the selected features.
Implement classes and methods.Conduct a code inspection.Unit test.Promote to the build.
TasksFeat
ure-
Driv
en D
evel
opm
ent
Verification A successful code inspection and passing the unit tests.
Completion of at least one feature that is of value to (visible to) the client.Exit Criteria
DevelopAn
OverallModel
Build aFeatures
List
Planby
Feature
Designby
Feature
Buildby
Feature
15
All slides copyright 2002, Mountain Goat Software
Six Key Roles
Project ManagerChief ArchitectDevelopment ManagerChief ProgrammerClass OwnerDomain Expert
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Key roles
Project Manager
Administrative leadReports progressManages budgetsCreate and maintain a productive environmentShields team from distractionsUltimate decision-maker on scope, schedule and resources
Responsible for overall system designRuns collaborative sessions with other designersHighly technical but also a facilitatorMay be split into Domain Architect and Technical Architect roles
Chief Architect
Feat
ure-
Driv
en D
evel
opm
ent
16
All slides copyright 2002, Mountain Goat Software
Key roles, continued
Development Manager
Leads day-to-day development activitiesRequires good technical skillsSolves problems among Chief ProgrammersResponsible for developer resource conflictsMay be combined with Project Manager or Chief Architect
Experienced developerParticipate in A&D activitiesLead teams of 3-6 developers
Chief Programmer
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Key roles, continued
Class OwnerA developer on a team working under a Chief ProgrammerDesign, code, test and document classes
Users or analysts with domain knowledgeGo-to resources for developers
Domain Expert
Feat
ure-
Driv
en D
evel
opm
ent
17
All slides copyright 2002, Mountain Goat Software
Supporting roles
Domain ManagerRelease ManagerLanguage GuruBuild EngineerToolsmithSystem Administrator
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Supporting roles
Domain Manager Leads the Domain Experts (large projects)
Tracks items released into new buildsAn assistant to the Project ManagerRelease ManagerFe
atur
e-D
riven
Dev
elop
men
t
Language Guru
Knows all aspects of the programming languageResponsible for ensuring correct use of the languageMay be a consultant, if needed at all
18
All slides copyright 2002, Mountain Goat Software
Supporting roles
Build Engineer Maintains version control system and build processes
Creates tools needed by other individualsMay be a centralized IT team
ToolsmithFeat
ure-
Driv
en D
evel
opm
ent
System Administrator
Keeps network and servers runningSupports specialized development tools and equipmentTypically involved in system deployment
All slides copyright 2002, Mountain Goat Software
Additional roles
TestersDeployersTechnical Writers
Feat
ure-
Driv
en D
evel
opm
ent
19
All slides copyright 2002, Mountain Goat Software
Additional roles
TestersIndependently verify system meets requirementsMay be part of the project or a separate group
Plan and carry out physical deployment of new systemConvert data from old systemMay be part of project or separate
Deployers
Feat
ure-
Driv
en D
evel
opm
ent
Technical Writers Write online and printed documentationMay be part of project or separate
All slides copyright 2002, Mountain Goat Software
Tracking progress
Feat
ure-
Driv
en D
evel
opm
ent
65%
Aug 2002
Reservations(3)
65%
Tod
Work in Progress
Name of Chief Programmer
Completed
Attention
Not Yet Started
Name of Feature Set
Number of Features in Set
Percentage Complete
Target Completion Month
20
All slides copyright 2002, Mountain Goat Software
So where’s the testing?Testing is conspicuous by its absenceWhy?
FDD authors thought most organizations already have good test practices
Do they?Are they complementary to FDD?
Wanted to address “core development processes”Isn’t testing “core”?
Why else?Testing doesn’t sell UML tools
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Unit testing
The “Build by Feature” process does require unit testingApproach is left up to the Chief Programmers
Can be very different on projects with multiple Chief Programmers
FDD requires “regular” buildsNot necessarily continuous builds
Feat
ure-
Driv
en D
evel
opm
ent
21
All slides copyright 2002, Mountain Goat Software
Design inspections
Held during “Design by Feature” process for each feature setFull team (of one Chief Programmer) participatesOther Chief Programmers may be invited
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Code inspections
Not necessarily Fagan InspectionsApproach is up to each Chief Programmer
So multiple approaches may be used on the same project
While FDD says code inspections are required, they say it’s not necessary for all codeDone after unit testing is complete
Feat
ure-
Driv
en D
evel
opm
ent
22
All slides copyright 2002, Mountain Goat Software
Integration testing
Testing by FeatureChief Programmer is responsible for end-to-end testing of his feature
Leads to problems (“Do I test this or do you?”) on teams with multiple Chief Programmers
Assign a Tester to work with the Feature Team
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
Traceability and ownership
TraceabilityTest cases come from Features List
Testers own complete Feature Sets, not just individual Features
Feat
ure-
Driv
en D
evel
opm
ent
23
All slides copyright 2002, Mountain Goat Software
How agile is FDD?
SomewhatArtifacts are minimizedNoWorking software is primary measure of progressMostlyIterative and incremental
Working SoftwareNot emphasizedContinuous process refinementYesScalableNoUse of minimal planningNoAbsence of phasesNot reallyAdaptive, empowered, self-organizing teams
Individuals and Interactions
Feat
ure-
Driv
en D
evel
opm
ent
All slides copyright 2002, Mountain Goat Software
How agile is FDD?
YesFrequent inspectionNoEmergent requirements
Responding to ChangeYesAdaptive, empirical customer relationship
Yes, but not emphasized
Customer involvement throughoutCustomer Collaboration
Feat
ure-
Driv
en D
evel
opm
ent
24
All slides copyright 2002, Mountain Goat Software
Scrum“The New New Product Development Game” in Harvard Business Review, 1986.
“The… ‘relay race’ approach to product development…may conflict with the goals of maximum speed and flexibility. Instead a holistic or ‘rugby’ approach—where a team tries to go the distance as a unit, passing the ball back and forth—may better serve today’s competitive requirements.”
Wicked Problems, Righteous Solutions by DeGrace and Stahl, 1990.
This is where Scrum was first mentioned in a software context.
Scr
um
All slides copyright 2002, Mountain Goat Software
Scrum originsJeff Sutherland
Initial Scrums at Easel Corp in 1993IDX and nearly 600 people doing ScrumNot just for trivial projects
FDA-approved, life-critical software for x-rays and MRIs
Ken SchwaberADMInitial definitions of Scrum at OOPSLA 96 with Sutherland
Mike BeedleScrum patterns in PLOPD4
Scr
um
25
All slides copyright 2002, Mountain Goat Software
Characteristics
Self-organizing teamsProduct progresses in a series of month-long “sprints”Requirements are captured as items in a list of “product backlog”No specific engineering practices prescribedUses generative rules to create an agile environment for delivering projects
Scr
um
All slides copyright 2002, Mountain Goat Software
Overview
30 days
24 hours
Product BacklogAs prioritized by Product Owner
Sprint Backlog
Backlog tasksexpandedby team
Demonstrablenew functionality
Daily ScrumMeeting
Source: Adapted from Agile Software Development with Scrum by Ken Schwaber and Mike Beedle.
26
All slides copyright 2002, Mountain Goat Software
The Scrum MasterS
crum
Represents management to the projectTypically filled by a Project Manager or Team LeaderResponsible for enacting Scrum values and practicesMain job is to remove impediments
All slides copyright 2002, Mountain Goat Software
The Scrum Team
Typically 5-10 peopleCross-functional
QA, Programmers, UI Designers, etc.Members should be full-time
May be exceptions (e.g., System Admin, etc.)Teams are self-organizing
What to do if a team self-organizes someone off the team??No titles
Membership can change only between sprints
Scr
um
27
All slides copyright 2002, Mountain Goat Software
Sprints
Scrum projects make progress in a series of “sprints”
Analogous to XP iterationsTarget duration is one month
+/- a week or twoProduct is designed, coded, and tested during the sprint
Scr
um
All slides copyright 2002, Mountain Goat Software
Sequential vs. Overlapping Development
Source: “The New New Product Development Game”, Hirotaka Takeuchi and Ikujiro Nonaka, Harvard Business Review, January 1986.
Scr
um
28
All slides copyright 2002, Mountain Goat Software
No changes during the sprint
SprintInputs Tested Code
Change
Scr
um
Plan sprint durations around how long you can commit to keeping change out of the sprint
All slides copyright 2002, Mountain Goat Software
Product Backlog
A list of all desired work on the projectUsually a combination of
story-based work (“let user search and replace”)task-based work (“improve exception handling”)
List is prioritized by the Product OwnerTypically a Product Manager, Marketing, Internal Customer, etc.
Scr
um
29
All slides copyright 2002, Mountain Goat Software
Sample Product BacklogS
crum
All slides copyright 2002, Mountain Goat Software
Sprint Planning Meeting
Scr
um
Sprint Planning
Meeting
Product Backlog
Team Capabilities
Business Conditions
Technology
Current Product
Sprint Backlog
Produc
t Owne
rScru
m Team
Manag
emen
t
Custom
ers
Sprint Goal
30
All slides copyright 2002, Mountain Goat Software
The Sprint Goal
Database Application
“Make the application run on SQL Server in addition to Oracle.”
Life Sciences
“Support features necessary for
population genetics studies.”
Financial Services
“Support more technical indicators than company ABC
with real-time, streaming data.”
A short “theme” for the sprint:
Scr
um
All slides copyright 2002, Mountain Goat Software
From Sprint Goal to Sprint Backlog
Scrum team takes the Sprint Goal and decides what tasks are necessaryTeam self-organizes around how they’ll meet the Sprint Goal
Manager doesn’t assign tasks to individualsManagers don’t make decisions for the teamSprint Backlog is created
Scr
um
31
All slides copyright 2002, Mountain Goat Software
Sample Sprint BacklogS
crum
All slides copyright 2002, Mountain Goat Software
Sprint Backlog during the Sprint
ChangesTeam adds new tasks whenever they need to in order to meet the Sprint GoalTeam can remove unnecessary tasksBut: Sprint Backlog can only be updated by the team
Estimates are updated whenever there’s new information
Scr
um
32
All slides copyright 2002, Mountain Goat Software
Sprint Burndown Chart
Progress
752 762664 619
304 264180
104200
100200300400500600700800900
5/3/2002
5/5/2002
5/7/2002
5/9/2002
5/11/2002
5/13/2002
5/15/2002
5/17/2002
5/19/2002
5/21/2002
5/23/2002
5/25/2002
5/27/2002
5/29/2002
5/31/2002
Date
Rem
aini
ng E
ffort
in H
ours
Scr
um
All slides copyright 2002, Mountain Goat Software
ParametersDaily15-minutesStand-upNot for problem solving
Three questions:1. What did you do yesterday2. What will you do today?3. What obstacles are in your way?Chickens and pigs are invited
Help avoid other unnecessary meetingsOnly pigs can talk
Daily Scrum meetings
Scr
um
33
All slides copyright 2002, Mountain Goat Software
Questions about Scrum meetings?
Why daily?“How does a project get to be a year late?”
“One day at a time.” Fred Brooks, The Mythical Man-Month.
Can Scrum meetings be replaced by emailed status reports?
NoEntire team sees the whole picture every dayCreate peer pressure to do what you say you’ll do
Scr
um
All slides copyright 2002, Mountain Goat Software
Constraints
A complete list of constraints put on the team during a Sprint:
<end of list>
Scr
um
34
All slides copyright 2002, Mountain Goat Software
Sprint Review MeetingTeam presents what it accomplished during the sprintTypically takes the form of a demo of new features or underlying architectureInformal
2-hour prep time ruleParticipants
CustomersManagementProduct OwnerOther engineers
Scr
um
All slides copyright 2002, Mountain Goat Software
Testing & Scrum
Scrum doesn’t specify any specific engineering practicesHowever, each sprint is required to produce ready-to-use code
Heavy in-sprint testing is usually appliedSome teams have dedicated testers
Others have programmers test everythingOther engineering practices are up to you
Automation, code inspection, pair programming, static analysis tools, etc.
Scr
um
35
All slides copyright 2002, Mountain Goat Software
Stabilization Sprints
Team focuses entirely on defectsPrepares a product for releaseUseful during
active beta periodswhen transitioning a team to Scrumif quality isn’t quite where it should be on an initial release
Not a part of standard Scrum, just something I’ve found useful
Sprint 1 Sprint 2 Sprint 3 Sprint 4
Sprint 1 Sprint 2 Sprint 3 StabilizationSprint
Scr
um
All slides copyright 2002, Mountain Goat Software
Scalability of Scrum
Typical Scrum team is 5-10 peopleSutherland used Scrum in groups of 600+I’ve used in groups 100+
Scr
um
36
All slides copyright 2002, Mountain Goat Software
Scrum of Scrums / Meta-ScrumS
crum
All slides copyright 2002, Mountain Goat Software
How agile is Scrum?
YesArtifacts are minimizedYesWorking software is primary measure of progressYesIterative and incremental
Working SoftwareYesContinuous process refinementYesScalableYesUse of minimal planningYesAbsence of phasesYesAdaptive, empowered, self-organizing teams
Individuals and Interactions
Scr
um
37
All slides copyright 2002, Mountain Goat Software
How agile is Scrum?
YesFrequent inspectionYesEmergent requirements
Responding to ChangeYesAdaptive, empirical customer relationshipYesCustomer involvement throughout
Customer Collaboration
Scr
um
All slides copyright 2002, Mountain Goat Software
Extreme Programming (XP)
The Three ExtremosKent BeckWard CunninghamRon Jeffries
C3 ProjectExt
rem
e Pr
ogra
mm
ing
(XP
)
38
All slides copyright 2002, Mountain Goat Software
Characteristics
“Turning all the dials up to 10”1-3 week iterationsStoriesOn-site customerHeavy, heavy emphasis on unit testingDo the simplest thing possibleYou Aren’t Gonna Need It (YAGNI)
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
Core valuesMany problems can be traced back to communicationsCommunication
What is the simplest [design/code/test/etc.] that will work in this situation?Focus on known needs of today instead of planning for hypothetical future needs
Simplicity
Feedback from the system through tests and continuously integrated codeCustomers get feedback through frequent iterations
Feedback
Courage to openly say what you believeCourage to pursue design and code changes
Courage
Ext
rem
e Pr
ogra
mm
ing
(XP
)
39
All slides copyright 2002, Mountain Goat Software
12 13 Practices
Whole Team (On-site customer)Small releasesThe Planning GameSimple designPair programmingTest Driven DevelopmentCustomer Tests
Refactoring (Design Improvement)Collective code ownership Coding standardContinuous integrationMetaphorSustainable Pace
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
Practice 1
Whole Team / On-site customerEveryone sits together in one roomA real customer sits with the development team
May be a customer proxy when a real customer isn’t available (e.g., ISV)
If the business can’t spare a customer, is the project worth doing?The customer
Writes storiesWrites acceptance tests
Ext
rem
e Pr
ogra
mm
ing
(XP
)
40
All slides copyright 2002, Mountain Goat Software
Stories
Method for expressing functionality in XP
Analogous to use cases or requirements
Also used for tracking progress
Track preferencesKeep track of the types of hotel (e.g., Marriott, 4-star, etc.) that a customer stays at.
View an existing reservation
Present the customer with a list of reservations he’s made.
Sort hotels
Allow the customer to sort hotels by various attributes (e.g., class, price, name).
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
Practice 2
Small releasesPlan only as far in advance as you can seeAdjust the plan as necessaryEach release is as small as possible to actually deliver something of value
Typically 1-3 weeksDo not need to deploy
Ext
rem
e Pr
ogra
mm
ing
(XP
)
41
All slides copyright 2002, Mountain Goat Software
Ext
rem
e Pr
ogra
mm
ing
(XP
)
Technical People
Business People
Practice 3
The Planning Game
The Planning Game
Scope
Priority
Releas
e date
s
Releas
e com
posit
ion
Estimate
sCon
sequ
ence
s
Detaile
d sch
eduli
ng
Proces
s
Iteration 1 Iteration 2 Iteration 3
All slides copyright 2002, Mountain Goat Software
The Cost of Change
Ext
rem
e Pr
ogra
mm
ing
(XP
)
“The error [is] typically 100 times more expensive to correct in the maintenance phase than in the requirements phase.”
Software Engineering Economics, Barry Boehm, 1981, p. 40.
42
All slides copyright 2002, Mountain Goat Software
The Cost of Change
Project Duration
Cos
t of C
hang
e
XP
Traditional Process
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
Practice 4
Simple designDesign only for todayIf the future is uncertain, don’t code for it todayDo not add infrastructure in this iteration for stories coming in future iterations
Upcoming stories could be cancelled or lowered in priority
YAGNIDo the simplest thing that can possibly work
Ext
rem
e Pr
ogra
mm
ing
(XP
)
43
All slides copyright 2002, Mountain Goat Software
Practice 5
Pair programmingTwo programmers at one computer
The driverhas the keyboardfocuses on the tactical aspects of writing the code
PartnerWatches the forest, not the treesThinks about missing tests, integration issues, etc.
Keep each other “honest”A lot of XP requires great discipline
Programming is far more than typingPairs constantly shift
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
Practice 6
Test-Driven Development (TDD)Write the unit tests first, then write the code“Any program feature without an automated test simply doesn’t exist.”
—Kent BeckExt
rem
e Pr
ogra
mm
ing
(XP
)
44
All slides copyright 2002, Mountain Goat Software
JUnit
A framework for automated unit testingProgrammers write tests in their Java code
JUnit executes TestCases and TestSuitesProvides instant feedback on whether the code works
If each programmer writes JUnit TestCases…Details are at: www.junit.orgOther xUnit test frameworks exist (VB, http, etc.)
All slides copyright 2002, Mountain Goat Software
JUnit
45
All slides copyright 2002, Mountain Goat Software
Practice 7
Customer testsWhile programmers are programming:
Customer writes an acceptance test for each story
Ideally, a tester is available to automate the test
View an existing reservation
Present the customer with a list of reservations he’s made.
1) Test with a customer with one reservation in the past and two in the future.
2) Test with a customer with no reservations.
Front Back
All slides copyright 2002, Mountain Goat Software
Practice 8
Refactoring (Design Improvement)Refactoring
Simplifying or improving the code without changing its behavior
Automated unit tests ensure nothing breaks
Allows programmers to refactor with confidence
“Always leave the code cleaner than you found it.”
Ext
rem
e Pr
ogra
mm
ing
(XP
)
46
All slides copyright 2002, Mountain Goat Software
Practices 9-11
Collective code ownershipAnyone can change any code
In fact, you’re required to if you see a better way
Coding standardsNecessary to support collective ownership and refactoring
Continuous integrationIntegration builds happen at least dailyIdeally (and usually) continuously
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
Practices 12 and 13
MetaphorEstablish a metaphor for the system
Helps establish a common lexicon and visionReplaces “architecture” descriptions
Sustainable PaceTeams work at a pace they can sustain over the long haulWork overtime only when needed and effective
Ext
rem
e Pr
ogra
mm
ing
(XP
)
47
All slides copyright 2002, Mountain Goat Software
Practices support each other
XP works only because the strengths of one practice shore up the weaknesses of anotherExample:
Refactoring would be too risky if not for:Collective code ownershipCoding standardsPair programmingSimple designAutomated unit testsContinuous integration40-hour weeks
All slides copyright 2002, Mountain Goat Software
How agile is XP?
YesArtifacts are minimizedYesWorking software is primary measure of progressYesIterative and incremental
Working SoftwareSomewhatContinuous process refinementYesScalableYesUse of minimal planningYesAbsence of phasesYesAdaptive, empowered, self-organizing teams
Individuals and Interactions
Ext
rem
e Pr
ogra
mm
ing
(XP
)
48
All slides copyright 2002, Mountain Goat Software
How agile is XP?
YesFrequent inspectionYesEmergent requirements
Responding to ChangeYesAdaptive, empirical customer relationshipYesCustomer involvement throughout
Customer Collaboration
Ext
rem
e Pr
ogra
mm
ing
(XP
)
All slides copyright 2002, Mountain Goat Software
XBreed
Mike Beedle’s combination of Scrum, XP and Patterns
Scrum
Patterns
XP
XB
reed
49
All slides copyright 2002, Mountain Goat Software
XBreed practices
Scrum of Scrums for team leaders
Planning Game replaced by Scrum
Some YAGNI but not as much as pure XP
Generally use CRC cards for stories but also use cases for complex stories
XB
reed
All slides copyright 2002, Mountain Goat Software
XBreed practices
Architect role is defined
Strong emphasis on patterns
Weekly technology workshops
A Shared Services team once a second application is started
XB
reed
50
All slides copyright 2002, Mountain Goat Software
Crystal
Alistair CockburnProject anthropologistInterviews project teams around the world
“Software development is a cooperative game of invention and communication.”
—Alistair Cockburn
Cry
stal
All slides copyright 2002, Mountain Goat Software
Two values
People- and communication-centricTools, artifacts, and processes exist only to support the people on the project
Highly tolerantHigh or low ceremonyHigh or low discipline
Cry
stal
51
All slides copyright 2002, Mountain Goat Software
Two rules
Project must use incremental developmentIncrements cannot exceed four months
Team must hold pre- and post-increment workshops
Reflect on successes and failures of the processMid-increment workshops encouraged as well
Cry
stal
All slides copyright 2002, Mountain Goat Software
Additional characteristics
Only for collocated teamsDifferent projects need to be run differently
There can never be one processUse heavier methodologies for larger teams
Fiddling with the process is a Critical Success FactorTwo most important CSFs:
CommunicationCommunity
Cry
stal
52
All slides copyright 2002, Mountain Goat Software
The Crystal familyC
ryst
al
Essential Money(E)
Life(L)
Discretionary Money(D)
Comfort(C) C20
D20
E20
L40
C40
D40 D80
E80
C6
D6
E6
L6 L20
E40
C80
L80
Clear Yellow Orange Red
All slides copyright 2002, Mountain Goat Software
Where Cockburn thinks agile works
Cry
stal
Essential Money(E)
Life(L)
Discretionary Money(D)
Comfort(C) C20
D20
E20
L40
C40
D40 D80
E80
C6
D6
E6
L6 L20
E40
C80
L80
Clear Yellow Orange Red
53
All slides copyright 2002, Mountain Goat Software
Techniques and ArtifactsC
ryst
al
TechniquesEngineering techniques are undefined
Similar to ScrumXP techniques can be added in
ArtifactsNo specific templates definedArtifacts suggested but customize to your own needs
All slides copyright 2002, Mountain Goat Software
Crystal Clear
Targeted at D6But works up to E8 or D10
One team, one officeRoles
SponsorSenior Designer / ProgrammerDesigner / ProgrammerUser (possibly part-time)
Cry
stal
Cle
ar
54
All slides copyright 2002, Mountain Goat Software
Software is delivered incrementally
Progress is measured by code or major decisions
Automated regression testing
Some level of user involvement
Two user demos per release
Crystal Clear—Policy StandardsC
ryst
al C
lear
All slides copyright 2002, Mountain Goat Software
Crystal Clear—Typical Artifacts
Annotated Use CasesOr Feature
Descriptions
Design Sketches or
Notes
ScreenDrafts
ObjectModel
RunningCode
TestCases
User’s Manual
Cry
stal
Cle
ar
55
All slides copyright 2002, Mountain Goat Software
Crystal Orange
10-40 peopleProject duration of 1-2 yearsTime-to-market is criticalProject is not life critical Desire to communicate with future staff
But while minimizing time and cost of doing so
Cry
stal
Ora
nge
All slides copyright 2002, Mountain Goat Software
Crystal Orange—Roles
SponsorBusiness ExpertUsage expertTechnical facilitatorBusiness analyst/designerProject ManagerArchitectTester
Design mentorLead designer /programmerOther designers / programmersUI designerReuse pointWriter
Cry
stal
Ora
nge
56
All slides copyright 2002, Mountain Goat Software
Crystal Orange—Typical Artifacts
Requirements
UI Designs
Release Sequence
Common Object Model
Schedule
Inter-team Specs
Status Reports
Cry
stal
Ora
nge
User’s Guide Running Code Test Cases Migration
Code
All slides copyright 2002, Mountain Goat Software
So how do I “do Crystal?”
Hold a two-day workshop to develop policy statements for your projectStart with one of the documented variants
Crystal Clear, Orange and Orange-WebDo 2-4 month incrementsConstantly adjust process to be “barely sufficient”Reflect at middle and end of each increment
Cry
stal
57
All slides copyright 2002, Mountain Goat Software
Testing in Crystal
Product is built in increments (1-4 months)In general, testing occurs during the increments
Automated regression testing is emphasizedHowever, it’s an “embellishment”
Do whatever works for your team & project:Level of formality / documentationAmount of ceremonyTiming
Cry
stal
All slides copyright 2002, Mountain Goat Software
How agile is Crystal?
MostlyArtifacts are minimizedYesWorking software is primary measure of progressYesIterative and incremental
Working SoftwareYesContinuous process refinementYesScalableYesUse of minimal planningYesAbsence of phasesSomewhatAdaptive, empowered, self-organizing teams
Individuals and Interactions
Cry
stal
58
All slides copyright 2002, Mountain Goat Software
How agile is Crystal?
YesFrequent inspection
For C and D projects; less so for E and no for L
Emergent requirementsResponding to Change
YesAdaptive, empirical customer relationshipYesCustomer involvement throughout
Customer Collaboration
Cry
stal
All slides copyright 2002, Mountain Goat Software
DSDM
DynamicSystems Development Method
59
All slides copyright 2002, Mountain Goat Software
Origins
James Martin’s Rapid Application Developmentbook in 1991DSDM Consortium formed in 1994
Put out a collection of best practices that hadn’t yet been tried together220 organizations in Europe
Requirements Planning
User Design
Construction
Cutover
DSD
M
All slides copyright 2002, Mountain Goat Software
Characteristics
Highly iterativeStrong emphasis on prototypingUses timeboxes to control scopeStrong focus on business value
60
All slides copyright 2002, Mountain Goat Software
Current State
DSDM 4.1 is currently releasedDSDM 4.2 anticipated November/DecemberMembers “own” the process
Must join the consortium and can then vote
DSD
M
All slides copyright 2002, Mountain Goat Software
DSD
M
Avoids the “spiky sofa” curve
Adapted from StapletonTime
Use
r Inv
olve
men
tPrinciple 1Active user
involvement is imperative.
Principles
Source: Dynamic Systems Development Method, Jennifer Stapleton.
61
All slides copyright 2002, Mountain Goat Software
Principle 2Teams must be
empowered to make decisions.
Principle 3The focus is on frequent
delivery of products.
DSD
MPrinciples
All slides copyright 2002, Mountain Goat Software
DSD
M
Principles
Principle 4Fitness for business
purpose is the essential criterion for acceptance of
deliverables.Principle 5
Iterative and incremental development is necessary to converge on an accurate
business solution.
62
All slides copyright 2002, Mountain Goat Software
DSD
MPrinciples
Principle 6All changes during development are
reversible.
Principle 7Requirements are baselined at a high
level.
All slides copyright 2002, Mountain Goat Software
Principle 8Testing is integrated
throughout the lifecycle.
Principle 9A collaborative and
cooperative approach between all stakeholders
is essential.
DSD
M
Principles
63
All slides copyright 2002, Mountain Goat Software
AgreeSchedule
Create Functional Prototype
ReviewPrototype
Identify Functional Prototype
FunctionalModel
Iteration
Implement
ReviewBusiness
User Approval &User Guidelines
TrainUsersImplementation
IdentifyDesign Prototype
AgreeSchedule
Create DesignPrototype
ReviewDesign
Prototype
Design & Build
Iteration
FeasibilityStudy
BusinessStudy
Three pizzas and a cheeseD
SDM
All slides copyright 2002, Mountain Goat Software
Sequence of phases
Feasibility and Business Study are done sequentiallyCan iterate back and forth through other phases as desired
DSD
M
AgreeSchedule
Create Functional Prototype
ReviewPrototype
Identify Functional Prototype
FunctionalModel
Iteration
Implement
ReviewBusiness
User Approval &User Guidelines
TrainUsersImplementation
IdentifyDesign Prototype
AgreeSchedule
Create DesignPrototype
ReviewDesign
Prototype
Design & Build
Iteration
FeasibilityStudy
BusinessStudy
64
All slides copyright 2002, Mountain Goat Software
Feasibility Study
Done to make sure DSDM is right approach for the project
Is the project urgent?Is the project UI-intensive?Are specs incomplete?Are the users up for it?
ProducesOutline Plan for DevelopmentPrototype, if needed
FeasibilityStudy
DSD
M
All slides copyright 2002, Mountain Goat Software
Business Study
Gain an understanding of business processes
ER or class diagrams or ?Uses facilitated workshops to gain consensusIdentify users who will participate throughout projectOutline Plan is created
BusinessStudy
DSD
M
65
All slides copyright 2002, Mountain Goat Software
Functional Model & Design and Build Iterations
Repetitive cycles of:IdentifyAgreeDoReview
Functional ModelNon-production quality codeAnalysis artifacts
Design and BuildProduction quality code
DSD
M
AgreeSchedule
Create Functional Prototype
ReviewPrototype
Identify Functional Prototype
FunctionalModel
Iteration
IdentifyDesign Prototype
AgreeSchedule
Create DesignPrototype
ReviewDesign
Prototype
Design & Build
Iteration
All slides copyright 2002, Mountain Goat Software
Iden
tify
& P
lan
An idealized timebox
Rev
iew
Iden
tify
& P
lan
Rev
iew
Iden
tify
& P
lan
Investigate
Rev
iew
ConsolidateRefine
Kick
off m
eetin
g
Kick
off m
eetin
g
15% 15%70%
DSD
M
66
All slides copyright 2002, Mountain Goat Software
Timeboxing requires prioritizationMoSCoW Rules
Must havefundamental to the system
Should haveimportant requirement with short-term workaround, would normally be mandatory on a less time-constrained project
Could havecan be left out of this increment
Want to have but won’t have this timeWould like to have this increment but can wait for a future increment
DSD
M
All slides copyright 2002, Mountain Goat Software
Testing during Functional Model Iterations
AgreeSchedule
CreateFunctionalPrototype
ReviewPrototype
IdentifyFunctionalPrototype
FunctionalModel
Iteration
IdentifyDesign Prototype
AgreeSchedule
Create DesignPrototype
ReviewDesign
Prototype
Design & Build
Iteration
• Continuous testing• Items are tested as they are
produced• Heavy focus on usability testing;
perhaps even with an HF group• Usually little emphasis on non-
functional aspects
DSD
M
67
All slides copyright 2002, Mountain Goat Software
Testing during Design & Build Iterations
AgreeSchedule
CreateFunctionalPrototype
ReviewPrototype
IdentifyFunctionalPrototype
FunctionalModel
Iteration
IdentifyDesign Prototype
AgreeSchedule
Create DesignPrototype
ReviewDesign
Prototype
Design & Build
Iteration
• Testing continues• Components are driven to
releasable quality• Non-functional testing
(scalability, performance, stress, etc.) occurs
DSD
M
All slides copyright 2002, Mountain Goat Software
Implementation Phase
DSD
M
Deployment of actual system into production environment
Implement
ReviewBusiness
User Approval &User Guidelines
TrainUsersImplementation
68
All slides copyright 2002, Mountain Goat Software
AgreeSchedule
Create Functional Prototype
ReviewPrototype
Identify Functional Prototype
FunctionalModel
Iteration
Implement
ReviewBusine
ss
User Approval &User Guidelines
TrainUsers
Implementation
IdentifyDesign Prototype
AgreeSchedule
Create DesignPrototype
ReviewDesign
Prototype
Design & Build
Iteration
FeasibilityStudy
BusinessStudy
At end of Implementation Phase1. Done2. New business needs are
discoveredBack to Business Study
3. Low priority work was skipped
Back to Functional Model Iteration
4. Non-functional requirement only partially fulfilled
Back to Design and Build Iteration
23
4
DSD
M
All slides copyright 2002, Mountain Goat Software
When to use DSDM
Interactive, UI-intensiveClearly defined user groupEither small projects or projects that can be made small by decomposing themStrong time constraintsRequirements can be prioritizedRequirements are not clear or change frequently
DSD
M
69
All slides copyright 2002, Mountain Goat Software
Roles in DSDM
Business RolesExecutive SponsorVisionaryFacilitatorScribe
Ambassador UserAdvisor UserUser Roles
Technical CoordinatorTeam LeaderDeveloperTester
Technical Roles
Project Manager
DSD
M
All slides copyright 2002, Mountain Goat Software
Development Project
Teams
Project RolesProject Manager
Technical CoordinatorVisionary
Team RolesTeam Leader
Ambassador UserDeveloper
ScribeTester
End users includingAdvisor Users
User Management
Operations
Project SteeringCommittee
Executive SponsorSenior Management
DSD
M
70
All slides copyright 2002, Mountain Goat Software
DSD
M
Test at all stages to ensure system is fit for its intended business purpose.Validation
Testing principles
Test in priority order. Test the parts that deliver key value first.
Benefit-Directed Testing
Remember that tests are run to find errors. Build confidence by finding errors then having them fixed.
Error-Centric Testing
All slides copyright 2002, Mountain Goat Software
DSD
M
Test all products throughout all stages.No “test phase.”Testing must be planned as an integral activity.Testers and users test small parts iteratively and incrementally.
Test throughout the lifecycle
Testing principles
Testing should be done by someone other than the creator.
Independent testing
Make all tests repeatable.Some tests become obsolete as prototypes evolve.Archive tests with extinct prototypes in case they come back to life.
Repeatable testing
71
All slides copyright 2002, Mountain Goat Software
Testing against business goals
Testing is against a hierarchy of business goals
Not truly against requirementsEach requirement supports one or more business goals to greater or lesser degree
DSD
M
All slides copyright 2002, Mountain Goat Software
Risk-based testing
Typical project constraints force testing to be skipped in some areas
Time is critical so apply test time wisely, not necessarily evenly
RBT says to plan for this upfront by identifying areas you can skip or test lightlyIdentify – Assess – Plan – Reduce RiskDone within each timebox so if timebox expires, most important tests have been performed.Unit testing performed system-wide
DSD
M
72
All slides copyright 2002, Mountain Goat Software
Testing
Level of testing formality is reducedNormally no step-by-step test casesInstead, a list of test conditionsPredicted results not listed, rely on tester’s judgment
A final system test (by technical team and business users) does occurUse of static code analyzers and dynamic analysis tools strongly encouraged
e.g., Jtest, BoundsChecker, etc.
DSD
M
All slides copyright 2002, Mountain Goat Software
How agile is DSDM?
PartiallyArtifacts are minimizedYesWorking software is primary measure of progressYesIterative and incremental
Working SoftwareYesContinuous process refinementSomewhatScalablePartiallyUse of minimal planningNoAbsence of phasesPartiallyAdaptive, empowered, self-organizing teams
Individuals and Interactions
DSD
M
73
All slides copyright 2002, Mountain Goat Software
How agile is DSDM?
YesFrequent inspectionYesEmergent requirements
Responding to ChangeYesAdaptive, empirical customer relationshipYesCustomer involvement throughout
Customer Collaboration
DSD
M
All slides copyright 2002, Mountain Goat Software
Summary
The most agile processes areXPScrumXBreedCrystal
Less soDSDMFDD
Sum
mar
y
74
All slides copyright 2002, Mountain Goat Software
But….
“Being agile” is not necessarily the goalDelivering working software is the goal
Add your own sub-goals about:SpeedQualitySchedule predictabilityFunEtc.
Essential Money(E)
Life(L)
Discretionary Money(D)
Comfort(C) C20
D20
E20
L40
C40
D40 D80
E80
C6
D6
E6
L6 L20
E40
C80
L80
Sum
mar
y
All slides copyright 2002, Mountain Goat Software
The Hawthorne Effect
Western Electric Company, 1927-1932Impact of lighting of productivity:
With more lighting, productivity went upWith less lighting, productivity went upWith the same lighting, productivity went up
“The team gave itself wholeheartedly and spontaneously to cooperation in the experiment.”On important projects, the team owns the process.
Source: The Social Problems of an Industrial Civilization, Mayo, 1945.
75
All slides copyright 2002, Mountain Goat Software
Objections to agile
It only works with talented people
No, but you do need one “level three” developerCan a project with no level 3 developers work with ANY process?
Sum
mar
y
Person learns that there are multiple techniques.
Person learns to follow precise directions and get
predictable results.
Skill assimilated and can move between techniques without conscious thought.
1
2
3
Source: Agile Software Development, Alistair Cockburn, p. 14.
All slides copyright 2002, Mountain Goat Software
Objections to agile
It only works on trivial projectsIDXCaterpillarWe don’t yet know what is possible
It’s not appropriate for all projectsOK, use it when you can
Sum
mar
y
76
All slides copyright 2002, Mountain Goat Software
Objections to agile
Agile is hackingMore emphasis on unit testing in XP than any other process I’ve seen
Most importantly, programmers will do itPlanning is still part of the process
“Don’t confuse more exact with better.”—Brian Marick
Sum
mar
y
All slides copyright 2002, Mountain Goat Software
What to learn from agile
Communication is keyOn-site customer, programmers in shared spaceCommunicate in person, not via documents
Rapid feedbackCut out bureaucracy“Barely sufficient”Short increments
1 week to 3 months
77
All slides copyright 2002, Mountain Goat Software
What to learn from agile
Measure progress only by working codeCustomize the processAcknowledge the rapidly decreasing precision of plansYou Aren’t Gonna Need It (YAGNI)
Programmers won’t need all the architecture they designCustomers don’t need all the features
Measure success with ROI not KLOC
All slides copyright 2002, Mountain Goat Software
Where to go next?
Generalwww.agilealliance.comwww.mountaingoatsoftware.com
Crystalalistair.cockburn.usAgile Software Development and Surviving Object-Oriented Projects by Alistair Cockburn
DSDMna.dsdm.org
Furth
er S
ourc
es
78
All slides copyright 2002, Mountain Goat Software
Where to go next?
Scrumwww.mountaingoatsoftware.com/[email protected] Software Development with Scrum
Ken Schwaber and Mike BeedleTesting
Furth
er S
ourc
es
All slides copyright 2002, Mountain Goat Software
Where to go next?
XPwww.xprogramming.comhttp://c2.com/cgi/wiki?ExtremeProgrammingRoadmapextremeprogramming@[email protected]://www.extremeprogramming.org/Addison-Wesley’s XP Series of booksA Practical Guide to Extreme Programming by David Astels, Granville Miller, Miroslav Novak
XBreedwww.xbreed.org
Furth
er S
ourc
es
79
All slides copyright 2002, Mountain Goat Software
My contact information
Websiteswww.mountaingoatsoftware.comwww.userstories.com