29
Agile Requirements Methods CSSE 371 Software Requirements and Specification Mark Ardis, Rose-Hulman Institute October 26, 2004

Agile Requirements Methods CSSE 371 Software Requirements and Specification Mark Ardis, Rose-Hulman Institute October 26, 2004

  • View
    219

  • Download
    0

Embed Size (px)

Citation preview

Agile Requirements MethodsCSSE 371 Software Requirements and Specification

Mark Ardis, Rose-Hulman Institute

October 26, 2004

2

Outline

I. Origin of Agile Methods

II. Extreme Programming

III. How could this work?

3

I. Origin of Agile Methods

4

Cartoon of the Day

5

Spectrum of Methods

Source: "Get ready for agile methods, with care" by Barry Boehm, IEEE Computer, January 2002.

6

Boehm's Risk Exposure Profile

Source: "Get ready for agile methods, with care" by Barry Boehm, IEEE Computer, January 2002.

Black curve: Inadequate plans

Red curve: Market share erosion

7

Safety-Critical Profile

Source: "Get ready for agile methods, with care" by Barry Boehm, IEEE Computer, January 2002.

Black curve: Inadequate plans

Red curve: Market share erosion

8

Agile Profile

Source: "Get ready for agile methods, with care" by Barry Boehm, IEEE Computer, January 2002.

Black curve: Inadequate plans

Red curve: Market share erosion

9

Agile Manifesto

• We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: – Individuals and interactions over processes and tools

– Working software over comprehensive documentation

– Customer collaboration over contract negotiation

– Responding to change over following a plan

• That is, while there is value in the items on the right, we value the items on the left more.

10

II. Extreme Programming

11

Twelve Practices

7. Pair programming

8. Collective ownership

9. Continuous integration

10. 40-hour week

11. On-site customer

12. Coding standards

1. The Planning Game

2. Small releases

3. Metaphor

4. Simple design

5. Testing

6. Refactoring

12

1. The Planning Game

• Business people decide:– scope– priority– release dates

• Technical people decide:– estimates of effort– technical consequences– process– detailed scheduling

13

The Planning Game

• Collect User Stories on cards

• Stories are written by customers

• Stories generate tests

14

Estimating

• Be concrete

• No imposed estimates

• Feedback: compare actuals to estimates

• Re-estimate periodically

15

Scheduling

• Each story gets an estimate of effort

• Customers decide which stories are most important

• Programmers calculate how long each release will take

16

2. Small Releases

• Every release should be as small as possible

• Every release has to completely implement its new features

17

Waterfall to XP Evolution

Source: "Embracing change with extreme programming" by Kent Beck,IEEE Computer, October 1999.

18

3. Metaphor

• Each XP project has its own metaphor– naive– system is a spreadsheet

• Metaphor replaces architecture as the view from 10,000 feet

• Metaphor replaces vision statement

19

5. Testing

• Any feature without an automated test does not exist.

• Programmers need confidence in correct operation

• Customers need confidence in correct operation

• Develop test first, before code

20

Tools for Testing

• Test harnesses for various programming languages

• Simplify job of creating and running the tests

21

9. Continuous Integration

• Integrate and test every few hours, at least once per day

• All tests must pass

• Easy to tell who broke the code

22

11. On-Site Customer

• Real customer will use the finished system

• Programmers need to ask questions of a real customer

• Customer can get some other work done while sitting with programmers

23

III. How could this work?

24

1. The Planning Game

• You couldn't start with only a rough plan

• Unless:– customers did updating based on

estimates of programmers– short releases (2) revealed any mistakes in

plan– customer was sitting with programmers

(11) to spot trouble

25

2. Small Releases

• You couldn't release new versions so quickly

• Unless:– the Planning Game (1) helped work on the

most valuable stories– you were integrating continuously (9)– testing (5) reduced defect rate

26

3. Metaphor

• You couldn't start with just a metaphor

• Unless:– you got feedback on whether metaphor

was working– your customer was comfortable talking

about the system in terms of the metaphor– you refactored continually (6) to refine

understanding

27

5. Testing

• You couldn't write all those tests

• Unless:– the design was as simple as possible (4)– you were programming with a partner (7)– you felt good seeing all those tests running– your customer felt good seeing all those

tests running

28

9. Continuous Integration

• You couldn't integrate every few hours

• Unless:– you could run tests quickly (5)– you programmed in pairs (7)– you refactored (6)

29

11. On-Site Customer

• You couldn't have a real customer sitting by the programmers full-time

• Unless:– they could produce value by writing

functional tests– they could produce value by making small-

scale priority and scope decisions