68
We are doing it wrong. 10th international TYPO3 conference Robert Weißgraeber, aexea

"We are doing it wrong."

Embed Size (px)

DESCRIPTION

"We are doing it wrong" - Aspects of modern Software Development Project Management.

Citation preview

Page 1: "We are doing it wrong."

We are doing it wrong.10th international TYPO3 conference !Robert Weißgraeber, aexea

Page 2: "We are doing it wrong."

/meRobert Weißgraeber

Principal @ aexea

Page 3: "We are doing it wrong."

what we do:

Diesel EXPOSURE LOW - Sneaker - black I love jeans! These black EXPOSURE LOW sneakers by Diesel are avowed fans of denim and therefore come in a trendy jeans look, as reflected by the traditional denim elements, for instance the coin pocket, the typical rivets and a deep pocket. A declaration of love for the popular denim look: Diesel EXPOSURE LOW sneakers in black.

x10000x!

"#$%

&'

Page 4: "We are doing it wrong."

!

@robert_we #foodkoma

Page 5: "We are doing it wrong."

#NoEstimates

http://www.pleacher.com/mp/mgifs/gifs16/imp.jpg

Page 6: "We are doing it wrong."

“We are doing it wrong.”

This talk may contain subversive ideas. You have been warned. Do try this at home.

Page 7: "We are doing it wrong."

!

Aspects of project management in modern software development.

“We are doing it wrong.”

Page 8: "We are doing it wrong."

№?Why do we need to change something?

Page 9: "We are doing it wrong."

Technology adaption rate is getting stellar.Graphic: Technology adaption rate in the US (Consumer Technologies,10-90%).

Adoption Rates of Consumer Technologies: commercekitchen.com

Page 10: "We are doing it wrong."

Technology life cycles go down.Graphic: Duration of technologies as growth pulses (Consumer Technologies,10-90%).

Adoption Rates of Consumer Technologies: commercekitchen.com

Page 11: "We are doing it wrong."

Can you deliver fast enough?past? present? future?

Maersk Agile Journey.

Page 12: "We are doing it wrong."

Rethink the role of software in your business.

• Separation of software & business?

• Software solutions in a support role or as a solution in its own right?

• Adoption of mega-trends? Secure?

• Software as game changer?

Page 13: "We are doing it wrong."

”Today every company is a software company. That includes John Deere and Nordstrom.“@barry_crist (CEO @ Chef)

Page 14: "We are doing it wrong."

There will be no 4th Industrial Revolution.The evolution is already here.

And its not coming with a version number attached anymore.

Industry 4.0 Graphic: http://www.siemens.com/annual/13/en/company-report/report-industry-solutions/strategic-context/img/130E_StrategieUSA_E_Grafiken02_%5BWeb%5D.pngJenkins Logo Adaption: http://www.praqma.com/sites/default/files/cool-jenkins2x3.png

Page 15: "We are doing it wrong."

Not every system is mission critical.If you don’t care about your brand or (possible) employees.

Page 16: "We are doing it wrong."

http://mkhmarketing.files.wordpress.com/2013/01/orange-keep-calm-sign.png

Page 17: "We are doing it wrong."

№1Metaphors.Our analogies are wrong.

Page 18: "We are doing it wrong."

hdwallpapersinn.com

Page 19: "We are doing it wrong."

http://www.maurerunited.nl/wp-content/uploads/2009/07/090731_day_1024x768.jpg

Page 20: "We are doing it wrong."

Create the Impossible.You are not limited to physical boundaries.

http://www.mcescher.com/gallery/impossible-constructions/waterfall/

Page 21: "We are doing it wrong."

other analogies & metaphors

• building a bridge

• comparing software to art

• sports metaphors

• gardening, landscaping

Page 22: "We are doing it wrong."

№2Extensive Process Governance

Page 23: "We are doing it wrong."

What we do:

• We want predictive and reliable software releases

• Long term plan/backlog of features

• High Inventory on Ideas & Work in Progress

• Extensive prioritization and contracts

• Risk aversion by agent

• Linear process with multiple sign offs

Page 24: "We are doing it wrong."

№3Optimizing on Resource Utilization. *: resource = human people

Page 25: "We are doing it wrong."

Resource Planning!

• Backlog of work / orders

• Long time commitment

• Committed salaries/resources

• Resource planning as system process

*: resource = human people

Page 26: "We are doing it wrong."

Client/Business• fast implementation

• costs

• problem fitting software („the right one“)

!

!

• „the way we work together has to integrate with how our company works“

Dev Shop/Agency• getting paid

• averting risks

• mid/long term commitment

!

• „we have more problems finding developers than fitting projects“

Dev Team• making a difference by delivering

meaningful software

• working with experts

• „craftsmanship“, clean code…

• salary, job security, …

!

• live is too short for bad jobs

Conflict of Interest?

Page 27: "We are doing it wrong."

Stefan Roock

Page 28: "We are doing it wrong."

Little's law

*: resource = human people

ThroughputWork in Progress

Lead Time=

Page 29: "We are doing it wrong."

Little's law

*: resource = human people

Work in Progress=

ThroughputLead Time

Page 30: "We are doing it wrong."

Cycle Time as a Function of Utilization (and Batch Size)

leanessays.com

Cycle

Tim

e

Resource Utilization: 40% 60% 80% 100%

0

30x

1

II

III

Page 31: "We are doing it wrong."

Reduced Lead time• delivering fast is valuable

• increases throughput

• faster payout

• enables a faster delivery of value (testable on fit)

• reduced risk

• shorter feedback cycles (e.g. cash flow) can replace planning processes

Needed Changes• minimize Work in Progress (wip limit, process steps)

• Focusing on flow instead of utilization

• smaller batch size

Where is the $$?

Page 32: "We are doing it wrong."

№4HeijunkaChoosing a non-adaptive technique for non-standardized activities.

Page 33: "We are doing it wrong."

Value Chain und Work in Progress.

*: resource = human people

Work in Progress=

ThroughputLead Time

Page 34: "We are doing it wrong."

Software Development Value Stream

softwarecreation.org

Needs Live Feature

Analysis / RequirementsArchitecture / Design

ProgrammingTesting / Acceptance

Deployment / Delivery

Customer not available

Business Approval

Excessive up-front architecture

Design approval Meetings

Developers involvementin other Projects

Over-engineering

Redoing because of incorrect requirement and stress

Fixing Bugs

Deferred Integration Debt

Page 35: "We are doing it wrong."

cut back on• sitting idle

• waiting for somebody to work on it

fast exchange of information• move design, code and information fast

• link processes and people together to make problems surface right away

• remove linear sequences

Software Development Flow

Page 36: "We are doing it wrong."

“One-Piece Continous Flow.”Citrix Online applied this pattern at the enterprise level and reduced release cycles from 42 months to less than 10 months resulting in substantial gains in market share for their products.

scrumplop.org

Page 37: "We are doing it wrong."

“All the brilliant minds working on !

the same thing… at the same time… in the same space on the same computer… !

just like a real mob.” !– Woody Zuill

It exists. it’s called „Mob Programming“.

Page 38: "We are doing it wrong."

№5Projects.Moving #BeyondProjects.

Start thinking small: Diseconomies of Scale.

Page 39: "We are doing it wrong."

a Project• a temporary organization

• to achieve predefined result

• at a pre-specified time

• using predetermined results.

Success Criteria• On Schedule

• On Budget

• On Quality (~ Features)

Assumptions• The Value is knowable (at the start)

• There is no value in flexibility.

Intrinsic Properties

Page 40: "We are doing it wrong."

“Delivering on schedule, budget & features is a sign of failure, not success.”

Allan Kelly, #NoProjects@allankellynet

Page 41: "We are doing it wrong."

We make decisions, when we know the least.

Decision Impact

Time in Project

Knowledge LearnedStuff

Page 42: "We are doing it wrong."

Projects are, where software goes to die.Successful Software does not end.

http://www.dorkly.com/post/1243/pacman-graves

Page 43: "We are doing it wrong."

Software is state, not result.

• Treat everything as Service Creation & Service Delivery • Don’t limit your options by long-term determination

Page 44: "We are doing it wrong."

Bring work to stable teams

• Create stable performing teams

• Close to the business

• Bring the work to the team

• Manage different teams as queues with capacity, not via queue switching.

Page 45: "We are doing it wrong."

The Value Distribution of Requirements in a Project is not linear.

I E E E S O FT W A R E 7 1

.

requirements. Based on this discussion,software managers prioritize the re-quirements and decide which will actu-ally be implemented. They can also usethe information to develop strategies forrelease planning.

CASE STUDY 1:THE RAN PROJECT

Since 1992, Ericsson Radio SystemsAB and the Department of Computerand Information Science at LinköpingUniversity have been involved in a jointresearch program to identify, apply, andevaluate ways to improve the early phasesof the software engineering process. Aspart of this collaboration, in January1994 we were invited in to use the in-dustry-as-laboratory approach,8 per-forming in-depth case studies in an in-dustrial environment. For the first study,we selected Ericsson’s Radio AccessNetwork project.

The goal of the RAN project was toidentify and specify requirements for asystem that would give managers infor-mation about mobile telephony systemoperation.9 The project started small,with a staff of five, but as a result of ourstudy it grew considerably and is now anumbrella for a portfolio of both researchand development projects.

First steps. We identified 14 high-levelrequirements (services) that covered themain system functionality. These high-level requirements were intended to givemanagers information about issues suchas capacity, coverage, and quality in a mo-bile communications system. Once we’ddefined the 14 requirements, the projectmembers reviewed and agreed on them. The prioritizing technique in use at thattime was to rank-order the requirementson an ordinal scale ranging from 1 to 3,where 1 denotes highest priority. In prac-tice, the requirements belonging to cat-egory 1 were then implemented and therest discarded or postponed to future re-leases. Because we’d used this techniquebefore and found it far from optimal,6 wedecided to prioritize RAN requirements

using the cost–value approach.We asked a group of experienced pro-

ject members to represent customers’views and carefully instructed them onprioritizing requirements, making pair-wise comparisons, choosing the scale tobe used, and deciding how many com-parisons would be needed. We also ex-plained the importance of carrying outthe pairwise comparisons carefully.

To begin, the project manager ex-plained each candidate requirement anddiscussed it with project participants. Hedid this to make the requirements moreclear and reduce subsequent misinter-pretations. We distributed sheets outlin-ing the 91 unique pairs of requirements,including the fundamental scale (asshown in Table A in “The AnalyticHierarchy Process” on page 70). Parti-cipants then performed pairwise com-parisons of the candidate requirements,first according to value and later, in a sep-arate session, according to the estimated

implementation cost.We let the participants work with the

requirement pairs in any order they chose,allowing for retraction during the com-parison process. The session was notmoderated and participants worked attheir own pace. Discussions were allowed,though in fact there were very few.Completing the cost–value approach tookabout an hour. When all 14 requirementshad been pairwise compared, we calcu-lated the value distribution and the costdistribution, as well as the consistency in-dices and ratios of the pairwise compar-isons. There were some judgmental er-rors, since the consistency ratios for bothvalue and cost were computed as 0.18.Based on the resulting distributions, weoutlined the candidate requirements in acost–value diagram and presented the re-sults to the project members.

Requirements’ value. Each requirement’sdetermined value is relative and based on

Requirement

Cost

(per

cent

) 20

15

10

5

01% 2% 3% 4%

6%

11%

4%6% 7%

12%

4%6%

23%

10%

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Figure 3. Estimated cost of requirements implementation in the RAN project.Requirements 6, 10, 13, and 14 constitute 56 percent of the total implementation costs.

Requirement

Valu

e (p

erce

nt)

20

15

10

5

0

12%

6% 5%7%

12%

16%

3% 3% 4% 5%

1% 1%

21%

3%

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Figure 2. The value distribution of the 14 requirements in the RAN project.

I E E E S O FT W A R E 7 1

.

requirements. Based on this discussion,software managers prioritize the re-quirements and decide which will actu-ally be implemented. They can also usethe information to develop strategies forrelease planning.

CASE STUDY 1:THE RAN PROJECT

Since 1992, Ericsson Radio SystemsAB and the Department of Computerand Information Science at LinköpingUniversity have been involved in a jointresearch program to identify, apply, andevaluate ways to improve the early phasesof the software engineering process. Aspart of this collaboration, in January1994 we were invited in to use the in-dustry-as-laboratory approach,8 per-forming in-depth case studies in an in-dustrial environment. For the first study,we selected Ericsson’s Radio AccessNetwork project.

The goal of the RAN project was toidentify and specify requirements for asystem that would give managers infor-mation about mobile telephony systemoperation.9 The project started small,with a staff of five, but as a result of ourstudy it grew considerably and is now anumbrella for a portfolio of both researchand development projects.

First steps. We identified 14 high-levelrequirements (services) that covered themain system functionality. These high-level requirements were intended to givemanagers information about issues suchas capacity, coverage, and quality in a mo-bile communications system. Once we’ddefined the 14 requirements, the projectmembers reviewed and agreed on them. The prioritizing technique in use at thattime was to rank-order the requirementson an ordinal scale ranging from 1 to 3,where 1 denotes highest priority. In prac-tice, the requirements belonging to cat-egory 1 were then implemented and therest discarded or postponed to future re-leases. Because we’d used this techniquebefore and found it far from optimal,6 wedecided to prioritize RAN requirements

using the cost–value approach.We asked a group of experienced pro-

ject members to represent customers’views and carefully instructed them onprioritizing requirements, making pair-wise comparisons, choosing the scale tobe used, and deciding how many com-parisons would be needed. We also ex-plained the importance of carrying outthe pairwise comparisons carefully.

To begin, the project manager ex-plained each candidate requirement anddiscussed it with project participants. Hedid this to make the requirements moreclear and reduce subsequent misinter-pretations. We distributed sheets outlin-ing the 91 unique pairs of requirements,including the fundamental scale (asshown in Table A in “The AnalyticHierarchy Process” on page 70). Parti-cipants then performed pairwise com-parisons of the candidate requirements,first according to value and later, in a sep-arate session, according to the estimated

implementation cost.We let the participants work with the

requirement pairs in any order they chose,allowing for retraction during the com-parison process. The session was notmoderated and participants worked attheir own pace. Discussions were allowed,though in fact there were very few.Completing the cost–value approach tookabout an hour. When all 14 requirementshad been pairwise compared, we calcu-lated the value distribution and the costdistribution, as well as the consistency in-dices and ratios of the pairwise compar-isons. There were some judgmental er-rors, since the consistency ratios for bothvalue and cost were computed as 0.18.Based on the resulting distributions, weoutlined the candidate requirements in acost–value diagram and presented the re-sults to the project members.

Requirements’ value. Each requirement’sdetermined value is relative and based on

Requirement

Cost

(per

cent

) 20

15

10

5

01% 2% 3% 4%

6%

11%

4%6% 7%

12%

4%6%

23%

10%

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Figure 3. Estimated cost of requirements implementation in the RAN project.Requirements 6, 10, 13, and 14 constitute 56 percent of the total implementation costs.

Requirement

Valu

e (p

erce

nt)

20

15

10

5

0

12%

6% 5%7%

12%

16%

3% 3% 4% 5%

1% 1%

21%

3%

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Figure 2. The value distribution of the 14 requirements in the RAN project.

7 2 S E P T E M B E R / O C T O B E R 1 9 9 7

.

a ratio scale. This means that a require-ment whose determined value is 0.10 istwice as valuable as a requirement with adetermined value of 0.05. Moreover, thesum of all requirements value measures isalways 1. Thus, a requirement with a de-termined value of 0.10 represents 10 per-cent of the total value of the requirementsset. Figure 2 shows the value distributionof the 14 requirements in the RAN pro-ject. As the figure shows, the value of in-dividual requirements can vary by ordersof magnitude. The four most valuable re-quirements—1, 5, 6, and 13—constitute61 percent of the total value; the four leastvaluable requirements—7, 8, 11, and 12—contribute a mere 8 percent. At the ex-tremes, requirement 13 is about 20 timesas valuable as requirement number 11.

Requirements’ cost. A requirement’s es-timated cost is also relative and based ona ratio scale; the sum of all costs is again1. Figure 3 shows the estimated cost ofRAN’s 14 requirements, which can againvary by orders of magnitude. The fourmost expensive requirements—6, 10, 13,and 14—constitute 56 percent of thetotal cost; the four least expensive re-

quirements—1, 2, 3, and 11—accountfor only 10 percent. Looking again at theextreme values, requirement number 13is about 20 times as expensive to imple-ment as requirement number 1.

Requirements cost–value analysis. Figure4 shows the cost–value diagram of the 14requirements. For discussion purposes,we divide cost–value diagrams into threedistinct areas:

♦ requirements with a high ratio ofvalue to cost (a value–cost ratio exceed-ing 2),

♦ requirements with medium ratio ofvalue to cost (a value–cost ratio between0.5 and 2), and

♦ requirements with a low ratio ofvalue to cost (a value–cost ratio lowerthan 0.5).As such, we infer that requirements 1, 2,and 5 fall into the high ratio category; re-quirements 3, 4, 6, 7, and 13 into themedium ratio category; and require-ments 8, 9, 10, 11, 12, and 14 into the lowratio category. Based on these categories,the software managers were able to ef-fectively and accurately prioritize theirrequirements.

The cost–value diagram clearly facil-itates requirements selection. If, hypo-thetically, you chose to implement all re-quirements except numbers 10, 11, and12, the software system’s value for its cus-tomers would be 94 percent of the pos-sible maximum, while the cost would bereduced to 78 percent of the cost for im-plementing all requirements. In general,by not implementing the requirementsthat contribute little to stakeholder sat-isfaction, you can significantly reduce thecost and duration of development.

CASE STUDY 2:THE PMR PROJECT

Encouraged by the apparent usefulnessand effectiveness of the cost–value ap-proach in the RAN project, we undertooka second case study. We picked a projectthat was developing a fourth release: thePerformance Management Traffic Re-cording project. PMR is a software sys-tem that enables recording and analysis ofmobile telecommunications traffic. Theproject began in 1992 with a full-time staffof 15 people, and has delivered 10 releasesof varying sizes thus far.

At the time we joined the project, thesystem’s third release was installed andrunning at customer sites. Many new re-quirements had emerged that had to betaken into account in planning the nextrelease. We divided these new require-ments into three categories: those de-manding traditional defect correction,those requiring performance enhance-ment, and those suggesting added func-tionality. We decided to prioritize onlythe last category because both the pro-ject managers and the customers agreedthat all defects had to be corrected andperformance had to be enhanced. How-ever, the exact functions to be addedwere up for negotiation.

The 11 high-level functional require-ments dealt with issues such as presenta-tion, sorting, and structuring new typesof information. To prioritize these re-quirements each project member had tocomplete 55 pairwise comparisons foreach criterion using the cost–value ap-

5 15

Valu

e (p

erce

nt)

25

20

15

10

5

0

0 10 20 25Cost (percent)

1 5

24

37

9

11 1214

10

6

13

8

Figure 4. Cost–value diagram for the RAN project requirements. By not implement-ing the requirements that contribute little to stakeholder satisfaction, such as 10, 11, and12, you can significantly reduce the cost and duration of development.

f e a t u r e

Karlsson/Ryan: „a cost-value approach for prioritizing requirements“, IEEE 1997

Page 46: "We are doing it wrong."

Reduce Batch Size

• „Elephant carpaccio“, „hamburger method“

• identify non-linear value distributions

• early, provable value delivery

• enables options by selection

• create portfolio of options

!

• Deliver features, not projects.

Page 47: "We are doing it wrong."

№6„Investment Decisions“

Page 48: "We are doing it wrong."

#NoEstimates

Estimating software is fed from our believe in the omnipotence of project managers.

Page 49: "We are doing it wrong."

Software development is a learning process. !

Working code is a side effect.

Learning is non linear.

- Alberto Brandolini

http://www.astro.princeton.edu/~jstone/images/sp.gif

Page 50: "We are doing it wrong."

Cost of DelayEnd dates are bad.Deadlines are good.

Page 51: "We are doing it wrong."
Page 52: "We are doing it wrong."

- Alberto Brandolini (@ziobrando)

“Technical debt is not like debt with the bank. It's debt with the mob.”

Page 53: "We are doing it wrong."

№7Doing things, that are technically possible.

Page 54: "We are doing it wrong."

What is „Done“?Software that's out in production doing its job.

Anything else is just self-delusion.

Page 55: "We are doing it wrong."
Page 56: "We are doing it wrong."

blog.endpoint.com stackoverflow.com blog.spearce.org

Page 57: "We are doing it wrong."

Continous Delivery should be a mutual interest of everyone.

http://electric-cloud.com/blog/2014/10/the-big-bang-and-why-we-are-here/

Page 58: "We are doing it wrong."

Released software starts delivering value.

Value

Time / ReleasesTime / Releases

Value

Page 59: "We are doing it wrong."

The software is ready for release. Every single moment.

Page 60: "We are doing it wrong."

Continous Delivery requires automation.

wikipedia.com

Page 61: "We are doing it wrong."

ポカヨケ(Poka Yoke)

Page 62: "We are doing it wrong."

“If Google has no branches, why do you think you might need them?”

Amazon releases to production every 11.6 seconds.

(Number from May 2011)

Page 63: "We are doing it wrong."

Summary

Page 64: "We are doing it wrong."

Work with aligned interests.

Trust in working together. Make everything transparent.

Page 65: "We are doing it wrong."

Do valuable software.

Meaningful software is far beyond some %s. Change the game.

Page 66: "We are doing it wrong."

Deliver early and continuously.

Deliver value and make it provable.

Page 67: "We are doing it wrong."

Focus on flow and service state.

Flow, throughput, velocity, flexibility.

Page 68: "We are doing it wrong."

Thank you.

Robert Weißgraeber, @robert_we