125
Noah Sussman [email protected] @noahsussman How Continuous Delivery is changing Quality Assurance. Continuous Improvement GroupOn Palo Alto January 15 2013

Continuous Improvement (GroupOn, Palo Alto 2013)

Embed Size (px)

Citation preview

Page 1: Continuous Improvement (GroupOn, Palo Alto 2013)

Noah Sussmannsnoahsussmancomnoahsussman

How Continuous Delivery is changing Quality AssuranceContinuous ImprovementGroupOnPalo AltoJanuary 15 2013

film still from The Lord of the Rings

The canonical Agile release cycle

Cocento Tecnologia on Flickr

Sprints of two or more weeks in length

TheMasonDixon on Etsy

Start deployment once the sprint is over

SisterDimension on Flickr

QA is part of the release process

film still from The Lord of the Rings

QA sign-off is required before going live

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 2: Continuous Improvement (GroupOn, Palo Alto 2013)

film still from The Lord of the Rings

The canonical Agile release cycle

Cocento Tecnologia on Flickr

Sprints of two or more weeks in length

TheMasonDixon on Etsy

Start deployment once the sprint is over

SisterDimension on Flickr

QA is part of the release process

film still from The Lord of the Rings

QA sign-off is required before going live

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 3: Continuous Improvement (GroupOn, Palo Alto 2013)

Cocento Tecnologia on Flickr

Sprints of two or more weeks in length

TheMasonDixon on Etsy

Start deployment once the sprint is over

SisterDimension on Flickr

QA is part of the release process

film still from The Lord of the Rings

QA sign-off is required before going live

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 4: Continuous Improvement (GroupOn, Palo Alto 2013)

TheMasonDixon on Etsy

Start deployment once the sprint is over

SisterDimension on Flickr

QA is part of the release process

film still from The Lord of the Rings

QA sign-off is required before going live

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 5: Continuous Improvement (GroupOn, Palo Alto 2013)

SisterDimension on Flickr

QA is part of the release process

film still from The Lord of the Rings

QA sign-off is required before going live

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 6: Continuous Improvement (GroupOn, Palo Alto 2013)

film still from The Lord of the Rings

QA sign-off is required before going live

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 7: Continuous Improvement (GroupOn, Palo Alto 2013)

evoo73 on Flickr

The Continuous release cycle

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 8: Continuous Improvement (GroupOn, Palo Alto 2013)

Travis S on Flickr

Minimum viable feature set

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 9: Continuous Improvement (GroupOn, Palo Alto 2013)

Releasing a feature is decoupled from deploying code

David E Smith on Flickr

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 10: Continuous Improvement (GroupOn, Palo Alto 2013)

An airport without an air traffic controller

mdashChad Dickerson

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 11: Continuous Improvement (GroupOn, Palo Alto 2013)

Etsy

Real-time data on how releases impact revenue

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 12: Continuous Improvement (GroupOn, Palo Alto 2013)

Default to open access

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 13: Continuous Improvement (GroupOn, Palo Alto 2013)

Constant tweaks to live features

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 14: Continuous Improvement (GroupOn, Palo Alto 2013)

dogpose on Flickr

Large features are deployed piecemeal over time

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 15: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Every feature is part of an AB campaign

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 16: Continuous Improvement (GroupOn, Palo Alto 2013)

Joy and Jon

Dark launches

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 17: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Opt-in experiments

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 18: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Partial rollouts

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 19: Continuous Improvement (GroupOn, Palo Alto 2013)

Wikipedia

Config Flags

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 20: Continuous Improvement (GroupOn, Palo Alto 2013)

Joe Thomissen on Flickr

Wire-Offs

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 21: Continuous Improvement (GroupOn, Palo Alto 2013)

if ($cfg[new_search]) new hotness$resp = search_solr()

else old busted$resp = search_grep()

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 22: Continuous Improvement (GroupOn, Palo Alto 2013)

$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 23: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

There is no ldquodone donerdquo

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 24: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Observed Behavior Of Complex Systems

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 25: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Emergent behaviors require unplanned responses

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 26: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Improvements are discovered rather than designed

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 27: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Users of the system have complex expectations

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 28: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Complex systems are never ldquocompleterdquo

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 29: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

QA Happens When

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 30: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

First of all what is ldquoQuality Assurancerdquo

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 31: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

QA assuring that there are no defects

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 32: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

It is impossible to prove the absence of defects

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 33: Continuous Improvement (GroupOn, Palo Alto 2013)

Lukjonis

There will always be bugs in production

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 34: Continuous Improvement (GroupOn, Palo Alto 2013)

Testing is everyonersquos job

Library of Congress

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 35: Continuous Improvement (GroupOn, Palo Alto 2013)

The Jargon File

Myths About Bug Detection

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 36: Continuous Improvement (GroupOn, Palo Alto 2013)

Myth there are a finite number of bugs

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 37: Continuous Improvement (GroupOn, Palo Alto 2013)

niscratz on Flickr

Myth here are a finite number of detectable bugs

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 38: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Myth all severity one bugs can be found before release

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 39: Continuous Improvement (GroupOn, Palo Alto 2013)

Fred Brooks at Etsy

Myth software is built to specifications

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 40: Continuous Improvement (GroupOn, Palo Alto 2013)

Myth at some point software is finished

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 41: Continuous Improvement (GroupOn, Palo Alto 2013)

Myth most bugs have complex unpredictable causes

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 42: Continuous Improvement (GroupOn, Palo Alto 2013)

The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation

mdash Edsger Dijkstra

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 43: Continuous Improvement (GroupOn, Palo Alto 2013)

The whole time Irsquom programming Irsquom constantly checking my assumptions

mdashRasmus Lerdorf

loriabys

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 44: Continuous Improvement (GroupOn, Palo Alto 2013)

As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer

mdash Steve McConnell

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 45: Continuous Improvement (GroupOn, Palo Alto 2013)

Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it

mdashBrian Kernighan

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 46: Continuous Improvement (GroupOn, Palo Alto 2013)

No blame

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 47: Continuous Improvement (GroupOn, Palo Alto 2013)

Many Small Anomalies Combined

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 48: Continuous Improvement (GroupOn, Palo Alto 2013)

An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses

mdash Wikipedia

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 49: Continuous Improvement (GroupOn, Palo Alto 2013)

John Allspaw

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 50: Continuous Improvement (GroupOn, Palo Alto 2013)

Prioritize the elimination of small errors

John Allspaw

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 51: Continuous Improvement (GroupOn, Palo Alto 2013)

Focus less on mitigation of large catastrophic failures

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 52: Continuous Improvement (GroupOn, Palo Alto 2013)

Optimize for recovery rather than failure prevention

Failure is inevitable

Richard Avedon

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 53: Continuous Improvement (GroupOn, Palo Alto 2013)

Unit testing is great for preventing small errors

John Allspaw

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 54: Continuous Improvement (GroupOn, Palo Alto 2013)

Resilience Not ldquoQualityrdquo

John Allspaw

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 55: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Readable code

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 56: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Reasonable test coverage

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 57: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Sane architecture

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 58: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Good debugging tools

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 59: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

An engineering culture that values refactoring

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 60: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Measurable goals

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 61: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Manual TestingBut probably not the kind yoursquore thinking of

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 62: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Real-Time Monitoring is the new face of testing

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 63: Continuous Improvement (GroupOn, Palo Alto 2013)

Etsy

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 64: Continuous Improvement (GroupOn, Palo Alto 2013)

Etsy

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 65: Continuous Improvement (GroupOn, Palo Alto 2013)

Etsy

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 66: Continuous Improvement (GroupOn, Palo Alto 2013)

Anomaly detection is hard

Greg and Tim Hildebrandt

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 67: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Watching the graphs

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 68: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

As of 2012 Etsy collected well over a quarter million real-time metrics

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 69: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Deciding which metrics matter is a human problem

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 70: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Everyone watches some subset of the graphs

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 71: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Human vision is an excellent tool for anomaly detection

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 72: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

QA happens when

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 73: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Exploratory testing can be performed at any time

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 74: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Rigorous scientific approach

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 75: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Focus on customer satisfaction

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 76: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Less focus on product specifications

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 77: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Exploratory Testing is equally useful before or after a release

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 78: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Just Quality

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 79: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

ldquoAssurancerdquo is a terrible word Letrsquos discard it

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 80: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Quality exists but itrsquos tricky to assure or prove that

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 81: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Therersquos no such thing as a formal proof of quality

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 82: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Most of us would agree that quality exists

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 83: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 84: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Customer ExperienceThough therersquos no formal proof for that either

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 85: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Exploratory Testing addresses areas that Developer Testing doesnrsquot

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 86: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Developer Testing validates assumptions

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 87: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

The Independent Testerrsquos job is to invalidate assumptions

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 88: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Technology Informs Customer Experience

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 89: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Exploratory Testing requires an understanding of the whole system

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 90: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Exploratory Testing requires understanding how the system serves a community of users

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 91: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Customer Experience is as much about technology as it is about product requirements

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 92: Continuous Improvement (GroupOn, Palo Alto 2013)

mdash Eric S Raymond

Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 93: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 94: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Customer Support

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 95: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Your customer support operators spend more time talking to your users than anyone else

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 96: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Customer Support interface with users as individuals rather than as aggregate data

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 97: Continuous Improvement (GroupOn, Palo Alto 2013)

Keep the feedback loop short

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 98: Continuous Improvement (GroupOn, Palo Alto 2013)

Manage Your Culture

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 99: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Effeciency To Thoroughness Trade-Off

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 100: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Rapid release cycles have different risks thanslower release cycles

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 101: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Continuous Delivery does not alter the fundamental nature of risk

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 102: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Test in both dev and prod

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 103: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Detectable errors should be caught in dev

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 104: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Undetectable errors must be worked out in production

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 105: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Software exists in context

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 106: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Networks services and people are always in flux

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 107: Continuous Improvement (GroupOn, Palo Alto 2013)

Small changesets are easier to debug

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 108: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

An SCM revert is a changeset

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 109: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Large changesets are riskier and harder to debug

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 110: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Fail Forward

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 111: Continuous Improvement (GroupOn, Palo Alto 2013)

scrapnow on Etsy

Always deploy the HEAD revision of trunk

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 112: Continuous Improvement (GroupOn, Palo Alto 2013)

Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 113: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Instead of rolling back fix the problem and move on

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 114: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Let go of the idea of ldquolast stable releaserdquo

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 115: Continuous Improvement (GroupOn, Palo Alto 2013)

Scott Holloway

Focus less on satisfying the requirements

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 116: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Watch the graphs

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 117: Continuous Improvement (GroupOn, Palo Alto 2013)

NASA

Listen to your customers

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 118: Continuous Improvement (GroupOn, Palo Alto 2013)

Kirsten Dunst on the set of Marie Antoinette

Build a culture of shared responsibility

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 119: Continuous Improvement (GroupOn, Palo Alto 2013)

Kirsten Dunst on the set of Marie Antoinette

Low-Ceremony Process

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 120: Continuous Improvement (GroupOn, Palo Alto 2013)

WSHS Science blog

Iteratively improve your product

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 121: Continuous Improvement (GroupOn, Palo Alto 2013)

Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker

ldquoLook At Your Datardquo John Rausser

ldquoOptimizing For Developer Happinessrdquo Chad Dickerson

ldquoOutages Postmortems and Human Errorrdquo John Allspaw

httpenwikipediaorgwikiSwiss_cheese_model

ldquoWhat Is Exploratory Testingrdquo James Bach

Questions

noahsussmannsnoahsussmancominfiniteundocom

Page 122: Continuous Improvement (GroupOn, Palo Alto 2013)

Questions

noahsussmannsnoahsussmancominfiniteundocom