12
VISION: I be uiori rec o Hll i /cd ci w m p i on on :"ues r' lu od 10 quul ilY · MISSION: ASQ 1Jis paper was presented at tbe 7ICSQ «S Inspections are widely accepted as an effective mechanism to find errors in software producʦ early in the development phases. TIus paper presenʦ a model usel to evaluate the effectiveness of both a soare organization's inspection process as well as d eployment. The paper briefly describes and contrasts the different types of peer reviews used within industry. Benefits of Formal Inspec[ions are presented. Next we describe the Formal Inspection Best-in-Class (FIBIC) model and discuss how it was devel oped and validated. The paper then desClibes how dle model was used to assess and improve the Peer Review Process of one of Motorola's larges[ soſtware development organizations. Finally, the paper suggests how the model can be further opti- mized to pVide additional insight into defuling and implementing Fonml Inspections. ODUON Software engineering technology and processes have changed significantly since tlle first software programs were created over 50 years ago. However, one aspect of software development has not kept pace witl1 other engi- neeri ng discipl ines; software d evelopmen t T Y remains a manual, human-intensi ve task. As such, software continues its susceptibility to human e nms. A neay universal shocoming among us humans is dle lack of abili to see our own miskes. "0 wa d some power the giſtie gie us to see oursels as idle see liS." - Robei1 Bums The soſtware engineering solution to these human pbl ems is the use of a cla of activities including walkthroughs, peer reviews, and inspections. lere is oſten misundersnding on the differences between peer reviews, walk- throughs' and inspections, with activities lilat range fm an infonnal desk check to dle highly structured fonnal soſtware inspection, or Fagan Inspection. Reviews and �lkthghs are pi- cally peer gup dn acvil that focus on leag and gaining coens. An ipec- tion is to ident ennrs dle work pruct and to COlTect tllem as near to the pain! of origin as possible. We dwell on the tennilogy because some organizations do not disti ngu ish the process differences among the valious fonns, yet expect results consistent Wil the more fmmal inspection paradigm. 'Dle Soare Engineering Institute (SEI) and our own inteal soſtware training have not helped here. Both refer to (cant. on p. 2) American Society for Quality �il�ASQ ,... , ".�. Software Division

T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

VISION:

I be Clul+iority rec o Hll i /cd ciwm p ion on

:"ues r' lu od 10

quul ilY ·

MISSION:

ASQ

17Jis paper was presented at tbe 7ICSQ

ABSTRACf Inspections are widely accepted as an

effective mechanism to find errors in software products early in the development phases. TIus paper presents a model useful to evaluate the effectiveness of both a software organization's inspection process as well as its deployment.

The paper briefly describes and contrasts the different types of peer reviews used within industry. Benefits of Formal Inspec[ions are presented. Next we describe the Formal Inspection Best-in-Class (FIBIC) model and discuss how it was developed and validated. The paper then desClibes how dle model was used to assess and improve the Peer Review Process of one of Motorola's larges[ software development organizations. Finally, the paper suggests how the model can be further opti­mized to proVide additional insight into defuling and implementing Fonml Inspections.

INTRODUCITON

Software engineering technology and processes have changed significantly since tlle first software programs were created over 50 years ago. However, one aspect of software development has not kept pace witl1 other engi­neering disciplines; software developmen t

T Y

remains a manual, human-intensive task. As such, software continues its susceptibility to human enms. A nearly universal shortcoming among us humans is dle lack of ability to see our own mistakes.

"0 wad some power the giftie gie us to see oursels as idlers see liS." - Robei1 Bums

The software engineering solution to these human problems is the use of a class of activities including walkthroughs, peer reviews, and inspections. TIlere is often misunderstanding on the differences between peer reviews, walk­throughs' and inspections, with activities lilat range from an infonnal desk check to dle highly structured fonnal software inspection, or Fagan Inspection. Reviews and "'�lkthroughs are typi­cally peer group discussion activille; that focus on learning and gaining consensus. An inspec­tion is to identify ennrs in dle work product and to COlTect tllem as near to the pain! of origin as possible. We dwell on the tenninology because some o rganizations do not disti ngu ish the process differences among the valious fonns, yet expect results consistent Witll the more fmmal inspection paradigm. 'Dle Software Engineering Institute (SEI) and our own internal software training have not helped here. Both refer to

(cant. on p. 2)

American Society for Quality

�il��ASQ ••••• .. , ... , .... "!I.�. � TU

Software Division

Page 2: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

8Y JAVIEStH DAlLAL

Greetings! On July 1 Sharon IvIiller (treasurer), Tim Surratt (secretary), Linda \Xfestfall (chair­elect), and I stalled our t\vo-year term as the officers of the division. We are excited abollt this opportunity and will fOCllS on growing the value YOll receive from your division member­ship. ASQ has a new strategic plan and I strongly urge you to read the May/June 1998 issue of On Q, pp. 8-9. At the annual meeting of the division in October the council will review the strategic plan for the division. We'll align our plan and activities with those of ASQ, and I invite you to give us your suggestions for activities on which the division should focus its attention. Remember that the council is more likely to have valuable activities for YOll if it has your input and does not have to guess!

Sue McGrath, immediate past chair, has handed us a healthy division of over 5,000 members and I want you to join me in thanking her and the other outgoing council members for their dedicated service to the division. By their accomplishments, they have set a high standard for the new council to maintain.

The 8th International Conference on Software Quality i s our next major event. I t is the annual conference of the division and will be held jointly with the 16th Paci fic Northwest Software Quality Conference. Details are provided in this newsletter. The conference has an exciting program and loca­tion, and I hope you're planning to attend.

The ASQ staff is actively deploying marketing plans for the division's new journal, Software Quality Pmfessional. We need to exceed the subscription threshold set by ASQ to make the journal launch a reality. Taz Daughtrey, journal editor, has articles lined up for several issues of the journal. Now it is up to us! Do purchase a personal subscription for the journal (it's only $30 per year for members!) and solicit corporate subsCliptions from your employer organization. Remember that this will be the first journal focused on you, the software quality professional!

The next CSQE exam application deadline is JanualY 8. Call ASQ at 800-248-1946 for an application packet.

I invite you to volunteer for any division activity that interests you. If you are interested in taking a more active role in the division, contact any other council member or me. We are far from having too many volunteers!

I look fOlward to serving as the division chair. I see it both as an opportunity and a challenge to position the division for continued success in the next millennium!

2

peer review processes, yet describe activities more in alignment with inspections. This paper will focus on the inspection process. The Formal Inspection Best-in-Class model presented in this paper proVides a tool by which organizations can measure the potential effectiveness of their inspection process no matter what they name it.

ERROR PREVENTION OR DETECTION TECHNIQUE

Ilesign slondmds Coding standards

Oesign inspecftons

Code inspections

Unittes!

A!gOlith m test

Integranon test

Requirements test

PROBABILITY OF FINDING DEFECTS

28.7 26.3 57.7 62.7 72.9 8.3

46.1 45.7

Table 1: Errol' prevention or detection probabilities - TRW Corporation

INSPECTION BENEFITS The point at which errors are found in the

software development life cycle will impact the cost to correct them. Studies have shown that coding errors found during integration, test, and maintenance can cost from 10 to 100 times more to fix than those found prior to or duting coding. In addition, changes made during system testing and maintenance tend t o degrade the architecture and reliability of the system over time. Schedule pressures close to delivery and urgency to repair during operation often establish goals to fix problems quickly, sometimes compromising sound software engineering practices.

Inspection benefits have been documented in a wide variety of industrial software envi­ronments. Table 1 shows the results of a study conducted by TRW Corporation [1]. In their study, they found that inspections were second only to unit testing in terms of their effective­ness in identifying defects. In 1985 HP found inspections to be three to five times more efft­cient in terms of defects found per person-hour of effort than all types of testing [2]. Another

... ------------------------------------------------------------------------------------------------------------------------1 Yes! Please enter my subscription to Software Qualily Professional.

, , , , , , o ASQ Member rate: $30 Member Number _________ ; For subscriptions to addresses outside the USA please add $20. : ( o Nonmember rate: $60 For subscriptions to addresses outside the USA please add $35.

D Payment of $ __ enclosed. Check or money order should be pa)able to ASQ. NOTE: Payment must be in U.S. dollars drawn on a U.S. bank.

Please bill o MasterCard o VISA o American Express Card Number __________________ Exp. ____ _ _

Signature: _____ ______ ______________

_

Card holder name ___________ ---;--:-;-_____ _____ _ printed

Mail to

DHome D Business Nrune ___________________________________________________ _

Address ____________________ ___ _

, , , , , , , , , , , , , , , , , , , , , , ,

, , , , , , , , , , , , , , It:!' ::--c�O�d�e/�z�iP�-_=��=�-_=�-_=�-_=�-_=�-_=�-_=��=�-_=�-= :_I:�_I�_

e

,-/ �s _-ta�

te

���� ����--��-

-��-�

-�-�

-�-_�� -_=�-_=�-_=�- i

E-mail address ; , , , � Reserve your copy of Software Qualify Professional now!! Mail your subsCription fmm and payment i :

today to: American Society for Quali�r, PO Box 3005, Milwaukee, WI 53201-3005. Or, call ASQ 800-248- I i �t�: o��4�2��:���.rum plareyouroroefl�ing your Cmlit

_

C: __

y��;�;;;;�8J (

L _________________________________________________________ .. ______________________ _

August 1998/S0F1WARE QUALITY

Page 3: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

ERROR PREVENTION OR DEUCTION UCHNIQUE PROBABILITY Of fiNDING DEfECTS

Property Formal inspection Walkthrough Peer review

Objective To find errors Training, resolution of Identify issues and

implementation issues gain consensus

Entry/Exit criteria Explicit entry/exit criteria Entry criteria No explicit entry/exit

for each work product criteria

Training Formal training No training No training

Roles Defined roles No defined roles No defined roles

Inspection leader Moderator Author of work product Author or project leader

Checklist Used to specify proper- Not used Not used

ties of t he war k prod uct

Follow-up Defects recorded and No formal follow-up Issues normaHy

tracked to closure; recorded, no formal

Process and product follow-up

data recorded

Table 2: Major Differences Between Formal Inspections, Walkthroughs. and Peer Reviews

company reported an average of 10 to 50 lUns for debugging prior to instituting software inspections. After which, only one run was normally required. Finally, another study found 55% of single line maintenance changes were in error prior to instituting a formal inspection process. This dropped to 2% with inspections [1].

Most Motorola organizations use inspections, and have reported Significant cycle time reductions, and quality improvements. For example, one large program reported a 40 to 1 rerum on investment for requif(!mt::nts inspections, which saved about 32 person­years of rework and added development.

In addition to finding defects, inspections can provide a reliable means of measuring completeness of the product at any point of its development for project management purposes. Effective inspections provide unambiguous, explicit, and universally accepted exit criteria for the work products of each software devel­opment phase. They insure that the progress repOlted during project status reviews is real.

FORMAL INSPECTIONS Michael Fagan created the formal inspec­

tion process in 1972 while at IBM for the dual purpose of improving software and increasing programmer productivity. In 1976 he published his landmark paper and code inspections to reduce errors in pro­gram develo pment." Formal Technical Reviews (FTR) or FOlmal Inspec:tiolls generic names often used for Inspections, and represent a class of process evolving from peer reviews. Since Fagan published his paper, leading soft\vare engineering practitioners, including Fagan, have evolved this process in similar but not

SOFTWARE QUALITY/August 1998

identical ways. Some notable contributors are Gilb and Ebenau [4][5J.

There are three essential requirements for the implementation of Formal Inspections:

• Definition of the development process in tem1S of phases, work prod-ucts, and their and exit criteIia,

• Proper of the inspection process with and defined roles for participants recorder, reader author, inspector)

• Disciplined execution of the inspection process

FOTInal Inspections differ from other types of peer reviews in that they follow a defined, repeatable, and proven process for evaluating, in detail, the w or k product with the expressed purpose of finding defects. They are conducted by individuals from development, test, and quality assurance, and may include the customer. Formal Inspections are more rigorous than walkthroughs and peer review processes, and significantly more effective. Inspections do not take the place of milestone reviews, status reviews, or testing. Table 2

illustrates some f undamental differences between Formal Inspections, Walkthroughs, and Peer Reviews.

QUALITY IMPROVEMENT STUDY In early 1995, the of one of

Motorola's largest software development organizations was not satisfied with their organization's quality improvement results. Success stories from the use of Fagan Inspections by other Motorola organizations raised their interest regarding the use of Michael Fagan to train their software engi­neers. Considering the training costs and three

off the job for engineers, the decision to enlist Fagan's help had significant budget implications. The Fagan training program is directed at not just inspection training, but an entire process evolution, incorporating inslpectiorls as a major component, then lead-

to effective defect prevention over the term. One major difference in the subject

org;anil�ati(}Il from other organizations that had previously implemented Fagan Insipectiol1S was that the subject organization had already achieved SEI level 3, whereas the other organizations were significantly less mature when they enlisted help. The following questions were raised:

• As a level 3 organization, would the use of Fagan training be cost effective?

• How "good" is the current peer review process?

• What are the alternative solutions to improve the organization's software quality?

The Motorola Corporate Software Center's Software Solution Team, of which both authors were members at the time of the study, was asked to help answer these qUlestiions. Re(:og.nizing the wide ranging value in these questions, the authors adopted an engineering a pproach to this challenge. We reasoned that a model that repires;enlted a best-in-class view of quality processes was needed. The inspection process was the clear choice in terms of fully docu­mented benefits. But just what f lavor of inspections? Fagan, Gilb, and Ebenau all tout their approaches as "best," and have supporting data to prove it In addition the existing Motorola software review training conrained many characteristics of software inspections.

The authors went into research and brain­stOlming mode, and reasoned that the problem required solutions for both selecting the appropriate inspection process, and also needed to address how the process is institutionalized. FlIlther, results needed to be quantified, so that decisions could be made in an objective manner.

The model needed to be credible, so after development of the model, a number of activ­ities were planned. They i ncluded both reviews by experts and calibration of the model against effective inspection practices and results within Motorola.

FORMAL INSPECTION BEST�IN-CLASS MODEL

The Formal I nspection Best-In-Class (FIBIC) model is a compilation of the best practices utilized the many organizations that successfully implemented and evolved f011nal inspections since they were first intro­duced. It is developed from industry literature

(cant. on p. 4)

3

Page 4: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

on formal inspections, discussions with soft­ware process experts, and of course, the model reflects heavily the engineering and inspection experience of the authors. The model is also strongly influenced by the SEI ClvlJ\'1 model [8].

The model is divided into two compo­nents; Effective Process Design and Effective Process Implementation. Process des ign defines the major characteristics of a Best-In­Class FOlmal Inspection process, while process implementation defines the key management and organizational characteristics required to sUpp0I1 and sustain successful execution of an effective formal inspection process.

The model contains a total of 57 main questions. Some questions have multiple sub­questions. Each subquestion counts an equal fraction of the main question. Scoring is accomplished by summing the number of "yes" answers to the main questions for each part of the model. Each main question is weighted equally. The results are then transformed into a percent yes score.

MODEL OITnINE Details of the model are presented in the

appendix. The following outline of the model shows the model elements and the number of associated questions. 1.0 Effective Inspection Process Design

1.1 Policy and Guidelines (3 questions) 1.2 \X'ell-defined Inspection Steps

- Planning (9 questions) - Pre-Inspection (3 questions) - Inspection Meeting (5 questions) - Post-Inspection (4 questions)

1.3 Formal collection and use of process and product data (4 questions)

2.0 Effective Inspection Proce.ss Implementation 2.1 Conunitment to pelfol1l1 (5 questions) 2.2 Ability to perform (9 questions)

CATEGORY HIGH PROCESS

MATURITY

ORGANIZATION

Policy and Guidelines 3/3

Defined Steps

1. Planning 8/9

2. Pre-Inspection 1/3

3. Inspection Meeting 3/4

4. Post-Inspection 3/3

Formal Data 2/3

TOTALS 20/25=80%

Table 3: Calibration scores for Effective Process Design

4

2.3 Activities performed (5 questions) 2.4 tl'leasllrement, analysis, and improve­

ment (7 questions) 2.5 Verifying implementation (3 questions)

MODEL VALIDATION The model validation process included

reviews by experts and scoring against existing Motorola inspection processes.

�I'Iodel reviewers included Ron Radice and Stewart Craw ford. Radice worked with Michael Fagan in developing the Fagan Inspection method, and formerly served as director of the SEI process program. Radice currently consults on inspections and other software process topics. Stewart Crawford worked in Bell Lab's software quality division where he worked with Bob Ebenau and others in institutionalizing inspections. He currently consults in software process and technology.

Both reviewers were impressed with the model, and stated no significant changes were needed. Radice suggested the model be incor­porated into the next release of the SEI CMM.

For the Effective Process Design component of the model! the process utilized by one of Motorola's leading software process maturity organizations was applied to the model and scored. In addition, a Motorola organization that had successfully utilized Fagan Inspections since 1992 was likewise scored against the modeC along with the Motorola's own software reviews training. Results of this scoring are shown in Table 3.

Table 3 entIies show the ratio of yes answers to total questions for each section of the Effective Process Design p0l1ion of the model. It should be noted that some of the questions in the model ,,,ere not present at calibration, and thus there is a small difference in the total question count in [he current model and Table 3. Questions were added in the Formal Data component of the Process Design section, and Verifying Implementation questions were added to the Process Implementation model.

Both the Motorola benchmark organizations shown in Table 3 have published impressive

FAGAN INSPECTION

ORGANIZATIO N

3/3

9/9

2/3

214

2/3

3/3

21/25=84%

MOTOROLA SOFTWARE

REVIEWS TRAINING

1/3

6/9

2/3

414

3/3

1.5/3

17.5/25=70%

benefits from the use of inspections. One organization has reported in the order of $15 million cost avoidance as a result of code inspections, and a similar number for inspec­tion of requirement, deSign, and oli1er docu­ments [6]. The other provided III authors with 1994 data on completed project den onstrating that over 8()!IAl of faults are id mill'd through their inspections, with signific:)llt cycle-time improvements (inspections took five times less cycle time than testing) [7]. These dal�t, coupled with the high scores achieved by t.h r lo\orola benchmark organizations against our inspec­tion model, strongly suggest that organizations effectively llsing inspection processes scoring 80% or better in this model will yield significant quality, productivity, and cycle-time benefits.

MODEL APPLICATION Application of the model in the subject

organization was done in two activities. First the review sections of their software develop­ment process documents were scored against the model to determine the effectiveness of their process as defined. And second, a series of focus group sessions were held with mem­bers of the organization. Five independent focus groups were conducted with teams of managers, software engineers, and process engineers. The Effective Process Implementation component of our model was used as the cat­alyst for dialog during sessions, stimulating discussion on each section. All group members were provided the model in advance to prepare for the meeting. Based on the results of each focus group session, the authors scored yes/no for each question in the model.

Review yield, efficiency, phase contain­ment, and other quality data were collected from projects. These data were compared to the inspection process definition and execu­tion scores from the model. We found high consistency among the results. Although not statistically validated, the low scores from the model were reflected in the lack of quality improvement results.

These results made it possible for the authors to answer the process effectiveness and improvement approach questions posed in section 4 above. Based on the model-driven results, the study team provided recommenda­tions including modifying the reviews section of the organization's software process, developing an effective inspection training program for managers, inspection leaders, and practitioners, and engaging a software inspection expert to assist in this effort. Findings and recommenda­tions were presented to the organization'S senior management. As a result of this activity, a significantly improved software inspection process was put in place by the organization. In addition, the Motorola software training

August 1998/S0FTWARE QUALITY

Page 5: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

was updated to address the deficiencies high­lighted by the FIBIC model.

FUTURE PLANS The authors continue to use the model in

their process improvement activities within f.'lotorola. Additional information is being gathered to calibrate and f\.llther substantiate the validity of the model. In addition, the model was used by Motorola's Sofnvare Quality Council to help drive and measure the use of formal inspections throughout Motorola.

Additional future activities may include statistical validation of the model, and expelimen­tation with weighting factors on the questions.

CONCLUSIONS

A best-in-class model to measure effective inspection processes has been presented. This model is validated based on the inspection processes and results of several Motorola organizations. The model provides an excel­lent basis to conduct an objective engineering study on review and inspection processes in any software environment.

THE 8TH INTERNATIONAL

CONFERENCE ON SOFT1NARE

QUALITV

will be held jointly with the Pacific Northwest

Software Quality Conference on Oct. 12-15, 1998.

The authors believe this model can be applied quickly and effectively by any software development organization striving to achieve software quality results excellence. It can serve as a tool to identif\, potential ·'root cause ;<prob­lems in software quality processes. And it can be an effective model to assure a "qualified" result in the SEI CMM Peer Review key process area.

ACKNOWLEDGMENTS The authors wish to acknowledge the sup­

port received from numerous individuals with­in ]\Ilotorola in conducting this study. Ken Biss, softv.'are quality manager, was instru­mental in supporting this effort. He planned and coordinated the study'S activities within the target organization, including planning and executing the foclls group sessions, col­lecting data, and providing the project visibility and communication with the study team and senior management. The authors also acknowledge the strong support by two Motorola senior execlitives, Dennis Sester and Dan Coombes, who endorsed the study and fostered actions to make positive changes based on results.

REFERENCES 1. D. Freedman and Gerald Weinberg, Handbook of

\l?alktbrollgbs, Inspec/iol/s, and Tecbnical Reuiews: Emhw/ing Programs and Projects, Little Brown and Company, 1982.

2. Tillson and Walicki, HP Software Engineering, ProductivilY Conference P roceedings , April 1985.

3. Michael Fagan, ·'Design and Code Inspections to reduce errors in program development', IBM �rs'l!ms

JOIl11lal, No.3, 1976. 4. Tom G ilb and Dorothy Graham, Software

Inspectio1ls, Addison-Wesley, 1993. 5. Robert G. Ebenau and Susan H. St rauss, Sojiware

Inspection Process, ldcGraw-HilI, 1994. 6. Steve Hooczko, ·'Taking Inspections to the Limit'",

Fourth International Confe rence on Soflware Testing Analysis and Review, May 1995.

7. Sarala Ravishankar. ··COSI of Quality System for Sothvare Deve\cpmcni-, FiI�1 \\; arid Congr. n Software Quality, �I\ FI�11l o. Junt: 19'95.

8. Paulk, Curtis, Chu � \Wb 'f, C:J ;(i}jli1r M;Jlli ly Model for Soflware, Vers ion 1.1. Software Engineering In."htut(', Call1<.'�it· �k'lIol1 l1i\' ·rsity. Piitsburgh, PA 1993.

APPENDIX: FORMAL INSPECTION BEST-IN­ClASS MODEL

1.0 EFFECfIVE INSPECTION PROCESS DESIGN 1.1 Polic}, and Guidelines

Do policies exi.� in the pro;t I p13u or process docu­mentation thal I ;H]Y define what work items are required to be i fic

Do guidelines t" in II. ' 1w;t.'Ct plan or process docu­mentation for inspection p reparation and execution rate?

Do pol" . < :1OU �luidelilles exist in the process docu­mentation fat (;1il . g Ihe inspection process?

1.2 Well-defmed Inspection Steps

Planning Is the inspection process documented and are the

documents easily accessible and IIp-to-date? Is the role of each inspection team member clearly

defined in the process docun /.:It ion? Do inspection te, III rok . lillie the follOWing:

• Leader or Mocemtor . SCTliof level person res.pon­sibIl? fOf cnxuring thai the inspection lIIal rial is properly prep.1fcd, su Idcnt prepa . ttoo time is avail:iblc. (\· ... ic�\ers are prepared for the inspeClion, the i spedmn meeling is successful.

• Recorder - responsible for c learly and accurately recording all defects during til<:· p<'Clion mtl puh­lish ing the inspection rtjlOl1.

• Prcscn ('f soo cone QlneT than Iht.· 3ulhor spon£3-hle Of prC"'O('nling the inspccliOtl In:' liJi during the in -pcctlon.

• Rc\·k .... ' c:rs - S(lhiect nl:\llCf t. S rcspo!lS�)t icr finding defects.

Are all inspection leaders required to mmp! fom al inspection leader tDining prim! I. dinS it pec1ion'

Is inspecrion traillin� required for ;tIll<.>am ml'11�)(!Th1 Are checklists deli, d OT c. h 1)llC" 0 it.l�ljoo to

help idellti�' typical defects? Are inspecliOll cnlr:lOCC nd ex·t . CTi:l c1 • 1"1) defineu

in the proces do<: un':!) I» 1 lOr c:t It "npc 1[00 Iype r

(COl1t. 011 p. 6)

ATTENTION! SOFT ARE UAL v

N l EERS. S

Looking for software quality engineering courses to help IMPROVE your job performance and the quality of your company's software products? ASQ has two courses to help you BUILD a solid foundation in the key knowledge and skills needed to ensure software quality.

American Society for Quality

ASQ

SOFTWARE QUALITY/August 1998

Building Software Quality SkiUs Oct. 7-9, 1998

Greensboro, NC #98219

Oct. 16-17, 1998

Portland/ OR #98248C

Software Quality Engineering Sept. 28-0ct. 2, 1998

Milwaukee/ WI #98215

Nov. 9-13, 1998 Houston, TX #98238

CALL ASQ TODAY TO REGISTER OR FOR MORE DETAILED INFORMATION! 800-248-1946 Priority CodePDPSSD7

5

Page 6: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

ensure that inspection material is complete, team mem­bers are prepared, and the inspection is successful'

Do guidelines exist in the process documentation for Supp011ing work items required for each inspection type?

Are inspection document formats clearly defined and sample templates available in the process documentation for the foHowing:

• Inspection meeting notice? • Pre-inspection report? • Inspection error/Comment logs? • Inspection reports?

Pre-Inspection Is inspection material required to be distributed far

enough in advance to allow the inspection team sufficient preparation time?

Are inspection team members required to complete a pre-inspection log that summarizes defects found and preparation time prior to inspections?

Is responsibility assigned to ensure inspection entl1' and exit criteria are satisfied prior to conducting the inspection meeting?

Inspection Meeting Do guidelines exist for the number of inspection [ealll

participants and functions needed for each type of inspection? Are inspections limited {Q finding defects and not

evaluating solutions? Is the inspection material presented by someone other

than the author? Are all issues recorded by the recorder and agreed to

by the team? Are individuals responsible for inspection tea III mem­

bers perfonnance appraisals excluded from inspections?

Post-Inspection Does inspection exit criteria exist to guide the final

disposition of the inspection? Is a post-mortem perfoOlled to classify defects by lype

and source? Is responsibility assigned for publishing a report to

document the inspection results? Is someone other than t he au thor responsible to

schedule and ensure that all defects are resolved?

1 . 3 Formal collection and use of process and product data

6

Is collection of the following inspection data required?

• Preparation time? • Inspection time? • Number of attendees • Inspection found defects? • Work item size? (pages, NCl.5)? • Phase defect detected? • Phase defect introduced? • Defect severity classification? • Defect type classification?

Are inspection process monitOring metrics required?

• Average size of inspection team • Inspection Rate - Pages/Hr. or NCJ.s/Hr.

V I S IT T H E S O FTWA R E

D IV I S I O N W E B S I T E :

http ://www . asq , org/sd/swqweb . htm I

• Preparation Rate - Pages/Hr. or NCLS/Hr. Are inspection effeciil'eness metrics required?

• Inspection Efficiency - Inspection Found Defects! ( Inspection + Preparation Effort)

• Inspenion Yield - Inspection Found Defects/Total Defects

• Defect Phase Containment - Phase Detected \'s. Phase Introduced

• Cost to fix

Are inspection improvement metrics def :ed?

• Escaped defect frequency by defect [}'pe

2.0 EFFECfIVE INSPECTION PROCESS IMPLEMENTATION

2.1 Conunitment to perform Is someone responsible for monitoring and improving

the inspection process! Does the project plan identify specific work products to

be inspected? Criteria? Goals? Do inspection leaders cancel inspections if the inspec­

tion team is not adequately' prepared or the review mater­i a l is i ncomplete, u ns atisfactof}', or does not meet entrance requirements?

Is sufficient time allocated for inspections in project schedules?

Are i nspection issues supported in performance appraisals of managers and technical stall? i.e., Are man­agers and technical staff members rewarded for the effectil'eness of inspections?

2.2 Ability to perform Do all inspection team members receive training in the

inspection process? Do inspection leaders receive training in how to lead

inspections? Do all software managers receil'e inspection training? Are checklists used to guide inspections? Are checklists updated based on inspection results and

escaped defect root cause analysis? Is sufficient time allowed to prepare for inspections? Are sufficient facilities available (meeting rooms, etc.)

for inspections? Are tools al'a i labJe for collening a n d re port ing

inspection metrics? Does a list of ceL1ified inspection leaders exist?

2.3 Activities performed Are inspection results reported to and reviewed by

project managers? Are pre-inspection logs completed prior to inspections' Does the inspection leader or designee follow up to

ensure all identified defects are resolved? Are inspections used for al l relel'ant work product and

phases of software development? Are the entl1'/exit criteria adhered to?

2.4 Measurement, analysis, and improvement Is someone assigned responsibility for collecting,

analyzing, and reporting inspection data/metrics? Are inspection metrics collected?

• Process monitoring metrics? • Process effectiveness metrics? • Process improl'ement metrics?

Are inspection metrics used to establish inspection process improvement goals?

Are inspection metrics used to establish criteria to deter­mine inspection meeting disposition? (i.e" prep rate, inspection rate, etc.)

Are improvements made to the inspection process based upon inspection metrics?

Are improvements solicited from technical staff a nd management involved in the inspection process?

Are inspection trends shared with inspection teams and managers?

2.5 Verifying implementation Does SQA ensure that inspections are held and conform

to the inspection process guidelines? Does SQA ensure that eml1'/exit criteria are adhered to? Does SQA review inspection results?

Have you visited http://www,asq.org/scl Iswqweb.html? Ir is the Software Division \�leb site. I'd like your input into this page, What do you need as an all -line reference to the division and software qual ity activities? What are good links? \X'hat information can't you fincl?

\'(fe have special areas for metrics, methods, standards, and certification_ There are links to information abollt 8lCSQ and the Software Quail!), ProfeSSional Journal_ Our most visited area remains Career Opportunities. What else would you like to see on the site?

I'll be working on the design of the Web and adding content as it is submitted. Hope you can become an active paltner in our Web page. With your help it wil l continue to expand wilh useful information. Send e-mail to sassam@wnLsas,com.

The Software Division is continuing to pro­vide copies of study materials for the Cettified Software Quality Engineer (CSQE) exam that were used by a Washington, DC-area study group that has the highest pass rate (82%) in

the country_ Not all topics in the b dy a

knowledge are covered (copyrighted materials

(cant. on p. 9)

Subwibe to Software QooJlty Prof&ssioool ond YOIIII NeYer Have to Find 0uJ.

Reserte r premiere i$$U8 oi A$:1 s newest rnal by bscribi� [or a roll yoor.

800-248 1946

August 1 998/S0FTW ARE QUALITY

Page 7: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

8th I nternati o n a l Conference on SoftlNare Qua l ity 1 6th Pacifi c NorthlNest S oft1Na re Qual ity Confere n ce

Sponsored by: ASQ's Software Division and PNSQC October 1 2-1 5, 1998 Oregon Convention Center - Portland/ OR

The fu l l confe rence package i ncludes a choice of fu l l and ha l f-day tutoria l s on MondaYI October 1 2 , and T hursday, October 1 5. On October 1 3- 1 4 the confer­ence will feature a variety of technical papers on software quality issues.

Keynote Presenters: Tom DeMarco IICharacteristics of the Learning Organizationil

Tom Gilb "Process Out, Quality In" http://www.result-planning.com

Why You Should Attend: 'Get the fads about whal i s happen i ng today in software quality.

'Network with your peers to compare whal works and what doesn'l.

'Participate in hands-on tutorials and leam in-depth methods.

'Take tools and techniques bock to your office that will increase your productivity.

Conference Pricing After Oct. 1

Conference only $ 2 9 5 $ 3 4 5 Conference + 1 workshop $ 4 7 5 $ 5 2 5 Conference + 2 workshops $ 6 7 5 1 workshop only $ 2 2 5 $ 2 7 5 2 workshops only $ 4 5 0 $5 0 0

Registration: The conference registration fee i n c ludes adm i ss i o n to a l l sess ions and exhibits, lunches, reception, and the confer­ence proceedings. Workshop registration incl udes lunch and a workbook. Monday's workshop price is for either 1 full-day or 2 half-day workshops. See registration form to sign up for specific sessions and workshops.

Cancellation Policy: If you cancel your regis­tration for the conference or workshops by October 1 , we wil l refun d $ 1 00 for each registration and send you a copy of the pro­ceedings. No portio n of the fee wi l l be returned after October 1 .

PNSQC Tax Number: 9 1 1 -084-55 1

SOFTWARE QUALITY I August 1 998

October 1 2- 1 5, 1 998

Conference Registration Information: Contact Pacific Agenda at 503-223-8633 (ask for Terri Moore) or visit our Web site www.pnsqc.org.

Hotel Information: Double Tree Horel, Portland Lloyd Centerll OOO NE Multnomah, Portland, OR 97232. The direct number is 503-28 1 -61 1 1 , Reservations can be mode at 800-222-TREE and must be mode by Sept. 1 998. Ask for the PNSQC conference rate.

Airline Discounts : Discounts on a i rfares to the PNSQC/8ICSQ in Portland/ OR, from the United States and Canada are being offered by American Airlines. please call:

Olson Travel Service at 800-847-5921 or 4 1 4-784-1 060 from 9:00 am to 6:30 pm Eastern Time,

Monday through Friday. Ask for Dawn Anderson or Nancy Jebavy and identify yourself os a Software Qua l i ty Conference attendee.

Or ca l l American Airl i nes at 800-433- 1 790 from 6:00 am - 1 :00 am Eastern Time, dai ly. Refer to Slar Fi le number 2108UU.

Car Rentals: Avis car rental is offering on automobile rentals for conference attendees. To make your reservations, cal l Olson Travel Service at 800-847-592 1 or Avis reservations at 800-33 1 -1 600 and refer to I D number J095146.

Post-Conference: ASQ offers "Bu i ld ing Software Quality Skills/ course item number 98248C, on Oct. 1 6 - 1 7, 1 998. For more informalion caI1 800-248-1 946.

Preliminary Progratn

WORKSHOPS

Monday Full 8:30 am . 5:30 pm Evolutionary Project Management Tom Gilb, Results Planning Limited

Software Reuse: Positive or Negative Ronald Leach, Howard University

Managing Business Requirements Brian Lawrencet Neal Reizer, Compaq Computers

Monday Half � 8:30 am ' 1 2 :30 pm A Quick Introduction to Software Reliability Jarrett Rosenberg, Sun Microsystems

A Methodology for Writin g H igb Qual i ty Requirement Specifications and for Evaluating Existing Ones Lindo Rosenbergt Unisys

Statistical Process Control for Software Mark Paulk Software Engi neering Institule

Monday Half Day 1 :30 pm 5:30 pm Using tbe Software CMM With Judgment: Small Projects & Small Organizations Mark Paulk, Software Engineering Jnstitute

Ridi n g the Wild Spaghetti - Address ing the Proliferation of Standards for Software Engineering William 1. Deibler II, Robert Bamford, SSQC.

leading an Effective Post-mortem, Postpartuml or Post-Proiect Review Norm Kerth, Elite Systems

Thursday pm

Practical Software Test Case Design Karen Bishop-Stone, Testware Associates/ Inc ,

Quantitative Techniques for CMM Level 4 David Card, Software Productivity Consortium

21 Ways to Make Sure Requirements Are Right Robin Goldsmith

The Cost of Software Qua lity {CoSQ}: Turning Inspiration Info Action Herb Krasner) Krasner and Associates

Quick Start for Project Risk Management Joyce Statz, T eraQuest

Requirements Driven Management: A Complete Advanced Approacb to Software and Systems Engineering Tom Gilb, Resulls Planning Limited

(conI. on p. 8)

7

Page 8: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

8th Internatio nal Conference on Softvvare Qu ality 1 6th Pacific Northvvest Softvva re Quality Conference

October 1 2- 1 5, 1 998

Preliminary Program

TuesdaYt October 1 3t 1 998

TIme

10:30 am

1:30 pm

3:30 pm

Mbnllglll!lent TrIck Melriu Tratk Tf,tlng Track

Critical Chain-Doing Development Selecting and Designing Metrics by Defuting Test Policies & Standards Faster With Quality Example: Two Successful Paradigms Invited Speaker Cheryl Moore. FedEx Richard E. lultner, lultnel & Company Claire Lohr. Lohr Systems Corporal ion Avoiding Litigation: Reflections of an Experiences Implementing Software A Testing Strategy for Outsourced Expert Witness Measurement Multi-Platform Software Solutions

Brian L1wrence, Coyote Valley Software Beth A. Layman, and Sharon L. Rohde. Lockheed Marlin i\lission Systems

A Cost Estimation Based Approach to Testing, Requirements, and Metrics Quantify Software Risk Evaluation L. Rosenberg. L Huffman. Unisys Results

Peter Hantos, Xerox

From VICTIM to VICfOR: The Trials and Tribulatio1ls of a Neophyte Software Testing Manager

Elizabeth A. Adams, Louisiana \'('orkers' Compensation Corporation

T. Hammer, NASA

Management and Technical Opportunities and Barriers to Applying Statistical Continuous Improvement to Software Quality

Mervin E. Mul ler, Ohio Stale University

Transferring Software Development Metrics and the Financial Health of Best Known Methods Between Software

Kersri Nogeste . SMS Consuhing Group Pty. ltd., Australia

Toward Quality Programming in the Automated Testing of Client/SeITer Applications

Huey-Der Chu, National Defense Managemenl College, Taiwan

John Dobson, Un iversity or ! . 1e upon Tyne, U.K.

The Automated Build Verify Test: A Valuable Time Saver

BlUce Kovalsky, Fidelity Technology Solutions

An Excel-Based Test Harness for Windows Software

Generational Product Lines

James R. Bindas, Inlel Corporation

Anticipating and Mitigating the Professional Challenge to Independent Verification & Validation

Paul Doherty, Oregon Graduate Institute Roben Bales, TektronLx, Inc.

James B. Dabney, lmermetrics, Inc.

James D. Arthur, Virginia Tech.

Don Springer, Universilr of Portland Assessment of COTS Products From

Successful Strategies for an Operating Systems Perspective

Implementing Software Metrics at Ronald J. Leach, Howard University Intel

Invited Speakers Paul Dittman, Jeanne Yuen-Hum, Intel Corporation

WednesdaYt October 1 4, 1 998

TIm

10:30 am

Process l 1rcuk

Strategic Planning Process - How to plan Improvement

Mary Sakry, Neil Potter, TIle Process Group

Putting the PSP to Practice

Jeff S. Holmes, �IotoroJa Cellular Infras!mcture Group

Bonnie E. Melha rl, Texas Christian University

Proc IS 2 Track

Using the Software CMM With Judgment: Small Projects & Small Organizations

Invited Speaker Mark Paulk, Software Engineering Jnstitme

Improved Software QUality by Adopting Control Charts

Anders Subotic and Nic1as Ohlsson, Linkoping University, Sweden

Tu.Jtag Tracl

Hierarchical Organization of Test Cases

Michael Ensminger, Panes Corporation

So You Think You Know Objects?

Invited Speaker Ray Lischner, Tempe '[

Software

N w Trcuk

Making Test Cases From Use Cases Automatically

lO\'� I $pc':Iker R�rl M Poston. Aonix

Panel discussion including mystery guests. See Web site for details.

Invited Spe;tKers

Prot IS 3 'flick

Developing a Software QUality Model to Improve Supplier Performance

Cdg Smith. M I la Cellular In(r.J$(f\ldll G up

Software Excellence Award Winner Presentation

8 August 1 998/S0FlWARE QUALITY

Page 9: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

8th International Conference on Soft1Nare Qual ity 1 6th Pacific NorthlNest SoftlNare Quality Conference

October 1 2.,.1 5, 1 998

Wednesday, October 1 4/ 1 998

Tlme

1:00 pm

3:00 pm

Proltss , Track

Successful Implementation of ISO 9001iTickIT in a Software Development Company

Cecilia II'I. Yourshaw, VllS Inc.

A Problem-Based Approach to Software Process Improvement: A Case Study

Johanna Rothman, Rothman Consulting Group, Inc.

I've Been Asked to Review This Specification: Now What Do I Do?

Prelim inary Program

Process 2 Track T sting Track

Process Synergy: Using ISO 9000 and Improving the Test Process Using the CMM in a Small Development New Technology Envirorunent

Sharon E. Miller · Northstar Consulting Group

An Integrated Software Audit Process Model to Drive Continuous Improvement

Neda L Gutowsk i, Motorola Cellular Inlrastructure Group

]USE Panel Discussion: Evaluation and Future of Process Innovation

Fareed K Shaikh. Lawrence E. Niech. Amomatic Data Processing

Experience Report: Comparing an Automated Conformance Test Development Approach With a Traditional Development Approach

Alan Goldfine. Gary Fishel, Lynne Rosenthal, NIST

Nuts 'n Bolts Experiences in Code Coverage Analysis

Pr.OCts.s 3 T,a,k

Stepwise Improvement of Your Test Processes Using TPJ

Invited Speaker Martin Pol. (QUIP Infomlatica I3.V .. The Netherlands

Karen Bishop-Stone, Testwarr:- Associates, Ryuzo Kaneko, J\lilsuru Ishio. NEC Pemmaraju S Rao. Richard Vireday. Imel Corporation

The Effects of the Year .WOO Problem on the Wintel Duopoly's Control of Internet Commerce in the International Marketplace for Multimedia Mindshar�r-How Do We Gel These Damn Things to Work?

Inc. Corporation

IEEE/EIA 12207 as the Foundation for Yukihiko Akasaka. NIT Data Enterprise Software Processes Corporation

Beyond Coverage Analysis - Time Optimization of Regression Testing

Invited Speaker Lincoln Spector. Compmer .Iournalist. Colummsl. Humorist

James \'\/. Moore. The r.nrn Corporation Koji Yoshizaki . Topcoo Corp. ROi Trammell. Perlomlance Tools Group

For the most up-to-date conference informalion( visit our Web site at: www.pnsqc.org

were removed), bUI tides of the additional materials they L1sed are included. The price of these mmerials is $35 (our reprodu([ion and ship ping costs) for U.S. addresses and more for foreign locations (based on the countty). To order a set, provide your mailing address to Claire Lohr (e-mail [email protected] or phone at 703-391-9007). The materials will be shipped with an invoice and payment instruc­tions enclosed (no prepayment required - we tlUst you). If you have any study matelials you would like to sh are with other Software Division members, please contact Claire Lohr.

The following are some useful tips about taking the CSQE exam: Take lots of references (do not get embalTassed by the luggage cart you need for hauling them all). I personally

SOFTWARE QUALITY/August 1 998

took lots of references. and wished I had brou ghl more. And yes. f did lise them. Mostly to look up the "right" answer when I

was preny sure I knew what it was, but why take a chance? Put index tabs in the refer­ences (you KNO\\' where yott are weak. don't you?) Take breaks each hour. I was the only one doing this (the aduh attention span is 45 minures), and 1 was also the only one who got done in three hours (and yes, 1 DID pass it). Trust me, you are going to need ALL of your brain cells, including the already tired ones, for some of THOSE questions. Do a "per hour" budget for how many questions you need to have done, so you can take a break without wonying about how far along you a re. Keep telling yourself that you can miss 30% (don' t get too excited when you draw a total blank on some of them - there are lot's more questions still left to get right). And remember - when you pass, they DON'T

give you your score (so neither you nor anyone else has to know for sure just how close it

was). Conventional wisdom on tests is that your first reaction is your best. LI is therefore recommended thai you NOT go back and review your answers. Supposedly when peo­ple do this. they cha nge more right answers to wrong. than vice versa. So, give il your besl shot the first time around. Go out to lunch with your fellow examinees aftelward to cia a post-mOltem together.

9

Page 10: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

W h a t 1D) @ (8.!S itlhlce C!lJHUI!rse C t!Jl lfi(\l !p> Il"use?'

lhe course is the official Department of Trade and Industry (DTI) TicklT Auditors' Train ing Course, presented by regis­tered TicklT auditors. It is a comb ination of lectures, realistic IT case studies, workshop exercises, and practical role playing. The course complies with the standards for TicklT auditor trai ning courses set by the IQA International Register of Certificated Auditors (lRCAI. Delegate participa­tion, in the form of case studies, classroom presentations,

Ty pical Course Thnetable

Day 1 " Monday

Qualily Concepts The Role of the Audi tor Qualily Management

Day 2 . Tue�Qy

Managing Qual ily Soflware Qualily Systems The TicklT Scheme The Audit System The Relationship Belween

and role p lay exercises comprise approximately 40% of the course time.

Cell"tiifiica'll:e @f Achiievemell1l1c

The course is formo lly reg istered with the lOA International Register of Certificated Auditors of Qualily Systems under the TicklT Scheme. Delegates who demonstrate satisfactory performance throughout the course and who pass the final examination are awarded a Certificate of Achievement. This certificate demonstrates the necessary training require-

ment hos been met and such delegates are el igible to apply to the lOA Intemalicool Reg' ��er or Certihco!cd Auditors for registration as TicklT Auditors.

<COllJJll'se 1Fo�ma1l: alIl'M:!I <content

At the start of the course, each delegate is supp!ied , 'th a

comprehensive course manual . Copiel of re\evoot ISO standards and the r!dcJT Gttidonce Hcmdbook are aYOilable to each d legate during the course.

Day 3 . Wedne�day Day 4 . Thvrsdoy Day 5 . friday

Audit Responsibilities Soflware Review Validation

and Verification

The Closing Meeting Audit Reporting Corrective Action and foaow-up

IT Services and In-house facilities

Internal Audit Third-Party Assessment and

Surveillance Systems Standards and Guides Quality Management Systems Documentation ISO 9001 and ISO 9000-3

Audit Planning and Preparation Checklists

Audit Investigation Evalua ting Effectiveness Configuration Management, Securily, and Archiving

Recording Noncompliances

Accreditation and TicklT Auditors Course Review

Examination

The Opening Meeting

Demonstrate youJ interest in maturing the Canadian software indust[Y for comp,etition around the world. Attend the TicktT Course y'ourself and send stoff as a KEY STEP in �e nddT QCCfediJo ion process. For more intormation please contact Stephen While 01 61 3-839-1 836 ([email protected]) or larry Jones ot 61 3,837-8823 [email protected]!.

For more information about the ASQ and its activities, conloel Larry Jones.

SPICE Assessor Training Course # 2 ISO 15504 Hosted by: ASQ Ottawa Va lley Settion 407 November 23 - 27, 1998 OHawa, Ontario

OBJECTIVES

In this workshop on Software Process Improvement and Capability dEtermination (SPICE):

• You will gain up-to-date information and an in-depth understanding about the emerging standard on Soflware Process Assessment.

• The tra ining will prepare you to participate as an effective member of a SPICE assessment team.

• This course win also satisfy the training requirements leading to a OAI Certified SPICE Assessor (CSA) as well.

You will also benefit:

• from learn ing how SPICE could be designed to fit into your corporale TOM program, • from improving your learning about best practice in sOflware process improvements

and capability determination • from the experience of using a SPICE Assessment Tool

1 0

SUM MAR" DESCRIPTIONS OlF COURS E

This course will provide a comprehensive and prodicol train' 9 in SPICE Software Process Assessment, Process Improvements, and Capobllily De�o ,

Some of the topics that will be covered include: • SPICE - Backg round, Principles, Architecture, and Model • Process Assessment • Capabilily Determination • Process Improvement

Process Categories: Cu stomer Supp l ier, E ng ineering , Ma nagement, Supporting, Organization

SPICE Assessment: Prepara tion, Conduct, Using Tool and Assessment Instrument, Determining Ratings, Validating Ratings

Case Studies & Ro!e Play

THIS COURSE IS BASED ON THE SPICE EMBEDDED MODEl.

F O R MO RE IN FORMATION. CONTACT T H E ASo. OTTAWA

VALLEY SECTION 407 VIA EITHER

larry Jones Education Chair 6 1 3-837-8823 ([email protected]) or Stephen White Soflware Focus Chair 6 1 3-839-1 836 Iswhi w_cyberus.cal

August 1 998/S0FTWARE QUAllTY

Page 11: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

DIVISIO� COU�CIL .\lD1BERS

]ayesh Dalal-Chair 732-970-0859 [email protected]

Linda Westfall-Chair-elect 972-519-3779 [email protected]

Sue McGrath-Past Chair, Internet Liason 919-677-8000 ext. 7032 [email protected]

Sharon Miller-Treasurer 732-933-9834 [email protected]

G. Timothy Surratt-Secretary 630-713-5522 [email protected]

Robin Fralick-Standards 610-838-8279 [email protected]

Carol Grangel-Parker-Standards 703-288-4245 [email protected]

Doug Hamilton-Certification 847-714-3306 [email protected]

Claire L Lohr-Education & Training 703-391-9007 ]ohrsys@ero]s.com

Barbara Frank-Publications 972-51 9-3297 [email protected]

Rick Biehl-Strategic Planning 407-296-6900 703 [email protected]

Taz Daughtrey-Liaison, Journal Editor 804-522-5137 [email protected]

Terry Deupree-81CSG 972-519-4762 [email protected]

David Zubrow-Metrics, Measure­ment, and Analytical Methods 41 2-268-5243 [email protected]

Kim Cobler-Methods 801-363-7770 [email protected]

John Pustaver-Awards 508-443-4254 [email protected]

Pam Case-Membership 301-261-3805 [email protected]

4 = Canada

** Includes Mexico

HEGIO \:\l. cm:'KILOHS

Region 1- John Pustaver 508-443-4254 [email protected]

Region 2- Jean Bums 607-779-7868 [email protected]

Region 3- Bill Folsom 203-385-4339 [email protected]

Region 4- Stephen White 613-763-6326 [email protected]

Region 5- Joel Glazer 410-765-4567 joel....glazer®mail.northgmm.com

Region 6- M.P. Kress 425-266-0545

Region 7- Harry Ohls 626-798-1160 [email protected]

Region 8- Brenda Eldtidge 614-213-0216 brenda-I­[email protected]

Region 9- John Lowe 937-429-6458

Region 10- Open

Region 11- L}rnn Loftin 919..fr8}-03(A)

flil! compu(er.org

Region 12- Bob Colby 630-979-6783 reolb. lUG 'flt.com

Region 1 3- Paul Caracciolo 303-424-4262 ext.204 pcara ralloo.com

Region 14- Mike Epoer 972-727-6391 [email protected]

Region 15- Amlin Torres 305-789-2639 [email protected]

Region 25- Martin Swafford 21�-363-6 181 , ext.411 sw [email protected]

Page 12: T Y - ASQasq.org/software/2010/11/software-quality/software... · Michael Fagan to train their software engi neers. Considering the training costs and three off the job for engineers,

CSQE Exam lvlarch 6, 1999 Application deadline is Jan. 8, 1999 Call ASQ for information.

8ICSQ & PNSQC Oregon Conference Center October 12-1 5, 1998 Portland, OR See p. 7

Software Division Council Meetings Double Tree Hotel Portland Lloyd Center

Strategic Planning - October 10

Council Meeting - October 1 1 Open Meeting - October 1 1

Submit ar6c1es for the next issue of Software Quality

by Oct. 201 1998.

Barbara Frank

Phone: 972-51 9-3297 Fax: 972-51 9-51 64 e-mai l : bfrank@ccmail .dsccc.com

ENHANCE your career and self-esteem ADVANCE with in your organization

EDITOR BARBARA L FRANK �IS QRA05 1000 Coit Rd. Plano, IX 75075 voice: 972-423-5989 (Home) voice: 972-519-3297 (BUSiness) fax: 972-519-5164 [email protected]

EDITORIAL REVlEW BOARD JAYESH G. DALAL, Chair UNDA \X'ESTFALL, Chair-Elect BARBARA FRANK, Publications Chair

EDITORIAL POllCY Unless otherwise stated, bylined articles, editorial commentary, and product and service descriptions reflect the author's or firm's opinion. Inclusion in Software Quality does not constitute endorsement by ASQ or the Software Division. ADVERTISING FULL PAGE-$500 per issue liz PAGE-$250 1/4 PAGE-$125 Discounts apply-Contact Barbara Frank, 972-519-3297

REALIZE your sa lary goals AFF IRM your commitment to quality

REMINDER: The next Cerl'ified Software Qual ify Eng ineer exam is Ma rch 6, 1 999. lapplication deadline is Jan. 8, 1 999), Call 800-248- 1 946 and request item B0 1 l 0.

Priority Code CTAPSD8

� 0>

� Q) � 0 Ol U) ci' co c e in 0 Q)

0 <C z :::.: :J

a. a. 0- m

C (J) Of � 0 => Q:; � z a..