View
220
Download
0
Category
Preview:
Citation preview
Surviving PerformanceImprovement ‘‘Solutions’’: AligningPerformance ImprovementInterventions
Mariano L. Bernardez
First, do no harm.
—Hippocrates
According to a 1981 study, approximately onethird of patients’ illnesses in a university hospital werecaused by treatment (Steel, German, Crescenzi, &Anderson, 1981). With approximately 225,000 deathsper year, treatment-caused, or iatrogenic1 factors arethe third leading cause of death in the United States,following heart disease and cancer (Starfield, 2000).
According to Weingart, Ship, and Aronson (2000)statistics and research, treatment-caused deaths breakdown according to these leading causes:
~ Unnecessary surgery: 12,000~ Medication errors in hospitals: 7,000~ Other errors in hospitals: 20,000~ Infections in hospitals: 80,0002
~ Nonerror, negative effects of drugs: 106,000
The most common causes of treatment-causeddeaths are (1) misdiagnosis, (2) drug interaction, (3) ‘‘nosocomial’’3 infec-tions, and (4) incorrect procedures.
Solutions-caused problems are not limited to health care centers. Thosefamiliar with home renovation can relate their experiences to the classic filmMr. Blandings Builds His Dream House (Potter, 1948),4 where a New York
111
P E R F O R M A N C E I M P R O V E M E N T Q U A R T E R L Y , 2 2 ( 2 ) P P . 1 1 1 – 1 2 7
& 2009 International Society for Performance Improvement
Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/piq.20057
How can organizations avoid thenegative, sometimes chaotic, effects ofmultiple, poorly coordinated perfor-mance improvement interventions?How can we avoid punishing our ex-ternal clients or staff with the sideeffects of solutions that might benefitour bottom line or internal efficiency atthe expense of the value received orperceived by clients and investors andour shared world, a world we all live inand depend on? Facing a multibillion-dollar consulting industry pushing new‘‘solutions’’ every year that might endup causing new and unexpected pro-blems, serious performance consul-tants and managers know thatblaming the law of unintended conse-quences will not prevent clients fromleaving, staff turnover, general organi-zational turmoil, and even national andinternational consequences associatedwith change. Because change is alsovital, this article introduces a systemic,multilevel framework to align perfor-mance improvement interventions;avoid systemic disruption; and mea-sure and eliminate overcosts, rework,and negative side effects of change.
publicist and his wife, longing to buy a country house in nearby Connecticutto escape Manhattan’s crowded apartments, go through an ordeal of over-costs, rework, and conflicts with uncoordinated contractors, engineers, andarchitects; the couple ends up tearing down and rebuilding their new place in
twice the expected time and at several times theoriginal budget.
Unlike classic Hollywood happy endings(the Blandings do get their dream house andlive there happily everafter), organizations en-gaging in ambitious organizational change orperformance improvement (PI) programs oftenend experiencing recurrent, complex, systemichangovers caused by multiple, uncoordinated,and sometimes even conflicting solutions.
Did the CFO launched a management byobjectives (MBO) initiative while the COOinstituted a total quality management (TQM)program? Did anybody read Deming’s 11thcommandment of TQM?5 Did the CFO andCOO know that those two solutions could bestrongly antagonistic? Did they have the timeor the tools to check the compatibility of suchcomplex solutions? Did the IT departmentpurchase a costly and promising enterpriseresource planning (ERP) system without
knowing that the CEO had just signed off an ambitious merger agreementwith a former rival with a noncompatible IT architecture? Are severaldepartments enthusiastically engaged in time-consuming, meeting-basedtraining and organizational development initiatives to increase employees’engagement and improve the climate?
These and others are early warnings that our organization is descendinginto some sort of ‘‘transformational chaos’’ that will sooner or later become alarger problem on its own.
Changes in strategic plans, management, mergers and acquisitions, anddepartmental initiatives combined with the multiple models and solutions-trained specialists and fads of a multibillion dollar consulting industryusually end up producing a level of systemic chaos that Gloria Gery aptlycharacterized as ‘‘organizational flagellation’’ (1992) and that is, in ourexperience as well as probably in yours, one of the major sources of employeeturnover, demotivation, and resistance to participation in so-called perfor-mance improvement initiatives.
Examine our early signs of solutions-caused, iatrogenic problems check-list, in Table 1. Consider each yes a flag for a potentially serious problem.
As solutions vendors (deceitfully self-introduced as solution consul-tants6) push to sell their solutions to companies’ functional areas—humanresources, IT, finance, marketing, and sales being the most prolific buyers—that in turn pull from inside for their functional priorities (their climate
The root cause of mostsolutions-caused problems is
the lack of a systemic,companywide, comprehensive
model for planning andmanaging interventions. Left tosolutions vendors, performance
improvement interventionstend to run out of control, like
the contractors on theBlandings’ dream house—
increasing costs and rework,reducing the chances for actualimprovement, and last but notleast making life miserable for
the organization’s staff andclients.
112 DOI: 10.1002/piq Performance Improvement Quarterly
surveys, ERPs, MBOs, and CRMs), a flurry of unconnected and frequentlyconflictive change or improvement initiatives takes place in the organiza-tion.7
Confusing performance improvement with solutions implementationoften occurs because this approach takes for granted that the mere lack of agiven solution or resource (MBO, ERP, CRM) is a genuine organizationalneed or gap in organizational results. If systemic factors that cause theoriginal problem are ignored, solutions consultants will operate like thesubcontractors on the Blandings’ house project, creating newly bred pro-blems and causing systemic chaos. Furthermore, the solutions approachincreases the chance of focusing needs assessment and interventions onoptimizing subsystems, such as sales or financial performance, at the expenseof organizational performance and external clients, and investors’ or societalinterests. Without a shared focus on organizational and external clients’results (Kaufman, 1972) rather than functional efficiencies, every newfunctional solution can become a nightmare for some other area, or theentire organization.
Think of IT defining spam, or purely administrative criteria settingInternet utilization automatic filters at a university with thousands of onlineeducational users. Conversely, think of an enthusiastic distance education
TABLE 1 ‘‘SOLUTIONS’’ ALIGNMENT PROBLEMS: EARLY WARNING SIGNALSCHECKLIST
INDICATORS YES?
We are implementing a ‘‘[name of solution]’’ needs assessment
We are improving a ‘‘[name of function]’’ performance improvement (PI) program
We are improving several ‘‘[function]’’ performance improvement programs simultaneously
Complains about too many meetings for improvement initiatives
PI projects at functional level
Executive compensation tied to functional performance
Vision and mission are general, rather philosophical statements, with only soft implementation
No metrics are defined for vision and mission
Each department or function has a written vision and mission statement
Functions have strategic plans
There is not a common definition of what is strategic (other than what the [superior-level official]
wants
Balanced scorecard is based on adding up functional goals
Budget is based on adding up functional budgets and plans
People complain about too much time in improvement or change projects at the expense
of getting work done
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 113
department in the same university launching a new, bandwidth-hungryvirtual classroom on the standard IT network at peak time.
Function-focused solutions also fail because they treat a systemicproblem (such as organizational performance or behavior) with partial fixesthat ignore systemic connections and interactions between subsystems, attheir own peril.
An MBO-based incentives program focused on improving individualresults might end up rewarding behaviors and decisions that produce lossesto the organization, such as maximizing mortgage sales at the expense ofcredit risk. A unilateral effort in maximizing bank tellers’ courtesy and cross-selling efforts might end up with a torrent of client complaints about slowservice. An equally partial emphasis on fast service might also end up causingthe bank to send clients who are looking for financial advice straight to thenearest competitor.
The root cause of most solutions-caused problems is the lack of asystemic, companywide, comprehensive model for planning and managinginterventions. Left to solutions vendors, performance improvement inter-ventions tend to run out of control, like the contractors on the Blandings’dream house—increasing costs and rework, reducing the chances for actualimprovement, and last, but not least, making life miserable for the organiza-tion’s staff and clients.
Systemic Analysis Breakthrough: PerformanceImprovement Models
From early origins in the work of Kaufman (Kaufman, Corrigan, andJohnson, 1969; Kaufman, 1972), Brethower and Rummler (Brethower, 1972;Rummler & Brache, 1995), and Gilbert (1978), those in the performanceimprovement and performance system8 fields developed a unique focus onsystemic analysis and solution that emphasizes:
(1) Considering performance and behavior as functions of a largercontext or performance system (Brethower, 1972, 2007)
(2) Defining need as a gap between current and desired results, not as alack of resources or as a subjective want (Kaufman, 2006)
(3) Analyzing how all the factors interacting in a performance systemaffect performance and performer, and one another, instead ofblaming the performer (Gilbert, 1978; Rummler & Brache, 1995)
(4) Considering not just the individual, job-level factors (Gilbert,1978) but processes, organization (Rummler, 2004), and societalcontext (Kaufman, 2006)
This is the good news. The bad news is that because all PI/HPT (humanperformance technology) models were developed independently andsuccessively in response to the challenges of different performance
114 DOI: 10.1002/piq Performance Improvement Quarterly
levels—individual, organizational, strategic,societal performance—they do not fit verywell and tend to be used as alternative ap-proaches rather than complementarily.
Like the sages in the Sufi tale of the blindmen examining an elephant, those using asingle model usually fail to get the wholepicture and reduce their chances of comingup with a comprehensive solution; they fallinto the trap of multiple, disconnected, andfinally dysfunctional initiatives whose suc-cess is at best temporary.
Integrating Three Performance Levels
Although there are multiple performance improvement models—morethan 46, according to a number of studies (Bernardez, 2006; Dean & Ripley,1997; ISPI, 2006; Kaufman, Thiagarajan, & MacGillis, 1997)—they can beclassified in three main categories according to their focus and scope:
1. Individual performance models—Gilbert’s Six Boxes (1978); Ma-ger’s performance analysis algorithm (Mager & Pipe, 1983); Spit-zer’s context of work (Spitzer, 1986, 1995)
2. Organizational performance models—Rummler’s Anatomy of Per-formance or AOP (Rummler & Brache, 1995); Brethower’s TotalPerformance System or TPS (Brethower, 1972); Tosti and Carleton’sOrganizational SCAN (Vanguard Consulting, 1996); Carleton andLineberry (2004); Langdon’s Language of Work (1995)
3. Strategic, societal performance models—Kaufman’s OrganizationalElements Model or OEM (Kaufman, Corrigan, & Johnson, 1969;Kaufman, 2006)
Individual Performance Models. Individual performance models, such asGilbert’s classic Behavior Engineering Model or BEM (shown in Table 2), arequite helpful in understanding and optimizing performance at the job level,that of the individual worker.
Organizational Performance Models. A few years later, Gilbert’s formerbusiness partners, Geary Rummler and Dale Brethower, took the entireapproach to performance analysis and improvement several steps further inthe systemic direction, noticing that use of Gilbert’s BEM model frequentlyled to optimizing individual workers’ performance at the expense of processand organizational performance.
If every worker were allowed to ‘‘improve’’ his or her own activities at thejob level on the basis of Gilbert ‘‘six boxes,’’ regardless of other co-workersworking ahead of, before, or while collaborating in a shared work process,
Like the sages in the Sufi tale ofthe blind men examining anelephant, those using a singlemodel usually fail to get thewhole picture and reduce theirchances of coming up with acomprehensive solution; theyfall into the trap of multiple,disconnected, and finallydysfunctional initiatives whosesuccess is at best temporary.
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 115
their collective performance would experience a noticeable setback—aswould happen if each rower in a coxed four were to row at his or her ownpace and rhythm.
Rummler and Brethower’s9 Anatomy of Performance or AOP modelstarted by envisioning three levels of performance nested one into and underthe other: job level, process level, and function level (see Figure 1).
Rummler’s AOP model (see Table 3) analyzes performance at threelevels—job, process, and organization—and three levels of ‘‘performanceneeds’’—goals, design, and management—considered from a performancemanagement perspective. Rummler and Brethower’s matrix includes at thelowest level all the key elements of Gilbert’s BEM models, though notorganized in six boxes.10
Strategic, Societal Performance Models. Although Rummler and Brethower’sAOP and Tosti and Carleton’s SCAN include references to the societalcontext considered as the ‘‘suprasystem,’’ their models do not give suchintense attention to societal performance as does Roger Kaufman’sOrganizational Elements Model or OEM.
TABLE 2 THE BEHAVIOR ENGINEERING MODEL
SD INFORMATION R INSTRUMENTATION SD MOTIVATION
E: environmental
supports
DATA INSTRUMENTS INCENTIVES
Relevant and frequent
feedback about the
adequacy of performance
Tools and materials of work
designed scientifically to
match human factors
Adequate financial
incentives made
contingent upon
performance
Descriptions of what is
expected of performance
Nonmonetary
incentives made
available
Clear and relevant guides
to adequate performance
Career-development
opportunities
P: Person’s
repertory of
behavior
KNOWLEDGE CAPACITY MOTIVES
Scientifically designed
training that matches the
requirements of
exemplary performance
Flexible scheduling of
performance to match
peak capacity
Assessment of people’s
motives to work
Placement Prosthesis Recruitment of people
to match the realities of
the situation
Physical shaping
Adaptation
Selection
Source. Gilbert (1978).
116 DOI: 10.1002/piq Performance Improvement Quarterly
Kaufman’s model, later reframed as part of his Mega planning metho-dology, focuses on the planning process, particularly in differentiating thetrue strategic part—represented by Mega-level societal results driven by aminimal ideal vision (MIV)11 of the future—from tactical levels such asbenefits for the organization (macro-level goals such as revenue, marketshare, and profit) and operational (which for Kaufman starts at the outputs ofproducts and services) level or micro level, and includes activities12 andresources, defined as inputs for activities (see Figure 2).
Kaufman’s model focuses on establishing vertical alignment amongstrategic, tactical, and operational results. Such alignment is, according to
FIGURE 1. THREE LEVELS OF PERFORMANCE (RUMMLER & BRACHE, 1995)
TABLE 3 GEARY RUMMLER’S ORGANIZATIONAL "NINE BOXES"
PERFORMANCE NEEDS
Performance
Level
Goals Design Management
Organization
level
Organization objectives and
indicators
Organization
design
Organization
management
� Macro � Macro � Macro
� Micro � Micro � Micro
Process level Process objectives and
indicators
Process design Process
management
Job and task objectives
and indicators
Job and task design Job and task
management
Job level Resource levels and
requirements
Resource allocation
system
Resource
management
Source. Rummler & Brache (1995).
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 117
Kaufman, the only way to guarantee delivering actual value to externalstakeholders, keeping the organization useful and focused.
Kaufman’s OEM model is a uniquely helpful guide to aligning all internalelements—after all, organizations are not ends unto themselves but means toachieve results and produce value—and make sure that our sequence ofdefinitions follows an outside-in, top-down order rather than the other wayaround. Using Kaufman’s OEM as a ‘‘builder’s plumb’’ (to continue with thehome improvement analogy), we can keep our performance improvementefforts honest and aligned with results and actual value for externalstakeholders as well as the survival of the organization.
Integrating Kaufman’s OEM, Rummler and Brethower’s AOP, andGilbert’s BTE as shown in Table 4, we can connect and align the three levelsof PI models and get a complete, systemic blueprint of what is involved inchange and its probable impact in the organization’s overall performance.
The ‘‘Falling Water’’ Path to Establishing PI Intervention Sequence. Applyingprinciple number one of all performance improvement models—adopting asystemic view—we must start by positioning each intervention in one ormore of Table 4’s 12 cells, according to the levels and steps where theintervention operates.
Following a ‘‘falling water’’ path, those PI interventions that operate athigher levels of the framework (from upper left corner to lower right corner,all across the table) have wider impact and determine the effectiveness ofthose at lower levels. A change in our client definition, for example, affectsour strategic programs. Changes in our strategic programs subsequentlyaffect strategic goals and rules, which in turn may trickle down at theorganizational goals or macro level in our P&L statement.
In facing multiple performance improvement intervention, we arehelped by Table 4 to (1) make sure that we start by implementing thoseinterventions with the highest and broadest systemic impact first, (2)establish a more effective sequence of PI interventions, (3) explore the
Source. Kaufman (2006).
FIGURE 2. ORGANIZATIONAL ELEMENTS MODEL (OEM)
118 DOI: 10.1002/piq Performance Improvement Quarterly
downstream effects of each PI intervention in other non-PI programs, and (4)keep our organizational budget under control.
Conversely, if we start our performance improvement initiatives at lowerlevels, we may face unplanned uphill battles against overpowering, upper-levelresistance factors that will increase the cost and effort required to achievesuccessandtheriskofsystemicfailure,anexamplebeingacostlyITsystemattheprocess level that ignores the strategic priorities of our external clients, wastingtheir time or delaying our organization’s effective response to their demands.
The guiding questions summarized in Table 5 help detect and preventpotential misalignment between levels (vertical) and steps (horizontal).
Sequencing Change Interventions. We are not done yet in avoiding nastycollisions between solutions. According to Davis and McCallon, ‘‘All changeis disorienting. Too much change in too little time is destructive’’ (1974).
TABLE 4 ALIGNING MODELS: MULTILEVEL FRAMEWORK
LEVEL
OBJECTIVES: GOALS,
STANDARDS, AND
INDICATORS
DESIGN: HOW-TO,
PROGRAMS
MANAGEMENT:
IMPLEMENTATION,
CONTROL
Societal and
external (Mega)
Mega objectives and
indicators
Social and organizational
plan, strategic directions
related to:
Social and regional
management
Community Community Community Market
Clients Market Market Policies
Market People People Regulations
Suppliers Suppliers Suppliers
Value chain
Organization Organization objectives
and indicators
Organization design Organization
management
Macro
(organizational
results)
Macro Macro Macro
Micro
(products)
Micro Micro Micro
Processes
Internal
services
Process objectives and
indicators
Process design Process
management
People and
resources
‘‘Six boxes’’ Job and task objectives
and indicators
Job and task design Job and task
management
Individual
performer
Resource levels and
requirements
Resource allocation
system
Resource
management
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 119
TABLE 5 ALIGNING PERFORMANCE LEVELS: KEY QUESTIONS
OBJECTIVES DESIGN MANAGEMENT
Is our organization adequate to
meet the demands of new
realities, clients, and markets?
Is organizational design aligned
with strategic vision and mission?
Is management focused on
developing and serving future
clients?
Do we have a clear value
proposition for clients and
markets that are the key for the
future?
Is there a design of the future or
desired organization?
Does management communicate
fluently and effectively with
clients, market, suppliers, and
community?
Is our organization adding
measurable value to the clients,
market, and communities it
serves?
Are adequate indicators to
measure accomplishment and
progress in achieving the strategic
vision and mission defined and in
place?
Does management invest enough
time in exploring future trends and
changes in the market, client,
supply chain, and community?
Are the vision and future strategy
articulated adequately to be
communicated and guide
implementation?
Are strategies correctly aligned
and communicated top-down,
inside-out?
Are all key organizational
processes and functions clearly
defined and implemented?
Are each function’s goals clearly
defined and coordinated with
strategy and other functions?
Are our strategies compatible
with our SWOTs?
Are all current functions adequate
and coordinated?
Is relevant function performance
measured?
Do we have clearly defined
products and standards for
each level of results defined in
our strategy?
Are products and services that
link all functions consistent and
adequate?
Are resources adequately assigned?
Do we have defined
measurable results and
standards at Mega, macro,
and micro levels?
Do the current organizational
structure and functions
adequately support the
organizational strategy and
performance?
Are interfaces between functions
adequately coordinated?
Are those results aligned and
compatible with each other?
Are our Mega results enough
to support and sustain our
macro goals?
Are clear goals and standards
defined for all key processes?
Are the current processes the
most effective and efficient to
achieve goals and meet
performance standards?
Are goals and standards for all
key processes and subprocesses
clearly defined?
Are those goals and standards
aligned with the organization’s
and client’s requirements?
Is process performance
adequately measured?
Are resources adequate for
each key process?
120 DOI: 10.1002/piq Performance Improvement Quarterly
Even combining all three performance levels—external, organizational,individual—we might discover that, like multiple contractors without ashared blueprint and plan, each improvement intervention cancels, reverts,or forces us to redo previous ones, increasing costs (as when carpets installedbefore piping or electric wiring must be removed and reinstalled).
Practical Application in a University. Let’s put our integration framework touse in establishing a sequence for change and performance improvementinterventions. We will use as an example a real case we solved using the tool.
Our client, a 12,000-student university, was implementing multiplechange and performance improvement initiatives, most started at thefunctional level. But they were taking the entire organization to a grindinghalt. People complained against change at every level, while fighting bitterlyto give their own initiatives priority.
Over a 4-year period, a total of 10 performance improvement initiativeswere implemented in this order:
1. Each department launched new educational programs based on itsexperts’ assessment of most valuable specialties and technologicalcareers for the next decade.
2. A 10-year IT infrastructure ‘‘master plan’’ was launched to‘‘unify the response’’ and systematize multiple departments’requirements, eliminating redundancies and ‘‘setting commonstandards.’’
TABLE 5 ALIGNING PERFORMANCE LEVELS: KEY QUESTIONS (CONTINUED)
OBJECTIVES DESIGN MANAGEMENT
Are process interfaces
adequately coordinated?
Are job goals and standards
clearly defined and
communicated to performers?
Are process requirements
adequately supported by jobs
and tasks involved?
Do performers know and
understand standards?
Are job goals and standards
adequately aligned with
process requirements?
Are tasks and job steps
adequately sequenced?
Are resources and job design
adequate?
Are adequate policies and
procedures in place?
Are incentives adequate for
meeting standards?
Are layout and technology
adequate to support tasks
and jobs?
Do performers know when
they reach goals?
Are they competent?
Is job environment or context
adequate?
Do performers have the
required capacities?
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 121
3. Because of interdepartmental conflicts during the first year of theIT plan, the university launched a cross-functional communica-tions and team-building initiative.
4. As a result of the findings of the communications initiative, theuniversity redesigned a common value chain linking educationalprograms with common goals.
5. To ensure alignment between academic and administrative func-tions, both departments defined balanced scorecards (BSCs) andstrategy maps, following Kaplan’s methodology (Kaplan andNorton, 1996, 2004).
6. On the basis of the goals defined by the BSC process, the universitylaunched a market development plan to assess needs for futureeducational programs.
7. As a consequence of the conflict between the new educationalprograms—already designed in step 1 of this list—and the marketdevelopment study findings unveiled in step 6, all universitydepartments decided to ‘‘update’’ their shared, strategic visionand mission on the basis of Kaufman’s OEM concepts.
8. Responding to evidence of insufficient exchange of knowledge andknow-how across disciplines, the university started an interdisci-plinary knowledge management initiative.
9. From the findings of both BSC and communications programs, across-functional change management initiative was launched tofacilitate migration from the traditional culture to a new one.
10. Responding to recommendations from the change managementinitiative, the university launched a process reengineering programin order to transition from a ‘‘functional’’ to a ‘‘process manage-ment’’ methodology.
The consequences of all this zigzag decision making driven by reactionsand fixes to unexpected consequences of other improvement steps were dire.At the time of our first assessment, most research professors were investingmore time in multiple performance improvement initiatives than in theirprimary research or teaching jobs.
Morale was sinking, complaints were mounting, and the departmentheads and career directors felt confused and frustrated. One of the directorssummarized the change management team’s feelings about the overallperformance improvement process at that time: ‘‘We’re deep into Alice inWonderland’s rabbit hole.’’13
Like homeowners lost in a vast reform project without blueprints, themultiple rabbit holes created by each performance improvement step weredelivering new systemic emergencies to address, with no end in sight.
Back to Reality: The PI Interventions Alignment Tool. On the basis of theuniversity’s newly defined vision and mission and its specific, measurableindicators focused on the university’s external clients’ priorities, we helpedthe university management get out of their ‘‘rabbit hole’’ by revising and
122 DOI: 10.1002/piq Performance Improvement Quarterly
resequencing their multiple performance improvement initiatives. We askedthe team to organize the 10 interventions using our multilevel, PI modelalignment framework (see Table 4), with the results shown in Table 6.
From this new sequence, we estimated the gap between the trial-and-error sequence followed originally and what would be a logical top-downsequence. We asked the evaluation team to estimate the consequences ofeach gap, monetizing the cost wherever possible. Table 7 shows the results.
Colleges and other professional organizations established around spe-cialized professional disciplines have a known tendency to create parallel,redundant planning and organization structures. University departmentstend to operate in relative isolation from each other because, according toDaniels and Mathers, ‘‘professionals go to their tasks alone; they gain skillfrom their own experience and the sharing of that experience with fellowprofessionals’’ (1997). This is often the reason academics fail to understand
TABLE 6 ALIGNING PERFORMANCE IMPROVEMENT INTERVENTIONS: EXAMPLE
LEVEL
OBJECTIVES: GOALS,
STANDARDS, AND
INDICATORS
DESIGN: ‘‘HOW TO,’’
PROGRAMS
MANAGEMENT:
IMPLEMENTATION,
CONTROL
Social and
External (Mega)
Vision and mission (1) Market development
plan (2)
Community
Clients
Market
Suppliers
Value chain
Organization Value chain redesign
(3)
Balanced scorecard (4)
Macro
(organizational
results)
Micro (products)
Processes Process reengineering
(5)
Change management
(6)
Internal services Knowledge
Management (7)
People and
resources
IT infrastructure (8) Communications and
team building (9)
‘‘Six boxes’’ New educational
programs (10)
Individual
performer
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 123
that ‘‘reality’’—as Roger Kaufman (2006) used to say—‘‘is not divided in[to]disciplines or departments’’ (p. 127).
Such tendency, however, only aggravated the alignment problems:the root cause of the conflicts, rework, and frustration that ended in a$324,000 loss and, more important, increased resistance to changewas the lack of an integrated, multilevel model to organize and prioritize anumber of PI interventions, putting the organization’s results (its ‘‘health’’)ahead of those of the functional areas and their solutions consultants(‘‘the medicine’’).
Furthermore, using this comprehensive, multilevel intervention orga-nizer as a common framework allowed all change advocates championingspecific solutions to identify and prevent systemic problems and reorganizethe existing PI programs effectively, instead of competing and fighting withone another.
TABLE 7 PI INTERVENTION MISALIGNMENT: CALCULATING THE COSTS OF‘‘NONQUALITY’’
PERFORMANCE
IMPROVEMENT INITIATIVES
HOW IT HAPPENED
(SOLUTIONS-
FOCUSED)
HOW IT SHOULD
BE PRIORITIZED? GAP
IMPACT AND COST
OF THE GAP
New educational programs 1 10 19 Redesign, low
enrollment (50,000)
IT infrastructure (5 years) 2 8 16 Repurchase, retrofit
($50,000)
Communications and team
building
3 9 16 Downtime ($40,000),
repetition (overcost
$35,000)
Value chain redesign 4 3 –1 Lost contracts ($30,000)
Balanced scorecard (BSC) 5 4 –1 Rework ($25,000)
Market development plan 6 2 –4 Opportunity cost
($50,000)
Vision and mission 7 1 –6 All of the above,
misalignment
Knowledge management 8 7 –1 Loss of information
($12,000)
Change management program 9 6 –3 Resistance, conflicts
($20,000)
Process reengineering 10 5 –5 Rework ($30,000)
Potential savings using the PI
intervention alignment
framework:
($342,000)
Note. 1 (plus sign) implies that the intervention was incorrectly anticipated to others; – (minus sign) implies theintervention was incorrectly delayed.
124 DOI: 10.1002/piq Performance Improvement Quarterly
Conclusion
Improving organizational performance is too important to leave it tomultiple performance improvement models or solutions consultants.
Failing to align and integrate multiple PI solutions has not only im-mediate, measurable, and costly consequences but also the long-lastingeffect of creating or reinforcing change aversion in the organization and itsinternal and external stakeholders. Using a multilevel, organization- andexternal-clients-focused framework may help organizations and consultantsachieve better results with less pain and effort.
This, by the way, is what performance improvement is all about.
Notes
1The term iatrogenesis (from Greek, iatros, physician) refers to adverse effectsor complications caused by or resulting from medical treatment or advice.22008 statistics show a growth in treatment-caused deaths and illnessescaused by hospital-bred ‘‘superbugs’’ resistant to antibiotics (Landro, 2008).3Nosocomial: related to or acquired in a hospital or treatment center.4This classic comedy, starring Cary Grant and Myrna Loy, became a cult film,generating ‘‘tours’’ to the Blandings house and a sequel, The Money Pit(1986), with Tom Hanks.5One of the 14 points for TQM implementation, by W. E. Deming, TQM’sfounder: ‘‘Eliminate arbitrary numerical targets: Eliminate work standardsthat prescribe quotas for the work force and numerical goals for people inmanagement’’ (Deming, 2000, p. 45).6‘‘Consultant’’ has increasingly become a code word for a salesperson of asolution rather than what it meant in the management or business associatedwith the likes of Peter Drucker, Roger Kaufman, C. K. Prahalad, or GearyRummler. Rummler appropriately titled his latest book Serious PerformanceConsulting as an indictment of such unethical practices (Rummler, 2004).7In a not-infrequent case in our experience studying dysfunctional perfor-mance improvement initiatives, two functional areas launched their ownbalanced scorecard initiatives, limited to their functional silos, and had theirown separate visions and missions—as if they were separate organizations.Interestingly, they hired the same solutions vendor, which did not find thesituation abnormal or unethical.8Also called more recently (and controversially) human performancetechnology by the International Society for Performance Improvement(ISPI).9According to their own report, Geary Rummler and Dale Brethower startedexpanding and questioning the primitive BEM model during their years ofresearch together, and after parting for decades—Rummler to consulting,Brethower to academia—developed two models, AOP and TPS, that were inessence variations of a common one. They rebaptized it Anatomy ofPerformance and have been working in recent years at ITSON with AOP.
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 125
10Although Gilbert’s BEM formulation separates ‘‘environmental control’’factors (see Table 2), these are considered part of the job context as in a jobdescription. The AOP model goes much further by differentiating jobconditions such as those from process and organizational levels.11For a complete description of Kaufman’s MIV, see Change, Choices, andConsequences (Kaufman, 2006) or Bernardez’s Tecnologia del DesempenoHumano (2006).12Kaufman’s activities are a more general equivalent to what AOP defines asorganization and process levels.13Reference to the character in Lewis Carroll’s novel Alice’s Adventures inWonderland, who in chasing a rabbit down a hole falls into a parallel world(Wonderland), a magic kingdom whose inhabitants—the Mad Hatter, theQueen of Hearts, and others—turn logic upside-down (Carroll, 2000).
References
Bernardez, M. (2006). Tecnologia del desempeno humano [Human performance technol-
ogy]. Chicago: Global Business Press.
Brethower, D. (1972). Behavioral analysis in business and industry: A total performance
system. Kalamazoo, MI: Behaviordelia.
Brethower, D. (2007). Performance analysis: Knowing what to do and how. Amherst, MA:
HRD Press.
Carleton, J. R., & Lineberry, C. (2004). Achieving post-merger success. San Francisco: Wiley.
Carroll, L. (2000). Alice’s adventures in Wonderland. London, UK: Signet Classics. (Originally
published in 1865)
Daniels, W. R., & Mathers, J. G. (1997). Change-ABLE organization: Key management
practices for speed and flexibility. Mill Valley, CA: American Consulting and Training.
Davis, L. N., & McCallon, E. (1974). Planning, conducting and evaluating workshops. Austin,
TX: Learning Concepts.
Dean, P., & Ripley, D. E. (1997). Performance improvement pathfinders: Models for
organizational learning systems. Washington, DC: ISPI.
Deming, W. E. (2000). Out of the crisis. Cambridge, MA: MIT Press.
Gery, G. (1992). Organizational flagellation. CBT Directions, 23–30.
Gilbert, T. F. (1978). Human competence: Engineering worthy performance. Amherst, MA:
HRD Press.
ISPI. (2006). Handbook of human performance technology (3rd ed.). (R. Pershing, Ed.). San
Francisco: Wiley.
Kaplan, R. S., & Norton, D. P. (1996). Translating strategy into action: The balanced scorecard.
Boston: Harvard Business Press.
Kaplan, R. S., & Norton, D. P. (2004). Strategy maps: Converting intangible assets into tangible
outcomes. Boston: Harvard Business Press.
Kaufman, R. (1972). Educational system planning. Upper Saddle River, NJ: Prentice-Hall.
Kaufman, R. (2006). Change, choices and consequences: A guide to Mega thinking and
planning. Amherst, MA: HRD Press.
Kaufman, R., Corrigan, R., & Johnson, D. (1969). Towards educational responsibity to
society needs: A tentative utility model. Journal of Socio-Economic Planning Sciences,
3(1), 151–157.
Kaufman, R., Thiagarajan, S., & MacGillis, P. (1997). The guidebook for performance
improvement: Working with individuals and organizations. San Francisco: Jossey-
Bass/Pfeiffer.
Landro, L. (2008, September 17). Rising foe defies hospital wars on ‘‘superbugs.’’ Wall
Street Journal, D1–D6.
126 DOI: 10.1002/piq Performance Improvement Quarterly
Langdon, D. (1995). The new language of work. Amherst, MA: HRD Press.
Mager, R., & Pipe, P. (1983). Analyzing performance problems—Or you really oughta wanna.
Atlanta: Center for Effective Performance.
Potter, H. (Director). (1948). Mr. Blandings Builds His Dream House [Motion picture]. United
States: RKO Pictures.
Rummler, G. (2004). Serious performance consulting. Silver Spring, MD: ISPI/ASTD.
Rummler, G., & Brache, A. P. (1995). Improving performance: How to manage the white space
in the organization chart. San Francisco: Jossey-Bass.
Spitzer, D. (1986). Improving individual performance. Englewood Cliffs, NJ: Educational
Technology.
Spitzer, D. (1995). Super-motivation: A blueprint for energizing your organization from top to
bottom. New York: AMACOM.
Starfield, B. (2000). Is U.S. health really the best in the world? JAMA, 284(4), 483–485.
Steel, K., German, P. M., Crescenzi, C., & Anderson, J. (1981). Iatrogenic illness in a general
medical service at a university hospital. New England Journal of Medicine, 304(11),
638–642.
Vanguard Consulting. (1996). Organizational SCAN. Unpublished manuscript. San Diego,
CA: Author.
Weingart, S., Ship, A. N., & Aronson, M. D. (2000). Confidential clinician-reported surveil-
lance of adverse events among medical inpatients. Journal General of Internal
Medicine, 15(7), 470–477.
MARIANO L. BERNARDEZ
Mariano L. Bernardez is an international consultant specializing indeveloping new businesses. During a 30-year career as a consultant to ArthurAndersen, Andersen Consulting, and the United Nations DevelopmentProgram, and as a manager and CEO in charge of running new startupsand consulting firms in Latin America, Europe, and the United States, he hashelped Fortune 500, small business, and multinational corporations, govern-ments, and NGOs in bringing about new businesses and organizations. He isresearch professor and director of the International Institute for Perfor-mance Improvement at the Sonora Institute of Technology (ITSON) inMexico, a Ph.D. and MBA program focused on improving social andorganizational performance by developing new companies. He is the authorof four books on the specialty and multiple articles in peer-reviewedand professional publications. He has been a board director at theInternational Society for Performance Improvement (ISPI) and is a frequentpresenter at ISPI, ASTD, IFTDO, and AMA international conferences.E-mail: mbernardez@expert2business.com
Volume 22, Number 2 / 2009 DOI: 10.1002/piq 127
Recommended