ACADEMIC EMERGENCY MEDICINE November 2000, Volume 7, Number 11 1223
The Cognitive Imperative: Thinking aboutHow We Think
PAT CROSKERRY, MD, PHD
Abstract. There are three domains of expertise re-quired for consistently effective performance in emer-gency medicine (EM): procedural, affective, and cog-nitive. Most of the activity is performed in thecognitive domain. Studies in the cognitive scienceshave focused on a number of common and predictablebiases in the thinking process, many of which arerelevant to the practice of EM. It is important tounderstand these biases and how they might influ-ence clinical decision-making behavior. Among thespecialities, EM provides a unique clinical milieu ofinconstancy, uncertainty, variety, and complexity. In-jury and illness are seen within narrow time win-dows, often under pressured ambient conditions.These operating characteristics force practitioners to
adopt a distinctive blend of thinking strategies. Prin-cipal among them is the use of heuristics, a form ofabbreviated thinking that often leads to successfuloutcomes but that occasionally may result in error. Anumber of opportunities exist to overcome interdis-ciplinary, linguistic, and other historical obstacles todevelop a sound approach to understanding how wethink in EM. This will lead to a better awareness ofour cognitive processes, an improved capacity toteach effectively about cognitive strategies, and,ultimately, the minimization or avoidance of clinicalerror. Key words: emergency medicine; cognition;errors; decision making; heuristics. ACADEMICEMERGENCY MEDICINE 2000; 7:12231231
THERE are three major skill sets in the per-formance repertoire of emergency physicians(EPs): procedural, affective, and cognitive.1 It mayappear to outside observers, as well as to manywithin the profession, that emergency medicine(EM) is predominantly action-oriented and thatprocedural skills are, therefore, the most impor-tant of the three. Indeed, during training, a con-siderable emphasis is directed at the acquisitionand retention of skills such as intubation, woundrepair, the insertion of chest tubes and centrallines, diagnostic peritoneal lavage, lumbar punc-ture, and cast application. This procedural skill setis tangible, well defined and teachable. While pro-cedural skills are important and integral to an ef-fective performance in the emergency department(ED), they comprise a relatively small part of theoverall activity of EPs. Most of our time, in fact, isengaged in cognitive behavior; it is the preponder-
From the Department of Emergency Medicine, Dalhousie Uni-versity, Queen Elizabeth II Health Sciences Center, Halifax,Nova Scotia, and the Department of Emergency Medicine,Dartmouth General Hospital, Dartmouth, Nova Scotia, Can-ada (PC).Received February 28, 2000; revision received March 29, 2000;accepted July 6, 2000.Supported in part by a grant from the Department of Emer-gency Medicine at Dalhousie University, Halifax, Nova Scotia,Canada.Address for correspondence and reprints: Pat Croskerry, MD,PhD, Department of Emergency Medicine, Dartmouth GeneralHospital, 325 Pleasant Street, Dartmouth, Nova Scotia, Can-ada B2Y 4G8. Fax: 902-465-8579; e-mail: email@example.com firstname.lastname@example.org
ant substance of EM. Surprisingly, it has attractedrelatively little attention, perhaps because of anundervalued regard for the importance of exactlyhow we think. In a recent report, medical educa-tors concluded that EM health care workers wereoften . . . not consciously aware of how they eval-uate evidence and cope with decision complexity.2
Proficiency in the cognitive domain, comparedwith that in procedural skills, is less easily de-fined, involves a much broader range of possibili-ties, and would appear to be less easily taught.Kassirer, in 1995, recognized the problem veryclearly:
Research in the cognitive aspects of clinical problem-solving has ebbed, and a comprehensive theory of di-agnostic and therapeutic problem-solving is not yetat hand. No matter how advanced our technology be-comes, and no matter how far our computer systemsevolve, the cognitive tactics and strategies of the cli-nician-problem-solver are not likely to be replaced inthe foreseeable future. Those who are responsible forteaching students and residents these skills shouldtry to identify clearly, separate, and then extractthese critically important cognitive tasks fromcourses that encompass myriad unrelated skills andknowledge. . . . They should recognize that these crit-ical cognitive skills constitute a specific body ofknowledge and should find imaginative ways ofteaching them.3
The problem that faces us involves thinkingabout the ways we think. We need to develop moreawareness of, and insight into, the cognitive pro-
1224 COGNITION Croskerry HOW WE THINK
cesses involved in decision making. If we under-stood more about these processes, we might betterunderstand how cognitive errors occur, and how wemight best teach others to minimize or avoid them.The argument is proposed here that the special mi-lieu and prevailing conditions of EM impose a dif-ferent style of thinking, with a unique blend of cog-nitive strategies, on those who work there.Importantly, there is a need to understand whatcognitive science has to offer EM.
The remaining skill set is in the domain of af-fective behavior, which again is ill defined and lesstangible. Emergency physicians and nurses haveto deal with a broad range of emotions shown bycolleagues, patients, and their relatives andfriends. Many of these interactions are probablymediated by important experiential factors involv-ing social transference and countertransferencephenomena4: our likes and dislikes of others maybe irrationally based on significant exemplars inour own past. Inappropriate affective responses to-ward others may also arise through causal attri-bution, the process by which we make value judg-ments about the behavior of others on the basis ofsituational or dispositional factors.5 More empha-sis should be placed on the cognitive managementof situations engendering a range of emotions thatmay run the full gamut from empathy to hostility.Emotional factors clearly impact on decision mak-ing and can lead to error. We do not make gooddecisions when our viscera are aroused. Again, asfor cognitive behavior, we need to be thinkingabout how we feel, and understand the impact thismay have on clinical decision making.
Cognitive and affective behaviors are largelycovert, and involve a most critical activity, that ofmaking decisions about patient diagnosis, manage-ment, and disposition. Despite a comprehensiveliterature on medical decision making, this areahas yet to be systematically addressed within thediscipline of EM. Historically, a greater emphasishas been placed on what we do rather than onwhat, or how, we think. All three components ofperformance influence each other; affective expe-riences clearly influence what we think,6 and cog-nition ultimately determines what we will actuallydo. For good calibration of performance7 in the ED,physicians must be knowledgeable and competentin all three domains.
Developing a perceptual awareness of how wethink and feel is a necessary first step in under-standing our cognitive behavior. William Jamesemphasized that in order to perceive something wemust not only be conscious of it but also be payingattention. For example, once a visual illusion hasbeen explained to us, we can focus attention on thecritical aspects of the signal and avoid being influ-enced to the same extent by distracting visual
noise.8 Perceptual accuracy, therefore, requiresattention. We need to be thinking about how wethink and feel.
THEORY AND PRACTICE
An argument might reasonably be made that thereare enough distinctive features of EM to set itaside from mainstream medicine, and that both itstheory and practice merit special treatment. A ma-jor difficulty in developing an approach to the cog-nitive behavior that underlies clinical decisionmaking in EM is that we do not yet have a distincttheoretical framework for the discipline. First,there is a need to develop an epistemological foun-dation, i.e., a theoretical basis for the method andgrounds through which knowledge will be ac-quired. On this we can build a sound cognitive the-ory that will be uniquely adapted to the specializedrequirements of EM. The final step will be the de-velopment of specific cognitive skill sets that areeasily understood and teachable. Importantly, thetheory must complement practice.
Schein9 described a hierarchical order of prior-ities in strategies for the acquisition of knowledge.At the topmost end are epistemological considera-tions that require theorizing about the processes,methods, and grounds upon which knowledge isdeveloped, what he referred to as the underlyingdiscipline or basic science component.
The next level has been termed the applied sci-ence, engineering,9 or normative10 component. Inthe cognitive domain, this refers to the empiricaltheory upon which practitioners base their reason-ing, and that underlies the clinical decision-mak-ing process. Earlier models of clinical decisionmaking, now referred to as classical decision the-ory, aimed at providing a formal, axiomatic meth-odology through which a clear endpoint could bereached. This process was based on utility theoryand probability theory,11 with the Bayesian rule al-lowing an algebraic computation of diagnosticprobabilities.12 Some form of probabilistic reason-ing, though less mathematical, underlies the var-ious approaches others have taken toward clinicaldecision making, such as the classic hypothetico-deductive strategy,13 pattern recognition,14 the al-gorithmic method,15 and the exhaustive strategy.16
The lower level of Scheins hierarchy, but thatwhich has the most significance for EPs andnurses, is characterized as descriptive,10 reflectingthe skills and attitudinal component,9 or what ac-tually happens in clinical practice. Traditionally,these skills were acquired toward the end of formalmedical training,17 although newer, innovative pro-grams introduce them at earlier stages.18
A historical problem for physicians in general,19
and EPs in particular,20 is that the upper, theoret-
ACADEMIC EMERGENCY MEDICINE November 2000, Volume 7, Number 11 1225
ical levels of this hierarchy have been perceived assomewhat ethereal. They have been characterizedas in vitro, the concern lying more with the needto perform at an in vivo level.21 Schon describesthis as the rigor or relevance dilemma,17 and Rea-son sees it as the cognitive reality departing fromthe formalized ideal.22 Elsewhere, somewhat moreharshly, it is referred to as the difference betweenclinical reality and abstract imagery.19 The mes-sage is fairly clear: for any theory of clinical deci-sion analysis to be acceptable to EPs and nurses,it must be practical and comprehensible, and mustenjoy some features of common sense.
In his excellent critique of this dichotomy be-tween the mathematical approach toward decisionmaking and the science of clinical reality, Feinsteinmakes the important objection that Bayesian clin-ical logic drives toward a diagnostic endpoint.19 Asimilar criticism is implicit in Rasmussens inter-pretation of medical diagnosis as a dynamic pro-cess involving a choice of possible actions ratherthan as an isolated endpoint.23 These viewpointsrepresent a subtle but important change of empha-sis in the theoretical basis of clinical decision mak-ing because EPs are often more concerned withmanagement or therapeutic action than withachieving diagnostic closure. There are some ob-vious pitfalls associated with the latter.1,8
More recently, the classical or normative theoryhas come under renewed attack from proponentsof the naturalistic model of decision making(NDM). This model has considerable appeal be-cause many of its features and properties are con-gruent with the unique operating characteristics ofEDs: the model avoids rigorous analytical strate-gies and emphasizes economy of thought and ac-tion; it accepts that decision making is more drivenby situational variables, may be influenced by re-source limitations, and temporally evolves; it ac-knowledges the important role of perceptual pro-cessing, varying cognitive strategies, and theircost, as well as the influence of past experience andhabit; and importantly, it emphasizes competencenot failure, and it can accommodate a teamworkapproach.24 It may well be what EPs are lookingfor. It more closely approximates their working en-vironment, and provides a realistic and practicalframework in which cognitive processes and clini-cal reasoning can be examined.
The goal, then, will be to find effective ways inwhich advances made at the theoretical levels areeffectively communicated to the grassroots levelamidst the variable topography of the EM land-scape. This is the place that Schon refers to as theswampy lowlands,17 where messy and indeter-minate problems will be encountered. This prob-lem is not unique to EM. Others have identifiedsimilar gaps between what has converged into ac-
cepted, theoretical dogma and the demands of real-world practice.17 Emergency physicians frequentlydemonstrate their ability to use available conver-gent knowledge in a particular area and tailor itto the unique demands of a specific situation.Schein9 refers to this as divergent thinking, thedivergence amounting to cognitive adaptationwhere theoretical dogma has failed. Thus, EPs andnurses will be interested particularly in the flexi-bility and comprehensiveness of the theoretical de-velopments they are offered, and how they willtranslate into practice on the front line. Such the-ory should properly have its basis in cognitive sci-ence. To begin the process, we should give someconsideration to the ways in which we think, andhow we might reduce errors in our thinking.
There are three basic reasoning strategies, moral,deductive, and inductive,25 and all have applicationin the ED. We cannot escape moral reasoning andjudgmental behavior in the ED. Everyone willhave some moral stance toward how patientsought to be treated, involving what he or she mightbelieve are right or wrong treatments, regardlessof the consensus of experts. This is well illustratedin the management of psychiatric patients in theED. Some studies have shown that emergency per-sonnel may actually increase the risk of suicide indepressed patients,26,27 such iatrogenic contribu-tions to suicide probably having their origin incountertransference.28 Moral reasoning clearly hasno logical validity, depending, as it does, on indi-vidual and prevailing societal and cultural values.It is distinguished from judgments of taste, where,for example, we express opinions about what welike and dislike, often as a result of attribution, theprocess through which we might make erroneousinferences about patients personal qualities on thebasis of their behavior,29 or, again, through trans-ference and countertransference feelings towardpatients.4 There are many examples of these biasesin the ED. A classic example is the patient withborderline personality disorder who might be di-agnosed solely on the basis of the consistent antip-athy he or she generates among health care pro-viders.30 Similar problems often occur in ourapproach to other categories of patients such asalcohol or drug abusers, frequent flyers, andthose whom we perceive as misusing the ED.These feelings toward patients are often unavoid-able, but awareness of such visceral biases maylessen their impact on our clinical judgment. Aswith moral reasoning, there is no logical validityin such taste preferences, and neither is there anyjustification for acting on them.
Deductive reasoning, performed correctly, does
1226 COGNITION Croskerry HOW WE THINK
offer the possibility of logical validity. If the prem-ises are valid in a deductive argument, then theconclusion logically drawn from these premisesmust be true. Emergency physicians make fre-quent use of pure deductive reasoning, but the con-text is often prosaic, involving simple decisionsabout whether a laboratory value, radiograph, orelectrocardiogram is normal. While there is littlesense of challenge or accomplishment in these rou-tine, deductive decisions, they are the only in-stances in which we can say our thinking is trulyvalid. This point often goes unrecognized. Even thetraditional hypothetico-deductive scientific methodoriginated by Popper,13 and now classically incor-porated into formal clinical decision making,12 is amisnomer in its current use in that there is neverany guarantee, only an inductive inference of var-ying degrees of certainty, that the rejection of com-peting hypotheses on the differential diagnosis islogically valid.31 Making a particular choice fromthe differential diagnosis list often has more to dowith clinical acumen than any rules of logic. Sim-ilar considerations apply to rigorously developedclinical decision rules32 in that, while they imparta quasideductive flavor, they are ultimately state-ments of probability, and are only as good astheir attached sensitivities and specificities. Whiletheir originators are aware of this limitation,those who use them may not be. They remain in-ductive.
Most of our thinking in clinical practice is of theinductive type, and we should understand its na-ture and limitations. Inductive thinking is thelogic of experience. If all the patients with acutemyocardial infarcts we have ever seen have severe,retrosternal crushing chest pain, then the next oneto come along with those symptoms will be simi-larly diagnosed, and a thoracic aortic dissectionmay be missed. Novices in the field are especiallyvulnerable to this inductive error, having a mis-placed faith in the law of small numbers.33 Ex-perienced clinicians, who function more comfort-ably with less prototypical presentations ofdisease, have learned to live with the vagaries anddeceptions of inductive reasoning, favoring insteadthe law of large numbers.34 This holds that themore presentations that are seen, the more readilyone will accept and correctly diagnose the atypicalvariant. Those who have learned well from expe-rience are said to have clinical acumen, and whilethere is no substitute for experience, we mightshorten the road by teaching some of the basicflaws and biases known to be present in everydaythinking. Importantly, there is never any guaran-tee that the conclusions drawn from an inductiveargument are logically valid, and the humility thatoften characterizes experienced EPs testifies tothat.
The special milieu of the ED is dominated by heu-ristic thinking, a cognitive process that simplifiesour clinical decision-making operations.34 It flour-ishes under the uncertainty arising from the re-quirement to assess patients with whom the phy-sician is usually unfamiliar, within narrow timeframes, and often with limited resources. Heuris-tics are shortcuts, rules of thumb, maxims, or anystrategy that achieves abbreviation and avoids thelaborious working through of all known options inthe course of problem solving. A classic example ofa heuristic is Suttons law, or going for where themoney is12: if a 40-year-old man, previously well,presents to the ED with flank pain, nausea andvomiting, and hematuria, we move almost imme-diately to a working diagnosis of ureteral colic. Wedo not paralyze our decision making by systemat-ically working through all the possibilities thatthis combination of symptoms and signs generates.These shortcuts considerably reduce the costs ofsearch,35 and, in the vast majority of cases, we willbe right.
Heuristics are highly inductive, but essentialfor achieving the economy of thought and actioncrucial to a well-calibrated performance in the ED.Under extreme conditions, they may take the fastand frugal form,36 becoming heuristically heuris-tic. Although they will often serve us well, inevi-tably they will be associated with error, and a pricewill be paid for cutting corners. Heuristics corre-spond to what Reason has termed flesh-and-blooddecision making,22 the expression having morethan metaphorical resonance, perhaps, for thosewho live in the swampy lowlands. It denotes theeveryday, intuitive decisions that EPs make with-out resorting to a formal decision analysis. Goodflesh-and-blood decision making leads to expedientpatient disposition and optimal utilization of re-sources. We should know and understand heuristicthinking and the instances under which it occursin the ED. Much has been written about the sub-ject over the last 30 years in the field of cognitivepsychology, and work continues on its refinement.
A number of errors in EM decision making arisefrom anchoring,33 the tendency to be unduly per-suaded by features encountered early in the pre-sentation of illness, thereby committing to a pre-mature diagnosis.1 Attaching diagnostic labels topatients early on in their presentation is an easyway of falling victim to anchoring. Another com-mon source of error is the powerful confirmationbias37 through which attention is directed dispro-portionately toward observations that appear to
ACADEMIC EMERGENCY MEDICINE November 2000, Volume 7, Number 11 1227
confirm a hypothesis instead of seeking evidencethat might disprove it. The combination of anchor-ing with confirmation bias can dangerously com-pound the error; occasionally, diagnoses can gathermomentum without gathering evidence. The con-junction fallacy is another source of error in whichthe likelihood of two or more independent in-stances occurring is overestimated through mis-takenly linking them in a causeeffect re-lationship.38
Another significant source of errors in the EDarises from the process of bounded rationality,39
denoting a restrictive keyhole view of the prob-lem we think confronts us.22 It is rather better de-scribed by the term search satisficing,40 in whichthe physician calls off the search for further ab-normalities, having achieved satisfaction fromfinding the first. There are many illustrative ex-amples of this in the ED, especially in radiologicalinterpretation, where second or additional frac-tures or significant soft-tissue injuries are com-monly missed. Often, we find what we expect tosee and prematurely settle for it, omission errorsbeing more common than false identification er-rors.41 Radiologists, too, are vulnerable to this typeof false-negative error.41 Experience may reducesuch errors, although the relationship may not bea simple linear one, depending on the degree ofcomplexity of the task.22,42 These findings havebroad applicability across a wide range of expertoccupations,22 and we would expect them to be ap-plicable to many ED activities.
Prevalence bias, or the tendency to misjudge thetrue base rate of a disease,43 is another source ofcognitive error. It is a good example of how we canbe misled by the representative heuristic.34 It is il-lustrated by the mistaken belief that circumstan-tial factors are representative of the event that weare anxious not to miss. For example, one studyshowed a fourfold overestimation of the probabilityof breast cancer in a woman with a positive mam-mogram by not taking into account the base rateof the disease and the correct detection rate of theimaging technique.44 Another example is the im-plicit assumption, often made, that patients pre-senting to the ED have more serious conditionsthan those who attend a clinic. Although it mightbe argued that such an assumption is a safe andprudent opening gambit in clinical decision mak-ing by emergency physicians, it inevitably resultsin an overestimation of the prevalence of seriousillness, and through the obligatory strategies ofrule out worst case scenario and erring on theside of caution, leads to an overuse of resources.Safe and prudent decision making may sometimesbe costly.
Still another well-documented cognitive phe-nomenon is hindsight bias,45 or the knew-it-all-
along effect.22 While EPs are well aware, often totheir chagrin, of the revealed wisdom and insightsthat the retroscope offers, they are less heedfulof the deceptive influence of this bias in their for-mal review of cases such as occurs at morbidityand mortality rounds. It may result in a faking-good reconstruction of the decision making thatactually took place, exaggerating what should havebeen anticipated in foresight, and overestimatingwhat was known at the time the case was first en-countered. The other side of the hindsight coin isthat individuals may be exposed to an unfair ap-praisal, made to look bad, and left feeling that theyshould have performed better. Importantly, thisdiscounts the ambient conditions at the time,which may have significantly influenced the deci-sions that were made, and which clearly are notreproduced in the cold and controlled light of dayat rounds. Either outcome results in an illusion ofcontrol of the decision-making process, leading toa misplaced confidence in being able to performwell in future similar situations.46 Hindsight biasclearly has a negative influence on our learning.Those who have failed to learn from their errorsmay be destined to repeat them.
The representative bias, referred to above, isalso evident at rounds, leading to a disproportion-ate emphasis on relatively rare and esoteric cases,which, though interesting, may actually contributelittle to the learning process. A similar phenome-non occurs in medical journals,47 leading to errorsassociated with the availability heuristic.43 Wetend to overestimate the prevalence of a disease ifwe have recently seen a case, or read about one,because what is readily available to our conscious-ness is more easily recalled and, therefore, over-represented. The converse of the availability heu-ristic occurs when we have had a dispro-portionately low exposure to a problem or diseaseand miss it completely when it comes along, whatReason has referred to as the out of sight, out ofmind failure mode.22 Other cognitive biases de-scribed in the cognitive psychology literature mayfind similar, useful application in the approach toclinical decision making in the ED.47,48
It seems reasonable to assume that many of thesebiases and heuristic failures might be circum-vented by first recognizing the circumstances un-der which they might exert their influence, andthen adopting a specific cognitive forcing strategy(CFS) to block the error; e.g., for potential errorsarising from search satisficing, described above,the simple strategy would be, once a positive find-ing was made, to immediately commence a searchfor a second or others. This would deal with the
1228 COGNITION Croskerry HOW WE THINK
classic error maxim: the most commonly missedfracture in the ED is the second.1 Using a CFS isan outcome of metacognition, the process by whichwe reflect upon, and have the option of regulating,what we are thinking. It describes the ability tostand back from the immediate situation and ac-tually observe ones own thinking. It has been de-scribed as a distinguishing hallmark of human in-telligence49 and has some parallel with Schonsconcept of reflection-in-action.17 Importantly, thenaturalistic decision model (NDM), referred to ear-lier, embraces metacognition as a core construct.The NDM recognizes the two-stage process in me-tacognition through which an individual 1) devel-ops awareness of the cognitive demands in a par-ticular situation and 2) is able to specify aparticular strategy for improving the decisionmaking.
We can benefit from analogies with the designof mechanical systems. Systemic errors can beavoided by making simple changes to the designand operational characteristics of the system, suchas forcing functions,50 which prevent a behaviorfrom occurring until a particular condition hasbeen met. Some car designs, for example, preventthe engine from being started unless the clutchpedal is depressed, thereby preventing the carfrom ever being started in gear, and avoiding a po-tential collision. Even subtle sequence changes canreduce error, for example when an automatedteller machine returns the customers card beforeyielding the cash, resulting in fewer cards beingleft in the machine.
For correcting cognitive errors, we do not enjoythe advantage of dealing with such simple, me-chanical behaviors. Nevertheless, some improve-ments in cognitive performance might similarly beachieved through this metacognitive process. Wemust first pay attention to what we are thinking,and when a particular error-prone scenario is iden-tified, we can select appropriate cognitive forcingstrategies to avoid some of these predictable flawsin our thinking. There is evidence that some of theeffects of cognitive biases can be reduced by train-ing.24 Thus, if EPs adopted metacognitive strate-gies, it seems likely that their cognitive perfor-mance would improve in a number of clinicalsituations.
Decision making in the ED is clearly influenced byprevailing, ambient conditions. Perhaps we wouldlike to believe that our decision making has someuniformity and consistency, but this is unlikelygiven the fluctuating conditions and inconstancy inmost EDs. The inverse relationship between speed(S) and accuracy (A) is well documented in the field
of industrial psychology. The resultant trade-offs(TO) that occur when accuracy is sacrificed forspeed are referred to as SATO phenomena.51 Thefaster the production line goes, the more errors arelikely to occur.
Similar problems clearly occur in the ED whenlimitations in resource availability (RA) such asnumber of available beds, staffing, fatigue, taskoverload, and time constraints compromise thecontinuous quality improvement (CQI) of care anddecision making, leading to trade-offs (RACQITO).The vital signs of the ED may sometimes becomeunstable.52 Decision errors resulting fromRACQITO phenomena are known to occur in anumber of medical settings,53 including the oper-ating room,54,55 the intensive care unit,56 and theED,57 and have received widespread coveragelately in the media in both the United States andCanada. They result in iatrogenic or, more pre-cisely, comiogenic errors, the latter term (from theGreek root komien, as in nosocomial) having beenproposed to describe patient harm that may origi-nate with any health care provider (including phy-sicians) at the sharp end, as well as administrativepersonnel at the blunt end of the delivery of healthcare.58 Errors are not the sole responsibility of phy-sicians and nurses; they are often forced by theoperating characteristics of the system in whichthey are obliged to practice.
The last decade has seen a tremendous upsurge ofattention to medical error with major works bothfrom within5962 and outside47,48,58,63,64 the profes-sion. The interest is reflected in an almost expo-nential increase, over the last 20 years, in citationson medical error collated by the American MedicalAssociations National Patient Safety Foundation.In a recent report from the U.S. Institute of Med-icine, it was suggested that medical errors wereapproaching the eighth leading cause of death inthe United States,65 and in a discussion of the re-port, the ED was characterized as a high hazardhealth care setting.66 Indeed, the ED has been rec-ognized as a natural laboratory for the study ofcomiogenic error,60 yet only recently has attentionbeen drawn to it.57,67 Several models derived fromnonmedical sources, such as the generic error mod-eling system (GEMS) described by Reason22 as wellas medical models,24,52,59 offer systematic ap-proaches of potential value. Both GEMS and theNDM include in their framework a mechanism foracknowledging the types of perceptual/cognitiveerrors described above. The analysis of clinical de-cision making is an important and integral aspectof error analysis, especially in the unique climateof the ED.67 Good decision making leads to fewer
ACADEMIC EMERGENCY MEDICINE November 2000, Volume 7, Number 11 1229
errors, improvements in patient care, greater well-being among its providers, and, for those in healthcare risk management, the added advantage of re-duced litigation costs.
Developments in the cognitive sciences, such aspsychology, philosophy, and logic, have much to of-fer our evolving approach to understanding howwe think. To date, however, there have been fewinterdisciplinary studies, most of them by cognitivescientists applying their phenomenological find-ings to what they perceive physicians do. Similaranalyses have been offered by logicians, statisti-cians, and Bayesian theorists in applying theirfindings to specific medical decision-making sce-narios and, less commonly, by physicians with agood grounding in probability theory.19,68
Few physicians, in fact, have a good workingknowledge of statistical theory and are usuallyaware of this.41 Given the increasing demand forstatistical knowledge made upon physicians byboth their patients and their clinical practice, twoproblems arise. The first is that of competence inprocedural statistical knowledge, the ability to con-duct appropriately designed studies and apply thecorrect statistical tests. Most physicians, recogniz-ing their lack of expertise, have consulted with sta-tistical experts, providing a good example of inter-disciplinary collaboration. For many who readthese studies, however, the assumption of statis-tical propriety remains an act of faith. The secondarea bears more directly on cognitive error andconcerns the widespread problem that physicians,like many others, have a fundamental difficulty inunderstanding some basic principles of statisticalinference, which include misconceptions aboutprobability (conditional probabilities in particular),regression, and sample size.34,41,68 Here, heuristicsusing personal experience, for example, both theavailability and the representative heuristics, caneasily lead to erroneous subjective estimates ofprobability. The cognitive scientists have again of-fered assistance, delineating ways in which statis-tical data can be made more comprehensible to cli-nicians.69
A similar onus is placed on ED clinicians, inturn, to overcome outcome bias,70 the tendency tojudge the quality of a decision in terms of its out-come, and omission bias,71 where we tend to preferthe consequences of inaction (omission) ratherthan commit ourselves to doing something thatchanges the patients course, thereby assumingmore direct responsibility for an adverse outcome.We need, too, to ensure that we are always com-municating effectively with our patients and thatthey are cognitively with us. Information about the
many varied procedures in EM, ranging from sim-ple venipuncture to thrombolytic therapy, as wellas current statistical data about particular dis-eases, needs to be transmitted in ways that areclearly understood.
To date, the opportunity has not readily pre-sented itself for EPs and nurses to incorporatethese findings from the cognitive sciences intotheir own sphere of understanding, and into theirpractice. In addition to traditional interdiscipli-nary obstacles, another major impediment is thatthe language of cognitive psychology is complexand arduous. There is little incentive for practicingEPs to venture out and master these discipline-specific lexicons as a necessary first step to under-standing how their findings might have applicationfor them. While some preliminary work has beendone in this direction,1,8,72 more effort is needed tosimplify the description and categorization of cog-nitive errors, and to develop our own lexicon to fa-cilitate communication and understanding amongourselves. It should be possible, as Dennett has de-scribed, to lift ourselves clear73 of the substrateof phenomenological language used by the cogni-tive sciences, so that we can all talk more clearlyto each other.
TRAINING AND TEACHING
During the course of medical training there is sig-nificant pressure to acquire competence in a largedomain of specific knowledge, and the anachronis-tic belief may still persist that once this has beenaccomplished, the task is mostly complete. Thosewith experience, however, know that this is onlythe beginning. Clinical acumen comes from theconstant honing and refinement of our skills, prin-cipally in the cognitive sphere. Perhaps the reasonwhy excellent clinicians are less able to articulatewhat they do than others who observe them74 isbecause, traditionally, there has been little empha-sis on developing insight into the cognitive aspectsof decision-making, or the deeply hidden ele-ments that underlie our mental deliberations.75
The learning process required for educatingourselves, as well as those in training, about bothcognitive and affective sources of error does notpresent an overly formidable task. There alreadyexists an extensive literature on all of the heuris-tics, biases, and pitfalls referred to here. The firststep is a commitment to the principle that the ac-quisition of such cognitive skills is, indeed, aworthwhile goal. The second is finding an agree-able and workable language to describe them, onethat would facilitate their use and promote mean-ingful communication among physicians abouttheir errors. Third, we need to recognize that mim-
1230 COGNITION Croskerry HOW WE THINK
icry is a significant feature of those in training.*We should be mindful, given the apprentice-likecomponent of medical training, that we provide ju-dicious models of cognitive behavior to those whomwe instruct, and that our mistakes are not them-selves mimicked. The final and challenging step,as Kassirer3 has proposed, is to find imaginativeand creative ways in which cognitive strategiesmight formally be taught.
The author is grateful to his psychologist colleagues Drs. Hil-ton, Sougoski, Winterbotham, and Coles for their input in thedevelopment of some of these ideas, to Dr. Bob Wears for hiscomments on an earlier version of the manuscript, and to Dr.James Adams and Calla Farn for their suggestions for the finalversion. The author is indebted to his administrative assistant,Sherri Lamont, at the Dartmouth General Hospital EmergencyDepartment, for her exemplary work. Finally, the author re-cords his thanks to an earlier mentor, Dr. Grant Smith, whoestablished in him some of the basics of disciplined thought.
1. Kovacs G, Croskerry P. Clinical decision making: an emer-gency medicine perspective. Acad Emerg Med. 1999; 6:94752.2. Kushniruk AW, Patel VL. Cognitive evaluation of decisionmaking processes and assessment of information technology inmedicine. Int J Med Inform. 1998; 51:8390.3. Kassirer JP. Teaching problem-solvinghow are we doing?N Engl J Med. 1995; 332:15078.4. Glassman NS, Andersen SM. Transference in social cogni-tion: persistence and exacerbation of significant-other-basedinferences over time. Cogn Ther Res. 1999; 23:7591.5. Gleitman H, Fridlund AJ, Reisberg D. Psychology (5th ed).New York: W. W. Norton, 1999.6. Feinstein AR. The chagrin factor and qualitative decisionanalysis. Arch Intern Med. 1985; 145:12579.7. Lichenstein S, Fischhoff B, Phillipps LD. Calibration ofprobabilities: the state of the art to 1980. In: Kahneman D,Slovic P, Tversky A (eds). Judgment under Uncertainty: Heu-ristics and Biases. New York: Cambridge University Press,1982.8. Croskerry P. Avoiding pitfalls in the emergency room. CanJ Contin Med Educ. 1996; Apr:110.9. Schein E. Professional Education. New York: McGraw-Hill,1973.10. Elstein AS. Heuristics and biases: selected errors in clini-cal reasoning. Acad Med. 1999; 74:7914.11. Beach LR, Lipshitz R. Why classical decision theory is aninappropriate standard for evaluating and aiding most humandecision making. In: Klein GA, Orasanu J, Calderwood R,Zsambok CE (eds). Decision Making in Action: Models andMethods. Norwood, NJ: Ablex Publishing Corp., 1995.12. Kassirer JP, Kopelman RI. Learning Clinical Reasoning.Baltimore: Williams and Wilkins, 1991.13. Popper K. The Logic of Scientific Discovery. London: Un-win Hyman, 1959.14. Buler P. Problem solving in clinical medicine. In: FromData to Diagnosis, 2nd ed. Baltimore: Williams and Wilkins,1985.15. Williams BT. Computer Aids to Clinical Decisions, vols 1and 11. Boca Raton, FL: CRC Press, 1982.16. Sackett DL, Haynes RB, Guyall GH, Tugwell P. ClinicalEpidemiology: A Basic Science for Clinical Medicine. 2nd ed.Boston: Little, Brown and Co., 1991.
*This ethological metaphor was brought to my attention by Dr.Colin Coles. Apparently, it was used originally in a British ad-vanced trauma life support (ATLS) instructors manual in1993.
17. Schon DA. The Reflective Practitioner. New York: BasicBooks, 1983.18. Seifer SD. Recent and emerging trends in undergraduatemedical educationcurricular responses to a rapidly changinghealth care system. West J Med. 1998; 168:40011.19. Feinstein AR. The haze of Bayes, the aerial palaces of de-cision analysis, and the computerized Ouija board. Clin PharmTher. 1973; 21:48296.20. Wears RL. The aerial palaces of decision analysis redux[commentary]. Acad Emerg Med. 2000; 7:3802.21. Croskerry P, Kovacs G. Reply to letter by Wears RL (Com-ments on Clinical decision making: an emergency medicineperspective). Acad Emerg Med. 2000; 7:4124.22. Reason J. Human Error. New York: Cambridge UniversityPress, 1990.23. Rasmussen J. Deciding and doing: decision making in nat-ural contexts. In: Klein GA, Orasanu J, Calderwood R, Zsam-bok CE. Decision Making in Action: Models and Methods. Nor-wood, NJ: Ablex Publishing Corp., 1995.24. Klein GA, Orasanu J, Calderwood R, Zsambok CE. Deci-sion Making in Action: Models and Methods. Norwood, NJ:Ablex Publishing Corp., 1995.25. Hughes W. Critical Thinking, 2nd ed. Toronto: BroadviewPress, 1999.26. Flood RA, Seager CP. A retrospective examination of psy-chiatric case records of patients who subsequently committedsuicide. Br J Psychiatry. 1968; 114:44350.27. Ironside W. Iatrogenic contributions to suicide and a reporton 37 suicide attempts. N Z Med J. 1969; 69:20711.28. Maltsberger JT, Buie DH. Countertransference hate in thetreatment of suicidal patients. Arch Gen Psychiatry. 1974; 30:62533.29. Heider F. The Psychology of Interpersonal Relations. NewYork: Wiley, 1958.30. Hutzler J, Rund D. Behavioral disorders: emergency as-sessment and stabilization. In: Tintinalli J, Ruiz E, Krome R(eds). Emergency Medicine: A Comprehensive Study Guide.New York: McGraw-Hill, 1996.31. Tomassi P. Logic and medicine. In: Phillips CI. Logic inMedicine. London: BMJ Publishing Group, 1995.32. Steill IG, Greenberg GH, McKnight RD, Nair RC, McDow-ell I, Worthington JR. A study to develop clinical decision rulesfor the use of radiography in acute ankle injuries. Ann EmergMed. 1992; 21:38490.33. Tversky A, Kahneman D. Belief in the law of small num-bers. In: Kahneman D, Slovic P, Tversky A (eds). Judgmentunder Uncertainty: Heuristics and Biases. New York: Cam-bridge University Press, 1982.34. Tversky A, Kahneman D. Judgment under uncertainty:heuristics and biases. Science. 1974; 185:112431.35. Perrow C. Normal Accidents. Princeton, NJ: PrincetonUniversity Press, 1999.36. Gigerenzer G, Goldstein DG. Betting on one good reason:take the best and its relatives. In: Gigerenzer DG, Todd P, andthe ABC Research Group (eds). Simple Heuristics that MakeUs Look Smart. New York: Oxford, 1999.37. Wason PC. On the failure to eliminate hypotheses in a con-ceptual task. Q J Exp Psychol. 1960; 12:12940.38. Tversky A, Kahneman D. Judgments of and by represen-tativeness. In: Kahneman D, Slovic P, Tversky A (eds). Judg-ment under Uncertainty: Heuristics and Biases. New York:Cambridge University Press, 1982.39. Newell A, Simon HA. Human Problem Solving. EnglewoodCliffs, NJ: Prentice-Hall, 1972.40. Simon HA. Reason in Human Affairs. London: Basil Black-well, 1983.41. Klatzky RL, Geiwitz J, Fischer SC. Using statistics in clin-ical practice: a gap between training and application. In: Bog-ner MS (ed). Human Error in Medicine. Hillsdale, NJ: Law-rence Erlbaum Associates, 1994.42. Raufaste E, Eyrolle H, Marine C. Pertinence generation inradiological diagnosis: spreading activation and the nature ofexpertise. Cogn Sci. 1998; 22:51746.43. Tversky A, Kahneman D. Availability: a heuristic for judg-ing frequency and probability. Cogn Psychol. 1973; 5:20732.44. Eddy DM. Variations in physician practice: the role of un-certainty. Health Aff. 1984; 3:7489.45. Fischoff B. Hindsight foresight; the effect of outcome
ACADEMIC EMERGENCY MEDICINE November 2000, Volume 7, Number 11 1231
knowledge on judgment under uncertainty. J Exp Psychol HumPercept Perform. 1975; 1:28899.46. Langer EJ. The illusion of control. J Person Soc Psychol.1975; 7:185207.47. Dowie J, Elstein A (eds). Professional Judgment. Cam-bridge: Cambridge University Press, 1988.48. Kahneman D, Slovic P, Tversky A (eds). Judgment underUncertainty: Heuristics and Biases. New York: Cambridge Uni-versity Press, 1982.49. Gleitman H. Some trends in the study of cognition. In:Koch S, Leary DE (eds). A Century of Psychology as a Science.New York: McGraw-Hill, 1985.50. Lewis C, Norman DA. Designing for error. In: Norman D,Draper S (eds). User Centered System Design. Hillsdale, NJ:Erlbaum, 1986.51. Foley P, Murray N. Sensation, perception, and systems de-sign. In: Salvendy G (ed). Handbook of Human Factors. NewYork: John Wiley and Sons, 1987.52. Vincent C, Taylor-Adams S, Stanhope N. Framework foranalysing risk and safety in clinical medicine. BMJ. 1998; 316:11547.53. Leape L. Error in medicine. JAMA. 1994; 272:18517.54. Weinger MB, Englund CE. Ergonomic and human factorsaffecting anesthetic vigilance and monitoring performance inthe operating room environment. Anesthesiology. 1990; 73:9951021.55. Helmreich RL, Schaefer HG. Team performance in the op-erating room. In: Bogner MS (ed). Human Error in Medicine.Hillsdale, NJ: Lawrence Erlbaum, 1994.56. Donchin Y, Gopher D, Olin M, et al. A look into the natureand causes of human errors in the intensive care unit. CritCare Med. 1995; 23:294300.57. Risser DT, Rice MM, Salisbury ML, et al. The potential forimproved teamwork to reduce medical errors in the emergencydepartment. Ann Emerg Med. 1999; 34:37383.58. Sharpe VA, Faden AI. Medical Harm. Cambridge: Cam-bridge University Press, 1998.59. Riegelman RK. Minimizing Medical Mistakes: The Art of
Medical Decision Making. Boston: Little, Brown and Co., 1991.60. Bogner MS (ed). Human Error in Medicine. Hillsdale, NJ:Lawrence Erlbaum Associates, 1994.61. Youngson RM, Schott I. Medical Blunders. New York: NewYork University Press, 1996.62. Spath PL (ed). Error Reduction in Health Care. San Fran-cisco: Jossey-Bass, 1999.63. Bosk CL. Forgive and Remember: Managing Medical Fail-ure. Chicago: University of Chicago Press, 1979.64. Paget MA. The Unity of Mistakes. Philadelphia: TempleUniversity Press, 1988.65. Kohn LT, Corrigan JM, Donaldson, MS (eds). To Err IsHuman: Building a Safer Health Care System. Report of theInstitute of Medicine. Washington, DC: National AcademyPress, 1999.66. Eisenberg JM. Statement on Medical Errors, Agency forHealthcare Research and Quality, before the Senate Appropri-ations Subcommittees on Labor, Health and Human Services,and Education, December 13, 1999, Washington, DC. Rockville,MD: Agency for Healthcare Research and Quality, http://www.ahrq.gov/news/stat1213.htm.67. Wears RL, Leape LL. Human error in emergency medi-cine. Ann Emerg Med. 1999; 34:3702.68. Macartney FJ. Diagnostic logic. In: Phillips CI (ed). Logicin Medicine (2nd ed). BMJ Publishing Group. Plymouth: Lat-imer Trend and Co., 1995.69. Hoffrage U, Gigerenzer G. Using natural frequencies toimprove diagnostic inferences. Acad Med. 1998; 73:53840.70. Baron J, Hershey JC. Outcome bias in decision evaluation.J Pers Soc Psychol. 1988; 54:56979.71. Ritov I, Baron J. Reluctance to vaccinate: omission biasand ambiguity. J Behav Decis Making. 1990; 3:26377.72. Bartlett EE. Physicians cognitive errors and their liabilityconsequences. J Healthcare Risk Manage. 1998; fall:629.73. Dennett DC. Darwins Dangerous Idea. New York: Touch-stone, 1995.74. Epstein RM. Mindful practice. JAMA. 1999; 9:8339.75. Coles C. Teaching the teachers. Med Educ. 2000; 34:845.