GRGL

Embed Size (px)

Citation preview

  • 8/13/2019 GRGL

    1/6

    Studies in Intelligence Vol. 56, No. 2 (Extracts, June 2012) 1All statements of fact, opinion, or analysis expressed in this article are thoseof the author. Nothing in this article should beconstrued as asserting or implying US government endorsement of its factual statements and interpretations.Intelligence in Public LiteratureDaniel Kahneman, (New York: Farrar, Straus and Giroux, 2011), 418 pp.Few books are must readsfor intelligenceofficers. Fewer still are must readsthat mentionIntelligence Community functions or theCIA only once, and then only in passing. DanielKahneman has written one of these rarebooks. Thinking, Fast and Slow represents anelegant summation of a lifetime of research inwhich Kahneman, Princeton University ProfessorEmeritus of Psychology and Public Affairs,and his late collaborator, Amos Tversky,changed the way psychologists think aboutthinking. Kahneman, who won the 2002 NobelPrize in Economics for his work with Tverskyon prospect theory, also highlights the bestwork of other researchers throughout the book.Thinking, Fast and Slow introduces no revolutionarynew material, but it is a masterpiece

    because of the way Kahneman weaves existingresearch together.Expert intelligence officers at CIA, anagency with the human intelligencemissionat its core, have come through experience andpractice to understand and exploit the humancognitive processes of which Kahneman writes.These expert officers will have many momentsof recognition in reading this book, which givesan empirical underpinning for much of theirhard-won wisdom.Kahneman also may challenge some stronglyheld beliefs. Thinking, Fast and Slow gives

    experts and newer officers, regardless of theintelligence agency in which they serve, anenormously useful cognitive framework uponwhich to hang their experiences.The title of the book refers to what Kahneman,adapting a device that other researchersoriginally proposed, calls the two systemsofthe human mind. System 1, or fast thinking,operates automatically and quickly with littleor no effort and no sense of voluntary control.Most System 1 skillssuch as detecting therelative distances of objects, orienting to a suddensound, or detecting hostility in a voiceare

    innate and are found in other animals. Somefast and automatic System 1 skills can beacquired through prolonged practice, such asreading and understanding nuances of socialsituations. Experts in a field can even use System1 to quickly, effortlessly, and accuratelyretrieve stored experience to make complexjudgments. A chess master quickly findingstrong moves and a quarterback changing aplay sent to him from the sideline when he recognizes

  • 8/13/2019 GRGL

    2/6

    a defensive weakness are examples ofacquired System 1 thinking.System 2, or slow thinking, allocates attentionto the mental activities that demandeffort, such as complex computations and conscious,reasoned choices about what to thinkand what to do. System 2 requires most of us topay attentionto do things such as drive on anunfamiliar road during a snowstorm, calculatethe product of 17x24, schedule transportationfor a teenage daughters activities, or understanda complex logical argument.Kahneman focuses much of the book on theinteractions of System 1 and System 2 and theproblems inherent in those interactions. Bothsystems are onwhen we are awake. System 1runs automatically and effortlessly butReviewed by Frank J. BabetskiThinking, Fast and SlowThinking, Fast and Slow2 Studies in Intelligence Vol. 56, No. 2 (Extracts, June 2012)System 2 idles, because using it requires effortand is tiring. System 1 generates impressionsand feelings, which become the source of System

    2s explicit beliefs and deliberate choices.System 1, when it encounters something it cannotquickly understand and did not expect (inother words, a surprise), enlists System 2 tomake sense of the anomaly. The alerted System2 takes charge, overriding System 1s automaticreactions. System 2 always has the lastword when it chooses to assert it.The systems operate to minimize effort andmaximize performance and are the result ofhundreds of thousands of years of human evolutionin our environment. They workextremely well, usually. System 1 performs

    well at making accurate models and predictionsin familiar environments. System 1 hastwo significant weaknesses: it is prone to makesystemic errors in specified situationstheseare biasesand it cannot be turned off. System2 can, with effort, overrule these biases ifit recognizes them. Unfortunately, System 2 isdemonstrably very poor at recognizing onesown biased thinking. Trying to engage System2 at all times to prevent System 1 errors isimpractical and exhausting.In terms of Kahnemans construct, a significantpart of the missions of intelligence agencies

    boils down to seizing opportunitiespresented by the flawed interactions of the System1 and System 2 thinking of foreign actorswhile at the same time recognizing and mitigatingthe flaws of their own System 1 andSystem 2 interactions. Hostile services andorganizations try to do the same thing inreturn. Operations officers rely on the biases offoreign counterintelligence officers, essentiallyadvising assets to avoid exciting any System 2

  • 8/13/2019 GRGL

    3/6

    thinking in people positioned to do them harm.Aldrich Amess Soviet handlers preferred thatwe not focus System 2 thought on how hebought a Jaguar on a GS-14 paycheckSystem1 found a tale about his wifes inheritancecognitively easy to accept.aA targets biases put the plausiblein plausibledeniability during covert actions. Effectivedeceptions also fundamentally rely on atargets unchallenged biases and so make iteasy for the target to believe what they alreadyare predisposed to believe. Effective fabricators,especially those with tantalizing access,rely on our biased desire to believe them. Oneor two plausible reports from such a personmay be enough to engage the exaggerated emotionalcoherence or halo effect. Roughly put,once lazy System 2 is satisfied, it tends to deferto System 1, which in turn projects positivequalities in one area into a generalized positiveassessment.Terrorists rely on these biases, but they arealso vulnerable to them. Terrorism worksbecause it provides extremely vivid images of

    death and destruction, which constant mediaattention magnifies. These images are immediatelyavailable to a targets System 1.System 2, even when armed with reliable statisticson the rarity of any type of terroristevent, cannot overcome System 1s associativereaction to specific events. If you are a CIA officerwho was working in Langley on 25 January1993, then chances are that you cannot makethe left turn into the compound from DolleyMadison Boulevard without thinking of AimalKasi, the Pakistani who killed two CIA officersand wounded three others at that intersection

    that day.The 9/11 hijackers on the first three planescould count on passengers to stay seated, relyingon their ability to quickly rememberaccounts of previous hijackings in which thehijackers were motivated to survivethis iswhat Kahneman calls the availability bias.However, because of their success at the WorldTrade Center and the Pentagon, the terroristsunwittingly and immediately rendered hijackinga less effective tactic. The passengers onFlight 93, quickly armed with knowledge of theother three flights, were able to engage System

    2 to overcome System 1s existing availaIf you think that you certainly would have known Ames was a Soviet spy had you known of his Jaguar, then you are probablyguilty of hindsight bias, or the tendency to underestimate the extent to which you were surprised by past events. On the other hand,you are not guilty of hindsight bias if you think this (before having read aboutAmes) and have ever reported a colleague to counterintelligencefor owning a Jaguar.Thinking, Fast and SlowStudies in Intelligence Vol. 56, No. 2 (Extracts, June 2012) 3

  • 8/13/2019 GRGL

    4/6

    ability bias and make the decision to physicallyoverpower the terrorists.Kahnemans insights pertain to the entirespectrum of intelligence operations. We acceptinformation security practices that demonstrablyimpede productivity in order to reduce thedanger of worse losses posed by cyberattack orpenetration. At the same time, we wouldalmost certainly consider the same amount oflost productivity a major defeat if a hacker hadinflicted it on us. This is what Kahneman callsthe loss aversion bias. System 2 does not assertcontrol over System 1s cognitive ease at imagininga disaster because increased productivityis much more difficult for System 2 to imagine.Any intelligence officer making budget decisionsshould read Kahnemans thoughts on thebiases underlying the sunk-cost fallacy, or thedecision to invest additional resources in losingendeavors when better investments areavailable. People find it difficult to engage System2 to cut their losses in such situations,especially when System 1 can easily convincethem of the loss of prestige that would surely

    follow. How often does the same officer whostarted an expensive major project also decideto kill it? You likely did not have to engage System2 to answer the question.Likewise, none of us are immune to whatKahneman calls the planning fallacy, whichdescribes plans and forecasts that are unrealisticallyclose to best-case scenarios and could beimproved by consulting statistics in similarcases. This review, for example, took twice aslong to write as I thought it would, just likealmost every other paper I have ever written.Intelligence analysts should pay particularly

    close attention to Kahnemans chapterson the nested problems of prediction, intuition,and expertise.a Forecasting and prediction arecore mission elements for analysts. Kahnemanbreaks them down into two main varieties. Thefirst, such as those engineers make, rely onlook-up tables, precise calculations, and explicitanalyses of outcomes observed on similar occasions.This is the approach an analyst uses topredict the amount of explosive force needed topenetrate a certain thickness of concrete, orcalculate how much fuel a certain type of airplaneneeds to complete a certain type of mission.

    Other forecasts and predictions involve intuitionand System 1 thinking. Kahneman furtherbreaks down this variety of prediction intotwo subvarieties. The first draws on the skillsand expertise acquired by repeated experience,in which a solution to the current problemcomes quickly to mind because System 1 accuratelyrecognizes familiar cues. The secondsubvariety of intuitive prediction, which isoften indistinguishable from the first, is based

  • 8/13/2019 GRGL

    5/6

    on biased judgments. This type of intuitive prediction,typically forwarded with considerableconfidence, very often leads to trouble. Theexpanded use in intelligence analysis of structuredanalytic techniques and approachesadopted in the wake of the 9/11 attacks and theNational Intelligence Estimate on Iraqi weaponsof mass destruction represents in part aneffort to eliminate this latter type of prediction.The trick is in using structured techniquesand approachesor applied System 2 thinkingin a way that eliminates biased intuitiveforecasts and predictions without also discouraging,delaying, or even eliminating the intuitiveinsights that true expertise provides. Thisdilemma probably explains in part why someexperts in the CIAs Senior Analytic Serviceremain ambivalent about structured analytictechniques and approaches.Kahneman, despite his stated preference forstatistics and algorithms, cannot dismiss out ofhand the value of intuitive prediction borne oftrue expertise. His Expert Intuition: WhenCan We Trust It?chapter centers on what he

    calls his adversarial collaboration with GaryKlein, a leading proponent of Naturalistic DeciaMany intelligence analysts are familiar with some of these theories from Richards J. Heuer, Jr.s Psychology of Intelligence Analysis(Washington, DC: Center for the Study of Intelligence, 1999), which is based inpart on earlier versions of Kahnemans and Tverskyswork. This publication is available online at https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/booksand-monographs/psychology-of-intelligence-analysis/index.html.Thinking, Fast and Slow4 Studies in Intelligence Vol. 56, No. 2 (Extracts, June 2012)sion Making, who rejects Kahnemans emphasison biases and focuses instead on the value

    of expert intuition and on how intuitive skillsdevelop. It is not difficult to imagine that theircollaboration was more difficult than Kahnemangenerously portrays it to have been, whichmakes the areas on which they were able toagree even more noteworthy.They agreed that the confidence that expertsexpress in their intuitive judgments is not areliable guide to their validity. They furtheragreed that two basic conditions must be presentbefore intuitive judgments reflect trueexpertise: an environment that is sufficientlyregular to be predictable and an opportunity to

    learn these regularities through prolongedpractice. An expert firefighters sensing theneed to order his men to evacuate a burningbuilding just before it collapses or a racedrivers knowing to slow down well before themassive accident comes into view are due tohighly valid clues that each experts System 1has learned to use, even if System 2 has notlearned to name them.Learning, in turn, relies on receiving timely

  • 8/13/2019 GRGL

    6/6

    and unambiguous feedback. Many if not mostof the issues with which intelligence analystsare seized are what Kahneman and Kleinwould probably call low-validityenvironments,in which the intuitive predictions ofexperts should not be trusted at face value,irrespective of the confidence with which theyare stated. Moreover, they would probably considerthe feedback available to analystsfrompolicymakers and eventsinadequate for efficientlearning and expertise development. Kahnemanwas not referring specifically tointelligence analysts when he wrote, it iswrong to blame anyone for failing to forecastaccurately in an unpredictable world,but hehas given interviews in which he discussesintelligence analysts in this context. At thesame time, he also wrote, however, it seemsfair to blame professionals for believing theycan succeed in an impossible task.In short,Kahneman concedes that intuition has to bevalued, but it cannot necessarily be trusted.Thinking, Fast and Slow provides intelligenceofficers with an accessible vocabulary to

    discuss the processes of human cognitiontheinteractions between System 1 and System 2thinkingwhich are at the center of theirwork. It does not, however, provide solutions orreliable approaches to bias mitigation. Accordingto Kahneman, the best we can hope to do islearn to recognize situations in which mistakesare likely, and try harder to avoid specificerrors when the stakes are high.Kahneman also spends very little time discussinghow biases work in collaborative environments,despite his own very insightful accountsof his collaboration with Tversky. We can hope

    he will explore that in his next work.? ? ?