18
This article was downloaded by: [The UC Irvine Libraries] On: 26 October 2014, At: 09:51 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK International Journal of Intelligence and CounterIntelligence Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/ujic20 Designing Effective Teaching and Learning Environments for a New Generation of Analysts James G. Breckenridge Published online: 25 Feb 2010. To cite this article: James G. Breckenridge (2010) Designing Effective Teaching and Learning Environments for a New Generation of Analysts, International Journal of Intelligence and CounterIntelligence, 23:2, 307-323, DOI: 10.1080/08850600903347418 To link to this article: http://dx.doi.org/10.1080/08850600903347418 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms- and-conditions

Designing Effective Teaching and Learning Environments for a New Generation of Analysts

  • Upload
    james-g

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

This article was downloaded by: [The UC Irvine Libraries]On: 26 October 2014, At: 09:51Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

International Journal of Intelligence andCounterIntelligencePublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/ujic20

Designing Effective Teaching andLearning Environments for a NewGeneration of AnalystsJames G. BreckenridgePublished online: 25 Feb 2010.

To cite this article: James G. Breckenridge (2010) Designing Effective Teaching and LearningEnvironments for a New Generation of Analysts, International Journal of Intelligence andCounterIntelligence, 23:2, 307-323, DOI: 10.1080/08850600903347418

To link to this article: http://dx.doi.org/10.1080/08850600903347418

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

JAMES G. BRECKENRIDGE

Designing Effective Teaching andLearning Environments for a NewGeneration of Analysts

The Intelligence Community (IC) confronts an increasing demand foranalysts able to effectively assess the complexities and implications of newas well as old threat issues for a dynamic world environment. At the sametime, the IC grapples with the accelerating departure of its mostexperienced analytic cadre. The greening of the analytic community placesan increased burden upon those charged with ensuring that academicallyqualified individuals become competent intelligence analysts.1 Thischallenge highlights the importance of understanding how to ensure thatintelligence analysis (IA) training is effective, how to improve analytictraining, and how to refine analytic competence throughout a career.

James G. Breckenridge is Dean of the Walker School of Business, andChairman of the Department of Intelligence Studies at Mercyhurst College,Erie, Pennsylvania. In recent years, major private sector firms such as BoozAllen Hamilton and Northrop Grumman, as well as the United StatesDepartment of Homeland Security, have contracted with Mercyhurst toprovide graduate education in intelligence to their analysts. Prior to joiningMercyhurst in 1998 as Director of Admissions, he was an officer with theUnited States Army, during which time he also served as an AssistantProfessor of History at the United States Military Academy, West Point,New York, from 1987 to 1990, and Professor of Military Science at GannonUniversity, Erie, Pennsylvania, from 1995–1998.

International Journal of Intelligence and CounterIntelligence, 23: 307–323, 2010

Copyright # Taylor & Francis Group, LLC

ISSN: 0885-0607 print=1521-0561 online

DOI: 10.1080/08850600903347418

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2 307

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 3: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

EFFECTIVENESS OF TRAINING

The recent meta-analysis of training effectiveness by Winfred Arthur Jr. andhis colleagues provides a useful starting point.2 In their statistical assessmentof 162 published scholarly evaluations of organizational training since 1960,they identify several design and evaluation features that determine themeasured effectiveness of training programs:

. The type of evaluation criteria

. The implementation of training needs assessment

. The skill or task characteristics trained

. The match between the skill or task characteristics and the training deliverymethod.

The study by Arthur et al. borrows from the work of Donald L.Kirkpatrick, which distinguishes four levels of training evaluation, eachsuccessively more informed and precise in terms of training assessment,and each more substantial in terms of payoff:3

. Level 1: Documents the participants’ reactions to the training, usually with afamiliar questionnaire asking ‘‘How satisfied were you with . . .?’’

. Level 2: Assesses students’ knowledge gained, typically through pre- andpost-course testing.

. Level 3: Evaluates the transfer of knowledge to the learner’s work environment byassessing behavior on the job using reports from self, bosses, and peers.

. Level 4: Assesses the training results or outcomes at the organizational level, forexample, in quality, efficiency, or customer satisfaction statistics; revenues; orsome other metric of strategic corporate goal.4

When designing training and assessing its effectiveness, Kirkpatrick’ssystem helps answer the question: Effective for what purpose: pleasingemployees, acquiring testable knowledge, changing job behavior, or gettingbetter mission results? Arthur et al. determined that training effectivenessvaries as a function of what an organization chooses to train, how ittrains, and how it assesses the results of training. Organizations go to thetrouble of making these assessments when they want to track programprogress and identify needs for improvement. As another example of thecontribution that psychology can make to IA, the IC could benefit froman application of the conclusions of Arthur et al. and the methodology ofthe Kirkpatrick model to its training programs.The learning environment reaches across the entire learning cycle, from job

analysis (what needs to be taught) through worker assessment (who needs tolearn and at what levels of proficiency) and then to training techniques (howpeople learn) and back to job design (maximizing likelihood that knowledgewill transfer from instruction to job). This cycle should yield the employer a

308 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 4: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

reasonable return on his or her investment. Extensive research supports eachstage, and psychological theories of learning, cognition, and work explainhow they all operate as a cycle or system. A fifth level used in highereducation, but not typically in the training environment, proposes usingresults or outcomes to inform organizational decisions and practices. Forexample, if the employer identifies a deficiency in link analysis skills, thisshortcoming would be communicated to the training developer forinclusion in the curriculum. The purpose of this final step is to make surethe organization is accountable and continually improves.To take advantage of the science supporting occupational instruction, the

IC can use a five-part program that contributes to employee andorganizational performance enhancement:

1. Construct a comprehensive and systematic training needs assessment that canserve as the basis for the design, development, delivery, and evaluation of thetraining program.

2. As Arthur et al. have pointed out, ‘‘a product of the needs assessment is thespecification of the training objectives that, in turn, identifies . . . the skills andtasks to be trained.’’5 Educational program managers in conjunction with IAline managers identify training objectives.

3. Training specialists assess cognitive and learning styles of the potential studentsand match them with alternative training delivery methods.

4. Training specialists establish appropriate metrics—training evaluation criteriathat will inform all the accountable stakeholders as to how well the system isworking and what needs improvement.

5. Training specialists and line IA managers identify the conditions in theworkplace—equipment, job design, performance metrics and rewards,evaluation criteria, etc.—that are conducive to transfer of learning to work;these are designed in and disincentives are designed out of the jobs.6

WHAT TO TEACH?

A review of Websites, training literature, and job descriptions indicatesdepartments and agencies within the IC have conducted some trainingneeds assessment, albeit in an uncoordinated and incomplete fashion.7 Inhis recent ethnographic study, Rob Johnston found no systematic,comprehensive training review encompassing all IC agencies. Academicand corporate intelligence training initiatives are, similarly, diverse anduncoordinated. The IC could realize big dividends if it rectified thedisparity among IA training programs.A comprehensive needs assessment commissioned by national intelligence

authorities that involves training practitioners across the IC, academia, andcontractor community could support reform. Such efforts would be anongoing process, with an established agenda and designated participants.

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 309

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 5: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

The process would be preceded by the development and formulation of anIA doctrine so that the assessment has a firm foundation and point ofdeparture. IA doctrine is a thread of continuity that can supportconsistency of training and measures of effectiveness from beginnerthrough advanced levels, and weave together training, selection,management, and technology development and acquisition. The needsassessment process would address preparatory (preemployment) experience,training, and education; entry-level training, basic training, advancedtraining and education; and career and continuing education. Critical andelective IA training objectives and tasks would be derived from the needsassessment process.The Director of National Intel l igence (DNI) would require a

professionally executed analysis of the requisite knowledge, skills, andabilities (KSAs) of intelligence analysts for the IC-wide training initiative.Professor Russell Swenson’s collected works on best practices within theIC may provide a good starting point for the discussion of training needs,objectives, and tasks.8 Swenson asserted that (as of 2003) the ‘‘availableliterature does not yet address the question of what knowledge, skills, andabilities are required, from the point of view of front-line managers, tosupport and sustain the evolution of intelligence tradecraft.’’9 In an effortto address this question, David T. Moore and Lisa Krizan defined a set ofcore competencies that ‘ ‘seem to apply across the intel l igenceprofession.’’10 These core competencies are a combination of an analyst’sabilities, personal characteristics, knowledge, and skill set.11 For example,an analyst may have an in-depth knowledge of chemical weapons, awell-developed oral communication ability, a personal characteristic ofbeing insatiably curious, and be highly skilled in Geographic InformationSystem (GIS) software. Moore and Krizan grouped the set of corecompetencies into a taxonomy that managers can use in hiring andtraining intelligence analysts.University students who aspire to be intelligence analysts are typically

uncertain about where in the IC they will seek a career. This uncertaintycould be alleviated by introductory courses designed to first build aknowledge base, followed by a skill set. In such a course, the aspiringstudent would learn the structure, function, and processes of the IC, andstudents would be exposed to the terminology and culture of the analyticcommunity. Because academic programs feed not just the federal demandfor analysts, but also state and local law enforcement and businessmarketplaces, some courses would address common requirements across allthree fields, and some would address specialty needs.Kristan Wheaton, who has conducted innovative training on intelligence

analysis tools and methods, has pointed out that virtually all analysts nowhave significant collection, exploitation, and production duties. In an effort

310 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 6: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

to fulfill these duties, analysts need applied courses that teach particular skillsets. Elicitation techniques, multimedia production, and image interpretationare just a few of the subjects that could be part of the common IA curriculumin the IC. Program developers may also contemplate the inclusion of acommunications course that introduces students to the fundamentals ofeffective intelligence writing and presentation in various settings.Courses specific to each application of IA—national security, law

enforcement, and commerce—would address their particular processes andissues, thus helping students become familiar with career choices. Morespecialized training would ensue subsequent to job placement, whenemployers address their particular rules, terminology, and practices.Courses focusing on IA may also include a heavy dose of critical thinking

and problem-solving content. David Moore quotes Central IntelligenceAgency (CIA) official Mark Lowenthal as saying ‘‘that critical thinkingought to be . . . ingrained in analysts as part of their training . . .but to dothat, one must first understand critical thinking—what it is, how to do it,[and] how to teach or learn it.’’12 Of course, an extensive body of literatureexists on critical thinking, to include medical, legal, and engineeringapplications, but Moore’s work offers a guide to critical thinking from anIA perspective.Training should expose prospective and new analysts to synthesis, as well

as analytic thinking, intuitive cognition, divergent thinking, and ill-structuredproblems. Problems that are slightly out of focus, ambiguous, and poorlyunderstood are particularly effective instructional instruments because thestudent analyst ‘‘must rely on judgment, intuition, creativity, generalproblem solving processes and heuristics.’’13 Applied courses can alsoimpart to aspiring and working analysts specific analytical techniques.Recent studies have indicated that argument mapping, Bayesian analysis,statistical modeling, and structured role playing have yielded positiveresults in improving human judgments.14 Teaching the analysis ofcompeting hypotheses, as described by Richards Heuer, is an essential toolin critical thinking and problem-solving processes.15 If these courses flowfrom a comprehensive, Community-wide assessment, one upshot would beanalysts better prepared for a career in any part of the IC, and thereforeprobably more confident of having chosen this career.

HOW TO TEACH AND TO LEARN—ONE SIZE DOES NOT FIT ALL

The IC’s training challenge includes creating an environment at the front endof the learning cycle that is responsive to the needs of intelligence analysts.Educational psychology literature offers research-based theories of learningand cognitive styles that can help; two are described here as illustrationsand not as definitive solutions. While each theory continues to gather

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 311

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 7: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

evidence and stir debate (with respect to the validity of its underlyingconceptualization and associated measurement tools), neither should beregarded as a panacea, and both should be considered among a variety ofexplanations for how people learn and the varieties of instruction thatserve each learning style best. Learning theorists David Kolb, JoyceOstand, and Irwin Rubin assert that,

By examining the learning process we can come closer to understandinghow it is that people generate from their experience the concepts, rules,and principles that guide their behavior in new situations and how theymodify these concepts to improve their effectiveness.16

These authors focus on identifying and evaluating cognitive styles inadults, or how adults learn.17 Of particular importance for IA training isDavid Kolb’s focus on the personality attributes associated with particularmodes of problem-solving and decisionmaking, and their directrelationship to the learning process.Due to personal cognitive or learning styles, students respond differently to

various methods of instruction and learning materials. For example, somelearners thrive under independent study, experiential practice and tests,and self-paced learning using visual and experiential materials. Otherstudents are lost with these methods and need more traditional structure,setting, and activities. All these learning styles are valid; the value is inmatching teaching to the learning style. Better trainers—and better trainingdepartments—have mastery of more varieties of instruction, materials, andways of testing the acquisition of knowledge, and can match them todifferent types of learners. Teachers and departments that employ only oneteaching method—say, lecture, reading, and recitation in tests—tend to beless effective.

Kolb’s Approach

Kolb’s Learning Style Inventory (LSI), an assessment tool, specificallyaddresses the strengths and weaknesses of a learner and indicates whichteaching mode might be most effective.18 For example, the LSI may beuseful in indicating the degree to which a new analyst orients towardconcrete experience, reflective observation, abstract conceptualization, oractive experimentation.19 Composite scores in the preceding categoriescharacterize the analyst as a learner who is an accommodator, diverger,converger, or assimilator, according to the theory. A responsive andeffective IA training methodology could benefit from individualizinglearning as much as possible through the systematic use of results from theLSI—or another va l idated sys tem—adminis tered to analys t s .Consequently, learners could choose from among a variety of structuredexperiences, or a classroom (if classroom training remains the only vehicle

312 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 8: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

for instruction) could afford a variety of materials and tutorials optimallymatched to each learner. Learning style is a key component of thepersonnel development process, and can be addressed as continuingeducation programs come on line for analysts who are new today buteventually will be journeymen and, with training and effort, experts.

Kirton’s Approach

Michael Kirton’s Adaptation–Innovation (AI) Theory provides additionalinsights for the development of IA training.20 Kirton describes aunidimensional, bipolar cognitive style ranging from highly adaptive tohighly innovative that markedly influences problem-solving. Adaptorsproduce a sufficiency of ideas within the established paradigm in theirorganization. In contrast, innovators are more likely to reconstruct theproblem, separate it from the accepted paradigm, and emerge with a lessexpected solution, but one that may be equally valid and effective. In thisrespect, his Adaptor–Innovator continuum is construed as a contributionto the study of creativity in organizations.

The Challenge of Matchups

Research on cognitive styles poses a potentially profound training dilemmafor the IC. New analysts must be trained, appropriately assigned, andretrained throughout their careers. Ideally, in order to encourage initialsuccess and longer-term retention, training facilitates a fit between theperson and job by assessing the individual’s thinking style and providingguidance about how best to use it, what to do when meeting resistance (orthose using a different style), and what contexts benefit the most and leastfrom the style. In creating work environments that maximize the transferof learning, IA managers can assign individual tasks and structure teamtasks to leverage different cognitive styles. (Alternatively, neglecting thisdimension can frustrate some analysts, which increases the likelihood ofteam conflict, in the end discouraging transfer of learning and increasingthe likelihood of undesired attrition.) The Kirton and Kolb research andrelated theories suggest that properly understood and engaged cognitivestyles, matched systematically with compatible tasks, materials, teams, andinstruction, are more likely to produce effective training experiences andtransfer of knowledge, therefore maximizing the learning experience.

Automation and Tutoring

As Robert Barr and John Tagg argue, there is value in moving away from aparadigm that mistakes a means (instruction) for an end (learning), and usesas its primary learning environment the fairly passive lecture-discussion

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 313

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 9: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

format.21 Either more automated or more personal instruction is better thanclassroom teaching. Paulo Freire’s work has a particular significance forinforming analyst training. Such training should embrace his emphasis ona dialogue with the student, building mutual respect, rather thandispensing knowledge or making ‘‘deposits’’ in an student’s knowledge‘‘bank.’’ Freire’s concern with praxis—actions that are informed andlinked to certain values and serve to problem pose and problem solve22—should be a hallmark of analyst training. J. Dexter Fletcher and RobJohnston, and later Philip Dodds and Fletcher, found that computer-assisted instruction—especially ‘‘smart technology’’ that includes diagnosticand branching capabilities and the ability to generate new material on thefly—can reduce the time it takes to learn a block of material by thirtypercent, or increase achievement by the same amount. Costs for suchinstruction can be substantially less, as well, once initial developmentinvestment is recouped.23

In addition to the benefits of automation in learning, tutoring can pay bigdividends. A stream of research comparing amounts of learning acrossvarious instructional methods shows that ‘‘tutored students learned morethan those taught in classrooms.’’24 Fletcher and his colleagues explain thisdisparity as a function of the greater learning opportunities in individualtutorials as compared to a classroom setting. In classes, a group ofstudents asked an average of three questions per hour, and any singlestudent asked an average of 0.11 questions per hour, suggesting very littledirect learner-centered interaction. In contrast, tutorial students asked20–30 questions per hour and in turn were asked 117–146 questions perhour. The authors equate these interactions to learning opportunities,presumably of a more potent nature than that afforded by lecturing andpassive reception of information. Many more opportunities to learn onlywhat the student needs to learn, when the student needs to learn it,translates in one set of studies to raising the achievement of the averagestudent two standard deviations (in effect, from the middle of the herd toperformance equivalent to that of the top students).25

While one-on-one tutoring is not a practical solution for the huge numbersof accessions in the IC for the next few years, some evidence indicates thatgroup tutorials afford some portion of the benefits of one-on-one learningand are significantly improved over classroom instruction. By this token,classes with 20–30 students could be reduced by six to ten students ingroup tutorials, thus meeting more competently the challenges of speedand volume. In combination with group tutorials, the greatest gain wouldbe even more, and possibly enough to justify the additional costs of addingpersonnel for the tutorials. The comparative value of alternative trainingmethods remains an empirical question, eminently answerable byexperiments and pilot programs. In the current circumstances, where

314 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 10: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

training demands profoundly strain the limits of the IC’s capacity to trainand assure the quality of training, research findings that offer alternativesolutions have critical, perhaps ethical, implications.

Shortcomings in Institutional Practices

The design of the IA learning environment should account for theexpectations of students emerging from secondary school systems wherediscovery- and problem-based learning approaches increasingly augmentcollaborative and cooperative learning methods. The Case Study method isa particularly effective discovery- and problem-based approach forteaching IA. Thomas Schreeve, a pioneer in developing this method forteaching intelligence analysts, uses questions to guide discussion to aparticular pedagogical destination. In the process, students are madepartners in the reduction of ambiguity surrounding complex issues. AsSchreeve points out, the advantages of the case method are the speed withwhich instructors become familiar with the technique and subject material,and the variety of cases already tailored to the IC.26

Unfortunately, universities and employer-based IA training programs aretwice disadvantaged: First, training institutions do not uniformly select andprepare IA teachers with the same care used by employers to select andtrain analysts. Second, training institutions are focused on staffingclassrooms for standard lecture presentations when other teaching modelsare, for some purposes and learners, demonstrably more effective andefficient. Training programs invest very little time and few resources toteach instructors how to create an innovative learning environment.Instructors tend to be selected solely for their subject matter expertise,plucked out of their cubicles, and thrust into the classroom with a lessonplan shoved into their hands; at that point, they are expected to engagestudents in a meaningful and skilled way.IA training should, but generally does not, incorporate structured and

rigorous instructor training programs with an emphasis on learning theoryand techniques. New instructors could learn a spectrum of teachingmethodologies, practice them in mock classes, and garner the benefit ofassessments by master teachers prior to being let loose on students.However, more selectivity in assigning teachers and more time taken totrain them in learning will make it harder, at least in the short run, to stafftraining programs. In the long run, such an approach is likely to be wellworth the cost, with the benefit of greater quality of training.Alternatives to the teacher-in-front-of-the-class model offer advantages for

both effectiveness and efficiency. The greater number of teachers required ina tutorial-based system is not a conversation stopper for training managers,but requires some empirical testing of ways to achieve the benefits of tutoring

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 315

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 11: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

while not bringing the training process to a crawl. The point here is not toargue for the best methodology for delivering instruction—an inherentlymeaningless proposition. Teaching styles, content modalities, and othervariables are not absolutely good or bad, but better or worse for somepurposes, some content, and some learners. The goal is to provide a mix ofmethodologies, and the flexibility and know-how to use any and all ofthem, in an effort to create an optimal learning environment for thevariety of learners’ needs. Adaptability and agility in alternative trainingmethods would well serve the IC’s professional preparation of analysts.

HOW TO ASSESS TRAINING AND LEARNING

A dispute prevails in the literature as to whether the IC has a sound basis atthis point for assessing its training needs, level of expertise, and success intraining. Studies and commissions writing of IA training, since at least1996, suggest that the IC has no overriding concept, curriculum, plan, orprocedures for assessing individual needs, achievement, and trainingeffectiveness, and those programs are either fragmented in agency-specificstovepipes or unduly duplicative (which is another manifestation ofuncoordinated efforts).27 More precisely, Rob Johnston has asserted thatIA lacks a taxonomy that would assist in the ‘‘development of a researchmatrix that identifies what is known and how that information may beused in intelligence analysis’’ and the ‘‘application of research from otherdomains to develop additional training and education programs foranalysts.’’28 Furthermore, a codified body of knowledge, consensually orempirically validated standards of practice, a common training curriculum,and public criteria for levels of proficiency among practitioners amount toan analytic doctrine, and none of these is part of the practice of IA.Without doctrine, a discipline or organization has no basis on which to

determine performance requirements, minimal criteria for acceptablepractice, or progress toward competent independent practice. The absenceof doctrine makes it nearly impossible to determine how expert someone isor needs to be, or to reliably judge what an acceptable work product is. Inthe absence of a standardized doctrine, ‘‘analysts are left to the rather slowand tedious process of trial and error throughout their careers’’29 and thevicissitudes of the subjective judgments of ever-changing authorities.Yet, some training proponents, such as Stephen Marrin in a post–11

September 2001 (9=11) article taking stock of progress in the CIA’s ShermanKent School, believe that Sherman Kent laid down an early analyticdoctrine that has been advanced and refined since then by practitioner–scholars. Moreover, tradecraft precepts, such as those promulgated byformer CIA Director of Intelligence Douglas MacEachin and his successors,have elaborated practical elements to the early formulation.30

316 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 12: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

While the report of the 2005 Commission on the Intelligence Capabilitiesof the United States Regarding Weapons of Mass Destruction, amongother calls to action previously cited here, encourages the IC to marshalmore and better training on any number of topics, no one study or reportaddresses how to do that. Its authors may assume, erroneously, that a planfor training is common knowledge. But before anyone turns to the catalogof courses recommended, the field should establish an analytic doctrine,standards of practice and expertise, and taxonomy of knowledge about thediscipline to rationalize such a curriculum. Once these fundamentals areachieved, developing a training needs assessment scheme will be easier. Inturn, ways to assess training are well known. Winfred Arthur et al.provided a hint, and training attendees in the IC already provide DonaldKirkpatrick’s first level of training assessment. That is the question: Howsatisfied are you with the instructor, content, ventilation, etc.? The IC hasno shortage of Level 1 data.Next, how well do IC analytic trainees learn new concepts, skills, and other

content? Occasionally, trainers collect this Level 2 data. Can trainingprogram managers say how much knowledge any particular course impartson average, i.e., which concepts and abilities do students know at the endof class that were not known before? Measuring student knowledge only attheir exit from a program does not show what was learned. Both pre- andpost-testing are necessary to assess gains in learning. Beyond knowledgelearned, the taxonomy should reveal what portion the course represents ofthe total knowledge needed for entry-level expertise, and whatcompetencies should follow to climb the ladder to expert level.Subsequently, the IC must ask: How well are Level 3 outcomes assessed

and attained? What can training program managers say about the effectson job behavior of any particular lessons in the curriculum? How muchbetter did Cultural Awareness students do in writing analyses of EastAsian or Latin American countries? What is different now in how anyparticular student approaches the effects of culture on leaders’decisionmaking or interpreting ambiguous source reporting?Finally, the IC should collect Level 4 data. For example, since the advent

of the analysis of the alternative hypotheses classes, how much better doesthe IC now evaluate low-likelihood, high-criticality events—the ones thatpreviously might have been discounted out of hand in the face ofhigh-likelihood, favored hypotheses? How many analyses are supported byan examination of disconfirming data? Has analytic surprise been avoidedto any greater degree (notwithstanding the analytic problems of answeringthat kind of question)? Have alternative analytic methods led to morecreative products, as evidenced, say, in customers conceiving unexpected ornovel policy options as a result of IC analyses? How well is evidenceevaluated differently now? Are some kinds of factoids receiving more

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 317

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 13: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

scrutiny, and is the prevailing analytic line more frequently challenged? Whatprogram office is responsible for collecting and analyzing this data? How dothe data link up with product evaluation programs? What is the mechanismfor feeding the findings back to the workforce? Ideally, everyone in the ICshould know the answers to these questions.Substantive answers to these queries go a long way toward demonstrating

that the IC and its academic and contractor partners know their business, aremaking progress, and are succeeding at training effective intelligenceanalysts. Should the IC be able to satisfactorily answer at least seventypercent of these questions—typically, a passing grade—the Communitywould have come a long way.

INTEGRATING THE LEARNING CYCLE31

Winfred Arthur and his colleagues asked whether training effectiveness ismore a function of learning criteria (Kirkpatrick’s Level 2) or ofbehavioral criteria (Kirkpatrick’s Level 3). Learning criteria measure howmuch students retained from the training, while behavioral criteria measurechanges in their job-related performance. A Government AccountingOffice study found that some work environments are more conducive tothe transfer of learning.32 This challenge is common to all training:demonstrating transfer of learning from the classroom to the job, and thenaggregating effects of all the learners’ new skills to larger effects onmission accomplishment or organizational outcomes (Kirkpatrick’s Level 4).The overall challenge for training is to integrate the entire learning cycle:

what to teach, how to teach it, how to assess it, and how to optimizeconditions in the workplace to ensure the effectiveness of how analystsapply their learning. Incentives to apply new learning and disincentives forrelapsing into old patterns are just two obvious ways to optimize the workenvironment. Designing the workspace and work processes, integratingcontinuous assessment and feedback as part of the normal work flow,asking returning students to train some of the uninitiated, and otherparticipative management techniques are some additional low-hanging fruitthat can help optimize the environment for application of learning.To maximize the return on investment in training, each part of the trainee’s

learning cycle should be optimized. At the start, training managers cancollaborate with managers and senior experts to determine the following;what KSAs are required for a job at different levels (training objectives);what each trainee needs to learn, starting from what each person alreadyknows (gap analysis); and how to train each of those KSAs, ideallytailored to each learner’s learning style. For on-the-job training, IAmanagers should have a solid understanding of how to create a workingenvironment that maximizes the opportunity to use the new knowledge

318 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 14: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

and skil l , and shape their environments accordingly, includingreinforcements for applying new KSAs. That creation will require, amongother actions, tailored assignments and intensified supervision (to catchand reward the employee for demonstrating learning), and trainingmanagers on how to do that. Finally, organizational assessments withfeedback to personnel should feature the prevalence and benefits of newKSAs applied at both the job level and the organizational level. In theprocess, accountability throughout the cycle is more readily achieved.

A PARTNERSHIP BETWEEN GOVERNMENT AND ACADEMIA

Rob Johnston’s findings on the state of IA and training are consistent withthe conclusions of the Future of Intelligence Analysis study commissionedby the Assistant Deputy Director of National Intelligence for Educationand Training. The study found ‘‘a crucial need to develop education andtraining programs that not only improve analysis directly but alsoprofessionalize the analytic workforce. Education and training are lowpriority activities throughout the IC. The fact that managers do not receiveconsistent training throughout their careers probably reinforces thisbias.’’33 The report goes on to recommend

including agency-specific programs and programs offered by universities;making as many education and training programs as possible IC-wideinitiatives; initiating a mandatory, joint ‘‘boot camp’’ for all analysts inthe IC within the first six months of employment; developing acoordinated education and training continuum for managers as well asanalysts so that education and training becomes a standard, periodicfeature of analysts’ and managers’ careers.34

In addition, that the IC recognize the indispensability of colleges anduniversities in providing prospective entry-level analysts is crucial. Highereducation offers the IC advantages in recruiting potential analysts, educatingthem through a broad curriculum, and conducting the assessments necessaryto produce relevant and measurable outcomes. Educators can activelyrespond to, and provide feedback for, intelligence training initiatives.Educators and trainers can identify the best techniques to informdecisionmaking, solve problems, and reduce uncertainty and surprise. In theprocess, educators and trainers can identify the principles, tasks, and skillsthat best reframe a decisionmaker’s question, transform it from a questionof fact to an analytic question, extract assumptions and biases from thequestion, and identify what the analyst knows and does not know. Themarriage between academia and the IC can go a long way in providing aknowledge base for prospective analysts. The IC can then invest resources inequally important advanced and continuing education classes.

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 319

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 15: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

In an effort to meet the expanded requirement for entry-level analysts,the IC has stepped up its recruiting efforts at colleges and universities. Ofparticular note is the ODNI-directed IC Centers for Academic Excellence(CAE) program. This program provides grants to four-year institutions tobuild IA programs. Currently, ten schools are funded through theprogram, and more CAEs may emerge. These programs are subject to ICoversight and annual review, as the jury is still out on how this initiative isfulfilling its mission. Several undergraduate and graduate programs haveestablished curriculums in IA and provide entry-level analysts to the ICalongside the ten CAEs noted in Table 1.The IC looks to academic institutions to assist with the preliminary

preparations of aspiring analysts. If these institutions are to be effective,evaluation standards and measures of effectiveness, as established by theIC, should be fully integrated into the academic curricula.Intelligence Community funding and assistance to academic institutions to

help prepare entry-level analysts will help alleviate the training resourceproblem. Analysts so educated may be able to test out of or spend lesstime in basic courses offered by the IC, and resources can be redirected toadvanced and career IA courses. The question then becomes how best to

Table 1. Schools with Intelligence Analysis Programs

1. The California State University, San Bernardino�

2. Clark Atlanta University, Atlanta, Ga.�

3. Embry-Riddle Aeronautical University, Prescott, Ariz.4. Florida International University, Miami�

5. Georgetown University, Washington, D.C.6. The Institute of World Politics, Washington, D.C.7. Johns Hopkins University, Baltimore, Md.8. Mercyhurst College, Erie, Pa.9. Neumann College, Aston, Pa.

10. New Mexico State University, Las Cruces11. Norfolk State University, Va.�

12. Tennessee State University, Nashville�

13. Trinity University, Washington, D.C.�

14. University of Notre Dame, Notre Dame, Ind.15. University of Pittsburgh, Pa.16. The University of Texas, El Paso�

17. The University of Texas, Pan American, McAllen�

18. The University of Washington, Seattle�

19. Wayne State University, Detroit, Mich.�

Note. Those with an asterisk were chartered under the DNI’s program, Intelligence CommunityCenters for Academic Excellence. The others started curricula on their own.

320 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 16: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

prepare students for eventual work in the IA community and, at the sametime, reduce the burden of training for the IC?

REFERENCES1A huge exodus of employees and an influx of recruits to replace them, in additionto a Congressionally mandated plus-up to the analytic corps, create a substantialtraining demand. Projections by the Office of Personnel Management, as cited bythe Government Accountability Office, are that sixty percent of white collar andninety percent of executive employees will be eligible to retire by 2010. Assumingthese include senior experts, and that the IC’s analytic corps is representative ofthe federal government, managers face a potentially massive turnover withimplications for training. See J. Christopher Mihm, Statement to the HouseCommittee on Appropriations, Subcommittee on Financial Services andGeneral Government, Human Capital: Federal Workforce Challenges in the 21stCentury, 6 March 2007, http://www.gao.gov/new.items/d07556t.pdf.Moreover, as an indication of one agency’s need, the media report that fortypercent of CIA analysts were hired after 9=11 and one out of seven have lessthan one year’s experience on the job. See David Morgan, ‘‘U.S. IntelligenceStill Years from Reform Goals,’’ Reuters, 27 January 2007, http://www.reuters.com/article/domesticNews/idUSN2446823920070124. These ratioswill become more severe over the next couple of years as the slopes of thesetwo lines—the departures and arrivals—approach their respective apogees.

2Winfred Arthur Jr. et al., ‘‘Effectiveness of Training in Organizations: AMeta-Analysis of Design and Evaluation Features,’’ Journal of AppliedPsychology, Vol. 88, No. 2, 2003, pp. 234–245.

3Donald L. Kirkpatrick, ‘‘Techniques for Evaluating Training Programs,’’ Journalof the American Society of Training and Development, Vol. 13, 1959, pp. 3–9;Donald L. Kirkpatrick, ‘‘Evaluation of Training,’’ in Robert L. Craig, ed.,Training and Development Handbook: A Guide to Human Resource Development,2nd ed. (New York: McGraw-Hill, 1976), pp. 301–319; Donald L. Kirkpatrick,Evaluating Training Programs: The Four Levels (San Francisco: Berrett-Koehler,1994).

4Donald L. Kirkpatrick, ‘‘Techniques for Evaluating,’’ pp. 3–9.

5Winfred Arthur Jr. et al., ‘‘Effectiveness of Training,’’ p. 236.

6This fifth stage—job design and management—is outside the scope of trainingprogram managers.

7Rob Johnston, Analytic Culture in the U.S. Intelligence Community: AnEthnographic Study (Washington, DC: Center for the Study of Intelligence,2005), https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/analytic-culture-in-the-u-s-intelligence-community/index.html. A review of Websites would include for example: Directorof National Intelligence (www.dni.gov), Central Intelligence Agency(www.cia.gov), National Security Agency (www.nsa.gov), Department ofHomeland Security (www.dhs.gov), FederalBureau of Investigation (www.fbi.gov).

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 321

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 17: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

8Russell G. Swenson, ed., Bringing Intelligence About: Practitioners Reflect on BestPractices (Washington, DC: U.S. Government Printing Office, 2003), http://www.au.af.mil/au/awc/awcgate/dia/bring_intel_about.pdf

9Ibid., p. 4.

10David T. Moore and Lisa Krizan, ‘‘Core Competencies for Intelligence Analysisat the National Security Agency,’’ in Russell G. Swenson, ed., BringingIntelligence About: Practitioners Reflect on Best Practices, pp. 95–131.

11For example, Ability (Teaming and Collaboration), Characteristic (VoraciousReader), Knowledge (Government Policy), Skill (Critical Thinking).

12David T. Moore, Critical Thinking and Intelligence Analysis: Occasional Paper#14 (Washington, DC: Joint Military Intelligence College, 2006), p. x.

13Harvey J. Brightman, Problem Solving: A Logical and Creative Approach (EastLansing, MI: Michigan State University Press, 1998), p. 6.

14Steven Rieber and Neil Thomason, ‘‘Creation of a National Institute for AnalyticMethods,’’ Studies in Intelligence, Vol. 49, No. 4, 2005, https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol49no4/Analytic_Methods_7.htm

15Richards J. Heuer Jr., Psychology of Intelligence Analysis (Washington, DC:Center for the Study of Intelligence, 1999), https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/PsychofIntelNew.pdf

16David A. Kolb, Joyce S. Osland, and Irwin M. Rubin, Organizational Behavior(Englewood Cliffs, NJ: Prentice Hall, 1995), p. 49.

17Researchers define cognitive style in technically different ways. For presentpurposes, it is useful to define the concept as consistent differences amongpeople in the way they process, organize, understand, and use experience andinformation, specifically in the way they make decisions and solve problems.Cognitive style is empirically demonstrated to be separate from cognitiveability, and related to but different from personality, emotion, and attitudes.

18David A. Kolb, The Learning Style Inventory: Self-Scoring Test and InterpretationBooklet (Boston: McBer, 1976); David A. Kolb, The Learning Style Inventory:Technical Manual (Boston: McBer, 1976).

19DavidA.Kolb, Joyce S. Osland, and IrwinM.Rubin,Organizational Behavior, p. 52.

20Kirton assesses thinking styles using the Kirton Adaptation–Innovation (KAI)Inventory, already used in some sectors of the IC. See Michael Kirton,‘‘Adaptors and Innovators: A Description and Measure,’’ Journal of AppliedPsychology, Vol. 61, No. 5, 1976, pp. 622–629; Michael Kirton, KirtonAdaption–Innovation Inventory Manual, 3rd ed. (Hertfordshire, UK: KAIDistribution Center, 1999).

21Robert B. Barr and John Tagg, ‘‘From Teaching to Learning: A New Paradigmfor Undergraduate Education,’’ Change, Vol. 27, No. 6, 1995, pp. 13–15.

22Paulo Freire, influential Brazilian educator, at http://www.infed.org/thinkers/et-freir.htm. See also Paulo Freire, Pedagogy of Freedom (New York: Rowmanand Littlefield, 1998).

322 JAMES G. BRECKENRIDGE

INTERNATIONAL JOURNAL OF INTELLIGENCE

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4

Page 18: Designing Effective Teaching and Learning Environments for a New Generation of Analysts

23J. Dexter Fletcher and Rob Johnston, ‘‘Effectiveness and Cost Benefits ofComputer-Based Decision Aids for Equipment Maintenance,’’ Computers inHuman Behavior, Vol. 18, No. 6, 2002, pp. 717–728; Philip Dodds and J. DexterFletcher, Opportunities for New ‘‘Smart’’ Learning Environments Enabled by NextGeneration Web Capabilities (Ft. Belvoir, VA: Defense Technical InformationCenter, 2004).

24J. Dexter Fletcher and Rob Johnston, ‘‘Effectiveness and Cost Benefits.’’

25Three dissertations conducted under Bloom’s direction as cited in Ibid., p. 88.

26Thomas W. Shreeve, Experiences to Go: Teaching with Intelligence Case Studies:Discussion Paper #12 (Washington, DC: Joint Military Intelligence College,2004).

27For example, Les Aspin and Harold Brown, Commission on the Roles andCapabilities of the U.S. Intelligence Community Report: Preparing for the 21stCentury: An Appraisal of U.S. Intelligence (Washington, DC: U.S. GovernmentPrinting Office, 1996); Shawn Reese, Congressional Research Service Report forCongress: Federal Counter-Terrorism Training: Issues for CongressionalOversight (Washington, DC: U.S. Government Printing Office, 2005); WilliamLahneman et al., The Future of Intelligence Analysis, Vol. I (College Park, MD:University of Maryland, 2006); Presidential Commission on Weapons of MassDestruction, The Commission on the Intelligence Capabilities of the UnitedStates Regarding Weapons of Mass Destruction (Washington, DC: U.S.Government Print ing Off ice , 2005), http://www.wmd.gov/report/wmd_report.pdf; Rob Johnston, Analytic Culture.

28Rob Johnston, Analytic Culture in the U.S. Intelligence Community, p. 44.

29Ibid., p. xviii.

30Stephen Marrin, ‘‘CIA’s Kent School: Improving Training for New Analysts,’’International Journal of Intelligence and CounterIntelligence, Vol. 16, No. 4,Winter 2003–2004, pp. 609–637.

31The Government Accountability Office has recognized the need for a moresophisticated approach in government to strategic training. A general depictioncan be found in: U.S. Government Accountability Office, Human Capital: AGuide for Assessing Strategic Training and Developmental Efforts in the FederalGovernment (Washington, DC, 2004), www.gao.gov=new.items=d04546g.pdfaccessed on 10 August 2007.

32Ibid., p. 242.

33William J. Lahneman et al., The Future of Intelligence Analysis, Vol. I, p. 6.

34Ibid., p. v.

DESIGNING EFFECTIVE TEACHING AND LEARNING ENVIRONMENTS 323

AND COUNTERINTELLIGENCE VOLUME 23, NUMBER 2

Dow

nloa

ded

by [

The

UC

Irv

ine

Lib

rari

es]

at 0

9:51

26

Oct

ober

201

4