18
UNCLASSIFIED UNCLASSIFIED 1 From: Mark D. Mandeles Date: 31 March 2011 Subj: JCOA and operational analysis for US forces and their commanders, Part 1 Preface. To date, most Joint Center for Operational Analysis (JCOA) products have been conceived as stand-alone efforts to satisfy a specific question or task. In answering questions posed by combatant commanders during wartime, JCOA has missed opportunities to meet vital unrecognized needs. These “JCOA and operational analysis” memoranda propose a conceptual and methodological shift in JCOA’s approach to operational analysis and assessment. The argument presented, however, does not seek to disparage the quality of JCOA products or of its analysts. Rather, it presents an opportunity to increase the value of JCOA analyses and products. JCOA Mission Statement: “As directed, JCOA collects, aggregates, analyzes and disseminates joint lessons learned and best practices across the full spectrum of military operations in order to enhance joint capabilities.September 2010 Background. JCOA’s September 2010 one-sentence mission statement creates a dilemma; the means (to collect, aggregate, analyze, and disseminate joint lessons learned and best practices) do not translate directly into the achievement of the stated end (to “enhance joint capabilities”). JCOA analysts conceive and design studies around the formulation of joint lessons learned, and prepare briefings and reports to present these lessons. Therein lies the problem. JCOA develops lessons learned by examining unique events and experiences, defined by a particular place, time, and situation. Yet, the goal of enhancing joint capabilities and the implied goal of supporting operational forces and their commanders require the development of broader knowledge about employment of operational forces that then can be applied consciously to near-, mid-, and long-term “full-spectrum operations.” Developing such knowledge calls for modifying JCOA’s current approach to designing studies, developing findings, interacting with, and presenting results to combatant commanders. The primary components of the JCOA approach to operational-level analysis should be the conceptual work of identifying and analyzing themes or patterns in the design, management, implementation, and assessment of operations and their outcomes. The elevation of the conceptual element of operational analysis in JCOA studies would decrease, but not eliminate, the current emphasis on collecting, aggregating, analyzing, and disseminating joint lessons learned. In the scheme proposed here, the individual studies are examples (or cases) of important issues and dynamics that may have been overlooked. In other words, joint lessons learned and “best practices” should be conceived as data and evidence informing a parallel and complementary effort to build, assess, and critique a growing, systematic, and coherent body of military operational knowledge. 1 Plan of attack. This memo is the first section of a study to examine the organization of operational-level analysis and assessment to support operational forces. Two issues will be introduced here: (1) the importance of the growth of knowledge to enhancing joint capabilities and supporting operational forces, 1 The interplay between parallel lines of analysis—building systematic operational knowledge and developing joint lessons learned—operationalizes the Army’s efforts to institutionalize adaptation. I believe that General Dempsey argued against the imposition of a “one best way” to develop the knowledge and understanding that enable adaptation. He stated that he was “neither looking for consensus nor affirmation of our current path. In fact, the power of our great profession comes from … diversity of thought …” GEN Martin E. Dempsey, “A Campaign of Learning to Achieve Institutional Adaptation,” Army, November 2010, p. 34.

Mandeles, Operational-level analysis, parts 1 \u0026 2 (2011)

Embed Size (px)

Citation preview

UNCLASSIFIED

UNCLASSIFIED 1

From: Mark D. Mandeles Date: 31 March 2011 Subj: JCOA and operational analysis for US forces and their commanders, Part 1

Preface. To date, most Joint Center for Operational Analysis (JCOA) products have been conceived as stand-alone efforts to satisfy a specific question or task. In answering questions posed by combatant commanders during wartime, JCOA has missed opportunities to meet vital unrecognized needs. These “JCOA and operational analysis” memoranda propose a conceptual and methodological shift in JCOA’s approach to operational analysis and assessment. The argument presented, however, does not seek to disparage the quality of JCOA products or of its analysts. Rather, it presents an opportunity to increase the value of JCOA analyses and products.

JCOA Mission Statement: “As directed, JCOA collects, aggregates, analyzes and disseminates joint lessons learned and best practices across the full spectrum of military operations in order to enhance joint capabilities.” September 2010

Background. JCOA’s September 2010 one-sentence mission statement creates a dilemma; the means (to collect, aggregate, analyze, and disseminate joint lessons learned and best practices) do not translate directly into the achievement of the stated end (to “enhance joint capabilities”). JCOA analysts conceive and design studies around the formulation of joint lessons learned, and prepare briefings and reports to present these lessons. Therein lies the problem. JCOA develops lessons learned by examining unique events and experiences, defined by a particular place, time, and situation. Yet, the goal of enhancing joint capabilities and the implied goal of supporting operational forces and their commanders require the development of broader knowledge about employment of operational forces that then can be applied consciously to near-, mid-, and long-term “full-spectrum operations.” Developing such knowledge calls for modifying JCOA’s current approach to designing studies, developing findings, interacting with, and presenting results to combatant commanders.

The primary components of the JCOA approach to operational-level analysis should be the conceptual work of identifying and analyzing themes or patterns in the design, management, implementation, and assessment of operations and their outcomes. The elevation of the conceptual element of operational analysis in JCOA studies would decrease, but not eliminate, the current emphasis on collecting, aggregating, analyzing, and disseminating joint lessons learned. In the scheme proposed here, the individual studies are examples (or cases) of important issues and dynamics that may have been overlooked. In other words, joint lessons learned and “best practices” should be conceived as data and evidence informing a parallel and complementary effort to build, assess, and critique a growing, systematic, and coherent body of military operational knowledge.1

Plan of attack. This memo is the first section of a study to examine the organization of operational-level analysis and assessment to support operational forces. Two issues will be introduced here: (1) the importance of the growth of knowledge to enhancing joint capabilities and supporting operational forces,

                                                                                                                         1 The interplay between parallel lines of analysis—building systematic operational knowledge and developing joint lessons learned—operationalizes the Army’s efforts to institutionalize adaptation. I believe that General Dempsey argued against the imposition of a “one best way” to develop the knowledge and understanding that enable adaptation. He stated that he was “neither looking for consensus nor affirmation of our current path. In fact, the power of our great profession comes from … diversity of thought …” GEN Martin E. Dempsey, “A Campaign of Learning to Achieve Institutional Adaptation,” Army, November 2010, p. 34.

UNCLASSIFIED

UNCLASSIFIED 2

and (2) the need to recognize and avoid the conversion of analytic means into research ends. These issues are illustrated in a follow-up memo, “JCOA and operational analysis for US forces and their commanders, Part 2,” that compares the draft Transition to Stability Operations (TSO), to the Iraq Information Operations (I2O), and Imposing Order on Chaos: Establishing JTF Headquarters (JTF HQ) reports, and examines how the reports contribute to the growth of knowledge about operational matters. I won’t have time to complete the draft of a proposed “JCOA and operational analysis for US forces and their commanders, Part 3,” which was conceived to examine the feasibility of developing a sustained operational-level analysis research practice and tradition, a critical element in maximizing the value and effectiveness of JCOA’s work.

Growth of knowledge. All research methodologies express a logic and procedure of analysis. Analysts choose methodologies to overcome known obstacles to describing and explaining behavior and physical phenomena, and the conduct of research and experimentation may stimulate refinements and improvements of methods. Over time, a research tradition emerges that encompasses knowledge about topics and problems under consideration, methodologies to explore those topics and problems, and gaps in knowledge that become topics for research.2

What does it mean to call for an “organizational culture that learns”? The concept encompasses three interacting and reinforcing elements. First, the culture’s formal and informal rules defining appropriate roles and behavior must foster the application of empirical evidence to assumptions, statements, strategies, policies, and plans. Second, empirical evidence must be developed within institutionalized processes and procedures that enable and reward criticism of research tactics and methods, rules of evidence and inference, and research findings. Finally, the culture will influence practice by enabling people to communicate and share knowledge.

The institutionalization of criticism distinguishes deliberate, systematic, and conscious learning from trial-and-error activity.3 To be sure, trial-and-error methods have led to discoveries. But progress in growth of knowledge is faster when supported by systematic criticism. The reciprocal relationship between a learning organizational culture and the organization’s approach to research is illustrated by the following apocryphal story:

During the thirteenth century, professors at the University of Paris decided to find out whether oil would congeal if left outdoors on a cold night. They launched a research project to investigate this question. To them, research meant searching through the works of Aristotle. After much effort, they found that nothing Aristotle had written answered their question, so they declared the question unanswerable.

The essential truth of that anecdote is that the Parisian professors were right: their question was unanswerable within the research tradition to which they conformed. Research findings often tell

                                                                                                                         2 Imre Lakatos, “Falsification and the Methodology of Scientific Research Programmes,” in Imre Lakatos and Alan Musgrave, eds., Criticism and the Growth of Knowledge (Cambridge: Cambridge University Press, 1970); Imre Lakatos, The Methodology of Scientific Research Programmes, eds., John Worrall and Gregory Currie (Cambridge: Cambridge University Press, 1978). 3 Philosopher of science Karl R. Popper argued, “Science is one of the few human activities—perhaps the only one—in which errors are systematically criticized and fairly often, in time, corrected. That is why we can say that, in science, we often learn from our mistakes, and why we can speak clearly and sensibly about making progress there.” Conjectures and Refutations: The Growth of Scientific Knowledge, 2nd edition (New York: Harper and Row, 1965), p. 216.

UNCLASSIFIED

UNCLASSIFIED 3

more about the researchers’ tactics than about the phenomena studied, the tactics being shaped by the culture in which the research takes place.4

Applied to JCOA, a research approach that focuses on lessons learned and best practices treats study topics and situations as singular events. There is currently no requirement for the study team to devote deliberate and conscious effort to (1) provide historical and conceptual context for the topic under consideration; (2) clarify explanatory concepts or causal mechanisms; or (3) treat the phenomena under examination as examples of known processes, types, concepts, problems, or phenomena. To be sure, many JCOA projects have been initiated in response to a crisis, and study leaders and product managers have had little time for reflection prior to a study’s data collection phase. Overcoming such practical obstacles to setting conditions for the growth of knowledge should not only be the responsibility of the product manager and study leader, but should extend to S&A leaders.

This brief discussion about the centrality of growth of knowledge to producing research that enhances joint capabilities is abstract. My 4,000-word critique of the TSO report,5 shows concretely how JCOA’s approach to research and analysis has evolved and how JCOA’s research practices hinder the development of products on operational-level problems.

Conversion of analytic means into research ends. The TSO team members are smart and great analysts; they worked diligently and hard. However, they produced findings—vetted by JCOA murder boards—that do not advance observations and analyses produced by previous JCOA studies or capture and further develop knowledge produced by academia and other analytic organizations. I believe this situation results from the way JCOA has changed means—collecting, aggregating, analyzing, and disseminating joint lessons learned—into the ends of analysis. Converting means into ends is a widespread and common phenomenon in public and private, and civilian and military organizations;6 it impedes the production of useful knowledge when “the way things get done around here” becomes “the only way things get done around here.”7

In this, JCOA is not unique. The larger and more important point is that JCOA’s research tradition hinders the application of existing knowledge to the research process and to the research findings. The consequences of failing to apply knowledge to the research process and to findings are costly; JCOA gives up opportunities to support operational forces (and their commanders), and to fill an unoccupied organizational niche within the DoD for the conduct of operational-level analyses. No DoD organization, including the Institute for Defense Analyses and the Center for Naval Analyses, fills this niche on a

                                                                                                                         4 William H. Starbuck, The Production of Knowledge: The Challenge of Social Science Research (Oxford: Oxford University Press, 2006), p. 1. 5 Memorandum, Mark D. Mandeles to Russ Goehring, Subject: Draft TSO Report, 8 October 2010, Date: 5 November 2010. 6 In 1940, sociologist Robert K. Merton called this phenomenon “goal displacement.” Robert K. Merton, “Bureaucratic Structure and Personality,” Social Forces, Vol. 17, 1940, pp. 560–68. 7 Colonel Christopher R. Paparone and Colonel George Reed, “The Reflective Military Practitioner: How Military Professionals Think in Action,” Military Review, Vol. 88, March–April 2088, p. 69.

UNCLASSIFIED

UNCLASSIFIED 4

consistent basis.8 In work I did a few years ago for Andy Marshall, I identified examples of these FFRDCs not seizing opportunities to do such vital work.9

If JCOA does not fill the operational-level assessment and analysis niche, others outside DoD may try to fill that void. 10 Recently, in a symposium on academic-military relations, Yale University Professor of Management and Political Science Paul Bracken suggested that university professors should conduct operational-level analysis of military operations. One justification for his proposal was that if academics opt out of research on national security issues,

It leaves the field open to others who are only too happy to shape the public debate. There is already too much content that comes unfiltered from inside the Washington beltway. A kind of auto-stimulation occurs. …[T]here’s not much thinking going on in the think tanks these days. … [T]he point I emphasize is that this creates an opportunity for academia to enter the debate. If we don’t, others will.11

By noting Bracken’s proposal, I am not arguing that academics should be excluded from national security debates. I am arguing that JCOA’s history and previous work, knowledgeable and experienced analysts, and organizational access to unified commands and commanders gives it legitimacy to conduct serious, informed analysis and to develop knowledge necessary to help operational forces respond to an increasingly dangerous security environment. Building an evolving body of operational-level knowledge would additionally distinguish JCOA from other organizations that advertise themselves as sources operational insight and well-grounded assessment and analysis. 12

Indeed, it is very difficult to conduct operational-level assessment and analysis, and declaring the intent to do so does not translate directly into the deed. Yet, the efforts associated with conducting operational-level assessment and analysis distinguish high and low performing militaries. In the mid-1990s, Andy Marshall wrote three memoranda on the concept of military revolutions. On the one hand, the U.S. should strive to be, in Andy Marshall’s words, “the best in the intellectual task of finding the most appropriate innovations in concepts of operation and making organizational changes to fully exploit the technologies

                                                                                                                         8 In launching a program to improve the Army’s institutional capacity for adaptation, General Dempsey’s purpose is to generate operational- and strategic-level effects. General Dempsey recognized that one constraint this program must overcome is the necessity to “relearn” and “rediscover” principles of command. To date, unfortunately, none of General Dempsey’s articles directly addressed the deliberate development of a systematic body of knowledge as an operational-level tool or standard to assess relevancy of principles, such as “mission command,” to contemporary conflicts. GEN Martin E. Dempsey, “A Campaign of Learning to Achieve Institutional Adaptation,” Army, November 2010, pp. 34–35; GEN Martin E. Dempsey, “Concepts Matter,” Army, December 2010, pp. 39–40; GEN Martin E. Dempsey, “Mission Command,” Army, January 2011, pp. 43–44; GEN Martin E. Dempsey, “Leader Development,” Army, pp. 25–28; GEN Martin E. Dempsey, “Win, Learn, Focus, Adapt, Win, Again,” Army, March 2011, pp. 25–28. 9 Mark D. Mandeles, Military Transformation Past and Present: Historic Lessons for the 21st Century (Westport, CT: Praeger Security International, 2007), p. 99. 10 Anthony H. Cordesman has developed operational-level analyses that examine the relationship between means and ends, for example, “The Quadrennial Diplomacy and Development Review (QDDR): Concepts Are Not Enough,” (Washington, D.C.: Center for Strategic and International Affairs, 21 December 2010). 11 Paul Bracken, “Scholars and Security,” Perspectives on Politics, Vol. 8, December 2010, pp. 1098–99. 12 Army Colonels Paparone and Reed argued that improving its body of abstract professional knowledge increases an organization’s fitness (and legitimacy) in competing with other organizations offering similar services. Colonel Christopher R. Paparone and Colonel George Reed, “The Reflective Military Practitioner: How Military Professional Think in Action,” Military Review, Vol. 88, March–April 2008, p. 67.

UNCLASSIFIED

UNCLASSIFIED 5

already available and those that will be available.”13 On the other hand, the memoranda are mute about comparing and ranking capabilities to conduct operational-level analysis and assessment among the U.S. allies, potential adversaries, and enemies. However, this topic was addressed in the conclusion to the Iraq Information Operations report:

An additional type of ongoing assessment, which does not seem to receive much attention, is that [U.S.] commanders [in Iraq] have taken a strategic view of assessments by making routine the efforts to (1) gather data about effects and effectiveness, and (2) commission policy reviews conducted by outsiders. In this enterprise, commanders and outside agencies deliberately engaged in the process of identifying errors and figuring out how to correct them. Congress commissioned studies of IO capabilities and organization. The Services, such as the Army’s CALL sent teams of analysts to theater to identify and analyze “lessons.” Major General Hammond, commanding general MND-B, gave carte blanche to an outside advisor to thoroughly review the conduct of IO in his division—and to recommend fixes. To be sure, the analyses and assessments already conducted are not all perfect; some assessments are limited by the availability of appropriate data; some may be guided by pre-determined conclusions; and others may not provide useful recommendations. Yet, the acceptance and application of this strategic form of assessment is remarkable; it represents and approach to rational self-correction that few public or private organizations employ; it might form the basis of a narrative to the U.S. national security community about itself; and it might offer a useful dimension to compare the performance of military organizations in a net assessment (my emphasis).14

JCOA’s research tradition has obscured opportunities to recognize and exploit a line of assessment and analysis that could further enhance its unique role in DoD. The following paragraphs offer another example of how JCOA’s research tradition hinders opportunities to enhance its role in DoD.

Illustrative draft TSO finding: “Mastering the Transitions.” The draft Transition to Stability Operations (TSO) report that I read and critiqued contains ambiguous findings that do not advance JCOA’s knowledge base about the employment of forces, or provide useful guidance to GOFOs preparing to meet the challenges of implementing national strategy in Afghanistan.

The finding, “Mastering Transitions” begins with the PowerPoint-type bullet, “Strategic vision and clear guidance were crucial to facilitate the planning and execution of seamless transitions.” This sentence contains two types of problems. First, the sentence does not enable the reader to appreciate how commanders developed understanding and insight, because it assumes commanders had a clear understanding, prior to planning and operational implementation, of near-, mid-, and long-term goals and the way those goals were related. Second, the sentence assumes no previous growth or learning in units’ and commanders’ executable knowledge about how to achieve the full set of goals. Lessons learned and best practices developed from this finding do not alert commanders preparing to deploy to the complexity and obstacles in coordinating multi-national forces and to the difficulty of accomplishing desired goals.

                                                                                                                         13 Andrew W. Marshall, Memorandum for the Record, “Some Thoughts on Military Revolutions—Second Version,” 23 August 1993, p. 2. In two other memoranda about military revolutions, Marshall noted the national goal of leading the world in accomplishing intellectual tasks related to analysis of links between operational concepts and organizational innovation, but did not examine the reciprocal relationship between operational-level analyses and the growth of knowledge. Andrew W. Marshall, Memorandum for the Record, “Some Thoughts on Military Revolutions, 27 July 1993; Andrew W. Marshall, Memorandum for the Record, “RMA Update,” 2 May 1994. 14 Mark D. Mandeles, Iraq Information Operations, April 2008–June 2009 (U), (JCOA, 2010), unpublished. Eighteen people reviewed and critiqued drafts of this report, but only one recognized that the capability to conduct analysis and assessment might be a useful assessment “metric” for military power.

UNCLASSIFIED

UNCLASSIFIED 6

Consider how GEN Raymond Odierno explained his vision of aligning influence activities, information operations, and kinetic operations, as described in the Iraq Information Operations report. First, he admitted that his understanding of the situation and the application of IO evolved. Second, he stated that he set “left and right boundaries” to actions in discussions with subordinate commanders, which allowed them to experiment within understood constraints. Frequent battlefield circulation enabled General Odierno to reformulate and clarify the left and right boundaries, and the initial formulation of boundaries was different from those issued later. When he began to talk with subordinate commanders about aligning information and kinetic operations, he didn’t yet know how to execute the alignment, and his subordinate commanders invented some key techniques, e.g., Major General Hammond’s “flashlight effect.”

It is important to recognize that General Odierno assessed and learned while he commanded; his understanding of the situation—an implicit theory that applied actions and interventions to accomplish broad goals in the context of a dynamic environment—changed. His “strategic vision” evolved and guidance was flexible (that is, the lines of operation were adapted to the task environment) rather than pre-determined and static. The TSO description of commanders’ business is less specific and less clear than that of the I2O report. The use of vague concepts has an important implication for the way JCOA analysts conduct their analyses: one can’t think clearly when using crude or ambiguous distinctions (or not recognizing a difference) among phenomena, events, or structures. In addition, to the extent that JCOA material is used to support the training of new commanders for command, the usefulness of JCOA material rises and falls with the clarity or obscurity of the work it produces.

Preview of issues in “Part 2.” In commencing a dialogue “about our Army,” GEN Martin E. Dempsey, commanding general, U.S. Army Training and Doctrine Command, appears to endorse the type of conceptual and empirical analysis developed in the I2O and JTF HQ reports. He acknowledges how difficult institutionalizing new patterns of thought, analysis, and conduct can be,15 and why the capacity to conduct operational analysis is indispensable to his larger goal of achieving institutional adaptation. Underlying all discussion relevant to producing “agile and adaptive leaders” is “realistic training and education.”16 Needless to say, realistic training and education assume a body of knowledge about how people behave in various situations and environments. The mere production and collection of time-, location-, and situation-dependent observations are inadequate substitutes for tasks that call for systematic knowledge and a clear understanding of the limits to that knowledge.

Answering combatant commanders’ stated needs does not preclude JCOA’s own experimentation on how to meet commanders’ needs, or how to develop a parallel JCOA research pathway that also will engage commanders intellectually. The next memo will compare the TSO, I2O, and JTF HQ studies, and develop a way to implement a research line of analysis parallel to the existing lessons learned/best practices tradition.

                                                                                                                         15 Dempsey, “Win, Learn, Focus, Adapt, Win, Again,” p. 25. 16 Dempsey, “A Campaign of Learning to Achieve Institutional Adaptation,” p. 35. General Dempsey added, “We must make the ‘scrimmage’ [training] as hard as the ‘game’ in both the institutional schoolhouse and at home station.” In doing so, the Army will develop “adaptive leaders who are comfortable operating in ambiguity and complexity will increasingly be our competitive advantage against future threats to our nation.” “Leader Development,” pp. 24–26.

UNCLASSIFIED

UNCLASSIFIED 1

From: Mark D. Mandeles Subj: JCOA and operational analysis for US forces and their commanders, Part 2 Date: 31 March 2011

“Just the facts, ma’am” Jack Webb, as Sergeant Joe Friday, Dragnet radio and television drama

Background. “JCOA and operational analysis, Part 1” argued that useful assessment and analysis of operations and their outcomes for commanders (and their forces) requires empirical knowledge about behavior, including how people coordinate and collaborate, define and solve problems, and make decisions.17 Operational analysis conducted for combatant commanders would be of a higher quality and value if a growing body of empirical knowledge were available to guide research, data collection, assessment, and analysis. The JCOA practice of concentrating its efforts on developing lessons learned and best practices reinforces current research traditions and habits. These practices impede development of a complementary and parallel empirical research tradition that builds detailed knowledge about matters of concern to combatant commanders, such as how staff structure and processes (e.g. coordinating, problem-solving, and decision-making) affect mission achievement.

At least four advantages will accrue to JCOA from supplementing its work on lessons learned and best practices with research to build knowledge about problem-solving and decision-making. First, developing a dedicated knowledge-building research tradition will open for JCOA a unique niche for operational-level analysis in the Department of Defense.18 Second, access to knowledge about decision-making and problem-solving will provide useful context for JCOA’s lessons learned products, and sharpen insight into the diagnosis of challenges and feasibility of recommendations. Third, building knowledge about operational-level matters responds to GEN Raymond Odierno’s suggestions to JCOA and to the wider national security community to identify and develop themes,19 and to study, learn, and build what is learned into doctrine.20 Finally a research tradition to create knowledge entails a parallel effort of training people to think about inferences and rules of evidence, thereby helping officers who have served at JCOA to contribute more effectively to Service initiatives, such as the Army’s “Campaign of Learning.” The interwar naval aviation and Marine Corps amphibious landing communities did not purposefully set

                                                                                                                         17 Political scientist James G. March, a renowned and long-time contributor to the fields of organization theory, management science, and public administration complained that the Academy has forsaken empirical analyses of organizational behavior: “We know more about bounded rationality and the scarcity of attention as theoretical problems than we do about how organizations cope with them. … We know more about incremental hill climbing on an imagined surface using formally specified decision rules (or rules that learn) than we do about problem identification, problem solving, and change in the messy world of real organizations.” “Administrative Practice, Organization Theory, and Political Philosophy: Ruminations on the Reflections of John M. Gaus,” PS: Political Science and Politics, Vol. 30, December 1997, pp. 692–693. 18 A potential niche would be to develop knowledge, suitable to the scenarios and multiplayer on-line games built under the auspices of the Army’s “Training Brain” program, about individuals’ decision-making and problem-solving in real organizations. The goal of this program is to make Army training “more rigorous and relevant.” See GEN Martin E. Dempsey, “Leader Development,” Army, January 2011, pp. 26–27. 19 GEN Raymond Odierno, commanding general, MNF-I, Camp Victory, Baghdad, video teleconference with JCOA, Suffolk, 21 July 2009. Notes prepared by Mark D. Mandeles, 22 July 2009. 20 Ann Roosevelt, “Understanding ‘Why’ is Key to Military Operations, Odierno Says,” Defense Daily, 22 July 2010. General Odierno has emphasized a need to base doctrine on empirical description and assessment of operations on several other occasions, including in the video teleconference cited above.

UNCLASSIFIED

UNCLASSIFIED 2

themselves the task to produce operational-level knowledge, but they did.21 In the process, they developed powerful new capabilities for their Services, and facilitated the education and training of many people who led U.S. forces to victory in Europe and the Pacific.22 Additional advantages may emerge from JCOA’s interaction with other DoD components regarding the role of knowledge and analysis in decision-making, research methodologies, and the growth of real-life knowledge about how people at all echelons think, decide, and interact in the stressful conditions of combat.

Plan of attack. This memo compares how the I2O, JTF HQ, and the draft TSO reports contribute to the growth of knowledge about operational matters. Two questions preface the comparison. First, how does growth of knowledge support operational analysis? And, second, what concepts and themes, appropriate as initial context for studies, are available to JCOA?

A few comments about the feasibility of modifying JCOA’s approach to operational analysis, earlier conceived to be the core of “JCOA and operational analysis for US forces and their commanders, Part 3,” will be included in an epilogue in this, the “Part 2” memorandum. The turmoil accompanying the disestablishment of JFCOM, the reassignment of some JCOA divisions and functions to other DoD components, and impending budget and manpower limits should not prevent taking initial steps to create knowledge.

Developing knowledge supports operational analysis. The key implication of the increasing complexity of wartime decision-making, as General Odierno noted in listing the many factors relevant to leading COIN campaigns,23 is that commanders have to know a great deal about more topics than ever before. They have to oversee and manage staff structures and processes to propose lines of operation and calculate and compare impacts, interactions, and tradeoffs of many policies and programs. The complexity of aligning the commander’s staff structures, processes, procedures, and lines of operation with the task environment requires developing approaches to operational assessments and analyses that help commanders understand their mission(s); organizational structures, processes, and people; the operational environment; the ways and means to achieve desired ends; and the feasibility and wisdom of mission goals.

Empirical research on individuals’ decision-making and problem-solving has revealed cognitive limits on information-processing and memory. Such limits account for basic features of organizational structures and processes: structures, processes, and procedures are “mechanisms” people use to compensate for memory, calculation, and information-processing limits, and thereby, to accomplish complex tasks.24 One

                                                                                                                         21 Recounting the stories of how the naval aviation community and the Marine amphibious landing community produced operational-level knowledge would require a long and detailed digression into how people in many offices interacted, argued, produced analyses, conducted strategic and tactical war games, and carried out exercises and experiments. These two examples of innovation are described and explained in Mark D. Mandeles, Military Transformation Past and Present (Westport: Praeger Security International, 2007). 22 Thomas C. Hone, Norman Friedman, and Mark D. Mandeles, American and British Aircraft Carrier Development, 1919–1941 (Annapolis: Naval Institute Press, 1999); Mandeles, Military Transformation Past and Present. 23 Ann Roosevelt, “Understanding ‘Why’ is Key to Military Operations,” Defense Daily, 22 July 2010. 24 The research program on individual-level decision-making and problem-solving that has elaborated these ideas has guided research in many social science disciplines, including economics, political science, and psychology. Several researchers working in that research program have been awarded Nobel prizes in economic science, including Herbert A. Simon and Daniel Kahneman.

UNCLASSIFIED

UNCLASSIFIED 3

memory and information-processing aid for decision-making is the collection, synthesis, and organization of formulae and quantities of information too great to carry around in one’s head. Thus, contemporary scientific research (to extend the boundaries of what is known) and engineering (to improve existing products or processes, and to create new ones) presuppose the availability of authoritative and systematically organized data. For example, the CRC Handbook of Chemistry and Physics, 91st Edition, 2010-2011 provides authoritative listings of fundamental physical constants, such as properties of ionic liquids and solids. The CRC Handbook is updated annually. Architects and engineers rely on similar handbooks that represent the accumulation of extensive research and experience, and contain codified, detailed data and knowledge necessary to create or develop all sorts of products and structures. Military targeteers rely on “joint munitions effectiveness manuals” to relate targets, munitions, and level of damage or impairment. The logic of grounding operational assessment on authoritative data is straightforward. Likewise, the logic of how to conduct operational analysis and assessment of messy real-world military problems is straightforward; a key requirement is developing systematic, accessible, reliable, and valid data at the appropriate level of detail.25

The deliberate effort undertaken by Wilbur and Orville Wright to explain the flight-test failure of their second glider in 1901 illustrates the reciprocal relationship among data, knowledge, and operational analysis. They had used Otto Lilienthal’s aerodynamic data and Smeaton’s coefficient to compute their gliders’ lift and drag, but both gliders experienced problems with lift and control.26 Mindful of Lilienthal’s death in the crash of a glider he had designed using Smeaton’s coefficient and data he had collected, Orville and Wilbur Wright were puzzled by their glider’s behavior. In particular, their lift and drag equations did not predict the glider’s behavior. These anomalies prompted the Wright brothers to do a simple operational analysis of their effort to invent a powered and controllable flying machine. They determined to test the accuracy of Lilienthal’s data and of the Smeaton coefficient, and to correct any deficiencies identified in the then-current understanding of aerodynamics. The Wrights developed and built wind tunnels, which they perfected as they conducted systematic research on approximately 200 model wing shapes. This research had two key outcomes. First, it led to the calculation of a coefficient for lift and drag equations more accurate than the Smeaton coefficient. Second, the Wrights developed systematic empirical data that informed the selection of the most efficient wings for their 1902 glider, and the 1903 powered, pilot-controlled aircraft.

This very brief vignette about how Orville and Wilbur Wright pioneered manned, controlled flight illustrates the indispensable role of systematic knowledge in setting problems and accomplishing tasks, and the need to build knowledge (through observation, experimentation, and analysis) if data—or data having the required level of detail—are unavailable. Absent such knowledge, solutions to problems must depend on less efficient and less effective approaches: trial-and-error and serendipity.

                                                                                                                         25 The logic of operational analysis and assessment is analogous to rules-based logic of action. The concept of “logic of action” is examined in Mark D. Mandeles, Imposing Order on Chaos: Establishing JTF Headquarters, (unpublished, December 2010), Unclassified. 26 In 1759, John Smeaton (sometimes called the “father of civil engineering”) described the mechanics of waterwheels and windmills with a coefficient that related pressure to velocity in a paper entitled, “An Experimental Enquiry Concerning the Natural Powers of Water and Wind to Turn Mills and Other Machines Depending on Circular Motion.” By 1900 Smeaton’s coefficient had been accepted as true for 140 years. http://en.wikipedia.org/wik/John_Smeaton, accessed 1 March 2011.

UNCLASSIFIED

UNCLASSIFIED 4

Availability of concepts and themes. Collecting “facts,” as Sgt. Joe Friday reminded witnesses, is necessary and central to investigation. In operational analysis, “facts” are not discrete actions or functions. The facts of operational analysis and assessment include interactions among people, organizational components, and echelons, and the outcomes of activities and actions across a command’s lines of operation. As General Dempsey remarked only a few years ago, “context matters.”27 Consequently, identifying operational-level facts entails questions and types of effort different from those that examine and critique tactics, techniques, and procedures. Two World War II reports illustrate this point. The first concerns Operation Torch. The second involves an effort to transform random observations of tank vulnerabilities into a systematic program to integrate operational experience into the design and production processes.

Operation Torch. Lessons learned reports focus on tactical issues. Headquarters staffs began to assess materiel and organizational shortfalls in tactical engagements shortly after the November 1942 start of Operation Torch, the Anglo-American invasion of North Africa. For example, the Eastern Assault Force’s memorandum, “Lessons from Operation ‘Torch,’” was transmitted to the War Department General Staff on 26 December 1942. This memorandum’s lessons highlighted equipment vulnerabilities (such as the fragility of the lensatic compass) and unanticipated organizational shortcomings (such as requirements for a pool of linguists available for assignment to units, and a prisoner of war collecting section). The memorandum also examined the adequacy of pre-invasion processes and functions, such as training and developing doctrine. For instance, the report noted that,

Our great weakness is the lack of adequate doctrine and technique for amphibious operations. This is especially true of the means and methods to be employed by Combat Teams and small units. The remedy appears to be to organize a training center employing officers from our divisions which have had combat amphibious experience, and there develop a technique which is suitable for our organization, and for equipment and for the amphibious missions which our Army may be called upon to perform.28

The lessons learned report’s recognition that the Army lacked amphibious landing doctrine did not prompt questions about the applicability of the Marine Corps’ amphibious doctrine—a product of more than 20 years of serious work prior to 1942—to Army operations in the Mediterranean theater.

GEN Dwight D. Eisenhower’s Report on Operation ‘Torch’, a narrative administrative history of Torch, was written partly to identify and assess errors: “our mistakes, some of which were serious, may be less apparent at this moment, and, in the interest of future operations, they should be subjected to dispassionate analysis.”29 The report examined organizational structures and functions, and their contributions to mission success from a tactical rather than an operational-level perspective. For example, Eisenhower proposed the concept of “unity of command” as instrumental to planning and executing the overall campaign. In his words, “throughout the whole North African campaign we had in our plan of operations but one ultimate objective, our operations were to be under one commander, and they were to

                                                                                                                         27 Remarks by GEN Martin E. Dempsey, commanding general, U.S. Army Training and Doctrine Command (TRADOC), at the Association of the U.S. Army’s Chapter Presidents’ Dinner, Washington, D.C., 4 October 2009. 28 Memorandum, Major General Charles W. Ryder, Commanding General, Eastern Assault Force, Subject: Lessons from Operation ‘Torch,’ Date: 26 December 1942, p. 2. This memorandum was forwarded through Brigadier General H. R. Lewis, Acting Adjutant General to Assistant Chief of Staff, G-2, War Department General Staff (WDGS), Assistant Chief of Staff, G-3, WDGS, Assistant Chief of Staff, G-4, WDGS, Date: 24 January 1943. 29 GEN Dwight D. Eisenhower, Report on ‘Torch’, n.d., circa 1944, http://handle.dtic.mil/100.2/ADA438657, p. 48.

UNCLASSIFIED

UNCLASSIFIED 5

be controlled by a staff of dual nationality.”30 In this respect, Eisenhower was determined to make “unity of command” meaningful and real.

Alliances in the past have often done no more than to name the common foe, and ‘unity of command’ has been a pious aspiration thinly disguising the national jealousies, ambitions and recriminations of high-ranking officers, unwilling to subordinate themselves or their forces to a commander of different nationality or different service.31

A tactical perspective on staff structures and processes assumes that all individuals—and their organizations—share the same set of primary and subsidiary goals, and understanding of how they contribute to mission success. Therefore, any lessons identified would concern issues tangential to the primary assumption, rather than whether the primary assumption described correctly the reality of coalitional planning coordination, and hidden agenda and incentives.

Unfortunately, the assumption was manifestly false; the individuals and organizations assembled in the coalition to conquer North Africa shared only the overarching goal of defeating the Germans. They did not share the same set and ranking of sub-goals. Subordinate commanders had markedly different understandings of how to defeat the Germans in North Africa, and they viewed opportunities and challenges from the perspectives of—among other factors—their military occupational specialties, experience, national prerogatives, and individual egos. Yet, Eisenhower’s report did not examine such issues for combatant commanders, for instance, of how coordination was effected, how bargains among the relevant people were struck, what outcomes resulted from such bargains, and under what circumstances analytic processes played a greater role in commanders’ decision-making than bargaining and negotiations.

Eisenhower did not report what happened at headquarters below his own,32 so he did not set conditions for an operational-level research tradition by deliberately examining issues appropriate to interaction between the commander and subordinate operational commanders. For example, the Operation Torch report did not:

• Recommend examining existing Marine Corps amphibious landing doctrine to determine whether it would be appropriate for operations in the Mediterranean (or to invade Europe),

• Analyze tradeoffs between overwhelming firepower and achieving tactical surprise,33 • Describe, catalogue, and clarify the varying conditions and outcomes of implementing the unity

of command concept, or • Examine the location and operation of negative feedbacks or other error-correction and adaptation

processes.34

                                                                                                                         30 Eisenhower, Report on ‘Torch’, p. 9. 31 Eisenhower, Report on ‘Torch’, p. 1. 32 Eisenhower, Report on ‘Torch’, p. 38. 33 In fact, the failure of Supreme Headquarters Allied Expeditionary Force in London to review Marine Corps amphibious landing doctrine and determine its applicability to Operation Overlord was responsible for many planning and implementation errors on 6 June 1944. Adrian R. Lewis, Omaha Beach: A Flawed Victory (Chapel Hill: University of North Carolina Press, 2001).

UNCLASSIFIED

UNCLASSIFIED 6

The Operation Torch report is symptomatic of analyses of operational- and strategic-level issues in which the authors—and those who commissioned the reports—do not deliberately seek to develop and build knowledge, e.g., about decision making, delegation, and coordination. Instead, coalition forces operate with “pious aspirations” to establish unity of command even while they make ad hoc accommodations to organizational realities of coordination and implementation.

Operational assessment for feedback. Noted combat commander, GEN George S. Patton, Jr., also was his own operational analyst, and he was mindful of the interplay between combat operations and the design and production of materiel on the home front. On 30 January 1945, following the Battle of the Bulge, Generals Troy H. Middleton (VIII Corps commander) and Patton (Third Army commander), conducting battlefield circulation, drove to Saint-Vith, Belgium. Patton described what they saw and the feedback process he initiated.

[We] passed the scene of the tank battle during the initial German break-through. I [Patton] counted over a hundred American armored fighting vehicles along the road, and, as a result issued an order, subsequently carried out, that every tank should be examined and the direction, caliber, and type of hit which put it out made of record, that we would have data from which to construct a better tank. These data are now in the hands of the Ordnance Department.35

Patton’s description of this initiative to gather data so the Ordnance Department could use it to identify and correct equipment design errors is casual. But his insight was remarkable. Patton’s focused attention on military victory did not obscure his understanding of relationships among tactics, armored vehicle vulnerabilities, weapons design, and materiel development and production. One might expect that people leading the effort to design and produce armored vehicles would want to know about vulnerabilities of the materiel they produced, and, before the vehicles were deployed, to have proposed a process to collect data appropriate to correcting flaws and improving their designs. Yet, Ordnance Department leaders did not ask themselves whether data from operations would be relevant to improving vehicle performance and the design and production processes. Apparently, other senior War Department officials, consumed by their own daily tasks, also did not look at the intersection between combat and production lines of operation to inquire into mutual impacts of vehicle design, tactics, and operational success. That task of integration fell to a commander who thought about how to fight, the organizational infrastructure required to produce equipment, and the strengths and vulnerabilities of equipment he committed to battle.

This anecdote’s “take-away” is Patton’s recognition of the need to gather data and information relevant to the operational and strategic tasks of defeating the Germans. Of course, understanding the need to collect data on armored vehicle vulnerabilities is simpler than identifying, collecting, and assessing data relevant to diagnosing problems, for example, in the establishment of JTF HQs. But an empirical outlook underlies each effort (i.e., to employ empirical description and explanation rather than consult Aristotle’s texts; see the section on the University of Paris in the “Part 1” memorandum). Useful operational assessments require data collection about the unique situation under review, and data collection guided by                                                                                                                                                                                                                                                                                                                                                                                                        34 Contemporary concepts of “feedback,” “adaptation,” and “adaptive learning” are new names for the mid-1930s public administration concepts of “pre-audit” and “post-audit.” The concepts of pre-audit and post-audit were proposed by reformers to achieve greater efficiencies (and, sometimes, greater effectiveness) in the expenditure of public funds. These concepts were applied widely to local, state, and federal government departments, and were discussed and promoted in the popular press, academic journals, and government reports. Wartime planners—pre-war professional military and post-December 1941 inductees from academia and business—knew about administrative concepts to promote efficiency and effectiveness. 35 George S. Patton, Jr., War as I Knew It (New York: Houghton Middleton, 1975), pp. 179–180.

UNCLASSIFIED

UNCLASSIFIED 7

concepts or themes that are not defined by specific location, time period, and circumstances. The use of concepts to conduct operational analysis is complemented by analytical methods that are not designed for neat laboratory problems. 36

Potential themes for operational analysis and assessment. An initial set of themes and concepts to organize research on operational-level problems includes the following:

• Battle for the narrative to coordinate kinetic and non-kinetic operations • Centralization-delegation tradeoffs • Communication of intent • Coordination in a joint or multi-national headquarters • Feedback, adaptation (e.g., “flexible human intelligence”), and error-correction strategies • Informal organization and trusted agents • Human rationality (e.g., means-ends analysis) • Logics of action • Methodological and operational implications of data collection time-lags • Organization-environment links • Link between organizational structure and types of organizational error (such as “groupthink”),

and • Unity of effort

This list of themes and concepts is neither exhaustive nor comprehensive, and it could be expanded or shortened as insight matures about operational problems. The themes also do not divide along academic disciplinary boundaries. Progress in describing and explaining how structures and processes affect operations requires an interdisciplinary approach. The list’s virtues include addressing different aspects of individual and headquarters decision-making and problem-solving, including the interaction between individual and organizational levels of analysis. In any event, the concepts should be clarified by subsequent work, for example, by showing when and how the concept may be applied.

The list also includes concepts that can be used to link JCOA studies that already have been developed as stand-alone products. For example, a cursory look at the CTI and JTE studies37 shows that they address four themes: centralization-delegation tradeoffs; feedback, adaptation, and error-correction strategies; organization-environment links; and informal organizations and trusted agents. A closer examination of these two reports might show that other themes, such as “link between organizational structure and types of organizational error,” could provide insight into how operational practice evolved and opportunities to mitigate challenges and improve performance.

Many of these concepts and themes listed above also were addressed in the I2O and JTF HQ reports, and the examination of these concepts was shaped by relevant academic research. For example, academic and think tank research on human and organizational rationality has explored how ways and means are devised, tested, and assessed to achieve ends. The I2O report illustrated commanders’ and staffs’ means-ends reasoning in aligning kinetic and information operations to implement General Odierno’s intent. The coordination of kinetic and information operations was not directed by detailed doctrinal guidance to

                                                                                                                         36 I drafted discussion for the “Part 3” memorandum of how WWII operational analysis was able to deal with real-world messiness, but I could not include it in the “Part 2” epilogue due to time constraints. 37 JCOA, Counterinsurgency Targeting and Intelligence, Surveillance, and Reconnaissance (October 2007–December 2007); and JCOA, Joint Tactical Environment (March–July 2008).

UNCLASSIFIED

UNCLASSIFIED 8

commanders and staffs about what to do and how to do it. General Odierno revealed that he tried to be careful not to allow information operations doctrine to guide his actions. He operated on “commander’s feel and intuition.”38

In particular, the I2O report explained how General Odierno influenced and guided the application of means-ends reasoning to lines of operation: “General Odierno’s [non-doctrinal, but] simple and clear definition [of IO] made it easier for subordinate commanders and staffs to think about matching available means to ends.”39 The I2O report’s approach to examining means-ends reasoning anticipated Secretary of Defense Robert M. Gates’ January 2011 memorandum regarding how to employ IO: “Successful IO requires the identification of information-related capabilities most likely to achieve desired effects and not simply the employment of a capability.”40 The similarity between the way Secretary Gates described the goal of integrating IO into other lines of operation and the I2O report’s description of how General Odierno aligned kinetic and information operations suggests the highest level of DoD personnel are receptive to operational-level analysis and assessment.

The final justification for developing and conducting research around a set of themes or concepts is that the themes provide an agenda around which JCOA leaders and analysts can engage intellectually with combatant commanders and their staffs.41 Building knowledge is a path-dependent process: the state of knowledge at any one time period depends on what is already known. Hence, during research, when new themes are proposed or concepts are clarified, previously obscure relationships between concepts and problems become clear.

Research and analysis begins with vague and ambiguous terms and hypotheses, and this situation applies with even greater force for studies of social, political, and organizational systems. In 1946, Herbert Simon (a subsequent Nobel laureate) argued that advice provided to organizations’ leaders had the form of “proverbs,” and these proverbs could be used to justify and rationalize any position chosen.42 Others have made similar observations.43 Writing in 1949, Bernard Brodie could have been describing a briefing prepared today in the Pentagon when he stated that “the military profession is by no means alone in its frequent recourse to the slogan as a substitute for analysis—certain scholarly disciplines, [including] political science, have been more than a little untidy in this regard—but among the military we find some extreme examples of its ultimate development.”44 Yet, since the growth of knowledge is path-dependent,

                                                                                                                         38 GEN Raymond Odierno, comments on “I2O Quicklook Brief,” 29 June 2009, Camp Victory, Baghdad, Iraq. 39 Mandeles, Iraq Information Operations, April 2008–June 2009, February 2010, p. 17; see also p. 2. 40 Memorandum, Robert M. Gates, Secretary of Defense, “Strategic Communication and Information Operations in the DoD,” 25 January 2011, p. 2. 41 JCOA lore about the CTI study reveals negotiation between General Petraeus (the requester) and JCOA over the problem statement. General Petraeus’s initial question concerned what tactics, techniques, and procedures were transferable to general purpose forces. JCOA’s COL Bill Dolan and BGEN Jim Barkley, in discussion with General Petraeus, concluded the formulation of the initial request was misleading. Instead, they decided to examine how ISR was integrated on the battlefield, and this research provided General Petraeus the analytic support necessary to clarify his testimony to Congress. 42 Herbert A. Simon, “The Proverbs of Administration,” Public Administration Review, Vol. 6, Winter 1946, pp. 53–67. 43 For example, C.P. Snow noted, “Most of the concepts that administrative theorists use are at best rationalizations, not guides to further thought; as a rule they are unrealistically remote from the workaday experience.” C. P. Snow, Science and Government (New York: Mentor Book, 1962), p. 11. 44 Bernard Brodie, Strategy as a Science,” World Politics, Vol. 1, July 1949, p. 471.

UNCLASSIFIED

UNCLASSIFIED 9

the choice and clarity of concepts used in studies is very important. Of course, the widespread use of ambiguous concepts has a central role in debates over roles, resources, and organization in Washington (and elsewhere) because fuzzy assertions can be used to support or rationalize almost any proposed action or action taken. Nevertheless, vague language in reports about military operations is not especially useful to combatant commanders or operational forces. In addition to not knowing what actions, processes, and organization are associated with mission successes, it is not clear what actions do not work or why an event was a surprise.45

Conscious growth of knowledge through research may begin with ambiguous and vague observations, but the way forward relies on deliberate observation (natural and experimental) and analysis to clarify themes and concepts. This general strategy was noted in the “JCOA and operational analysis, Part 1” memo: one can’t think clearly when distinctions are unclear among behavior, phenomena, events, or structures.

Comparison of themes in three JCOA reports. JCOA’s response to conceptual ambiguity should be to treat each study topic as an example of a mixture of concepts and themes. In doing so, the themes/concepts under examination would be informed by previous work, and the current work would identify and assess the generalizability of themes/concepts to other situations and circumstances. For example, compare how the I2O report,46 the JTF HQ report,47 and the October 2010 TSO draft48 examined the concepts and themes of ad hoc organization, and trusted agents.

Ad hoc organization. The I2O report observed that ad hoc structures and processes were created to fit IO functions into division HQ staffs and to coordinate IO and kinetic operations. Pre-conflict doctrine was silent or ambiguous about the functions and activities division commanders determined they should conduct (p. 17). Division commanders’ increasing reliance on IO stimulated the creation of additional ad hoc structures and processes (p. 18). Ad hoc structures and processes were created to implement “battle for the narrative” initiatives and some targeting methodologies. But creating ad hoc structures and processes also added unpredictability to high tempo/high consequence operations (pp. 26–28).

The JTF HQ study extended the I2O report’s discussion of ad hoc structures and processes: the characteristics of ad hoc organization were described in greater detail, the concept of ad hoc organization was distinguished from formal organization, and some vulnerabilities of ad hoc organization were identified. In addition, General Odierno’s preference for formal over ad hoc structures and processes was discussed (pp. 13–19).

The TSO report examined the ad hoc Advise and Assist Brigades, but did not critique, correct, or reference the concept of ad hoc structures or processes. By default, the JCOA research tradition empowers each study team to choose the concepts that inform and guide its research. I support this aspect of the JCOA tradition with the following caveat: the development of a JCOA research program for                                                                                                                          45 A short digression is necessary here to discuss JCOA’s uses of the term “best practices.” The word “best” implies a comparison of practices, e.g., under similar circumstances, at other places, and at other times. “Best practices” also imply the existence and use of analytic and assessment criteria to evaluate and place in rank order (with respect to achieving a goal) a set of actions, lines of operation, or policies. JCOA should not identify a practice as “best” if the practice has not been compared to other practices. 46 Mandeles, Iraq Information Operations, April 2008–June 2009, February 2010. 47 Mandeles, Imposing Order on Chaos: Establishing JTF Headquarters, 8 September 2010. 48 Transition to Stability Operations Study, 8 October 2010, draft.

UNCLASSIFIED

UNCLASSIFIED 10

operational assessment assumes a set of core concepts to organize studies—a study team’s decision to examine some subset of these core concepts should be justified in a report appendix.

Trusted agents. The I2O report described how person-to-person interaction facilitated coordination and collaboration (p. 3), and daily contact enhanced prospects of coordination and collaboration (p. 17). The JTF HQ study extended the I2O report’s discussion of trusted agents: characteristics of trusted agents were listed, and types of trusted agents were identified, such as recently established relationships versus long-standing relationships. This discussion generated an hypothesis: Being first on the scene gives joint deployable team (JDT) members situational awareness which they may share with the arriving JTF staff, thereby establishing initial bonds of trust and facilitating cooperation and coordination (pp. 5, 8–9, 15–16). The practical implication of this hypothesis is that ADVON members of a JDT assigned to help establish JTF HQs should try to deploy to the theater in advance of the incoming JTF HQ staff.

The draft TSO report discussed how training units embedded staff with in-theater counterparts, but did not analyze or assess the role of trusted agents in training or administration of programs to transition authority to Iraqis. The role of trusted agents was mentioned in the section on key leader engagement, and the Advise and Assist Brigades deliberately emphasized enabling personal relationships with ISF personnel, and these relationships sustained ISF’s development. The TSO report did not reference, critique, or correct the theme of trusted agents developed in the earlier studies.

My draft notes for this “comparison of themes” section include comparing the three JCOA reports on eleven concepts/themes. Although I truncated the discussion, the basic argument remains: The work of JCOA’s talented, dedicated, and hard-working analysts is hindered by JCOA research tradition and habits, which conceive each study as a stand-alone product, and permit the repetition of vague and ambiguous observations across studies. The draft TSO report did not reference, apply, or correct any of the findings and observations developed in previous JCOA studies. JCOA analyst could reduce conceptual ambiguity by building on observations and findings of previous reports.

Epilogue. My purpose in these two “JCOA and operational analysis” memoranda has been to describe an approach to operational analysis and assessment of military operations different from the approach used by other analytic organizations, and to show the feasibility—and desirability—of consciously organizing the research process to build knowledge. I will close with a few words about how I came to set boundaries to the operational analysis and assessment mission.

State and Process Description. One approach to design of organizational structures and processes is to contrast their “state” and “process” descriptions. A state description models the structures and processes under examination or sets criteria to identify them; it is a description of the world as sensed. Examples of state descriptions include chemical structural formula, blueprints, and organizational charts. A process description provides the means to produce or create structures and processes that have desired characteristics; it is a description of the world as it is acted upon. Examples of process descriptions include equations for chemical processes, recipes, and functions.49 The “JCOA and operational analysis” memoranda provided a process description of an operational analysis research tradition. The I2O and JTF HQ narrative reports provided a state description of operational analysis and

                                                                                                                         49 Herbert A. Simon, The Sciences of the Artificial, 3rd edition (Cambridge: The MIT Press, 1996), pp. 210–211.

UNCLASSIFIED

UNCLASSIFIED 11

assessment. Together, the state and process descriptions show that it is feasible to reorient the JCOA research tradition.

Methodology. The Services’ lessons learned programs are organized around the collection of reliable and valid empirical data and information about operations. For observations about the operational level, however, the lessons learned programs’ approach to research—i.e., their focus on data collection schedules and tempos—create pitfalls for assessment and analysis. For example, one pitfall occurs when the study team conducts so many interviews that it does not have the time to properly transcribe and annotate the discussion. This situation is an example of what General Mattis called “over-proceduralization,” in which commanders and staffs value process over the aggregation of useful data. The unchallenged adherence to a preferred procedural template precludes the “critical and creative thinking directed at understanding, visualizing, and describing complex problems.”50

A related pitfall occurs when data collection is guided by unarticulated theories, hypotheses, and assumptions about correlations, relationships, and causality. As a result, the analysis and assessment may not address the proper echelon or level of analysis,51 or may obscure thought about relationships between data and research questions.

The Way Ahead. Regardless of the wording of the next JCOA mission statement, it seems to me that JCOA’s mission should be to provide combatant commanders operational-level analyses and assessments of (1) military operations, (2) military organization, processes, and procedures, and (3) the evolving and reciprocal relationship between the security environment and military capabilities and actions.

JCOA should develop a broad format for studies which would include (1) historical and conceptual context for the topic under consideration, (2) clarification of explanatory concepts or causal mechanisms, and (3) treatment of the phenomena under examination as examples of known processes, types, concepts, problems, or phenomena.

Before data collection begins, JCOA should detail analysts to review and summarize what is known about relevant themes and concepts at a minimum of two levels of analysis (e.g., individual and organizational). If interest and resources permit, JCOA also should examine the themes and concepts with respect to other levels of analysis, such as multi-organizational systems or inter-agency coordination. The resulting collection plan should be vetted or reviewed by persons internal (or external) to JCOA who are sympathetic JCOA’s mission, appreciate the many difficulties of doing this type of research, and possess substantive knowledge, for example, about social and political institutions, organizational behavior, and individual decision-making and problem-solving processes.

Propositions should be selected from a summary of themes and concepts, and hypotheses prepared about what we should expect to find (and to not find) at each level of analysis. JCOA would then be in a better position to identify concepts/themes, hypotheses, and metrics that will form the study’s line of analysis. This phase would also serve JCOA’s internal needs, since the results of the study will contribute to the content of JCOA’s research program, and form the basis of JCOA’s outreach to DoD and non-DoD                                                                                                                          50 General James N. Mattis, “Vision for a Joint Approach to Operational Design,” 6 October 2009, pp. 1–2. 51 For example, military units’ information requirements vary by task and combat echelon, and information collected to answer questions posed by a leader of a tactical unit does not necessarily address problems considered at higher-level headquarters (and vice versa).  

UNCLASSIFIED

UNCLASSIFIED 12

audiences. A JCOA-wide review and assessment of the study findings, as examples of concepts, themes, and hypotheses, should serve as input for follow-on studies. The internal use of study findings should (1) inform collection plans, interview protocol analyses, and assessments of follow-up studies, and (2) provide the basis of papers or research products submitted to military professional journals (e.g., Naval War College Review, Joint Forces Quarterly, or Military Review), an on-line magazine, or monograph series to engage the wider civilian research communities, e.g., National Science Foundation or the National Academies of Science.