31
Innovation in Government Booz | Allen | Hamilton Morgan Capilla, Joe Fleming, Chenying Huang, and Katherine Knight April 23, 2015 SANFORD SCHOOL OF PUBLIC POLICY | DUKE UNIVERSITY This student presentation was prepared during the spring of 2015 in partial completion of the requirements for PUBPOL 804, a course in the Master of Public Policy Program at the Sanford School of Public Policy at Duke University. The research, analysis, policy alternatives, and recommendations contained in this report are the work of the student team that authored the report, and do not represent the official or unofficial views of the Sanford School of Public Policy or of Duke University. Without the specific permission of its authors, this report may not be used or cited for any purpose other than to inform the client organization about the subject matter. The authors relied in many instances on data provided to them by the client and related organizations and make no independent representations as to the accuracy of the data.

Booz Allen Innovation Report

  • Upload
    -

  • View
    33

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Booz Allen Innovation Report

Innovation in Government Booz | Allen | Hamilton

Morgan Capilla, Joe Fleming, Chenying Huang, and Katherine Knight April 23, 2015

SANFORD SCHOOL OF PUBLIC POLICY | DUKE UNIVERSITY This student presentation was prepared during the spring of 2015 in partial completion of the requirements for PUBPOL 804, a course in the Master of Public Policy Program at the Sanford School of Public Policy at Duke University. The research, analysis, policy alternatives, and recommendations contained in this report are the work of the student team that authored the report, and do not represent the official or unofficial views of the Sanford School of Public Policy or of Duke University. Without the specific permission of its authors, this report may not be used or cited for any purpose other than to inform the client organization about the subject matter. The authors relied in many instances on data provided to them by the client and related organizations and make no independent representations as to the accuracy of the data.

Page 2: Booz Allen Innovation Report

2

Table of Contents Overview ................................................................................................................................. 3

Case Studies ........................................................................................................................... 5 1. Boston’s Mayor’s Office of New Urban Mechanics ......................................................................... 5

Best Practices ................................................................................................................................... 6 Limitations ........................................................................................................................................ 7

2. The New York City Center for Economic Opportunity .................................................................... 8 Best Practices ................................................................................................................................. 10 Limitations ...................................................................................................................................... 10

3. Defense Advanced Research Projects Agency ............................................................................... 11 Best Practices ................................................................................................................................. 12 DARPA Hard Test ........................................................................................................................... 14 Limitations ...................................................................................................................................... 15

4. Xerox’s Palo Alto Research Center ................................................................................................ 16 Best Practices ................................................................................................................................. 16 Limitations ...................................................................................................................................... 17

5. Office of Personnel Management Innovation Lab .......................................................................... 18 Best Practices ................................................................................................................................. 19 Limitations ...................................................................................................................................... 20

Recommendations ................................................................................................................ 21 1. Absorb Risk on Behalf of the Government by Piloting Projects .................................................... 21 2. Build Small Innovation Teams to Pilot Projects ............................................................................. 22 3. Innovation Teams Should Crowd and Open-Source Ideas ............................................................. 24 4. Scale Up Successful Pilot Projects .................................................................................................. 25 5. Develop Meaningful Interim Performance Metrics ........................................................................ 25 Conclusion ............................................................................................................................ 28

Works Cited ......................................................................................................................... 29

Page 3: Booz Allen Innovation Report

3

Overview

Federal agencies often struggle to innovate the services that they provide to

American citizens. The government faces such barriers as: 1) a failure to mitigate risk; 2)

difficulty in managing the impact of failures; 3) slow decision-making processes; 4)

inability to adapt to changing circumstances; and 5) a lack of collaboration across agencies

(Criado et al., 2013, p. 324; Hull et al., 2006, p. 59; Mergel, 2012, p. 282; Torfing et al.,

2012, p. 845).

On behalf of Booz Allen Hamilton (BAH), we researched the question: How should

BAH effectively spur innovation in federal agencies?

In order to answer this question, we highlight the best practices and limitations

regarding government and private sector innovation teams:

1. Boston’s Mayor’s Office of New Urban Mechanics (MONUM)

2. The New York City Center for Economic Opportunity

3. Department of Defense’s Defense Advanced Research Projects Agency (DARPA)

4. Xerox’s Palo Alto Research Center (PARC)

5. Office of Personnel Management (OPM)

We identified the most successful practices from each example and analyzed how these

practices could best fit BAH’s own innovation strategy. We have made five

recommendations to BAH based on our five case studies:

1. Absorb risk on behalf of the government by piloting projects. Innovation teams

should pilot projects that absorb government risk. Innovation teams can mitigate

government risk by creating a physical space that distinguishes it from the rest of the

agency. Innovation teams can fully test pilot projects and protect the government

Page 4: Booz Allen Innovation Report

4

from implementing failed projects. Teams that pilot projects to absorb risk will

benefit from engaging in public-private partnerships. This partnership will allow for

diverse perspectives to flow into the organizations.

2. Build small innovation teams to pilot projects. We recommend that BAH’s

innovation teams consist of 5-8 people with diverse experiential knowledge. Team

members should be adaptable and flexible, and project managers should have strong

academic and leadership backgrounds.

3. Crowd- and open-source ideas from within BAH and government agencies.

BAH and government agencies should crowd-source ideas from within their

organizations through internal forums. The government will be able to collaborate in

the innovation process while not having to bear the responsibility of producing

innovative solutions entirely on its own.

4. Scale up successful pilot projects. BAH should then scale up successful pilot

projects. BAH and the government will share resources to increase the likelihood of

project success and reduce the cost of scaling.

5. Develop meaningful interim performance metrics. BAH should adapt

performance tests, such as the DARPA Hard Test, to measure a project’s potential

innovation and ensure that pilots are achieving the desired outcomes. Performance

metrics will save time as well as funds and direct resources towards successful

projects.

Page 5: Booz Allen Innovation Report

5

Case Studies 1. Boston’s Mayor’s Office of New Urban Mechanics The Boston Mayor’s Office of New Urban Mechanics (MONUM) is a government

innovation team that offers creative solutions to challenges within city government.

MONUM engages public and private participants in its innovation projects. It focuses on

using new technologies to improve public services (Puttick et al., 2014, p. 42).

In 2009, Boston Mayor Thomas Menino created MONUM as a way to drive

innovation within city government without taking city employees away from their primary

duties. While MONUM collects, pilots, and scales projects, regular city employees can

remain focused on their daily responsibilities (Puttick et al., 2014, p. 42). As Menino’s

former Chief of Staff Mitch Weiss puts it, “MONUM allows us to be ‘ambidextrous’—to

have agencies focus on the traditional work residents expect but meanwhile also experiment

on more boundary-pushing, citizen-engaged innovation” (Goldsmith, 2012).

With five highly skilled team members and an annual budget of around $600,000,

MONUM works on issues related to education, civic engagement, city design, and economic

development (Puttick et al., 2014, p. 104; New Urban Mechanics, 2015). In each area, team

members form partnerships with public, private, academic, and nonprofit groups. These

dynamic collaborations help it address a broad range of ideas and expertise to address city

challenges (Puttick et al., 2014, p. 42; Trenkner, 2011a). Following MONUM’s success,

Philadelphia and Utah Valley adapted Boston’s New Urban Mechanics model into their

municipal governments (New Urban Mechanics, 2015).

One of the team’s most successful projects is Citizens Connect, an award-winning

smartphone application that allows residents to readily report such public problems as

Page 6: Booz Allen Innovation Report

6

potholes and graffiti (Lawrence, 2013; Desouza, 2014).1 City workers can easily access the

reports through a partner smartphone application (Farrell, 2012). From 2009 to 2013,

Citizens Connect assisted the city in efficiently resolving over 29,000 reported problems.

Today the application handles 28% of Boston’s service requests (Wood, 2012; The Urban

Vision, 2013; New Urban Mechanics, 2015). MONUM helped to introduce the application

to 54 other Massachusetts communities (Laidler, 2013). Roughly 20 other countries across

the world have adopted the Citizens Connect model (Trenkner, 2011a).

Best Practices MONUM helps the Boston city government overcome a traditional barrier to

innovation: it assumes the risk affiliated with failed innovation projects. When we asked

Nigel Jacob, co-chair of New Urban Mechanics about MONUM’s role, he explained:

Everyone understands that innovation requires risk. But what they don’t usually understand is … who bears the brunt of that risk? Asking public officials to do that isn’t fair. Instead, we have devised a mechanism by which we will own it, and that’s a core part of the value-add…We can essentially de-risk [a] project for the government by making that a New Urban Mechanics project. This is really just slight of hand, but it actually does work in practice” (Interview with Nigel Jacob, 2015). So while the fear of failure generally inhibits government agencies from exploring

new solutions to public challenges, MONUM can take on the risk and more freely explore

innovative projects.

MONUM’s effective experimentation method allows it to mitigate the impact of

failures. The team first sources ideas from various government and non-government actors.

1 In 2011, Citizens Connect received Harvard’s Ash Center for Democratic Governance Bright Ideas award for its innovative use of technology to engage citizens ("Harvard Announces Bright Ideas in Government, 2011, p. 4). In recognition for their accomplishments with MONUM, co-chairs Nigel Jacob and Chris Osgood won Governing Magazine’s 2011 Public Official of the Year award (Trenkner, 2011b). They also received the White House’s Local Innovation Champions of Change award in 2012 (Rocheleau, 2012; “Local Innovation”, 2015).

Page 7: Booz Allen Innovation Report

7

After selecting the best ideas, the team pilots the innovation on a small scale with the goal of

“maximizing public impact and minimizing public costs” (Puttick et al., 2014, p. 43).

MONUM packages only the most successful pilots and disseminates its findings to other

cities (Puttick et al., 2014, p. 43). Thus, by identifying and eliminating failures early in the

experimentation process, MONUM can minimize the costs of failed projects and provide the

city with only the most effective outcomes (Georges et al., 2013, p. 3).

MONUM co-chairs Chris Osgood and Nigel Jacob make sure that its projects

involve a broad range of participants. Private corporations, university students, public

employees, and city residents can actively participate in MONUM initiatives through

contributing specialized knowledge or supplying general feedback (Georges et al., 2013, p.

2). In forming these partnerships, MONUM acquires specialized knowledge while

maintaining the quick response time of a small, agile team.

MONUM also maintains an open line of communication with the public to crowd

source innovation proposals. MONUM’s website encourages any citizen with a thoroughly

developed innovation proposal to submit their project plans. The team reviews and assesses

proposals based on: 1) the citizen’s ability to follow through with the project; 2) the scope of

the audience the initiative is projected to serve; and 3) the ability to transform the proposal

into a pilot project (Interview with Nigel Jacobs, 2015). Enlisting public participation helps

MONUM keep fresh ideas flowing into the agency and engage Boston residents.

Limitations One limitation that MONUM faces involves its tendency to undertake relatively low-

risk and low-reward projects, such as those made possible through smartphone applications.

City innovation teams generally work with tight budgets, and they can get stuck picking the

Page 8: Booz Allen Innovation Report

8

“low-hanging fruit” (The Urban Vision, 2013). Exploring additional funding opportunities

and attracting more private investments could assist MONUM in attaining the resources

necessary to experiment with high-risk, high reward innovations.

2. The New York City Center for Economic Opportunity The New York City Center for Economic Opportunity (CEO) seeks to reduce New

York City’s poverty rates by developing innovative assistance programs for low-income

residents (Gais et al., 2014, p. 2). CEO works with a broad range of city agencies,

universities, nonprofits, and private companies to design, pilot, monitor, and scale

initiatives. Since it opened in 2006, the Center for Economic Opportunity has worked with

40 agencies to test nearly 70 initiatives that provide health, education, professional

development, and youth programs for impoverished New York residents (Puttick et al.,

2014, p. 66-67).2 Its 23 employment programs have enrolled over 40,000 citizens, helping

participants earn, on average, about $2.00 more per hour than nonparticipants (CEO Annual

Report, 2014, p. 29).3

2 According to CEO’s 2012-2013 progress report, the agency ranked their top 12 most successful programs as follows: Advance in Work, Child Care Tax Credit, Community Partners, CUNY ASAP, NYC Training Guide, Office of Financial Empowerment, School-Based Health Centers, Sector-Focused Career Centers, EITC Mailings, Jobs-Plus, CEO Poverty Measure, and Food Policy Coordinator (CEO Annual Report, 2014, p. 16). 3 Linda Gibbs, NYC Deputy Mayor and co-creator of CEO, won the Harvard University Innovation in Government Award in 2006 and 2009 for her role in creating the agency (Goldsmith et al., 2010, p. 107). In 2012, the agency won this same award for its progress in advancing educational and employment opportunities across the city (“Center for Economic Opportunity”, 2011). In an effort to replicate CEO’s successes, the Obama Administration awarded the agency and its collaborators a $5.7 million grant from the Social Innovation Fund to implement five of its projects in seven other US cities (“Center for Economic Opportunity”, 2015; Hoagland, 2012). Through Bloomberg Philanthropies, CEO co-founder and former Mayor Michael Bloomberg is currently scaling his innovation approach in 17 other US cities and two international cities ("Innovation Teams - Bloomberg Philanthropies", 2015).

Page 9: Booz Allen Innovation Report

9

The City University of New York’s Accelerated Study in Associate Programs

(CUNY ASAP) is one of CEO’s most successful projects. The program aims to improve

community college graduation rates, which were dismally low before ASAP started (Puttick

et al., 2014, p. 68; “Innovations to Build On”, 2013). ASAP addresses financial, structural,

and learning barriers that traditionally inhibit students from graduating. In addition to

financial assistance, it provides students with academic mentoring, career services, and

tutoring sessions. In return for these services, ASAP requires students to maintain full-time

enrollment status and strongly encourages students to adhere to a three-year graduation track

(Scrivener et al., 2015, p. 18).

In a 2013 study, the Manpower Demonstration Research Corporation (MDRC), an

independent social policy research institution, found that 40% of participating students

graduated from New York community colleges within three-years, while only 22% of

nonparticipants graduated within that timeframe. After receiving their associate degrees,

25% of participating students went on to enroll in four-year institutions while only 17% of

nonparticipants did so (Scrivener et al., 2015, p. 51; p. 62).

The program has also proved to be cost-effective. ASAP invested more money per

student than traditional community college assistance programs. But the program succeeded

in graduating a far greater percentage of students within a timely fashion. So while

traditional programs invested less money per student, their failure to graduate students on

time incurred greater long-term costs. Overall, ASAP helped the city save about $13,423 per

degree earned (Scrivener et al., 2015, p. 81). Following these impressive results, the

program is undergoing an expansion; its goal is to serve over 13,000 students across six

communities by the fall of 2016 (Scrivener et al., 2015, p. 92).

Page 10: Booz Allen Innovation Report

10

Best Practices CEO first pilots its projects on a small scale. In carefully monitoring the effects of its

pilots, CEO can stop underperforming programs early on and reduce the impact of failed

initiatives. Rather than completely disregarding unsuccessful projects, CEO maintains

detailed information about the nature of its failures, which helps prevent similar mistakes

from occurring in the future (CEO Annual Report, 2014, p. 15).

To accurately measure the impact of its anti-poverty initiatives, CEO emphasizes

rigorous monitoring and evaluation practices. Of its 19 staff members, 13 focus on project

design, evaluation, and poverty research (Center for Economic Opportunity, 2015). It also

collaborates with nine independent research institutions to conduct randomized control trials

and other studies to evaluate its programs (CEO Annual Report, 2014, p. 15). Throughout

the piloting and implementation phases, CEO closely monitors the outcomes of its projects

and collects extensive data to convey its findings. Whether a project succeeds or fails, CEO

publishes evaluations reports on its website. This practice establishes a high level of

transparency and credibility for its work (Center for Economic Opportunity, 2013).

CEO forms a variety of partnerships across the public and private sectors to attract

specialized knowledge and attain funding. Collaborating with participants of various

expertise and backgrounds allows CEO to remain relatively small and agile while achieving

the critical knowledge and resources.

Limitations

CEO’s ambitious goal—alleviating citywide poverty— poses a number of challenges

for the agency. In order to witness results on a city level, CEO would need extensive

resources to expand its projects. CEO works with an annual budget of around $100 million,

Page 11: Booz Allen Innovation Report

11

yet it still lacks the resources needed to scale programs across the city (Puttick et al., 2014,

p. 66; Gais et al., 2014, p. 29). As one city council staff member noted, “The biggest

criticism of CEO…has been that the programs have not been ramped up to scale. The total

number of people served…needs to expand” (Gais et al., 2014, p. 28).

Though CEO effectively monitors and evaluates the impacts of its programs on

target communities, it has trouble estimating its performance in the context of citywide

poverty. First deputy mayor Anthony E. Shorris explained:

Poverty is reflective of larger national and global phenomena…We’re not unrealistic about what a city can do. But after four years, we’ll be asking whether our interventions were effective in changing what would have been the course of poverty in New York (Roberts, 2014).4

CEO’s programs have undoubtedly improved economic conditions for many New

Yorkers. But, in dealing with such an ambitious objective, the agency has made it difficult

to assess its overall impact and attain the resources necessary to achieve large-scale progress

against poverty.

3. Defense Advanced Research Projects Agency The Defense Advanced Research Projects Agency (DARPA) is the central research

organization within the Department of Defense (DoD). The agency’s goal is to provide

innovations in the field of national security. DARPA aims for “transformational change”

rather than “incremental advancement” (DARPA). DARPA has developed significant

technological innovations, such as stealth technology, ARPANET, and unmanned aerial

vehicles.

4 The New York Times, online article.

Page 12: Booz Allen Innovation Report

12

Best Practices DARPA has a flat management hierarchy to allow the swift movement of ideas

throughout the DoD. Ideas can often travel slowly through government agencies with multi-

layered top-down bureaucracies (Mergel, 2012, p. 286). Removing that obstacle is

especially important for DARPA because its projects are sensitive to predetermined

deadlines. With its 220 total employees, DARPA has only one level of separating project

managers from the agency director (Dubois, 2003, p. 5).

The program managers fulfill numerous logistical responsibilities. They have the

flexibility and autonomy to design and select their teams. Managers also have the authority

to identify opportunities for innovation and act upon those opportunities in the role of

“techno-scout” (Dubois, 2003, p.8; Dugan, 2013). Program managers have tenures of only

four years, which instills a sense of urgency and purpose into the process. To allow the team

to continue its work unabated, project managers oversee project development and handle

logistical responsibilities.

A project manger is usually a PhD because deep technical knowledge is a

prerequisite for the position (Dugan, 2013). That requirement may seem contrary to the

culture of DARPA. However, the organization’s innovation model is one of “constant flux,”

and project managers need technical knowledge to understand how to adapt projects. This

PhD preference may be specific to DARPA due its emphasis on physical sciences. Yet the

idea may have some general application because project managers should have deep

knowledge of the specific field in which the innovation is occurring.

According to former DARPA director Regina Dugan, previous failed efforts to

imitate DARPA outside of the Department of Defense have led individuals to wrongly

Page 13: Booz Allen Innovation Report

13

conclude that the DARPA’s model cannot be replicated (Dugan, 2013). She argues that

these mistaken conclusions stem from a misunderstanding about the three core concepts of

the agency: ambitious goals, temporary teams, and independence from the parent

organization. DARPA aims to develop revolutionary rather than incremental change.

DARPA’s independence insulates the agency from DoD culture and lets it pursue projects

that may threaten existing DoD business practices. The agency can contract with specialists

from a number of different private organizations who would not otherwise collaborate

(Dugan, 2013).

Temporary project teams are a core element of DARPA’s operations. Fixed-term

managers bring together capable specialists to form project teams that last no more than five

years. Dugan has argued that this process allows DARPA to attract high-quality talent.

Furthermore, she states that this model is pragmatic because a “high-risk effort by a diverse

set of world-class experts can be sustained for only a limited period” (Dugan, 2013).

DARPA allows these performers to work away from DoD in their different organizations

(Dugan, 2013). Dugan argues that permanently hiring high-quality individuals would be the

wrong method to encourage performers to take on high-risk projects (Dugan, 2013).

Adapting DARPA’s fixed project timeframes can benefit a firm’s project portfolio.

By assigning benchmarks for project success, DARPA is able to move funds away from

projects that will not reach their benchmarks and towards promising new proposals (Dugan,

2013).

Project proposals are subjected to a strict set of criteria reviewed by the DARPA

offices that solicit proposals before project managers can recruit their teams and receive

funding (Dubois, 2003, p.6). Former DARPA director George Heilmeier established what is

Page 14: Booz Allen Innovation Report

14

known as the Heilmeier Catechism, which serves as the foundation for DARPA’s project

criteria (Greenwald, 2013):

1. What are you trying to accomplish?

2. How is it done today, and what are the limitations?

a. What is truly new in your approach that will remove current limitations and improve performance? By how much? (A factor of 10? 100? More?)

3. If successful, what difference will it make, and to whom?

b. What are the midterm exams, final exams, or full-scale applications required to prove your hypothesis? When will they be done?

c. What is the DARPA exit strategy? Who will take the technologies that you have developed and turn them into a new capability or a real product?

4. How much will it cost? (Dubois, 2003, p.7)

DARPA Hard Test

BAH innovation teams can use the DARPA Hard Test to gauge their perspectives on a

project’s potential and co-develop average scores from team members and stakeholders.

The Hard Test measures a project’s potential innovation on a scale of 1 to 7 based on

four criteria:

1) Far-reaching: It fundamentally changes how sizeable populations interact with a concept.

2) Technically challenging: New technologies are needed to push innovations forward.

3) Multidisciplinary: It requires multiple stakeholders from diverse professional and academic fields to implement the solution.

4) Actionable: It invokes direct action by stakeholders to incorporate ideas (Carleton et

al., 2013, p. 213).

Page 15: Booz Allen Innovation Report

15

Dimension 1 (Lowest) 7 (Highest)

I. Far-Reaching Does not alter the way people think about an issue

Societal shift in paradigm on how people think about

a solution

II. Technically challenging

No new expertise is required

Advanced specialized proficiency

III. Multidisciplinary No collaboration is required Diverse skill set is needed

IV. Actionable Little incentive to begin adapting the solution

Easily adaptable

DARPA created the Hard Test to test projects that are both high-risk and high-

reward. For example, DARPA’s Biological Technologies Office recently solicited proposals

on how to reduce the effects of infectious diseases. The Office wants novel approaches to

identifying the harmful effects of diseases, which can be difficult to accomplish.

The DARPA Hard Test can be easily adopted and implemented by any innovation

team. However, the team must recognize that the Hard Test heavily emphasizes

technological advances. The Hard Test still provides valuable metrics for innovation teams

to test prospective projects. Teams may decide to alter the Hard Test to exclude the

technology criteria and fit their project’s needs.

Limitations DARPA develops projects that deal with deep technical knowledge of physical and

material sciences. BAH may want its team members and project managers to possess

technical knowledge as well. Suitable project managers and team members will need

experiential or academic knowledge in the field of public policy to identify potentially

innovative practices.

Page 16: Booz Allen Innovation Report

16

4. Xerox’s Palo Alto Research Center The Palo Alto Research Center (PARC) has an impressive history of developing

innovative and disruptive technologies, such as personal computers, Ethernet, and fiber

optics. In his book, “Dealers of Lightning: Xerox PARC and the Dawn of the Computer

Age”, Michael Hiltzik catalogued these achievements. Xerox established PARC as its

independent research wing in 1970. It centered on a consolidated team of engineers and

information scientists who designed or redefined disruptive technologies. PARC was

designed to explore technologies not directly related Xerox’s photocopying business

(Dennis). PARC did not having to tie its innovations back to the Xerox’s central practices.

Best Practices

According to PARC CEO Stephen Hoover, multiple methods exist for embracing

open innovation. Open innovation may involve crowdsourcing or open-sourcing innovation,

yet it is ultimately premised on the understanding that not all of the best performers work

within the firm (Hoover, 2012).

Hoover argues that the best forms of open innovation rely upon three principles: 1)

understanding the right problem that needs solving; 2) a willingness to experiment with new

processes; and 3) specialized industry knowledge (Hoover, 2012). A firm needs strategic

insight about its client’s visions and goals and must work in collaboration with the client

using those three principles (Hoover, 2012).

The concept of open innovation is particularly salient for Booz Allen Hamilton.

BAH’s client-based dynamic with government agencies allows the firm to utilize

crowdsourcing in collecting ideas from employees within the client agency as well as from

within BAH itself.

Page 17: Booz Allen Innovation Report

17

Alan Kay, who helped develop innovative technologies at PARC, has made the case

that more companies should embrace the PARC model. In following PARC’s model,

companies should find exemplary individuals within the respective field and fund those

individuals so that they can achieve successful innovations—even if that success rate is as

low as 30% (Mui, 2012, p.2). Kay argues that the right question to ask about spurring

innovation is: “Are we funding great people in sufficient quantity to allow 30% of success to

change the world and create a cornucopia of new wealth?” (Mui, 2012, p.2).

In Malcolm Gladwell’s profile of PARC, University of California, Davis

psychologist Dean Simonton similarly argued for an iterative process of creativity.

According to Simonton, “quality is a probabilistic function of quantity” (Gladwell, 2011). In

Simonton’s view, to correctly manage creative methods, managers must recognize the

tradeoff between a higher number of successes and a higher number of failures (Gladwell,

2011). While government agencies may hesitate to embrace that concept, BAH may find it

most applicable in a client-based project model, since it may be able to absorb the risk from

the higher number of failures.

As a private firm, BAH may be well suited to absorb risk from innovating practices

on behalf of government agencies. Agencies are often averse to taking on high-risk, high-

reward projects due to concerns about how failures may reflect on their performance. BAH

can experiment with riskier practices and then pass the successful methods on to the agency.

Limitations

PARC’s model is unique in that it encourages an iterative design process defined by

the trade-off between failures and successes. Before PARC’s incorporation, engineers were

not obligated to report to managers for status updates on projects. They did have to

Page 18: Booz Allen Innovation Report

18

demonstrate the value in their proposals to receive funding. However, the proposal’s value

was not contingent upon the innovation having pragmatic application. Rather, the issue was

how far the proposal planned on pushing the limits of the technology (Gladwell, 2011).

Lawrence Lee, senior director of innovation at PARC, argued that using the iterative

approach provides myriad benefits. It allows the organization to better understand how the

product fits into the market, as well as to anticipate expected returns from the process (Lee,

L., 2012). The caveat to those benefits, however, is that they are best realized in the absence

of deadlines. For BAH’s contracted projects, open-ended projects would not provide the

timely results that government agencies may require.

5. Office of Personnel Management Innovation Lab In 2012, the Office of Personnel Management (OPM) opened its innovation lab in

the basement of its Washington, D.C. headquarters. The Innovation Lab has its own office

space, which allows for a venue for community engagement and collaborative work (Office

of Personnel Management, 2015). The lab develops and tests solutions to problems within

both the OPM and the federal government (Mann, 2013, p.1). In the fiscal year 2013, the lab

hired six full-time employees and operated at a reported annual cost of $476,000 (United

States Government Accountability Office, 2014, p. 1).

The OPM Lab mitigates the risk of failure by reviewing and facilitating each project

employees bring to the table in problem-solving processes. The public sector has significant

deterrents to innovation, such as limited funding and risk-averse cultures (GAO, 2014, p. 5).

Additionally, public sector officials may fear the political consequences of underperforming

projects (GAO, 2014, p. 5). The OPM Lab assumes that risk for the agency. The Lab allows

Page 19: Booz Allen Innovation Report

19

for federal employees to learn quickly from early mistakes by prototyping and piloting

projects (GAO, 2014, p. 5). The Lab seeks to change governments’ risk-averse nature by

diminishing the political consequences of failure.

Best Practices A lab needs to accommodate a wide variety of activities, often including

brainstorming sessions, skills workshops, formal project reviews, office duties, meetings,

and presentations. Thus, a space should be easily adaptable to accommodate the various

initiatives that it will help cultivate (UNICEF, 2012, p. 24).

The OPM Lab is a good example of how to wisely manage a physical space. The

Lab provides physical space where innovators can convene. It hosts weekly training sessions

on best practices and ensures the sharing of information across OPM and other federal

agencies. The Lab fills the space with whiteboards, couches, and open carpeted space to

promote creativity. The Lab also uses the communal space to establish inter-organizational

networks, which can be powerful drivers for supporting innovation (GAO, 2014, p. 2).

For example, the Office of Science and Technology Policy (OSTP) partnered with

OPM and the Federal Community of Practice on Crowdsourcing and Citizen Science

(FCPCCS) to develop a workshop to engage different stakeholders to innovate government

services. The OSTP, OPM and FCPCCS collaborated and developed a ToolKit through

crowdsourcing methods to help Federal agencies innovate countrywide scientific and

societal problems (Gustetic et al., 2014).

Designing services from the point of view of the targeted audience is a concept that

has been recognized as crucial in the public sector. It helps to avoid approaches that do not

improve the experience of users beyond the status quo (Puttick et al., 2014, p. 38).

Page 20: Booz Allen Innovation Report

20

The OPM Lab supports efforts to address complex policy issues by

incorporating the “human-centered design” approach to problem solving. This approach is

intended to generate concrete solutions driven by the needs of the targeted audience. The

Lab, for example, helped reframe the problem of integrating veterans by focusing on the

experience of the end-users. Previously, the OPM assisted veterans by requiring them to

participate in job training programs (Lee, J., 2012). Now the agency more selectively places

veterans into training programs, ensuring that the right individuals receive the right training.

Limitations Internal surveys on whether employees would recommend the Lab to colleagues and

the efficacy of human-centered designs have shown overwhelmingly positive responses

(GAO, 2014, p. 19). However, OPM has not fully developed meaningful performance

measures (GAO, 2014, p. 1). A year after the Lab opened, OPM began to measure the Lab’s

performance more systematically. Lab staff use a series of surveys to measure participant

experience through pre-session and post-session check-ins, and to capture specific project-

related outcomes for the different services they offer. However, according to the GAO

survey, instruments are unlikely to yield data of sufficient credibility and relevance to

indicate the extent of the lab's achievements (GAO, 2014, p. 19). The surveys have been

susceptible to various types of respondent bias. Moreover, when employees attempted to

analyze the responses to measure performance, the results proved difficult to interpret

(GAO, 2014, p. 17).

Page 21: Booz Allen Innovation Report

21

Therefore, we highlight the MindLab5 to show that projects benefit when managers

set quantifiable interim goals, establish performance measures, and use data to adjust their

practices (GAO, 2014, p. 21). Meaningful measures could help the OPM Lab assess its

progress toward improving participants’ abilities to solve problems. BAH should adopt

performance metrics to track project costs, benefits, and performance improvements.

Recommendations

1. Absorb Risk on Behalf of the Government by Piloting Projects

Based on our case studies, we recommend that BAH form its own innovative teams

to mitigate the risk of failed innovation projects. The government is risk-averse by nature,

and agencies are inhibited from exploring innovative solutions to public challenges. BAH

innovation teams can help the government overcome this barrier by being accountable for

project outcomes.

BAH’s teams can absorb government risk by piloting projects before applying them

to public challenges. The teams should work in physical spaces separated from the agency.

Within these designated spaces, BAH’s teams can fully test projects and absorb the risk of

failed projects. As demonstrated by the successes of the OPM Lab, projects benefit when

teams have physical space to convene and collaborate. Agencies will be protected from

project failures and minimize the implementation of poor solutions.

5 MindLab is a U.S. cross-governmental innovation unit, which involves citizens and businesses in creating new solutions for society. It is a part of three ministries and one municipality: the Ministry of Business and Growth, the Ministry of Education, the Ministry of Employment and Odense Municipality and it forms a collaboration with the Ministry for Economic Affairs and the Interior. It addresses areas such as entrepreneurship, digital self-service, education and employment. It is also a physical space—a neutral zone for inspiring creativity, innovation and collaboration.

Page 22: Booz Allen Innovation Report

22

When piloting projects, the BAH teams should consider legal, political, and social

concerns, such as:

1) Is this something BAH will actually be able to do?

2) Do political considerations prevent the project from being viable?

3) Will the innovative solutions be accepted by the public?

If a piloted project does not meet these requirements, BAH should stop developing

that project.

According to Bloomberg Innovation Playbook,6 piloting is a way to “live test” an

idea (Bloomberg Philanthropies, 2014, p. 40). The goal is to create a scenario in which the

essence of the idea can be tested. For example, NYC CEO pilots its projects on a small scale

before introducing the initiatives to a larger audience. The organization then scales

successful pilots and discontinues underperforming ones. The OPM Lab also functions as a

risk-taking agency to experiment, prototype, and pilot projects.

Innovation projects benefit when teams engage in public-private partnerships. For

example, MONUM incorporates private corporations, university students, public employees,

and city residents into its initiatives (Georges et al., 2013, p. 2). Enlisting diverse

participants allows MONUM to evaluate a variety of perspectives and keep fresh ideas

flowing into the organization, which are processes that BAH could adopt.

2. Build Small Innovation Teams to Pilot Projects BAH innovation teams should consist of 5-8 members, and each member should

possess diverse knowledge. Most team members, even the project managers, do not need

6 Bloomberg Innovation Playbook presents the Innovation Delivery Model created to provide cities with a method to reduce barriers and deliver change more effectively to their citizens. In November 2011, Atlanta, Chicago, Louisville, Memphis, and New Orleans began to use the Innovation Delivery Model. The Bloomberg Innovation Playbook functions as a guideline for creating innovation teams.  

Page 23: Booz Allen Innovation Report

23

expertise in a specific policy field. Yet they should be able to adapt to new projects once the

goals of previous projects have been accomplished. More specifically, team members should

have an array of skills suited to adaptability and flexibility. BAH project managers should

look for performers with more general technical knowledge (in the fields of public policy

and/or public administration). Members of successful teams generally possess the following

skills: project management; quantitative and qualitative research and analytical capabilities;

creative and critical thinking skills; and strong oral and written communication skills (Putick

et al., 2014).

Project managers should play crucial roles in the BAH innovation teams. They

should be responsible for logistical concerns, such as identifying potential projects (after the

government agency defines the priority area), crafting proposals, recruiting members of the

team, managing budgets and contracts, and handling public engagements and execution

issues. In building the team, managers should have the project area in mind, but not recruit

to the specific subject area. For that reason, project managers need not wait until the

proposal has been approved by the principal administration before they start building their

team.

Unlike their team members, project managers should possess a rigorous academic

background in addition to possessing strong leadership abilities and experience with dealing

with management hierarchy. Managers need a strong academic background because they are

required to identify which projects are both feasible and paradigm altering, as well as the

industry trends that might affect the project goals or processes. Examples of previous

academic and professional experience for project managers includes: a partner at a large

Page 24: Booz Allen Innovation Report

24

consulting firm, a law degree, a researcher at an institute for governmental service and

research, and/or a master of public administration degree.

3. Innovation Teams Should Crowd and Open-Source Ideas Booz Allen Hamilton has multiple avenues by which it can source ideas both

internally and within government agencies. MONUM, for example, welcomes anyone with

a cogent, well-designed innovation to submit it via email. Team members then evaluate the

proposals and select compelling projects for the piloting phase. BAH can apply these

crowdsourcing techniques through websites and social media channels. If open-sourcing

ideas pose a confidentiality issue, BAH can assist federal agencies in adopting their own

crowdsourcing methods. The agencies, in turn, can assume responsibility in managing the

ideas.

The innovation team within BAH may not necessarily have to bear the burden of

monitoring the crowdsourcing process. Booz Allen Hamilton could help the client

government agency to conduct its own crowdsourcing by developing criteria for reviewing

the crowdsourced ideas. In this dynamic, BAH deconstructs the barriers government

employees face in proposing innovative ideas and effectively builds up the agency as well as

its employees.

Identifying good ideas is pivotal. Teams should review every single proposal, though

they do not give comprehensive attention to those ideas that are clearly lacking.

Additionally, BAH should be clear with the person(s) who submitted the proposal about the

idea becoming an experiment. The government agency should also be comfortable with

playing a significant role in scaling the pilot, after BAH drives the pilot and determines that

the experiment has been successful.

Page 25: Booz Allen Innovation Report

25

4. Scale Up Successful Pilot Projects

After crowdsourcing ideas, BAH teams should then aid the government agency in

implementing the pilot program on a larger scale. To determine which projects are scalable,

BAH teams can adopt MONUM’s goals of piloting those that are "maximizing public

impact and minimizing public costs" (Goldsmith, 2012). BAH teams should help the

government agency to scale those pilots that effectively address the challenges faced by the

agency and do so in a cost-effective manner.

The BAH team should also ensure that the programs to be scaled have reached the

pre-determined benchmarks for piloting success (CEO, 2013, p.15). The principal

administration may want to rely upon strict criteria, such as the aforementioned Heilmeier

Catechism, to determine whether pilots have been successful enough to scale. The metrics

themselves may have to be project-dependent, but the application of the criteria can be

repeatable.

Engaging the federal agency in the scaling process has two main advantages. First, it

engages the government as a collaborator in the innovative process much in the same way

that the crowdsourcing process does. By sharing the process with BAH, the agency is able to

pool resources and increase the probability of success. Second, BAH is able to reduce the

cost of scaling by capitalizing on the government agency’s financial and personnel

resources.

5. Develop Meaningful Interim Performance Metrics

Performance metrics that dictate whether a project has been successful will often

depend on the project itself. Metrics will depend upon the goals and aims of the project.

Page 26: Booz Allen Innovation Report

26

Identifying an appropriate means to measure pilot and scaled program successes will depend

upon the values of the client agency, as well as the nature of the program.

If a project does lend itself to rigorous quantitative evaluation, the DARPA Hard

Test should help to identify beforehand the measurable benchmarks that a project must

reach. If the intended outcomes are less tangible or do not require specific metrics, then

MONUM may be a better model. MONUM, for example, has a central goal of civic

engagement. Thus, if one of their programs does not provide a quantifiable output but has an

identifiable effect on improving citizens’ well-being, MONUM views that project as a

success.

Regardless of whether a program should be evaluated holistically or quantitatively,

setting evaluative criteria can help the client agency and BAH swiftly drop unsuccessful

projects. Using process evaluation methods is better than simply evaluating projects after

they have reached a conclusion. Interim evaluation enables BAH and government agencies

to abandon foundering projects before the scaling process. The practice saves resources

since funds from those projects can then be diverted towards other projects with greater

promise.

Page 27: Booz Allen Innovation Report

27

As an example, our team applied the DARPA Hard Test to the CUNY ASAP project

to demonstrate how BAH could apply the Hard Test to potential projects:

• On the far-reaching criterion, the ASAP project received a 6 out of 7 because 40% of

participating students graduated within three-years and is projected to serve over

13,000 students across six communities by 2016 (Scrivener et al., 2015, p. 92).

• We assigned ASAP a 1 out of 7 on the technically challenging criterion because it did

not require the development of new technologies to integrate the program into the

community.

• For the multidisciplinary criterion, we ranked ASAP a 6 out of 7 because the project

involves the cooperation of multiple community stakeholders, such as students,

parents, educators and government officials.

• Finally, we ranked ASAP a 6 out of 7 on the actionable criterion because it involved

all community stakeholders to implement the ASAP program. It resulted in 25% of

participating students enrolling in four-year institutions (Scrivener et al., 2015, p. ES-

6).

Page 28: Booz Allen Innovation Report

28

Conclusion Our recommendations based on the five case studies demonstrate how Booz Allen

Hamilton can build innovation teams to effectively spur innovation within government

agencies. Our recommendations will allow BAH to absorb risk on behalf of government

agencies and facilitate innovative processes.

1. We recommend that innovation teams absorb government risk by piloting

solutions to challenges that the government identifies. Teams should have their

own physical spaces. Innovation teams will benefit from public-private partnerships.

The collaboration will gather a variety of perspectives.

2. BAH should build small innovation teams of 5-8 people with diverse

experiential knowledge to pilot projects. Innovation team members should be

adaptable and flexible, and project managers should have strong leadership

experience and familiarity with management hierarchies.

3. Teams should crowdsource ideas from within BAH and government agencies

through internal forums. This will allow the government to be a collaborator in the

innovation process.

4. Innovation teams should scale pilots that have fulfilled predetermined criteria

by pooling resources with government agencies, which can reduce project-

scaling costs.

5. BAH’s teams should develop performance metrics, such as the DARPA Hard

Test, to measure a project’s potential innovation and to ensure that pilots are

achieving the desired outcomes.

Page 29: Booz Allen Innovation Report

29

Works Cited Bloomberg Philanthropies. “Transform Your City Through Innovation”. 2014. Carleton, T., W. Cockayne, and A. Tahvanainen. “Playbook for Strategic Foresight and

Innovation”. Innovation Leadership Board. 2013. Center for Economic Opportunity. 2015. Center for Economic Opportunity. Replicating Our Results. 2013.

CEO Annual Report. Toronto: Bantam. 2014. Criado, J. Ignacio; Gil-Garcia, Ramon and Sandoval-Almazan, Rodrigo. “Government

Innovation Through Social Media”. Government Information Quarterly 20.4 (2013). Defense Advanced Research Projects Agency. DARPA: Creating Breakthrough

Technologies for National Security. 2014. Dennis, M. “Xerox PARC”. Encyclopedia Britannica. Desouza, Kevin C. "Turning Governments into Innovation Machines." Governing.com. 10

Nov. 2014. Dubois, L. “DARPA’s Approach to Innovation and Its Reflection in Industry”. National

Research Council Chemical Sciences Roundtable. 2003. Dugan, R. “Special Forces” Innovation: How DARPA Attacks Problems”. Harvard

Business Review. Oct. 2013. Farrell, Michael B. “Boston Deploys Smartphone App for City Workers”. Boston Globe, 15

Feb. 2012. Gais, Thomas, Patricia Strach, and Katie Zuber. “Poverty and Evidence-Based

Governance”. Rockefeller Report, 2014. Georges, Gigi, Tim Glynn Burke, Andrea McGrath. “Improving the Local Landscape for

Innovation (Part 1): Mechanics, Partners and Clusters.” Ash Center for Democratic Governance and Innovation, 20 June 2013.

Gladwell, M. “Creation Myth: Xerox PARC, Apple, and the Truth About Innovation”. The

New Yorker. 16 May 2011. Goldsmith, Stephen. "Boston's Pioneering Way of Innovating." Governing.com. 12 Sept.

2012.

Page 30: Booz Allen Innovation Report

30

Greenwald, T. “Secrets of DARPA’s Innovation Machine“. Forbes. 15 Feb. 2013. Gustetic, Jenn, Lea Shanley, Jay Benforado, and Arianne Miller. "Designing a Citizen

Science and Crowdsourcing Toolkit for the Federal Government." The White House. 2 Dec. 2014.

Harvard Announces Bright Ideas in Government. Harvard Kennedy School Ash Center, 29 Mar. 2011.

Hoagland, Kate. "Center for Economic Opportunity Wins Harvard Innovations in American

Government Award." Harvard Kennedy School. 13 Feb. 2012. Hoover, S. “Defining (and Practicing) ‘Open Innovation“. PARC. 2012. Hull, Clyde and Brian Lio. “Innovation in Non-profit and For-profit Organizations:

Visionary, Strategic, and Financial Constraints.” Journal of Change Management 6.1 (2006): 53-65.

Innovation Teams. Bloomberg Philanthropies. 2015. Web. Innovations to Build On. Center for an Urban Future. 2013. Interview with Nigel Jacob, Co-Chair of New Urban Mechanics. Skype interview.

20 Mar. 2015. Laidler, John. "Free App Helps Residents Report Problems, Get Action." Boston Globe, 11

July 2013. Lawrence, Alex. "Ten Innovation that Reimagine City Services.” Ash Center. Harvard

Kennedy Center, 2013. Lee, Jolle. "OPM's Innovation Lab Spurs New Way of Problem-

solving." FederalNewsRadio.com. 14 Sept. 2012. Lee, L. “What PARC Learned About Executing on Open Innovation”. Harvard Business

Review. 15 Oct. 2012.

Local Innovation. The White House. 2015. Mann, Amelia. "Commentary: Every Federal Agency Needs an Innovation Lab." Nextgov.

National Journal Group, 9 Dec. 2013. Mergel, I. “The Social Media Innovation Challenge in the Public Sector”. Information

Polity. 17 Dec. 2012. Mui, C. “The Lesson That Market Leaders Are Failing to Learn From Xerox PARC”.

Forbes. 1 Aug. 2012.

Page 31: Booz Allen Innovation Report

31

New Urban Mechanics, 2015. Office of Personnel Management, Innovation Lab. Creative States. 2015. Puttick, Ruth, Peter Baeck, and Philip Colligan. "The Teams and Funds Making Innovation

Happen in Governments Around the World." Nesta. Bloomberg Philanthropies, 2014.

Rocheleau, Matt. "White House Honors 2 City of Boston Workers as Local Innovators." The

New York Times, 27 Sept. 2012. Roberts, Sam. "Nearly Half of New Yorkers Are Struggling to Get By, Study Finds." The

New York Times. 29 Apr. 2014. Scrivener, Susan, Michael J. Weiss, Alyssa Ratledge, Timothy Rudd, Colleen Sommo, and

Hannah Fresques. “Doubling Graduation Rates: Three-Year Effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for Developmental Education Students”. MDRC, 2015.

Torfing, Jacob and Eva Sorensen. “Enhancing Collaborative Innovation in the Public

Sector”. Administration and Society 43.8 (2011): 842-868. Trenkner, Tina. “Nigel Jacob and Chris Osgood: 2011 Honoree.” Governing.com. 2011a. Trenkner, Tina. "Public Officials of the Year." Governing.com. 2011b. UNICEF. Innovation Labs: A Do-It-Yourself Guide. 2012. United States Government Accountability Office. “Office of Personal Management: Agency

Needs to Improve Outcome Measures to Demonstrate the Value of Its Innovation Lab”. Rep. no. GAO-14-306. 2014.

Urban Vision, The. "Boston's Mayor's Office of New Urban Mechanics." YouTube. 2 Apr.

2013. Wood, Colin. "Transportation." Boston's Reporting App Expands Statewide. Govtech.com,

18 Dec. 2012.