12
REFLECTIONS ON GOVERNMENT AGENCY RESEA RCH & EVALUATION REALTY I N 2 015 A PE RSONAL VIEW : DR JOHN WREN, HO NOUREE RESEARCH FELLOW, A UT INSTITUTE OF WORK JANU ARY, 2016

Reflections on Research and Evaluation Reality Jan 2016

Embed Size (px)

Citation preview

Page 1: Reflections on Research and Evaluation Reality Jan 2016

REFLECTIO

NS ON

GOVERNMENT AGENCY

RESEARCH & EVALUATI

ON

REALTY IN

2015

A P E R S O N A L V I E W: D R J O H N W

R E N , H O N O U R E E R E S E A R C H

F E L L O W, A U T I N S T I T U T E O F W

O R K

J A N U A R Y , 2 0 1 6

Page 2: Reflections on Research and Evaluation Reality Jan 2016

SOME BACKGROUND TO REFLECTIONS• 15 years ago experience as principal and senior level researcher in

New Zealand government agencies• Experience in leading public health policy initiatives• Recipient of New Zealand Health Research Council Post-Doctoral

Fellowship hosted by the Injury Prevention Research Unit, Otago University

• 3 year Doctoral Scholarship Massey University• New Zealand Public Service Scholarship for 1 year study for Post-

graduate Diploma• Observing, listening and talking to a wide range of government (or

government funded) research and evaluation colleagues in New Zealand and Australia

• Being through multiple restructurings

Page 3: Reflections on Research and Evaluation Reality Jan 2016

THEME 1: THE VALLEY OF DEATH

Source: Nature, 11 June 2008

Researcher User / Decision-maker

Page 4: Reflections on Research and Evaluation Reality Jan 2016

THEME 2: COMMON RESEARCH / EVALUATION REALITY: A CHALLENGE TO PRODUCE

TimelyRelevantValue for Money

Cost-effectiveAdds Value

Research that directly addresses Business Group Owners information needsInformation for policy and operational decision-makingTimelines are very fast & frequently move forward on you

Page 5: Reflections on Research and Evaluation Reality Jan 2016

WHAT … Content WHO HOW … Delivery

THEME 3: THE CHALLENGE FOR RESEARCHERS …▶ How important are the impact factors

to decision-makers?

Source: Wharton Business School, Survey of Fortune 500 CEOs in ACC Foundations for Leadership Training Course, Catapult, 2011.

WHAT 4%

WHO 40%HOW 56%

Page 6: Reflections on Research and Evaluation Reality Jan 2016

Your decision-maker is typically thinking …▶ WHO is this person ?▶ WHY should I listen to them ?▶ Do I understand what they are saying ?▶ Are they using my language ?▶ Is this information useful ?▶ Is this information important to me ?

THEME 3: THE CHALLENGE FOR RESEARCHERS AND EVALUATORS…

Page 7: Reflections on Research and Evaluation Reality Jan 2016

TECHNOCRATIC APPROACHScientific Process Foci: discipline, method

No end user inputDistanced from policy & operational context

No stakeholder ownershipResearch quality standard

THEME 4: ENSURING RESEARCH FIT FOR PURPOSE

▶ My DILEMMA as Principal Research AdvisorPositioning the research … the optimal

position

Audi

ence

, Pur

pose

, Sty

le

Research Method

PARTICIPATIVE APPROACHActive end user engagementAlliances set upTendency for value drivenContext awareHigh degree of stakeholder ownershipCan lack independence, end user capture

FOCI: Policy / Strategic DirectionPolitical decision makers

BoardCEO, Senior Executive

FOCI: Operational / DeliveryProfessional Service provider

ClaimantsCommunity Group

Copyright: Wren, J. (2013)

Page 8: Reflections on Research and Evaluation Reality Jan 2016

THINKING ABOUT OUR RESEARCH / EVALUATION FUNCTION: HOW ABOUT USING THIS IMPACT MODEL TO GUIDE FUTURE DIRECTION?

In: Professor Niki Ellis Presentation to Actuaries Injury Scheme Seminar, November 2015

Page 9: Reflections on Research and Evaluation Reality Jan 2016

Behavioural Insights reflections: Thinking about our Research / Evaluation Processes

Is our documentation

easy to use?

Are our reports attractive to look at? Do they invite

the reader to pick it up and once

started to not put it down?

Are our reports really timely?

Do we really socialise and

properly disseminate our

work?

Page 10: Reflections on Research and Evaluation Reality Jan 2016

HOW ABOUT: REBRAND, REGROUP, REFOCUS?

Rebrand“Research / Evaluation Insights” (we differentiate ourselves on basis of application of

rigorous scientific methods and theory, deep subject matter expertise, independence)

Regroupinstead of a focus on methods or subject matter as our default mode of organising and

thinking such as EBH, Evaluation, Surveys (voice of the customer), Injury Prevention, Rehab etc, to focus on small teams delivering: a. Programme and Project Design and Delivery Insights b. Customer Voices Insightsc. Evidence for Effective Treatment, Rehabilitation and Injury Prevention Insights

Refocusby reviewing our Process Templates and Research Project deliverables with Behavioural Insights EAST principles in mind , and from both the perspective of end user and research staff who have to use / or prepare the documents in detail (as we are users as well).

Page 11: Reflections on Research and Evaluation Reality Jan 2016

REBRAND, REGROUP, REFOCUS MEANSA change and challenge in emphasis in how we think about what we do

and how we might like to organise.A change from focus on How it is delivered to What is delivered ‘What’ implies a stronger emphasis on promoting the delivery of “Product or

Outcomes” that are desired by our clients in language they are using, rather than on how we do it in the first instance (which I suggest is our powerful default setting by training and interest).

For example instead of delivering an ‘Evaluation’ we are delivering “insight about program design and delivery” –

that just happens to primarily use evaluation methods. ‘Evidence for Effective Treatment’ etc is written advice that may be derived

from the use of range of research methods ranging from EBH, to general lit reviews, to subject matter expertise).

Instead of ‘I am an evaluator or do EBH or do Surveys’ to a setting of “I deliver deep Programme and Project Insights”, or “Customer Voice Insights”, or “Evidence for Effective Treatment / Intervention” through the use of a range of rigorous scientific methods.

Page 12: Reflections on Research and Evaluation Reality Jan 2016

FINAL OBSERVATIONS• Criticisms of research / evaluation function are not new or unique• Deeper question is – what does an organisation want from a “Research /

Evaluation” function?• Independence, integrity, deep content knowledge, ability to withstand

external scrutiny of decisions, critical thinking, foresight?• Tactically, Operationally, Policy, Strategically, Content, Type of research?• does the organisation want to learn?

Restructure is no substitute for• Deep senior management understanding of the ‘research / evaluation’

function, the value it can add to an organisation, and a willingness to support and resource it appropriately

• Anything other than a rebrand, refocus and process review is likely to require significant changes in staff skill mix

• Outsourcing – more contract and project management• Narrowing of focus – change in skill base and content knowledge