Upload
independent-science-and-partnership-council-of-the-cgiar
View
164
Download
2
Embed Size (px)
Citation preview
Assessing Quality of Science in IEA Evaluations
Lessons-learned & considerations for the future
Rachel Sauvinet-Bedouin, IEAISPC 13th Meeting Item 5 May 2016
Relevance Efficiency
Effectiveness Impact & Sustainability
Quality of Science
Governance & Management to SLO; Research
Community; Legitimacy
Quality of research
and design
GenderPartnerships &Capacity Development
Quality of Science: an evaluation criteria
mgmt of QoS, processes for assuring QoS, incentives
HR quality: team profiles (h-index); research design; data management
IEA QoS Evaluation Framework
Inputs
Outputs
MgmtProcess
Inputs
Research publications (bibliometric analysis); qualitative review of publications; peer assessments
IEA Workshop on Evaluating QoSDate: 10-11 December 2015, Rome
Purpose:
- Consolidate and strengthen the IEA’s approach to evaluating QoS;
- Opportunity to explore the scope for achieving a common understanding and definition of OoS and how to link it to other aspects of performance in appraisal, monitoring and evaluation;
Specific objectives among which:
- Draw broader lessons for the CGIAR with the aim of preparing guidelines for evaluation of QoS
16 Participants: IEA/ISPC/IDRC/CO/Donor/CRP leader/Evaluation Team experts
Role and responsibility of Center vs CRP –aspects require assessment of Center products, practices and capacities
Difficult to identify shortcomings and target recommendations
“What can the CRP management and oversight do to have a better handle on the QoS?”
“How does the CRP use its authority to demand more from the Centers?”
What have we learned from IEA experience? (1)
What have we learned from IEA experience? (2)
• Quality of science in the context of agriculturalresearch for development: what does this mean?
• Assessing systems science and transdisciplinary research: additional challenges
Methodological issues
• Complexity of the framework: seeing the forest from the trees?
• Using quantitative and qualitative data
• How to interpret evidence and findings
• How to have a more systematic approach to assessing research design taking into account a broad range of disciplines and types of research
Outputs
Research Process
Resources and their
Management
- Human Resources
- Financial Resources
- Infrastructure
- Research
Planning and
design
- Research
implementation
and protocols
- Scientific and
peer review
- Non peer
reviewed
- Non publication
outputs
IEA revised framework
1) Is the framework a good starting point for reaching a common understanding on the various dimensions and aspects of QoS in for CGIAR?
2) What are the minimum set of indicators/information when assessing QOS in program appraisal, monitoring, evaluation?
3) Does this provide sufficient flexibility for tailoring the approach for these various functions?
Questions for ISPC
Quality of Science in the Context of AR4D
• Issue of QoS along IP: what should be the expectations?
• How to integrate risk of failure in assessment of QoS?
• How to benchmark?
How to assess QoS at CRP level while most elements rest are the responsibilities of Centers?
• How can CRP be accountable for QoS?
• How can QoS of CRP be improved within this context?
Further Considerations for ISPC
Conclusions
• Revise IEA framework for assessing QoS in preparation for the next round of CRP evaluations;
• Explore and address the many unresolved issues with concerted efforts from all those involved in design, appraisal, implementation, monitoring and evaluation of research for development
Thank you!