30
Research Assessment and The Metric Tide Vicky Jones Senior Policy Adviser Research impact: delivering excellence 5 th July 2016

Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Embed Size (px)

Citation preview

Page 1: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Research Assessment and The Metric Tide

Vicky Jones

Senior Policy Adviser

Research impact: delivering excellence

5th July 2016

Page 2: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

http://www.hefce.ac.uk/rsrch/metrics/

http://www.responsiblemetrics.org

Page 3: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

“I can announce today that I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The review will consider the robustness of metrics across different disciplines and assess their potential contribution to the development of research excellence and impact…”

David Willetts, Minister for Universities & Science, Speech to UUK, 3 April 2014

Page 4: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

REF 2014: Evaluation programme

Evaluation activity

• Two-phase evaluation of impact

• Feedback from participating institutions

• REF panel feedback

• Review of costs, benefit and burden

• Multi- and inter-disciplinary research in the UK

• Equality and diversity analysis

Wider work relating to REF

• Analysis of impact case studies and database

• Independent Review of Metrics

• Open access

Page 5: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

The Metric Tide

Headline findings

Page 6: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Across the research community, the description, production and consumption of ‘metrics’ remains contested and open to misunderstandings.

Page 7: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Peer review, despite its flaws and limitations, continues to command widespread support across disciplines. Metrics should support, not supplant expert judgement.

Page 8: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Inappropriate indicators create perverse incentives. There is legitimate concern that some quantitative indicators can be gamed, or can lead to unintended consequences.

Page 9: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Correlation analysis of the REF2014 results at output-by-author level has shown that individual metrics cannot provide a like-for-like replacement for REF peer review.

Page 10: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Within the REF, it is not currently feasible to assess the quality of UOAs using quantitative indicators alone, or to replace narrative impact case studies, or the impact template.

Page 11: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Responsible metrics

Responsible metrics can be understood in terms of:

• Robustness: basing metrics on the best possible data in terms of accuracy and scope;

• Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment;

• Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;

• Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths;

• Reflexivity: recognising the potential & systemic effects of indicators and updating them in response.

Page 12: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

The Metric Tide

Recommendations

Page 13: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

At an institutional level, HEI leaders should develop a clear statement of principles on their approach to research management and assessment, including the role of indicators.

Page 14: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Research managers and administrators should champion these principles and the use of responsible metrics within their institutions.

Page 15: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

HR managers and recruitment or promotion panels in HEIs should be explicit about the criteria used for academic appointment and promotion decisions.

Page 16: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Individual researchers should be mindful of the limitations of particular indicators in the way they present their own CVs and evaluate the work of colleagues.

Page 17: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Like HEIs, research funders should develop their own context-specific principles for the use of quantitative indicators in research assessment and management.

Page 18: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Data providers, analysts & producers of university rankings and league tables should strive for greater transparency and interoperability between different measurement systems.

Page 19: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal-based metrics that provide a richer view of performance.

Page 20: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

The UK research system should take full advantage of ORCID as its preferred system of unique identifiers. ORCID iDs should be mandatory for all researchers in the next REF.

Page 21: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Further investment in research information infrastructure is required to improve the interoperability of research management systems.

Page 22: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

HEFCE, funders, HEIs and Jisc should explore how to leverage data held in existing platforms to support the REF process, and vice versa.

Page 23: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

For the next REF cycle, in assessing outputs, we recommend that quantitative data –particularly around published outputs –continue to have a place in informing peer review judgements of research quality.

Page 24: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

In assessing the research environment, we recommend that there is scope for enhancing the use of quantitative data.

Page 25: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

In assessing impact, we recommend that HEFCE builds on the analysis of the impact case studies from REF2014 to develop clear guidelines for the use of quantitative indicators in future impact case studies.

Page 26: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

• Preparing for impact has provided benefits and strategic insight to Universities

• The assessment of impact worked well, but there are areas for improvement

• Considerable and diverse impacts were submitted for assessment

• Impact derives from the integration of disciplinary knowledge

• The systematic collection of impact data has generated an important national asset, and provided new insight into the relationship between research and impact

What have we learned?

Page 27: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

27

Page 28: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

• Impact is not confined to the place where research was carried out

• Research across the UK leads to impact in London

• There is some regional focus of impact

• Some regions are more locally focused in terms of impact than others

Page 29: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

• No evidence to suggest the removal of the impact element or a radical change to the approach…

• …but some areas for review/reform:

• Impact template

• Evidence and data requirements

• FTE thresholds

• Changes to the guidance

• Case study database is an important source of evidence

Original areas for consultation

Page 30: Dr Vicky Jones, Senior Research Policy Adviser, HEFCE

Thank you for listening

[email protected]