24
Dr Peter Darroch SciVal Consultant [email protected] ng past the usual metrics to help researchers demon lence to support Grant applications

Dr Peter Darroch SciVal Consultant [email protected] Looking past the usual metrics to help researchers demonstrate Excellence to support Grant applications

Embed Size (px)

Citation preview

Dr Peter DarrochSciVal [email protected]

Looking past the usual metrics to help researchers demonstrateExcellence to support Grant applications

2

Halt the avalanche of performance based metrics

“Bogus measures of ‘scientific quality’ can threaten the peer-review system”

“The increasing dominance of quantitative research assessment threatens the subjective values that really matter in Academia”

World view – Colin Macilwain, Nature, Vol 5, 2013 http://www.nature.com/nature/journal/v500/n7462/index.html#col

Comment on Snowball metrics“I suspect that in practice, however, it will end up being used mainly to exercise yet more control over academic staff, with every aspect of their professional lives tracked on the system.”

Response from Glenn Swaford, Oxford University“However his portrayal of Project Snowball does not ring true with us here at Oxford. We are founding participants in this Project. Snowball puts HEIs in the driver's seat. We (not others) elect what to measure and why. Oxford is involved to assist academic-led planning at a unit level. There is and will be no 'central' much less local use of these data to assess individuals.

3

Coming up…

• Conditions for a good metric and factors that can affect the value of citation metrics

• A model to help you select appropriate metrics• Showcasing excellence to support for example a grant application• An example of useful metrics for showcasing a senior researcher• An example of useful metrics for showcasing a junior researcher• What currently happens?

Health warning• Using metrics is not black and white• This session is a discussion about potential uses• There are always exceptions, so always engage your brain!!

4

Adapted from Scholary Kitchen podcast, July 10th 2013. Phil Davis – Bibliometrics in an age of abundancehttp://scholarlykitchen.sspnet.org/2013/07/10/scholarly-kitchen-podcast-bibliometrics-in-an-age-of-abundance/

Conditions for a good metric

1. Transparent underlying data – Can you trace the data. Is there authority and accountability in the data set (related to 5)

2. External validity of metric – Needs theoretical connection to what you are tryingto measure which is not always clear

3. Reliable – Query several times and get same or similar result

4. Easy to replicate – Some metrics are based on complicated algorithms/calculations

5. Hard to distort – Needs to be structural and human systems in place to prevent distortion/gaming

5

Factors that can affect the value of citation metrics

• Variety in the size of entities within the data set• Several disciplines within the data set• Multiple publication types within the data set• Coverage of data source, by geography and/or

discipline• Ease of manipulation

• Quality of performance

Accounting for these

Reveals this

A model to help you select appropriate metricsBased on four questions

7

Q1 What am I trying to achieve?

This may be the most difficult part of any process. Your question/goal should drive the data and metrics that you use, not the other way round

Example questions/goals:• How can I show I deserve this award/grant?• Which of several applicants would be a good fit with our existing group?• How can I show that I should be promoted or get tenure?• How can I attract more students/researchers to my group?

Researchers will experience the reverse direction also:• Funders/line managers using metrics as one input into decisions (in addition to opinion

and peer review)

Note: Metrics don’t need to be about top down evaluation or benchmarking

8

Evaluations/showcasing can fall into 3 types

Distinguishing between performance: “looking for the best”

Demonstrating excellence: “showing off”

Modeling scenarios: “fantasy academia”

Will I have the best chance of a positive outcome if I invest in X or in Y?

How can I showcase Z to look the best possible?

What if I…?

TY

PIC

AL

Q

UE

ST

ION

Average of all publications e.g. Citations per Publication

Highlight the few top publications in a data set e.g. Publications in Top Percentiles

US

EF

UL

AP

PR

OA

CH

Depends…

Snowball Metric; www.snowballmetrics.com/metrics

9

Evaluations/showcasing can fall into 3 types

Distinguishing between performance: “looking for the best”

Demonstrating excellence: “showing off”

Modeling scenarios: “fantasy academia”

Will I have the best chance of a positive outcome if I invest in X or in Y?

How can I showcase Z to look the best possible?

What if I…?

TY

PIC

AL

Q

UE

ST

ION

Average of all publications e.g. Citations per Publication

Highlight the few top publications in a data set e.g. Publications in Top Percentiles

US

EF

UL

AP

PR

OA

CH

Depends…

Snowball Metric; www.snowballmetrics.com/metrics

10

Q2: What am I looking to evaluate or showcase?

• Institution / group of / discipline within• Country / group of / discipline within• Research Area / group of• Researcher / group of• Publication Set / group of

• Awards program of a funding agency• Effectiveness of policy• Etc.

11

Q3: How will I recognise/demonstrate good performance?

A: usually, relative to peers that you have selected to be• Distinguishing: equivalent to and a bit above your status• Demonstrating excellence: equivalent to and somewhat below your status

A few considerations that may affect your selection of peers• Size – of publication output / student program / funding• Status – recognition / league tables / reputation / influence• Disciplinary focus• Geographic location• Degree of focus on research or teaching• Comparators for your university, or for a department or academic

12

Q4: Which metrics could help me make my decision? This list displays metrics being developed by SciVal

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration ImpactAcademic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of AuthorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaborationAcademic-Corporate Collaboration

Showcasing excellence to support for example a grant application

14

Which metrics help me showcase performance? This list displays metrics being developed by SciVal

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration Impact (geographical)Academic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of AuthorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaboration (geographical)Academic-Corporate Collaboration

Snowball Metric; www.snowballmetrics.com/metrics

Productivity metrics. Useful to make big entities look good. Unfair if you are comparing entities of different sizes. Few recent citations.

15

Extensive Citation Impact metrics address many needsThis list displays metrics being developed by SciVal

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration Impact (geographical)Academic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of Co-authorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaborationAcademic-Corporate Collaboration

Snowball Metric; www.snowballmetrics.com/metrics

Field-Weighted Citation Impact is very useful because it accounts for several variables that affect the metric value, and recent values do not drop. But it is not transparent for new users.

Citation Count is a “power” metric. Useful to make big entities look good. Unfair if you are comparing entities of different sizes. Few recent citations.

Citations per Publication is a size-normalized metric. Useful to compare entities of different sizes. Few recent citations.

16

h-index variants emphasize different strengths Examples given for researchers onlyThis list displays metrics being developed by SciVal

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration Impact (geographical)Academic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of AuthorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaborationAcademic-Corporate Collaboration

Snowball Metric; www.snowballmetrics.com/metrics

m-index is h-index per year of publishing activity. It ‘levels’ the playing field for researchers with different career lengths. It is not useful for researchers who have had a career break.

An h-index of 7, means that 7 of a papers have each been cited at least 7 times. For researchers, it is useful to indicate both productivity and citation impact. It is not useful for new researchers with few citations.

g-index emphasizes and rewards the most highly cited papers. Is always the same as or higher than h-index. For researchers, it is good to emphasize exceptional papers. It is not useful for average researchers where h=g, or for new researchers.

Not all Citation Impact metrics need the data set to have citations!This list displays metrics being developed by SciVal

17

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration Impact (geographical)Academic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of Co-authorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaborationAcademic-Corporate Collaboration

Snowball Metric; www.snowballmetrics.com/metrics

Publications in Top Journal Percentiles.This is a good metric to engage researchers. It is also useful early in a strategy or career because publications do not need their own citations.However, publications are judged based on the average performance of the journal.

Publications in Top Percentiles . Useful to distinguish between entities whose averages are similar, and to show off. Not always inclusive of average entities, and time is needed for citations to be received.

18

Topical Collaboration metrics have broad valueThis list displays metrics being developed by SciVal

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration Impact (geographical)Academic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of AuthorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaboration (geographical)Academic-Corporate Collaboration

Snowball Metric; www.snowballmetrics.com/metrics

Collaboration metrics only need the affiliation information that authors have included on their publications. They do not need any citations.They are very useful e.g. at the start of new strategy, or early in a researcher’s career, when publications exist but too little time has passed for citation-based metrics to be reliable.

19

Showcasing the performance of a senior researcher

• Likely a large body of work available to showcase• Publication volume and total citation counts as well as associated

indices (h- and g-index) should work well?

Metrics to consider• cpp• Top citation percentiles• Top Journal percentiles• h- and g-index• Collaboration metrics

– Demonstrate the expertise of your collaborators who support your research– Demonstrate your network/reach

20

Showcasing the performance of a junior researcher

• Potentially smaller body of work available to showcase• Simple counts perhaps not so useful due to lower volume and maybe

not enough time to accumulate the necessary citations• h- and g-index not so useful potentially but m-index?

Metrics to consider• cpp• Top citation percentiles• Top Journal percentiles• m-index• Collaboration metrics

– Demonstrate the expertise of your collaborators who support your research– Demonstrate your network/reach

21

A model for selecting metrics with 4 questions

What question am I trying to achieve/answer?

What am I evaluating/showcasing?

How will I recognise good performance?

Which metrics will help me?

22

What currently happens?

• How do Academics at your institution currently showcase their expertise?o Grant applicationso PDR discussionso Promotion

• Do they/you use citations or any metrics?

• How do you support Academics around showcasing expertise currently?

Thank you for your attention

24

Snowball Metrics are a subset of SciVal metricsThis list displays metrics being developed by SciVal

Productivity metricsScholarly Outputh-indices (h, g, m)Publication Share

Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCitation ShareCollaboration Impact (geographical)Academic-Corporate Collaboration Impact

Disciplinarity metricsJournal countCategory count

Collaboration metricsNumber of Co-authorsAuthorship CountNumber of Collaborating CountriesNumber of Citing CountriesCollaborationAcademic-Corporate Collaboration

Snowball Metric; www.snowballmetrics.com/metrics

Snowball Metrics are endorsed by distinguished universities. They are a manageable, convenient way to start using benchmarking data in university strategy.Other metrics allow more sophisticated analysis, and are useful for other entities.