23
Evaluation of scientists: reasons, data and methods Gábor B. Makara Library and Information Centre, Hungarian Academy of Sciences

Evaluation of scientists: reasons, data and methods

  • Upload
    alaula

  • View
    53

  • Download
    0

Embed Size (px)

DESCRIPTION

Evaluation of scientists: reasons, data and methods. Gábor B. Makara Library and Information Centre, Hungarian Academy of Sciences. Scientists' evaluation – the topics. The need for evaluation The types of evaluation Data for evaluation Data selection Indicators - PowerPoint PPT Presentation

Citation preview

Page 1: Evaluation of scientists:  reasons, data and methods

Evaluation of scientists: reasons, data and methods

Gábor B. Makara

Library and Information Centre, Hungarian Academy of Sciences

Page 2: Evaluation of scientists:  reasons, data and methods

Scientists' evaluation – the topics

• The need for evaluation• The types of evaluation• Data for evaluation• Data selection• Indicators• The peer review process

Page 3: Evaluation of scientists:  reasons, data and methods

Evaluations are everywhere

• Scientists evaluation start at the doctoral schools

• Postdoctoral fellowships• Job interviews, search committees,

leadership applications, promotion• Performance reviews• Funding decisions• Professorships, Doctor of Academy title• Prizes

Page 4: Evaluation of scientists:  reasons, data and methods

Evaluation goals may be widely different

Selection in some situations, such as• general funding• funding talented young scientists• recognizing outstanding achievement• filling a job openingwill be different.

Goals should be defined in advance and kept in focus throughout

Page 5: Evaluation of scientists:  reasons, data and methods

Experienced evaluators are scarce

• Good peer reviewers’ time is in great demand, hence the importance of scientometric indicators.

• Good data and indicators may speed up and simplify evaluation

• It is not enough to get good scientists in a panel and tell them to do the evaluation

• Scientific evaluation require guidelines, evaluator training/experience and diligent study of the subjects

Page 6: Evaluation of scientists:  reasons, data and methods

However, evaluation is similarly to soccer: everyone seems to be an expert.

Page 7: Evaluation of scientists:  reasons, data and methods

Types of evaluationsEligibility evaluation- Comparing indicators to thresholds, one scientist at a timeCompetitive (comparative) evaluation- Ranking or grouping by indicators, groups of scientists

They may use − Indicators of scientific competence, productivity and

recognition by peers− Thresholds for the indicators

Disciplinary differences will apply

Page 8: Evaluation of scientists:  reasons, data and methods

Eugene Garfield (1979) suggested:

"Instead of directly comparing the citation count of,say, a mathematician against that of a biochemist,

both should be ranked with their peers, and the comparison should be made between rankings.”

“Using this method, a mathematician who ranked in the 70 percentile group of mathematicians would have an edge over a biochemist who ranked in the 40 percentile group of biochemists, even if the biochemist's citation count was higher."

Page 9: Evaluation of scientists:  reasons, data and methods

Data in scientific evaluations− Qualifications, licences, trainings, experience

(time), prior positions− Scientific publications− Patents− Contracts and contract income− Grants− Invitations− Prizes

Page 10: Evaluation of scientists:  reasons, data and methods

Data on scientific publications− International non-profit databases

− Medline, Astrophysics Data System, ... − Commercial databases

− Web of Sciences, Scopus, ...− National or institutional databases− Ad hoc lists, self-evaluations

Across subdisciplines these vary tremendously in completeness, accuracy and applicability.

Page 11: Evaluation of scientists:  reasons, data and methods

Hungarian Scientific Bibliography Database (Hungarian abbreviation: MTMT)

The goal is to construct and maintain a complete and validated collection of scientific and scholarly publications (and citations) of Hungarian scientists and scholars

Page 12: Evaluation of scientists:  reasons, data and methods

MTMT characteristics

• Scientific and scholarly publications and citations

• National (widely used, quasi compulsory)

• Publicly available

• Attribution to authors and institutions

• Author and institutional responsibility:„they should know best their own output”

• Complete for given periods (2007-2014, and beyond)

• Standardized

• Validated

Page 13: Evaluation of scientists:  reasons, data and methods

Uses for MTMT

• Inventory of scientific output

• Public data for funding evaluation of individual scientists

• Evaluation of applicants for the Doctor of Academy (DSc) title

• Academy membership election

• Evaluation of research groups at Institutions of HAS

• Evaluation, accreditation of professors, universities?

• Gateway to open access repositories

Page 14: Evaluation of scientists:  reasons, data and methods

Publication selection for scientists’ evaluation

Publications (primary, secondary, ...)

• Trustworthy

• Scientific

• Reporting original research

• Separate the reviews from the original researchCitations in primary, secondary, ... ,research papers

• Original scientific publications

• Scientific reviews (journal, book, conference)

• Patents

• Dissertations (usually not a primary research publication)

• Miscellaneous

Page 15: Evaluation of scientists:  reasons, data and methods

Metrics for evaluation‒ Journal level metric used with individual

articles ‒ an easily committed error, also frequent in Hungary,

where impact factor values are transferred to research articles and summed over publications of a scientist and a time interval

‒ Article level metric‒ Citation counts – raw‒ Citation counts – selected ‒ Citation counts – weighted or classified

Page 16: Evaluation of scientists:  reasons, data and methods

Evaluation by publications

• Numbers, raw• Numbers weighted by prestige of the

journals• Citation counts, raw numbers• Weighted citation counts• Downloads• Web links to the publications, raw numbers• Networks of citations• and so on

Page 17: Evaluation of scientists:  reasons, data and methods

Identifying contributors (technicalities)

Identifying scientists by names or identifiersORCID (Open Researcher and Contributor ID) is

coming, but not yet here

Identifying institutional affiliation („authorship”)

MTMT solves both problems locally as authors and institutions are best placed to know and label their own publications. Errors in identification are spotted and corrected by those involved.

Page 18: Evaluation of scientists:  reasons, data and methods

An expensive myth: self-citation distortion

The myth:„Author-self citations are used to manipulate impact and to artificially increase the own position in the community. Self-citations are very harmful and must be removed from the statistics.” (Wolfgang Glanzel: Seven Myths in Bibliometrics. 2008)

• Eliminating self-citations carry large administrative overhead without adding value to a sound evaluation

• Average self-citation rate is around 20% in our 3 million sample of citations

Page 19: Evaluation of scientists:  reasons, data and methods

Attribution of credit for publications

• Are authors and institutions equal contributors? • Can we attribute partial credit?• First, last, multiple first and/or last authorships,

corresponding authorship• Collaborations (large groups of scientists,

experts) as authors• Authors for collaborations?• Publications with many authors,

more than 10, more than 1000, ...

Page 20: Evaluation of scientists:  reasons, data and methods

For eligibility evaluation• Define at sub-field level

E.g. neuroophthalmology versus neuroscience, values differ more than 5 times • Published, fixed threshold values for subfields

o Problems with small specialitieso Scientists active in two or more widely different subfields

• Individual reference values are preferable

• Construct a tailor-made reference publication list (Andras Schubert et al.)o Match publications by subfield and maturityo Compare to the real peerso Requires investment, time, workforce and research

Reference values

Page 21: Evaluation of scientists:  reasons, data and methods

Peer review groups and scientometric indicators

• Evaluation is a human activity, helped by scientometric indicators

• The scales and thresholds are multidimensional and relative

• Algorithms are not available• Peer review is the instrument of evaluation• Transforms a multidimensional

measurement process into decision-making (yes/no, ranking or grouping)

Page 22: Evaluation of scientists:  reasons, data and methods

Panel "technology„ is important

• Each dimension is given a scaleo marks or points are assignedo additivity is implied, but...o decision by the numbers - bad practice

• Open debate• Decision by consensus• Decision by voting on rankings using

o pointso ranking separately in each dimensiono ranking or grouping

handling tieso weighting of the dimension for the goals of evaluation

Page 23: Evaluation of scientists:  reasons, data and methods

Seven general points, as a summary

• Evaluation of scientists is inevitable• Design evaluation for the goals• Use data appropriate to the goals• Use selected types of publications and

citations• Use article level metrics• Choose appropriate reference values• Use peer review panels carefully