48
WHAT CAN BIBLIOMETRICS TELL US ABOUT MEDICINE & BIOMEDICINE? Philip Purnell Leiden May 2010

Leiden Conference May 2010

Embed Size (px)

DESCRIPTION

What can Bibliometrics tell us about Medicine and Biomedicine?

Citation preview

Page 1: Leiden Conference May 2010

WHAT CAN BIBLIOMETRICSTELL US ABOUT MEDICINE & BIOMEDICINE?

Philip Purnell

LeidenMay 2010

Page 2: Leiden Conference May 2010

HOW DO WE EVALUATE RESEARCH?• Research grants

– Number and value

• Prestigious awards– Nobel Prizes

• Patents– Demonstrating innovative research

• Faculty– Number of post-graduate researchers

• Citation analysis– Publication and citation counts– Normalised by benchmarks

• Peer Evaluation– Expensive, time consuming and subjective

2

Page 3: Leiden Conference May 2010

BRIEF HISTORY OF THE CITATION INDEX• Concept first developed by Dr Eugene Garfield

– Science, 1955

• The Science Citation Index (1963)– SCI print (1960’s)– On-line with SciSearch in the 1970’s– CD-ROM in the 1980’s– Web interface (1997) Web of Science

• Content enhanced:– Social Sciences Citation Index (SSCI)– Arts & Humanities Citation Index (AHCI)

• The Citation Index– Primarily developed for purposes of information retrieval– Development of electronic media and powerful searching tools

have increased its use and popularity for purposes of ResearchEvaluation 3

Page 4: Leiden Conference May 2010

WEB OF SCIENCEJOURNAL SELECTION POLICY

WHY DO WE SELECTJOURNALS?

Page 5: Leiden Conference May 2010

THOMSON REUTERSJOURNAL CITATION REPORTS

40% of the journals:

• 80% of the publications

• 92% of cited papers

4% of the journals:

• 30% of the publications

• 51% of cited papers

5

0

20

40

60

80

100

120

0 1000 2000 3000 4000 5000 6000

# of journals

% o

f data

base

Articles Citations

Page 6: Leiden Conference May 2010

WEB OF SCIENCEJOURNAL SELECTION POLICY• Approx. 2000 journals evaluated annually

– 10-12% accepted

• Thomson Reuters editors– Information professionals– Librarians– Experts in the literature of their subject area

6

Web of Science

Journals under evaluation

Journal ‘quality’

Page 7: Leiden Conference May 2010

THOMSON REUTERSJOURNAL SELECTION POLICY• Publishing Standards

– Peer review, Editorial conventions

• Editorial content– Addition to knowledge in specific subject field

• Diversity– International, regional influence of authors, editors,

advisors

• Citation analysis– Editors and authors’ prior work

7

Page 8: Leiden Conference May 2010

GLOBAL RESEARCH REPRESENTATIONWEB OF SCIENCE COVERAGE

8

Region # Journals from Region in Web of ScienceEurope 5,573 49%

North America 4,251 38%Asia-Pacific 965 9%

Latin America 272 2%Middle East/Africa 200 1%

Language # Journals in Web of ScienceEnglish 9114 81%Other 2147 19%

Page 9: Leiden Conference May 2010

Analyses based on authoritative, consistent data from the world’sleading provider of Research Evaluation solutions

Thomson Reuters has developed a selection policy over the last50 years designed to hand-pick the relevant journals containingthe core content over the full range of scholarly disciplines

This has created a large set of journals containing comparablepapers and citations

Thomson Reuters has always had one consistent editorial policyto index all journals cover-to-cover, index all authors and index alladdresses. This unique consistency makes Web of Science theonly suitable data source for citation analysis

SUMMARYCONSISTENCY IS THE KEY TO VALIDITY

Page 10: Leiden Conference May 2010

EVALUATINGMEDICINE

Page 11: Leiden Conference May 2010

RELATIVE IMPACT OF MEDICINE

11

Page 12: Leiden Conference May 2010

RELATIVE CONTRIBUTION TO WORLD’SLITERATURE

12

Page 13: Leiden Conference May 2010

RELATIVE PERFORMANCE OFMEDICAL DISCIPLINES

13

Page 14: Leiden Conference May 2010

EVALUATINGCOUNTRIES

Page 15: Leiden Conference May 2010

COMPARATIVE PERFORMANCEBY COUNTRY

15

Page 16: Leiden Conference May 2010

EFFICIENCY COMPARED WITHEUROPEAN AND GLOBAL AVERAGES

16

Page 17: Leiden Conference May 2010

GOVERNMENTS AND INSTITUTIONSUSING TR DATA FOR EVALUATION (INCL.)

• NWO & KNAW, Netherlands

• France: Min. de la Recherche, OST - Paris, CNRS

• Germany: Max Planck Society, several gov’t labs, DKFZ, MDCUS:National Institutes of Health

• United Kingdom: King’s College London; HEFCE

• European Union: EC’s DGXII(Research Directorate)

• US: NSF: biennial Science & Engineering Indicators report (since 1974)

• Canada: NSERC, FRSQ (Quebec), Alberta Research Council

• Australian Academy of Science, gov’t lab CSIRO

• Japan: Ministry of Education, Ministry of Economy, Trade & Industry

• People’s Republic of China: Chinese Academy of Science

• Times Higher Education: World University Rankings (from 2010)17

Page 18: Leiden Conference May 2010

18

GLOBAL REACH: >4,000 RESEARCHCENTRES (91 COUNTRIES)

Asia-Pacific

353 Customers in 26 countries

Russia 147 Customers

Europe,Middle

East andAfrica

2,500+ Customers In 50 countries

244 Customers in 12 countries

LatinAmerica

760 customers

NorthAmerica

Page 19: Leiden Conference May 2010

EVALUATINGINSTITUTIONS

Page 20: Leiden Conference May 2010

EVALUATING INSTITUTIONS

Source: Thomson Reuters

North America University Science Indicators

Page 21: Leiden Conference May 2010

COMPARATIVE PERFORMANCE OFGLOBAL MEDICINE RESEARCH

21

Page 22: Leiden Conference May 2010

BENCHMARK YOUR PAPERS AGAINSTGLOBAL AVERAGES

22

Hematology articles fromthis year have been cited18,83 times

This article is ranked in the12,92nd percentile in itsfield by citations

Articles publishedin ‘Blood’ from2004 have beencited 34,30 times

This paper has received40/34,30=1,17 times theexpected citations forthis journal

This paper has received40/18,83=2,12 times theexpected citations forthis subject category

Page 23: Leiden Conference May 2010

WHICH COLLABORATIONSARE THE MOST VALUABLE?

23

Collaborations with theseinstitutions have producedhighly cited papers withintheir subject fields

Page 24: Leiden Conference May 2010

EVALUATING INDIVIDUALS

Page 25: Leiden Conference May 2010

WHO ARE OURMOST PRODUCTIVE AUTHORS?

25

Page 26: Leiden Conference May 2010

WHO ARE OURMOST INFLUENTIAL RESEARCHERS?

26

Page 27: Leiden Conference May 2010

WHICH AUTHORS HAVETHE MOST IMPACT?

27

Normalises citationcounts for quantityof papers…

… but not for age ofpaper, documenttype or subject field!

Page 28: Leiden Conference May 2010

WHICH AUTHORS’ PAPERS HAVEPERFORMED THE BEST IN THEIR FIELD?

28

Normalises citationaverage for subject fieldand age of papers

Meaning you can nowcompare the geneticistwith the historian

Page 29: Leiden Conference May 2010

HOW CAN WE COMPARE RESEARCHERS?

29

Author A: 60 papers Author B: 117 papers

Page 30: Leiden Conference May 2010

EVALUATING JOURNALS

Page 31: Leiden Conference May 2010

EFFICIENCY JOURNAL IMPACT FACTOR

2008 Impact Factor

200820072006

Source paper – published in 2008

Cited reference – published in 2006 or 2007

Citations

AllPreviousYears

2005 2009

Page 32: Leiden Conference May 2010

Citations in 2008To items published in 2007 = 273

To items published in 2006 = 463

Sum = 736

Number of itemsPublished in 2007 = 133

Published in 2006 = 173

Sum = 306

736

306

= 2,405

CALCULATING 2008 IMPACT FACTOREPILEPSY RESEARCH

Page 33: Leiden Conference May 2010

JOURNAL IMPACT FACTORBIOTECHNOLOGY JOURNALS

Page 34: Leiden Conference May 2010

CITATION BEHAVIOUR VARIESBETWEEN SUBJECT CATEGORIES

Page 35: Leiden Conference May 2010

JOURNAL 5-YEAR IMPACT FACTORBIOTECHNOLOGY JOURNALS

Page 36: Leiden Conference May 2010

HOW DO RESEARCH INSTITUTIONSEVALUATE JOURNALS FOR LIBRARIES?• Faculty head request

• Publisher packages

• Budget constraints

• Library recommendation

Page 37: Leiden Conference May 2010

IN WHICH JOURNALS DO OURBIOLOGISTS PUBLISH?

Page 38: Leiden Conference May 2010

AND IS THE GLOBAL COMMUNITYINFLUENCED BY YOUR RESEARCH?

Page 39: Leiden Conference May 2010

USING THE IMPACT FACTOREVALUATING JOURNALS

• Appropriate use– To evaluate journals

• Misuse– Evaluation of individual articles– Evaluation of institution or researcher

Page 40: Leiden Conference May 2010

USING THE IMPACT FACTORMISUSE: EVALUATING INDIVIDUAL PAPERS

30% of articles inFood Policy werenot cited at all

Page 41: Leiden Conference May 2010

RESEARCHER IDSCHOLARLY RESEARCH COMMUNITY• Accurate Identification

• Organize and Manage

• Increase Visibility &Recognition

• Measure Performance

• Collaboration

• Security

Science, March 2009

Page 42: Leiden Conference May 2010

INDIVIDUAL LEVELRESEARCH EVALUATION

View accuratepublication list dueto unique authoridentification See personalized

metrics using Web ofScience citation data

Page 43: Leiden Conference May 2010

RESEARCHER IDANALYZE COLLABORATION NETWORK

Seek global collaborationopportunities by author,field, institution or country

Page 44: Leiden Conference May 2010

RESEARCHER IDVISUALIZE CITING ARTICLES NETWORK

Page 45: Leiden Conference May 2010

RESEARCHER IDELECTRONIC CV RESOURCE

Page 46: Leiden Conference May 2010

Top Five CountriesTop Five Institutions

• Top Ten Institutions• University College London• The University of Queensland• Monash University• Harvard University• University of Michigan• University of Pennsylvania• ETH Zurich• University of Cambridge• Stanford University• McGill University

RESEARCHER IDGLOBAL PARTICIPATION

Page 47: Leiden Conference May 2010

RESEARCHER IDUPLOAD SERVICE• Institutions can upload content on behalf of their

researchers– Upload researcher names Obtain a ResearcherID

account– Upload individual publication portfolios Articles are

matched to Web of Science records and ResearcherIDportfolios are generated

Page 48: Leiden Conference May 2010

RESEARCHER IDDOWNLOAD SERVICE• Download data about the individuals at your

institution– Names and name variants, current and past affiliations

• Download ResearcherID publication portfolios– Bibliographic details of each item in the portfolio– For articles that were successfully matched to Web of

Science records there will be a Times Cited count and UTtag.

• Can be used by the customers for their owninternal systems