45
Downloads and Beyond - new perspectives on usage metrics This panel consists of three presentations, covering different stages in the usage by scholars of articles: Before the download – what happens during the search process? Marie Kennedy, Loyola Marymount University, Los Angeles, CA Download-based metrics – new usage-based measures of impact Peter Shepherd, COUNTER, UK Beyond downloads – how are journal articles shared and used? Carol Tenopir, University of Tennessee, Knoxville, TN

Downloads and Beyond · core STM research areas, where there is no reliable, universal measure of journal impact, because citation-based measures are either inadequate or non-existent

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Downloads and Beyond

- new perspectives on usage metrics

This panel consists of three presentations, covering different stages in the

usage by scholars of articles:

Before the download – what happens during the search process?

◦ Marie Kennedy, Loyola Marymount University, Los Angeles, CA

Download-based metrics – new usage-based measures of impact

◦ Peter Shepherd, COUNTER, UK

Beyond downloads – how are journal articles shared and used?

◦ Carol Tenopir, University of Tennessee, Knoxville, TN

New usage-based measures of

impact - Article Level Reporting and the Usage

Factor

Peter Shepherd

COUNTER

Library Assessment Conference

Seattle, 4 August 2014

COUNTER usage-based

measures in the context of

altmetrics Advantages:

Usage can be reported at the individual item and individual researcher level

Usage is more ’immediate’ than citations

Usage potentially covers all categories of online publication

COUNTER usage statistics are independently audited and generally trusted

Two new COUNTER Codes of Practice have been launched:

COUNTER Code of Practice for Articles (COUNTER Articles)

Recording, consolidation and reporting of usage at the individual article level

Standard applies to publishers, aggregators and repositories

COUNTER Code of Practice for Usage Factor

Usage-based measure of impact of journals, institutions and individual scholars

The Usage Factor for a Journal is the Median Value in a set of ordered full-text article usage data ( i.e. the

number of successful full text article requests) for a specified Usage Period of articles published in a journal

during a specified Publication Period.

COUNTER Articles and Usage Factor are both based on the recording and

consolidation of COUNTER-compliant usage data at the individual article level

COUNTER Code of Practice for

Articles

COUNTER Articles covers the following areas:

article types to be counted;

article versions to be counted;

data elements to be measured;

definitions of these data elements;

content and format of usage reports;

requirements for data collection and data processing;

requirements for independent audit (under development);

Release 1 of the COUNTER Code of Practice for Articles is available on the COUNTER website at: http://www.projectcounter.org/counterarticles.html

COUNTER Articles: data and

metadata

Publisher/aggregator organizations should collect the usage data in the format

specified in Article Report 1. The following data and metadata must be collected for

each article:

Either Print ISSN OR Online ISSN

Article version, where available

Article DOI

Date of First Successful Request

Monthly count of the number of successful full-text requests - counts must remain

available for at least 24 months from Online Publication Date OR date of First

Successful Request

The following metadata are optional, but are desirable:

Journal title

Publisher name

Platform name

Journal DOI

Article title

Article type

Article publication date

COUNTER Articles

– 3 Article Reports

Article Report 1: publisher specification for data collection

by article

To be used by publishers for the collection of data and metadata

Article Report 2: number of successful full-text article

requests by author, month and DOI, consolidated from

different sources

To be used by publishers to report individual article usage to

authors, broken down by source of usage

Article Report 3: summary of all successful full-text article

requests for an author, by month

To be used by publishers to provide a summary to authors of usage

for all their articles

Article Report 1: specification for data

collection by article

Usage Factor: aims and outcomes

The overall aim of the Usage Factor project was to explore how online journal usage

statistics might form the basis of a new measure of journal impact and quality, the

Usage Factor for journals.

Specific objectives were to answer the following questions:

Will Usage Factor be a statistically meaningful measure?

Will Usage Factor be accepted by researchers, publishers, librarians and research

institutions?

Will Usage Factor be statistically credible and robust?

Is there an organizational and economic model for its implementation that would

cost-effective and be acceptable to the major stakeholder groups.

Following extensive testing using usage data for over 200 journals from a range of

publishers the main outcome of the project has been the new COUNTER Code of

Practice for Usage Factors. This new Code of Practice uses the article level usage

data collected using the COUNTER Code of Practice for Articles as the basis for the

calculation of the Usage Factor.

The COUNTER Code of Practice for Usage Factors is available on the COUNTER

website at: http://www.projectcounter.org/usage_factor.html

Who will benefit from the Usage

Factor? Four major groups will benefit from the introduction of Usage Factors:

Authors, especially those in practitioner-oriented fields, where citation-based measures understate the impact of journals, as well as those in areas outside the core STM fields of pure research, where coverage of journals by citation-based measures is weak.

Publishers, especially those with large numbers of journals outside of the core STM research areas, where there is no reliable, universal measure of journal impact, because citation-based measures are either inadequate or non-existent for these fields

Librarians, when deciding on new journal acquisitions, have no reliable, global measures of journal impact for fields outside the core STM research fields. They would use usage-based measures to help them prioritise journals to be added to their collections.

Research Funding Agencies, who are seeking a wider range of credible, consistent quantitative measures of the value and impact of the outputs of the research that they fund.

Usage Factor metric:

recommendations

Usage Factors should be calculated using the median rather than the arithmetic

mean

A range of Usage Factors should ideally be published for each journal: a

comprehensive UF ( all items, all countable versions) plus supplementary factors

for selected items

Usage Factors should be published as integers with no decimal places

Usage Factors should be published with appropriate confidence levels around the

average to guide their interpretation

The Usage Factor should be calculated initially on the basis of a maximum usage

time window of 24 months.

The Usage Factor is not directly comparable across subject groups and should

therefore be published and interpreted only within appropriate subject groupings.

The Usage Factor should be calculated using a publication window of 2 years

Usage Factor: Journals

- the calculation

Publishers will be able to generate Usage Factors using the Code of Practice, but will

have to be independently audited for their Usage Factors to be listed in the Usage

Factor Central Registry. Two categories of Usage Factor may be calculated

The 24 month Journal Usage Factor 2010/2011: all content

The median number of successful requests during 2010/2011 to content

published in the journal in 2010/2011

The Journal Usage Factor 2010/2011: full-text articles only

The median number of successful requests during 2010/2011 to full-text

articles published in the journal in 2010/2011

Note:

1.The article-level data collected in COUNTER Article Report 1 will be used as

the basis for the Usage Factor calculation

2. Usage Factors will be reported annually, for 2010/2011, 2011/2012, etc.

COUNTER Articles and Usage

Factor - implementation

Step 1: implement COUNTER Code of Practice

for Articles

Step 2: Collect article-level usage data for

2014/2015

Step 3: Calculate and report Usage Factors

using protocols specified in Code of Practice for

Usage Factors

COUNTER Articles and Usage

Factor

Common threads

Article-based metrics • Can be rolled up to researcher, institutions and journal level

Reliable, audited data • Based on tested COUNTER standards

Common process/ infrastructure requirements • Similar metadata

• Efficient, cost-effective processes

For further information:

http://www.projectcounter.org/index.html

BEFORE THE DOWNLOAD:

THE SEARCH PROCESS FROM A

SOCIAL NETWORK ANALYSIS

PERSPECTIVE

Marie R. Kennedy David P. Kennedy Loyola Marymount University RAND Corporation

[email protected] [email protected]

The finding

The found

TWO TYPES

OF COUNTING MECHANISMS

COUNTER

Proxy server

EVALUATION AND USE

A MODEL OF A PERSON’S

INFORMATION SEEKING

AN ALTERNATIVE

COUNTING MECHANISM

METHODS

Describe and compare 3 kinds of

measurements of electronic resource usage

Time frame: June 1, 2011-May 31, 2012

COUNTER JR1

PROXY SERVER

SOCIAL NETWORK ANALYSIS

Data extracted from Gimlet

11,444 total service point interactions

4,024 tagged as reference interactions

1,548 of the reference interactions mention an

electronic resource

SOCIAL NETWORK ANALYSIS

New data set created

1,548 of the reference interactions mention an electronic resource

Listed the resource mentioned and counted each time it was suggested

Analyzed and visualized using Ucinet, Netdraw

FINDINGS

Electronic resources and Information Desk staff, by color/shape

Core staff and electronic resources emerge

Connections between electronic resources

DISCUSSION

“Knowledge creation is not confined to

an individual, rather it is a social process

between individuals, groups and

organisations.”

(Zheng and Yano, 2007, p. 5)

FUTURE RESEARCH

Further analysis on existing data set

Kinds of e-resources suggested to kinds of

patrons

Kinds of reference desk staff suggest which kinds

of e-resources

Expand data set to include more years of data

Develop e-resource marketing plan and look at

resulting 3 kinds of usage data

SUMMARY

We find that the perspective gained from

social network analysis provides a context-

aware component that provides a fuller picture

of the “use” of electronic resources, following

the path from “finding” to “found.”

CONTACT US

Marie R. Kennedy David P. Kennedy

[email protected] [email protected]

This presentation is supported by a

Research Incentive Grant from the

William H. Hannon Library at LMU

Preliminary results of this research were presented

at the 2013 QQML Conference (Rome, Italy)

Center for Information and Communication Studies

Beyond Downloads:

How Are Journal Articles Shared and Used?

Carol Tenopir

Professor, School of Information Sciences,

University of Tennessee

[email protected]

Center for Information and Communication Studies

Beyond Project COUNTER

• Secondary usage

• Sharing without

downloading

Center for Information and Communication Studies

Formal sharing methods

Center for Information and Communication Studies

Informal sharing methods

Center for Information and Communication Studies

Interviews / Focus Groups

Two main types of sharing

1.

2.

Most participants who

share, uploaded their own

work into institutional or

subject repositories.

Center for Information and Communication Studies

Interviews / Focus Groups

Participants shared material to further scientific and

academic discovery, to promote their own or someone

else’s work, and to fulfill an information need.

Center for Information and Communication Studies

Overall, the project aims to: • define ways to measure non-download usage of digital

content both within and outside institutional firewalls

• evaluate the relationship between COUNTER usage and

usage of digital articles obtained through other means

• develop practical ways to estimate total digital article

usage from known downloads and non-download usage

• initiate discussion across the publisher, STM research,

and library communities regarding these issues

Center for Information and Communication Studies

Interviews / Focus Groups

• “Bootleg” sharing (e.g., email, print,

internal network): the most frequently

mentioned method of sharing.

• Twitter: the most frequently mentioned

social media tool for sharing.

• Dropbox: the most frequently mentioned

method used for sharing with collaborators.

Center for Information and Communication Studies

Survey launched soon • Population: researchers internationally

• Aim: to estimate amounts of sharing

and calculate averages that take into

account: • Multiple ways to share

• Differences in discipline

• Development of instrument underway

Center for Information and Communication Studies

Stay tuned for further

results!

Carol Tenopir

[email protected]