30
Evaluating E-Reference: An Evidence Based Approach Elaine Lasda Bergman and Irina I. Holden University at Albany Presentation for Reference Renaissance Denver, CO August 10, 2010

Evaluating E-Reference: An Evidence Based Approach

Embed Size (px)

Citation preview

Page 1: Evaluating E-Reference: An Evidence Based Approach

Evaluating E-Reference: An Evidence Based Approach

Elaine Lasda Bergman and Irina I. HoldenUniversity at AlbanyPresentation for Reference RenaissanceDenver, CO August 10, 2010

Page 2: Evaluating E-Reference: An Evidence Based Approach

Overview

What is Evidence Based Librarianship?

Methods What constitutes “evidence?” Systematic reviews and analyses

Systematic Review Process Research question Database Search Article Review Critical Appraisal Synthesize, analyze, discuss

Page 3: Evaluating E-Reference: An Evidence Based Approach

Overview

Results of our review Methods of determining user satisfaction Comparison of variables Range of results

Conclusions, lessons learned About evidence based librarianship About research quality About user satisfaction with electronic reference

Page 4: Evaluating E-Reference: An Evidence Based Approach

What is Evidence Based Librarianship?

Booth and Brice’s definition of Evidence Based Information Practice:

○“The Retrieval of rigorous and reliable evidence to inform… decision making”

(Booth and Brice, ix)

Page 5: Evaluating E-Reference: An Evidence Based Approach

What is Evidence Based Librarianship (EBL)?

History

Gained traction in Medical fields in 1990’s and spread to social sciences after that

Medical librarians were the first to bring this approach to LIS research

Increasingly used in social sciences and information/library science

Sources: Booth and Brice, ix.

Page 6: Evaluating E-Reference: An Evidence Based Approach

Don’t we ALREADY use “evidence”? Evidence is “out there, somewhere”

Disparate locations: many different journals, many different researchers

Evidence is not summarized, readily available and synthesized

No formal, systematized, concerted effort to quantify and understand if there is a pattern or just our general sense of things

Page 7: Evaluating E-Reference: An Evidence Based Approach

Heirarchy of “Evidence”

Source: http://ebp.lib.uic.edu/applied_health/?q=node/12

Page 8: Evaluating E-Reference: An Evidence Based Approach

Systematic Reviews vs. Literature Reviews

Literature Review Systematic Review

Narrative text Research methodolgy/process

Evaluation: author’s opinion

Evaluation: formal critical appraisal process

Usually single evaluator Best if multiple evaluators

Studies are categorized but separately summarized

Variables in studies are compared across studies, synthesized and analyzed

General sense of a pattern

Quantified, identified patterns and comparisons

Page 9: Evaluating E-Reference: An Evidence Based Approach

Systematic Reviews: When Are They Useful?

Too much information in disparate sources

Too little information, hard to find all of the research

Help achieve consensus on debatable issues

Plan for new researchProvide teaching/learning materials

Page 10: Evaluating E-Reference: An Evidence Based Approach

Process of Systematic Review

Formulate Research Question

Database Search

Review Results

Critical Appraisal

Analysis

Page 11: Evaluating E-Reference: An Evidence Based Approach

Research Questions

Research question formulation Description of the parties involved in the studies

(librarians and patrons, for ex.) What was being studied (effectiveness of

instructional mode, for ex.) The outcomes and how they can be compared What data should be collected for this purpose

(either student surveys or pre/post tests, etc.)

Page 12: Evaluating E-Reference: An Evidence Based Approach

Our Research Questions

1. What is the level of satisfaction of patrons who utilize digital reference?

2. What are the measures researchers use to quantify user satisfaction and how do they compare?

Page 13: Evaluating E-Reference: An Evidence Based Approach

Database Search

LISTA (EBSCO platform): 123 articles retrieved

LISA (CSA platform): 209 articles retrieved

ERIC: no unique studies retrieved

Page 14: Evaluating E-Reference: An Evidence Based Approach

Working with Results

279 Results after de-duplication

Only format retrieved: journal articles

Abstracts were reviewed applying inclusion and exclusion criteria

Page 15: Evaluating E-Reference: An Evidence Based Approach

Sample Inclusion/Exclusion Criteria

Inclusion Peer reviewed journals

Articles comparing e-reference with face-to-face reference

Articles on academic, public and special libraries

Articles on e-mail, IM, and “chat” reference

Exclusion Articles describing how

to implement digital reference programs

Articles discussing quantitative or demographic data only

Reviews, editorials and commentary

Non-English articles

Page 16: Evaluating E-Reference: An Evidence Based Approach

Working with Results

93 articles were selected based on inclusion/exclusion criteria

Full text was obtained and read by both authors independently to determine if at least one variable pertaining to user satisfaction was present; then the results were compared

Page 17: Evaluating E-Reference: An Evidence Based Approach

Results of Full Text ReviewReason for Exclusion # of

Articles

No variable on user satisfaction 32

Advice or commentary 18

Not about e-ref transactions 5

About resources used 5

Showcased library experience 3

Literature review 2

Review of another study 1

Not scholarly 1

Not about electronic reference 1

Selected for critical appraisal 23

Found during citation search 1

Total 94

Page 18: Evaluating E-Reference: An Evidence Based Approach

Critical Appraisal Tools

QUOROM (The Lancet, 1999, vol. 354, 1896-1900)

Downs-Black scale (“Checklist for study quality”)

CriSTAL (Critical Skills Training in Appraisal for Librarians (Andrew Booth)

Page 19: Evaluating E-Reference: An Evidence Based Approach

Glynn’s Critical Appraisal Tool

PopulationData collectionStudy designResults

Page 20: Evaluating E-Reference: An Evidence Based Approach

Critical Appraisal Process

24 articles were subjected to critical appraisal

Each question from Glynn’s tool was answered (either yes, no, unclear or N/A) and the results were calculated

12 research papers selected and subjected to the systematic review

Page 21: Evaluating E-Reference: An Evidence Based Approach

Analysis (Findings of Review)

Settings and general characteristics: Multiple instruments in a single article 9 unique journals US based

Methods and timing of data collection 7 paper surveys 3 pop up surveys 3 transcript analysis

Page 22: Evaluating E-Reference: An Evidence Based Approach

Similar Variables in Surveys

“Willingness to return” 11 surveys of all instruments (Nilsen) Staff person vs service

“Have you used it before?” Ranged from 30%-69% (email)

Positivity of experience 7 point, 4 point, 3 point scales 65% - 98.2% (email, small group) 14-417 respondents

Staff quality 7 point, 4 point, 3 point scales 68% - 92.8% (14 respondents)

Page 23: Evaluating E-Reference: An Evidence Based Approach

Analysis

Other questions in obtrusive studies

“Were you satisfied?” “Would you recommend to a colleague?” each only asked in only 1 of the studies

Page 24: Evaluating E-Reference: An Evidence Based Approach

Analysis:

Reason for variation:Nature of questions asked is

contingent on context in which satisfaction was measured • [correlate to guidelines, librarian behaviors,

reference interviews, etc.]

Page 25: Evaluating E-Reference: An Evidence Based Approach

Unobtrusive studies: Transcript Analysis

2 Basic Methods: Transcript analysis by person asking the question

(proxy patron) (Schachaf and Horowitz, 2008, Sugimoto, 2008).

• 75% “complete”, 68% “mostly incomplete” Transcripts independently assessed for quality

and coded (Marsteller and Mizzy, 2003, Schachaf and Horowitz, 2008)

• 3 point scale, “+ or –” scale• 2.24 out of 3 (level of quality); 5 negatives/200

transactions

Research question: Efficacy of third party assessors vs. user surveys

Page 26: Evaluating E-Reference: An Evidence Based Approach

Lessons Learned Lessons about user satisfaction with

electronic reference:Overall pattern of users being satisfied,

regardless of methodology or questions asked

Measurement of user satisfaction is contingent upon context

Researchers most often try to connect user satisfaction to another variable, satisfaction the sole focus of only one article

Page 27: Evaluating E-Reference: An Evidence Based Approach

Lessons Learned

Lessons about library research Extensive amount of qualitative research makes

performing systematic reviews challenging

Inconsistency of methodologies used in original research makes the systematic review challenging, meta-analysis is more often than not impossible

Common pitfalls in LIS research that affect the quality of the published article

Page 28: Evaluating E-Reference: An Evidence Based Approach

Lessons Learned

Benefits of undertaking a systematic review: Sharpens literature searching skills: benefits for

both librarians and their patrons who need this kind of research

Researcher gains the ability to critically appraise research

The practice of librarianship is strengthened by basing decisions on a methodological assessment of evidence

Page 29: Evaluating E-Reference: An Evidence Based Approach

Systematic Reviews and EBL:Impact on the Profession

Formal gathering and synthesis of evidence may:Affirm our intuitive sense about the patterns in

current researchRefine, clarify and enhance a more robust

understanding of a current problem in librarianship

May, on occasion, provide surprising results!

Page 30: Evaluating E-Reference: An Evidence Based Approach

Questions?http://www.slideshare.net/

librarian68Elaine M. Lasda [email protected]

Irina I. [email protected]