1
Lessons in Evaluation of Digital Lessons in Evaluation of Digital Libraries and Libraries and Their Application to the CITIDEL Project Their Application to the CITIDEL Project Phase 1: Plan Phase 2: Implement Phase 3: Report Introduction/Background The CITIDEL Project provides access to digital collections in computer science, information systems, information science, software engineering, computer engineering, and related disciplines [1]— among them the digital libraries of ACM and IEEE- CS. The project leadership is distributed over Hofstra University, The College of New Jersey, The Pennsylvania State University, Villanova University, and Virginia Tech; the project is part of the Collections Track activities in the National STEME Digital Library (NSDL) [2]. Metadata from applicable digital repositories is being Stage 1: Plan The initial planning phase enumerates a project timeline and a set of objectives for the entire project. The project goals are enumerated in terms of participant interaction, program costs, program maintenance, expected project outcomes and effective use of human [6] and hardware resources [9]. Choices made during the implementation and report stages can have an affect on future project activity, hence the planning phase must allow for changes in the long-term plan. Stage 2: Implement Evaluation questions of the implementation portion of the project include issues related to the selection of participants, construction of an effective management plan, progress towards the program goals and adhering to a timeline [2]. The evaluation component of the programming phase is focused on providing courseware tools and an information portal that can be of use to a wide range of anticipated users. Key concerns of the programming phase include: can a search query compensate for abbreviated or incomplete cataloging, are information sources correctly recognized [12], can individual users of the database customize the interface, how is user-feedback collected, is cross-searching possible [13], and is user privacy maintained, including the proper use of logs? Stage 3: Report There is no final report because it is expected that funding will continue. Thus there can only be summation snapshots at various times. The periodic reports summarize the events and managerial decisions that have shaped the project up until the present time. The periodic progress reports functions to redefine sub- goals and programming initiatives that have changed throughout the course of the project. Conclusion / Interdependence Although we divide the evaluation procedure into planning, implementation, and report sections, the inter-dependence of each phase makes the evaluation process continuous. A definite start exists, namely the initial planning phase, but successive stages are repeatedly changed and altered, hence there is a need to continuously update the different evaluation stages of the CITIDEL Project. A dynamic evaluation process which is constantly and easily updated allows for unexpected setbacks and creative developments to be easily integrated into the project timeline. References 1. 2 nd Annual Conference on the Theory and Practice of Digital Libraries, June 11-13, 1995 - Austin, Texas, USA; http://csdl.tamu.edu/DL95 ; 2. The national SMETE Digital Library; http://www.steme.org ; 3. Alan Whitelaw and Gill Joy, Summative Evaluation of Phase 3 of the eLib Initiative: Final Report . Guildford: ESYS Consulting, 2001; 4. Beyond eLib: Lessons from Phase 3 of the Electronic Libraries Programme , 2001; http://www.ukoln.ac.uk/services/elib ; 5. Both above reports are also available at http://www.ukoln.ac.uk/services/elib/papers/other/intro.html#elib-evaluation ; 6. University of Michigan Digital Library; http://www.csdl.tamu.edu/DL94/paper/umdl.html ; 7. American Memory Learning; http://lcweb2.loc.gov/ammem/ndlpedu ; 8. Christine L. Borgman ‘Digital libraries and the continuum of scholarly communication’ Journal of Documentation, 56, 4, July 2000, pp.412-430; 9. Floraline Stevens, Frances Lawrenz, Laure Shap, User-Friendly Handbook for Project Evaluation: Science, Mathematics, Engineering and Technology Education, National Science Foundation, 1997.; 10. Linda L. Hill, Ron Dolin, James Frew, Randall B. Kemp, Mary Larsgaard, Daniel R. Montello, Mary-Anna Rae, and Jason Simpson, User Evaluation: Summary of the Methodologies and Results for the Alexandria Digital Library, University of California at Santa Barbara, Alexandria Digital Library Project, University of California, Santa Barbara, California; http://www.asis.org/annual-97/alexia.htm ; 11. Berkley Digital Library; http://elib.cs.berkeley.edu ; http://elib.cs.berkeley.edu/papers.html ; 12. The Library of Congress National Digital Library Program; http://memory.loc.gov/ammem/dli2/html/lcndlp.html ; 13. e-Lib: The Electronic Libraries Programme; http://www.ukoln.ac.uk/services/elib/papers/other/summative-phase-3/elib-eval.main.pdf ; 14. Final Report of the American Memory User Evaluation, 1991-1993; http://lcweb2.loc.gov/ammem/usereval.html Lillian N. (Boots) Cassel, Ph.D.; Filip B. Jagodzinski, B.S. Villanova University, PA, 19085; USA Start Cost, Efficiency User Privacy; Feedback Effective Management User Needs Met? Periodic Reports Expected Outcomes Inter- Dependence User privacy and the digital library desire for information are conflicting requirements that influence data gathering goals. Requests to record information about users must be evaluated with reference to the role such information plays in the effectiveness of the digital library functions and with regards to user response to information gathering. One way to keep detailed user logs is to introduce a hit-counter within each database or metafile and to include a database with a linked structure showing the time-ordered events that were input by a user and the search results that were output in response to each query. The creation of such a user log is an example of one phase of the project that may have an effect on a future phase; the entire metadata structure might need to be adjusted to accommodate such logging capabilities. User Feedback, including quick, multiple-choice style questions as well as personalized responses [14] will be used as an evaluation tool. Comments pertaining to data within an individual resource will be used to help with database ranking and database relevance. User responses about the tools and functions within the CITIDEL Project will be used to assess the effectiveness of the courseware tools and community development activities that are available. Because the CITIDEL Project is a unique digital library in terms of the computer related information content, we expect users to fall into two categories: collegiate level computing students or computing professionals and primary and secondary school students and teachers. Computing professionals will expect research-style content and ease of cross-referencing, while students and teachers from primary and secondary schools will desire learning materials applicable to their academic needs. The wide range of users thus provides challenges and opportunities, and the user feedback will help us to mold the CITIDEL digital library to meet the needs of all participants. User Logs, including a time-ordered list of events that describe a user’s search queries and traversal through the various digital databases, are being investigated [14]. The number of hits per database and the number of times that a user resubmits or alters an original search query will help to evaluate the effectiveness of the search engine and will help determine the most effective structure of the contents of the digital library. An analysis of repeated user inputs will help determine which databases are correctly cross-linked and which keywords are most effective. www.citidel.org harvested, related collections are being cross linked for ease of searching [3, 4, 5, 6], and community development activities and effective courseware tools for computing and information technology education are being developed [7]. The MARIAN digital library software designed at Virginia Tech and the niche search engine technology from Penn State are being used to develop tailored services for the broad user community. The aim of the CITIDEL Project is to provide a service that will be both functional and informative to the entire user community, which will include students in primary schools and research professionals and instructors at the collegiate level [8]. ded by a grant from the National Science Foundation, http://www.nsf.gov ; NSDL is the National Science Digital Library www.nsf.gov This inter-dependence of the various stages of the entire CITIDEL Project mandates that the entire evaluation process be open-ended and be a continuous cycle with no predetermined end, where phases are returned to over and over again. The initial planning phase is largely based on speculation, and hence the communication of a unified goal at the onset of the project helps maintain the initiative to meet user needs [10,6]. The planning phase is open to change, and hence we hope that creative, and often unexpected, improvements to the entire project will be made during the course of implementation.

Lessons in Evaluation of Digital Libraries and Their Application to the CITIDEL Project

  • Upload
    afra

  • View
    21

  • Download
    1

Embed Size (px)

DESCRIPTION

Lessons in Evaluation of Digital Libraries and Their Application to the CITIDEL Project. Lillian N. (Boots) Cassel, Ph.D.; Filip B. Jagodzinski, B.S. Villanova University, PA, 19085; USA. www.citidel.org. www.nsf.gov. Introduction/Background - PowerPoint PPT Presentation

Citation preview

Page 1: Lessons in Evaluation of Digital Libraries and  Their Application to the CITIDEL Project

Lessons in Evaluation of Digital Libraries and Lessons in Evaluation of Digital Libraries and Their Application to the CITIDEL ProjectTheir Application to the CITIDEL Project

Phase 1:Plan

Phase 2:Implement

Phase 3:Report

Introduction/Background

The CITIDEL Project provides access to digital collections in computer science, information systems, information science, software engineering, computer engineering, and related disciplines [1]—among them the digital libraries of ACM and IEEE-CS. The project leadership is distributed over Hofstra University, The College of New Jersey, The Pennsylvania State University, Villanova University, and Virginia Tech; the project is part of the Collections Track activities in the National STEME Digital Library (NSDL) [2]. Metadata from applicable digital repositories is being

Stage 1: Plan 

The initial planning phase enumerates a project timeline and a set of objectives for the entire project. The project goals are enumerated in terms of participant interaction, program costs, program maintenance, expected project outcomes and effective use of human [6] and hardware resources [9]. Choices made during the implementation and report stages can have an affect on future project activity, hence the planning phase must allow for changes in the long-term plan.

Stage 2: Implement 

Evaluation questions of the implementation portion of the project include issues related to the selection of participants, construction of an effective management plan, progress towards the program goals and adhering to a timeline [2]. The evaluation component of the programming phase is focused on providing courseware tools and an information portal that can be of use to a wide range of anticipated users. Key concerns of the programming phase include: can a search query compensate for abbreviated or incomplete cataloging, are information sources correctly recognized [12], can individual users of the database customize the interface, how is user-feedback collected, is cross-searching possible [13], and is user privacy maintained, including the proper use of logs?

Stage 3: Report

There is no final report because it is expected that funding will continue. Thus there can only be summation snapshots at various times. The periodic reports summarize the events and managerial decisions that have shaped the project up until the present time. The periodic progress reports functions to redefine sub-goals and programming initiatives that have changed throughout the course of the project.

Conclusion / Interdependence 

Although we divide the evaluation procedure into planning, implementation, and report sections, the inter-dependence of each phase makes the evaluation process continuous. A definite start exists, namely the initial planning phase, but successive stages are repeatedly changed and altered, hence there is a need to continuously update the different evaluation stages of the CITIDEL Project. A dynamic evaluation process which is constantly and easily updated allows for unexpected setbacks and creative developments to be easily integrated into the project timeline.

References

1. 2nd Annual Conference on the Theory and Practice of Digital Libraries, June 11-13, 1995 - Austin, Texas, USA; http://csdl.tamu.edu/DL95; 2. The national SMETE Digital Library; http://www.steme.org; 3. Alan Whitelaw and Gill Joy, Summative Evaluation of Phase 3 of the eLib Initiative: Final Report . Guildford: ESYS Consulting, 2001; 4. Beyond eLib: Lessons from Phase 3 of the Electronic Libraries Programme , 2001; http://www.ukoln.ac.uk/services/elib; 5. Both above reports are also available at http://www.ukoln.ac.uk/services/elib/papers/other/intro.html#elib-evaluation ; 6. University of Michigan Digital Library; http://www.csdl.tamu.edu/DL94/paper/umdl.html; 7. American Memory Learning; http://lcweb2.loc.gov/ammem/ndlpedu; 8. Christine L. Borgman ‘Digital libraries and the continuum of scholarly communication’ Journal of Documentation, 56, 4, July 2000, pp.412-430; 9. Floraline Stevens, Frances Lawrenz, Laure Shap, User-Friendly Handbook for Project Evaluation: Science, Mathematics, Engineering and Technology Education, National Science Foundation, 1997.; 10. Linda L. Hill, Ron Dolin, James Frew, Randall B. Kemp, Mary Larsgaard, Daniel R. Montello, Mary-Anna Rae, and Jason Simpson, User Evaluation: Summary of the Methodologies and Results for the Alexandria Digital Library, University of California at Santa Barbara, Alexandria Digital Library Project, University of California, Santa Barbara, California; http://www.asis.org/annual-97/alexia.htm; 11. Berkley Digital Library; http://elib.cs.berkeley.edu; http://elib.cs.berkeley.edu/papers.html; 12. The Library of Congress National Digital Library Program; http://memory.loc.gov/ammem/dli2/html/lcndlp.html; 13. e-Lib: The Electronic Libraries Programme; http://www.ukoln.ac.uk/services/elib/papers/other/summative-phase-3/elib-eval.main.pdf ; 14. Final Report of the American Memory User Evaluation, 1991-1993 ; http://lcweb2.loc.gov/ammem/usereval.html

Lillian N. (Boots) Cassel, Ph.D.; Filip B. Jagodzinski, B.S.Villanova University, PA, 19085; USA

Start

Cost, Efficiency

User Privacy; Feedback

Effective Management

User Needs Met?

Periodic Reports

Expected Outcomes

Inter-Dependence

User privacy and the digital library desire for information are conflicting requirements that influence data gathering goals. Requests to record information about users must be evaluated with reference to the role such information plays in the effectiveness of the digital library functions and with regards to user response to information gathering.

One way to keep detailed user logs is to introduce a hit-counter within each database or metafile and to include a database with a linked structure showing the time-ordered events that were input by a user and the search results that were output in response to each query. The creation of such a user log is an example of one phase of the project that may have an effect on a future phase; the entire metadata structure might need to be adjusted to accommodate such logging capabilities.

User Feedback, including quick, multiple-choice style questions as well as personalized responses [14] will be used as an evaluation tool. Comments pertaining to data within an individual resource will be used to help with database ranking and database relevance. User responses about the tools and functions within the CITIDEL Project will be used to assess the effectiveness of the courseware tools and community development activities that are available. Because the CITIDEL Project is a unique digital library in terms of the computer related information content, we expect users to fall into two categories: collegiate level computing students or computing professionals and primary and secondary school students and teachers. Computing professionals will expect research-style content and ease of cross-referencing, while students and teachers from primary and secondary schools will desire learning materials applicable to their academic needs. The wide range of users thus provides challenges and opportunities, and the user feedback will help us to mold the CITIDEL digital library to meet the needs of all participants.

User Logs, including a time-ordered list of events that describe a user’s search queries and traversal through the various digital databases, are being investigated [14]. The number of hits per database and the number of times that a user resubmits or alters an original search query will help to evaluate the effectiveness of the search engine and will help determine the most effective structure of the contents of the digital library. An analysis of repeated user inputs will help determine which databases are correctly cross-linked and which keywords are most effective.

www.citidel.org

harvested, related collections are being cross linked for ease of searching [3, 4, 5, 6], and community development activities and effective courseware tools for computing and information technology education are being developed [7].

The MARIAN digital library software designed at Virginia Tech and the niche search engine technology from Penn State are being used to develop tailored services for the broad user community. The aim of the CITIDEL Project is to provide a service that will be both functional and informative to the entire user community, which will include students in primary schools and research professionals and instructors at the collegiate level [8].

 

The project is funded by a grant from the National Science Foundation, http://www.nsf.gov; NSDL is the National Science Digital Library

www.nsf.gov

This inter-dependence of the various stages of the entire CITIDEL Project mandates that the entire evaluation process be open-ended and be a continuous cycle with no predetermined end, where phases are returned to over and over again. The initial planning phase is largely based on speculation, and hence the communication of a unified goal at the onset of the project helps maintain the initiative to meet user needs [10,6]. The planning phase is open to change, and hence we hope that creative, and often unexpected, improvements to the entire project will be made during the course of implementation.