23
Web Tools for Peer Reviewers … and Everyone Richard Akerman ICSTI Public Conference 2007 June 21, 2007

Web Tools For Peer Reviewers... and Everyone

Embed Size (px)

DESCRIPTION

What are the challenges in exploring the existing journal literature and new forms of scientific communication? Ultimately in the online, digital environment, everyone is a potential consumer and reviewer of scientific content. What tools currently exist, and what future types of content certification and metrics can we imagine?

Citation preview

Page 1: Web  Tools For  Peer  Reviewers... and Everyone

Web Tools for Peer Reviewers… and Everyone

Richard Akerman

ICSTI Public Conference 2007

June 21, 2007

Page 2: Web  Tools For  Peer  Reviewers... and Everyone

The Motivation for Science

• If Mr. Cavor made [the gravity-blocking substance], it would go down to posterity as Cavorite or Cavorine, and he would be made an F.R.S., and his portrait given away as a scientific worthy with Nature, and things like that. And that was all he saw!

- H.G. Wells, The First Men in the Moon (1901)

Page 3: Web  Tools For  Peer  Reviewers... and Everyone

Peer Review also remains basically unchanged

• And has enduring value

• Some enhancement through– automation of review process– citation linking– plagiarism detection

• Perhaps increased recognition or openness– Publish list of reviewers annually– Publish reviewer comments after some embargo period

Page 4: Web  Tools For  Peer  Reviewers... and Everyone
Page 5: Web  Tools For  Peer  Reviewers... and Everyone

Overview

• Exploring the problem space

• New metrics

• Certification (and reward) challenges

• Some example web tools

Page 6: Web  Tools For  Peer  Reviewers... and Everyone

Two Problem Spaces and Two Directions

• Find existing– Articles, experts, clusters, objects (article discovery)

• Discover new– Ideas, concepts, relationships (knowledge discovery)

• Retrospective– The existing body of scientific knowledge, the citation web,

known author relationships

• Prospective (real-time)– New and unconnected or weakly-connected work and authors

(Ramanujan)

Page 7: Web  Tools For  Peer  Reviewers... and Everyone

Retrospective Finding

• Start with some nucleus– idea, keywords, article, author, (collection), your history

• Find a network of related objects

• How? Recommender services.– Use metrics/features and find closeness/similarity in some

feature space

Page 8: Web  Tools For  Peer  Reviewers... and Everyone

Features/Quality Metrics

• Networks of– Citations– Authors (reputation)

• Groups and Projects

– Certification (journal publication)

• Position in these networks

Page 9: Web  Tools For  Peer  Reviewers... and Everyone

http://www.flickr.com/photos/darkmatterpaintball/251552464/

Page 10: Web  Tools For  Peer  Reviewers... and Everyone

Forensics & Major Miner Problems

• Examining the corpse / dead trees

• Fossilized trees -> coal -> mining!

• We need to recognize this knowledge metaphor

• We’re mining material that is already refined

Page 11: Web  Tools For  Peer  Reviewers... and Everyone

http://www.flickr.com/photos/mekin/399220499/

Page 12: Web  Tools For  Peer  Reviewers... and Everyone

Thinking about mining

• Currently most demonstrations are only on abstracts due to limited full-text availability– So we are mining a refinement of the refinement– Rights are a major problem– Lack of mining rights is one factor that is driving open access

• Also knowledge discovery leads into difficult areas of machine reasoning

• What does it mean to move beyond mining?

Page 13: Web  Tools For  Peer  Reviewers... and Everyone

Talking a walk in the the forest

• How do we find the new Mr. Cavor?

• Exploring the World Wide Web (including repositories) as opposed to the World Wide Literature

• Blogs, wikis, videos, datasets– And pre-prints

• The retrospective metrics we could use are either weaker or non-existent

Page 14: Web  Tools For  Peer  Reviewers... and Everyone

http://www.flickr.com/photos/jzakariya/191481917/

Page 15: Web  Tools For  Peer  Reviewers... and Everyone

What New Metrics Can We Use?

• Derived reputation (e.g. Nature blogs)• Page rank and similar citation-like connections

– Bookmarks• Intentional rank• Comments• Page hits / usage / viewing time• Derived quality

– (important person or group) favours this object– (many many people) favour this object

• Lots of privacy issues

Page 16: Web  Tools For  Peer  Reviewers... and Everyone

Challenges: Download/Views

• The versions challenge – many copies of the article in many places– http://www.lse.ac.uk/library/versions/– eprintweb is linking pre-prints to published versions

• Search engines (spidering)

Page 17: Web  Tools For  Peer  Reviewers... and Everyone

Challenges for Certification

• What should be certified?– Articles in repositories– Blog entries– Versioned Wiki entries– Data sets– Videos– Annotated mashups

• How is certification asserted?– Some sort of digital signature

• What sort of scientific rewards can we provide for the new Mr. Cavor?

Page 18: Web  Tools For  Peer  Reviewers... and Everyone

Example: Eigenfactor

Page 19: Web  Tools For  Peer  Reviewers... and Everyone

Example: eprintweb.org

astro-ph/0607051 (July 2006)Bulk viscosity of Mixed nucleon-hyperon-quark Matter in Neutron starsNa-Na Pan, Xiao-Ping Zheng and Jia-Rong LiReceived. 04 July 2006 Last updated. 04 July 2006Abstract. We calculate the coefficient of bulk viscosity …Journal-ref. Mon.Not.Roy.Astron.Soc. 371 (2006) 1359Published Article doi: 10.1111/j.1365-2966.2006.10759.x

Page 20: Web  Tools For  Peer  Reviewers... and Everyone

Example: ScienceBlogs

Page 21: Web  Tools For  Peer  Reviewers... and Everyone

Example: Postgenomic

Page 22: Web  Tools For  Peer  Reviewers... and Everyone

Example: ChemRank

Page 23: Web  Tools For  Peer  Reviewers... and Everyone

Thank You

•Richard dot Akerman at NRC dot CA

•Supplementary bookmarks athttp://www.connotea.org/user/scilib/tag/icsti2007akerman

•© 2007 Government of Canada and licensed in the Creative Commonshttp://creativecommons.org/licenses/by-nc-sa/2.5/