Click here to load reader
View
369
Download
0
Embed Size (px)
Citation preview
NISORecommendedPracticestoSupportAdoptionof
Altmetrics
Todd A. Carpenter3:AM Conference
Bucharest, RomaniaSeptember 29, 2016
We’re just not as alternative as we used to be
RobertSmith,TheCure
Ø US-based,non-profitindustrytradeassociationaccreditedbyAmericanNationalStandardsInstitute
Ø Missionofdevelopingandmaintainingtechnicalstandardsrelatedtoinformation,documentation,discoveryanddistributionofpublishedmaterialsandmedia
Ø Volunteerdrivenorganization:400+contributorsspreadoutacrosstheworld
Ø Responsible(directlyandindirectly)forstandardslikeISSN,DOI,DublinCoremetadata,DAISYdigitaltalkingbooks,OpenURL,MARCrecords,andISBN
About
August20,2016 2
August20,2016 5
WhitePaperReleased- June2014
August20,2016 6
August20,2016 7
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Oflittleimportance
Moderatelyimportant
Important
Veryimportant
n=118
CommunityFeedbackonProjectIdeaThemes
Definitions and Use CasesCode of Conduct
Data MetricsOutput Types for Assessment
Persistent Identifiers and Assessment
August20,2016 8
August20,2016 9
Definitions and Use Cases
August20,2016 10
Caveats
• Citations, usage data, and altmetrics are ALL potentially important and potentially imperfect
• Please don’t use altmetrics (or any metrics) as an uncritical proxy for scholarly impact – consider quantitative and qualitative information too
• data quality and indicator construction are key factors in the evaluation of specific altmetrics(read as: garbage in, garbage out!)
August20,2016 11
What is Altmetrics? Definition
Altmetrics is a broad term that encapsulates the digital collection, creation, and use of multiple forms of assessment that are derived from activity and engagement among diverse stakeholders and scholarly outputs in the research ecosystem.
The inclusion in the definition of altmetrics of many different outputs and forms of engagement helps distinguish it from traditional citation-based metrics, while at the same time, leaving open the possibility of their complementary use, including for purposes of measuring scholarly impact. However, the development of altmetrics in the context of alternative assessment sets its measurements apart from traditional citation-based scholarly metrics.
August20,2016 12
Use Cases
Developed eight personas, three themes:Showcase achievement: Indicates stakeholder interest in highlighting the positive achievements garnered by one or more scholarly outputs.Research evaluation: Indicates stakeholder interest in assessing the impact or reach of research.Discovery: Indicates stakeholder interest in discovering or increasing the discoverability of scholarly outputs and/or researchers.
August20,2016 13
Personas: academic/researcher
August20,2016 14
Persona: member of hiring committee
August20,2016 15
Personas: publishing editor
August20,2016 16
Persona: librarian
August20,2016 17
Glossary
• Activity. Viewing, reading, saving, diffusing, mentioning, citing, reusing, modifying, or otherwise interacting with scholarly outputs.
• Altmetric data aggregator. Tools and platforms that aggregate and offer online events as well as derived metrics from altmetric data providers, for example, Altmetric.com, Plum Analytics, PLOS ALM, ImpactStory, and Crossref.
• Altmetric data provider. Platforms that function as sources of online events used as altmetrics, for example, Twitter, Mendeley, Facebook, F1000Prime, Github, SlideShare, and Figshare.
• Attention. Notice, interest, or awareness. In altmetrics, this term is frequently used to describe what is captured by the set of activities and engagements generated around a scholarly output.
August20,2016 18
Glossary (more...)
• Engagement. The level or depth of interaction between users and scholarly outputs, typically based upon the activities that can be tracked within an online environment. See also Activity.
• Impact. The subjective range, depth, and degree of influence generated by or around a person, output, or set of outputs. Interpretations of impact vary depending on its placement in the research ecosystem.
• Metrics. A method or set of methods for purposes of measurement.• Online event. A recorded entity of online activities related to scholarly
output, used to calculate metrics.
August20,2016 19
Glossary (and even more...)
• Scholarly output. A product created or executed by scholars and investigators in the course of their academic and/or research efforts. Scholarly output may include but is not limited to journal articles, conference proceedings, books and book chapters, reports, theses and dissertations, edited volumes, working papers, scholarly editions, oral presentations, performances, artifacts, exhibitions, online events, software and multimedia, composition, designs, online publications, and other forms of intellectual property. The term scholarly output is sometimes used synonymously with research outputs.
• Traditional metrics. The set of metrics based upon the collection, calculation, and manipulation of scholarly citations, often at the journal level. Specific examples include raw and relative (field-normalized) citation counts and the Journal Impact Factor.
• Usage. A specific subset of activity based upon user access to one or more scholarly outputs, often in an online environment. Common examples include HTML accesses and PDF downloads.
August20,2016 20
Code of Conduct
August20,2016 21
CodeofConduct
•WhyaCodeofConduct?• Scope• AltmetricDataProvidersvs.Aggregators
August20,2016 22
CodeofConductKeyElements
• Transparency• Replicability• Accuracy
August20,2016 23
CodeofConduct:Transparency
• Howdataaregenerated,collected,andcurated
• Howdataareaggregated,andderiveddatagenerated
• Whenandhowoftendataareupdated• Howdatacanbeaccessed• Howdataqualityismonitored
August20,2016 24
CodeofConduct:Replicability
• Provideddataisgeneratedusingthesamemethodsovertime
• Changesinmethodsandtheireffectsaredocumented• Changesinthedatafollowingcorrectionsoferrorsare
documented• Dataprovidedtodifferentusersatthesametimeis
identicalor,ifnot,differencesinaccessprovidedtodifferentusergroupsaredocumented
• Informationisprovidedonwhetherandhowdatacanbeindependentlyverified
August20,2016 25
CodeofConduct:Accuracy
• Thedatarepresentswhatitpurportstoreflect• Knownerrorsareidentifiedandcorrected• Anylimitationsoftheprovideddataarecommunicated
August20,2016 26
CodeofConduct:Reporting
Listallavailabledataandmetrics(providers&aggregators)andaltmetrics dataprovidersfromwhichdataarecollected(aggregators).
Provideacleardefinitionofeachmetricprovided.Describethemethod(s)bywhichdataisgeneratedorcollectedandhowthisismaintainedovertime.
Describeanyandallknownlimitationsofthedataprovided.
Provideadocumentedaudittrailofhowandwhendatagenerationandcollectionmethodschangeovertimewithanyandallknowneffectsofthesechanges,includingwhetherchangeswereappliedhistoricallyoronlyfromchangedateforward.
Describehowdataisaggregated.Detailhowoftendataisupdated.Providetheprocessofhowdatacanbeaccessed.
Confirmthatdataprovidedtodifferentdataaggregatorsandusersatthesametimeisidenticaland,ifnot,howandwhytheydiffer.
Confirmthatallretrievalmethodsleadtothesamedataand,ifnot,howandwhytheydiffer.Describethedataqualitymonitoringprocess.Provideprocessbywhichdatacanbeindependentlyverified(aggregatorsonly).
Provideaprocessforreportingandcorrectingsuspectedinaccuratedataormetrics.
August20,2016 27
Non-traditional Outputs
August20,2016 28
Alternativeoutputs
June25,2016 29
RecommendationsreDataMetrics
• Metricsonresearchdatashouldbemadeavailableaswidelyaspossible
• DatacitationsshouldbeimplementedfollowingtheForce11JointDeclarationofDataCitationPrinciples,inparticular:– Usemachine-actionablepersistentidentifiers– Providemetadatarequiredforacitation– Providealandingpage– Datacitationsshouldgointothereferencelistorsimilarmetadata.
June25,2016 30
RecommendationsforDataMetrics
• Standardsforresearch-data-usestatisticsneedtobedeveloped.– BasedonCOUNTER;considerspecialaspectsofresearchdata
– Twoformulationsfordatadownloadmetrics:examinehumanandnon-humandownloads
• Researchfundersshouldprovidemechanismstosupportdatarepositoriesinimplementingstandardsforinteroperabilityandobtainingmetrics.
• Datadiscoveryandsharingplatformsshouldsupportandmonitor“streaming”accesstodataviaAPIqueries.
June25,2016 31
PersistentIdentifiers
June25,2016 32
Altmetrics for #NISOALMI
August20,2016 33
39presentationslideshavebeendownloaded32,740times(asofJuly26,2016
ThePhase1reportpublishedin2014downloaded9,636times
Pageshostingcontentrelatedtothisprojectwereaccessed60,548times
>2,000peopleattendedthe22in-personpresentationsabouttheproject
FinalReporthasbeendownloaded2,906timesinthe7dayssincepublication
Morethan50articles/blogs/papersabouttheinitiative
Where to next?
August20,2016 34
August20,2016 35
Initial• Metricsfromprovider
• Ad-hoc
Repeatable• Commonmeasurementcriteriafromprovider
• Documentedmeasurementsandprocesses
• Comparableandconsistent
Defined• Measurementsdefined/confirmedasastandardforprovider
• Madepublic• Businessprocessesfollowedconsistently
• Transparent
Managed• Standardsapplied• Controlsinplace• Checksandbalancesrepeatedovertime
• Openforcommentandfeedback
• Accountable
Governed• Independentverificationor3rdpartyaudit
• Evolvingcommonindustrydefinedstandards
• Trustandconfidence
MaturityModelforStandardsAdoptionIncreasingtrustandconfidenceinaltmetrics
August20,2016 36
Promote
August20,2016 37
Operationalize
August20,2016 38
Iterate
August20,2016 39
August20,2016 40
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Oflittleimportance
Moderatelyimportant
Important
Veryimportant
n=118
CommunityFeedbackonProjectIdeaThemes
KeyOriginalIdeasNotYetDone
• Whatistheroleofalternativeassessmentmetricsinresearchevaluationandidentifyandwhatgapsexistindatacollectionaroundevaluationscenarios.
• Identifybestpracticesforgroupingandaggregatingmultipledatasources
• Identifybestpracticesforgroupingandaggregatingbyjournal,author,institutionandfunder.
August20,2016 41
Steering Committee
August20,2016 42
Thank you to thedozens of people
on the working groupsand
the hundreds of people who participated
in brainstorming and commentingon this effort!
August20,2016 43
For more
Project Site:www.niso.org/topics/tl/altmetrics_initiative/
August20,2016 44
Questions?
Todd CarpenterExecutive Director
[email protected]@TAC_NISO
National Information Standards Organization (NISO)3600 Clipper Mill Road, Suite 302Baltimore, MD 21211 USA+1 (301) 654-2512www.niso.org
August20,2016 45