36
Open Science Dr. Dr.Phil. Rene VON SCHOMBERG Team Leader-Open science policy coordination and development European Commission DG Research & Innovation A.6-Data, Open Access and Foresight

Presentation on Open Science and its 'Impacts';

Embed Size (px)

Citation preview

Open Science

Dr. Dr.Phil. Rene VON SCHOMBERG Team Leader-Open science policy coordination and development European Commission DG Research & Innovation A.6-Data, Open Access and Foresight

Open Science: a new approach to the research process

Open Science

• Based on cooperative work and new ways of diffusing and sharing knowledge using digital technologies and new collaborative tools • A systemic change to the way science is organised and research is carried out • It affects virtually all components of doing science and research, from conceptual work to publishing, from empirical research to data-analysis. • Shifting focus from "publishing as fast as possible" to "sharing knowledge as early as possible" • 2014 Public consultation on ‘Science 2.0: Science in Transition’

Notes: tiam ultricies nisi vel augue. Curabitur ullamcorper ultricies nisi. Nam eget dui. Etiam rhoncus. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero,

Source : http://ec.europa.eu/research/consultations/science-2.0/background.pdf

Collaborative bibliographies

Analysis

Open Science

Open Science – opening up the research process

Analysis

Publication

Review Conceptualisation

Data gathering

Open access

Scientific blogs Collaborative

bibliographies

Alternative Reputation

systems

Citizens science

Open code

Open workflows

Open annotation

Open data

Pre-print

Data-intensive

4

Sci-starter.com

Runmycode.org

Impact Story

Openannotation.org

An emerging ecosystem of services and

standards

It's real!

2%

5%

10%

19%

22%

43%

Digital science

Enhanced science

Networked science

Open Digital science

Science 2.0

Open science

What is the most appropriate term to describe ‘Science 2.0’?

70%

17%

11%

2%

Do you recognise the trends described in the consultation paper as 'Science 2.0'?

Yes

Yes, but with a different

emphasis on particularelementsYes, but some essential

elements Are missing

No, not at all

11%

22%

26%

28%

32%

36%

34%

30%

43%

47%

76%

33%

40%

45%

44%

41%

39%

42%

46%

43%

43%

22%

6%

6%

3%

3%

6%

2%

6%

4%

3%

34%

22%

20%

19%

15%

16%

14%

17%

9%

7%

2%

16%

9%

6%

6%

6%

7%

4%

3%

3%

2%

0% 20% 40% 60% 80% 100%

Citizens acting as scientists

Scientific publishers engaging in 'Science 2.0'

Public demand for faster solutions to Societal Challenges

Growing public scrutiny of science and research

Public funding supporting 'Science 2.0'

Public demand for better and more effective science

Growing criticism of current peer-review system

Increase of the global scientific population

Researchers looking for new ways of collaboration

Researchers looking for new ways of disseminating theiroutput

Availability of digital technologies and their increasedcapacities

What are the key drivers of 'Science 2.0'?

I totally agree I partially agree I don´t know I partially disagree I totally disagree

26%

44%

43%

43%

35%

47%

43%

46%

50%

53%

44%

32%

37%

38%

46%

35%

41%

39%

38%

35%

6%

6%

4%

6%

5%

6%

4%

5%

4%

3%

17%

13%

13%

9%

10%

10%

9%

9%

7%

8%

7%

5%

3%

5%

4%

3%

2%

1%

1%

2%

0% 20% 40% 60% 80% 100%

Concerns about ethical and privacy issues

Lack of incentives for junior scientists to engage

with 'Science 2.0'

Lack of research skills fit for 'Science 2.0'

Legal constraints (e.g. copyright law)

Uncertain benefits for researchers

Lack of financial support

Limited awareness of benefits of 'Science 2.0 for

researchers

Lack of integration in the existing infrastructures

Lack of credit-giving to 'Science 2.0'

Concerns about quality assurance

What are the barriers for 'Science 2.0' at the level of individual scientist?

I totally agree I partially agree I don´t know I partially disagree I totally disagree

18%

21%

29%

33%

37%

41%

42%

42%

46%

40%

39%

47%

43%

41%

38%

40%

41%

37%

8%

9%

6%

6%

6%

6%

6%

3%

4%

26%

22%

14%

15%

13%

13%

10%

11%

10%

8%

9%

4%

4%

3%

3%

3%

3%

2%

0% 20% 40% 60% 80% 100%

Crowd-funding an important research funding

source

Research more responsive to society through

crowd-funding

Science more responsive to societal challenges

Reconnect science and society

Greater scientific integrity

Data-intensive science as a key economic driver

Faster and wider innovation

Science more efficient

Science more reliable (e.g. re-use of data)

What are the implications of 'Science 2.0‘ for society, the economy and the research system?

I totally agree I partially agree I don´t know

I partially disagree I totally disagree

Background

7.4 7.4 6.9

6.2 5.7 5.6 5.5 5.4 5.4 5.3

4.7

0

2

4

6

8

10

12

Mean

rankin

g p

osi

tion

On what issues within 'Science 2.0' do you see a need for policy intervention?

Mean

Mean - std

Mean + std

Rank : the lowest need (1) to the highest need (11)

Five lines of potential policy actions

Open Science

• Fostering and creating incentives for Open Science • Removing barriers to Open Science • Mainstreaming and further promoting Open Access policies • Developing research infrastructures for Open Science • Embedding Open Science in society as a socio-economic driver

Notes: tiam ultricies nisi vel augue. Curabitur ullamcorper ultricies nisi. Nam eget dui. Etiam rhoncus. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero,

Open Science: key issues

Open Science

• The European Open Science Cloud • Advancing Open Access and Data Policies • Alternative systems to evaluate the quality and impact of research • Text and Data Mining • Towards better, more efficient and more Open Science • Fostering Research Integrity • Making science more inclusive: Citizen Science

Notes: tiam ultricies nisi vel augue. Curabitur ullamcorper ultricies nisi. Nam eget dui. Etiam rhoncus. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero,

Bottom-up governance Federation Legacy and sustainability Leverage of MS investment Trust

IPR and privacy protection Big data analytics Data fusion across disciplines High performance computing Data access and re-use Data manipulation and export Data discovery and catalogue

High-speed connectivity Super-Computing Data storage

Governance layer

Data and service layer

Infrastructure layer

…Long tail of science Lead scientific users…

Scale

of

scie

nti

fic a

cti

vit

y (

data

-driv

en

scie

nce)

Open Science Source: DG Research and Innovation (2015)

Governance of the European Open Science Cloud

Research organisation

Funder

Sub-unit of research organisation

Funder and research organisation

Multiple research organisations

Nu

mb

er o

f P

oli

cie

s

Policies Adopted by Quarter

Open Science Source: http://roarmap.eprints.org/

Growth of Open Access Repository Mandates and Policies

Discovery Analysis Writing Publication Outreach Assessment

Elsevier

Springer Nature Digital Science

Google

Wikimedia

Open Science Source: http://innoscholcomm.silk.co

Open Science: From Open Access to Open Scholarly Communication

Public or private initiatives at every level of the research process offering specific services to researchers

Layer of "commons"

New initiatives allowing the scholarly process to be carried out differently

Open Science Source: http://blogs.lse.ac.uk/impactofsocialsciences/2015/11/11/101-innovations-in-scholarly-communication/

research governance

changes

technical changes & standards

economic & copyright

changes

GOOD

OPEN EFFICIENT

Towards ‘better science’ – Good, efficient and Open Science

connected tools & platforms

no publ. size restrictions

null result publishing

speed of publication

(web)standards, IDs

semantic discovery

Re-useability

versioning

open peer review

open (lab)notes

plain language

open drafting

open access

CC-0/BY

declaring competing interests

replication & reproducibility

meaningful assessment

effective quality checks

credit where it is due

no fraud, plagiarism

Open Science Policy Platform and European Open Science Agenda

• May 2016 Competitiveness Council:

• "NOTES the establishment of the Open Science Policy Platform by the

Commission, which aims at supporting the further development of the European Open Science policy and promoting the uptake by stakeholders of best practices, including issues such as adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics, guiding principles for optimal reuse of research data, development and use of standards, and other aspects of open science such as fostering research integrity and developing citizen science";

• Commissioner Moedas will inform the Council biannually on advances of the Platform (which consist of 25 Key stakeholders-European Branch Organisations)

Optimal re-use of Research Data

• Competitiveness Council:

• 1.Make research data produced by H2020 open by default

• 2.Encourage MS to Promote data stewardship and implement data management plans

• 3.Encourage MS and Commission to follow FAIR principles in research programmes and funding mechanisms

• Follow-Up by Stakeholders, EC and MS:

• 1. As of 2017, Open Data is the default option under H2020- Data Management Plans will be mandatory

• 2. Evaluation of MS advances on Open Data will be necessary

• 3. Evaluation of MS advances on Open Data will be necessary; Expert Group on FAIR data will advise DG RTD in course of 2016

European Open Science Cloud

• Competitiveness Council:

• "CALLS on the Commission, in cooperation with Member States and stakeholders, to explore appropriate governance and funding frameworks'

• European Commission-Follow up of the April 2016 Communication on a European Cloud Initiative:

• Commission will need to have a roadmap for funding of European Open Science Cloud by end of 2016 which requires consultation of Member States.

European Commission

DSM & framework conditions for data:

• Copyright - TDM • Data Protection • Free Flow of Data • …

ERA & framework conditions for actors:

• European Charter for researchers

• Code of conduct for Research Integrity

• Charter for Access to Research Infra

• …

Open Science Policy Platform

Wide input from stakeholders:

• ad-hoc meetings and workshops • e-platform with wider community • reports and independent experts

EG on open science cloud EG on altmetrics EG on alt. business models

for OA publishing EG on FAIR open data

opinions

advice

context

European Open Science Agenda:

• OA publishing models • FAIR open data • Science Cloud • Alternate metrics • Rewards & careers • Education & skills • Citizen Science • Research integrity • …

Open Science Policy Platform

European Open Science Policy Agenda (1) Foster Open Science: Creating incentives, e.g.

• Establish an Open Science Policy Platform

• Promote best practices

• Launching a European Open Science Monitor

• Promote a discussion on evaluation criteria of research, prepare for next framework programme

European Open Science Agenda (2) Remove barriers, e.g.

• European Copyright and Data Protection revisions: foresee

appropriate exceptions for research activities - TDM

• Development of 'alternative' metrics

• Propose a European "code of conduct"

• Address low open data-skills amongst researchers and

the underuse of professional support (librarians,

repository managers etc.)

Next-generation altmetrics: responsible metrics and evaluation for open science

Flash Report

EU Expert Group Altmetrics

First release September 2016

EU expert group members

James Wilsdon, University of Sheffield (chair); Judit Bar-Ilan, Bar-Ilan University; Robert Frodeman, University of North Texas;

Elizabeth Lex, Graz University of Technology; Isabella Peters, Leibniz Information Centre for Economics;

Paul Wouters, Leiden University

Aims /1

✓assess changing role of (alt)metrics in research evaluation

✓consider how altmetrics can be developed to support open science

✓engage stakeholders

✓consider implications of metrics for: ✓diversity and equality

✓ interdisciplinarity

✓research cultures

✓gaming

Aims /2

✓examine implications of: ✓emerging social networks

✓research information systems

✓citation profiles

✓ to develop a framework for responsible metrics for research qualities and impacts for evaluation of Horizon 2020 and for wider use in the next framework programme

✓ to consider required data infrastructures

Across the research community, the description, production and consumption of ‘metrics’ remains contested and open to misunderstandings.

✓ Quantitative evaluation should support expert assessment.

✓ Measure performance in accordance with the research mission.

✓ Protect excellence in locally relevant research

✓ Keep data collection and analytical processes open, transparent and simple.

✓ Allow for data verification

✓ Account for variation by field in publication and citation practices

✓ Data should be interpreted taking into account the difficulty of credit assignment in the case of multi-authored publications.

✓ Base assessment of individual researchers on qualitative judgment.

✓ False precision should be avoided

✓ Systemic effects of the assessment and the indicators should be taken into account and indicators should be updated regularly

http://www.hefce.ac.uk/rsrch/metrics/

Responsible metrics

✓ Robustness: basing metrics on the best possible data in terms of accuracy and scope;

✓ Humility: recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment;

✓ Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;

✓ Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths;

✓ Reflexivity: recognizing the potential & systemic effects of indicators and updating them in response.

Ambitions for Open Science

✓ More comprehensive measurement of traditional scientific publications (eg Mendeley)

✓ Recognizing and capturing the diversity of scientific output including new forms (eg software and blogs)

✓ Opening up the whole scientific publication system (open access) and more interactive communication

✓ Opening up the very core of knowledge creation and its role in higher education and innovation (participatory science)

31

Measuring is changing

✓ What counts as excellence is shaped by how we measure and define “excellence”

✓ What counts as impact is shaped by how we measure and define “impact”

✓ Qualities and interactions are the foundation for “excellence” and “impact” so we should understand those more fundamental processes first

✓ We need different indicators at different levels in the scientific system to inform wise management that strikes the right balance between trust and control

✓ Context crucial for effective data standardization

Call for evidence /1

✓strong support for development and research of open metrics and altmetrics

✓metrics should complement not replace human judgment of quality

✓altmetrics currently not yet ready for routine use in assessment

✓EU should help develop public sector based metrics

✓diversity key criterion for metrics

Call for evidence /2

✓portfolios of metrics for societal interaction and impact urgently needed

✓open standards for data and indicator infrastructure

✓context should prevail over technical standards

✓reflexive protection against gaming strategies

✓strong support for Metrics Tide and Leiden Manifesto principles

✓portfolios of indicators to support open science

Report outline

✓Metrics: technical state of the art

✓Use of metrics in policy and practice

✓Data infrastructures and open standards

✓Cultures of counting, ethics and research

✓Next generation metrics: the way forward

More information & updates on the progress of the expert panel can be found here: http://ec.europa.eu/research/openscience/index.cfm?pg=altmetrics_eg

To conclude with some problems…

• Good Metrics for Science 'equals' good metrics for Open Science?

• -Impacts of research is becoming more important, but what is a good impact?

• -metrics can never directly measure 'impact' and 'excellence'(whatever the definition)- are metrics not more useful for what they are not created for?

• Final thesis: Responsible metrics resembles responsible research ( See Von Schomberg, 2013- A Vision of Responsible Research and Innovation)