EU Open Science Agenda – rethinking incentives

I am delighted to have been selected to serve on the High-Level Advisory Group ‘Open Science Policy Platform’ (OSPP) at this important juncture in the EU’s vision on bringing open science to the way research across the EU is conducted and communicated.

At the Dutch Presidency’s Open Science conference in Amsterdam in April, European Commissioner Carlos Moedas announced his vision for an Open Science Policy Platform. The meeting was attended by key players from governments, universities, funders, researchers, publishers and other interested parties, and set out a draft Open Science Agenda.

This agenda identified five major policy areas: to foster open science; remove barriers to open science; develop research infrastructures for open science; mainstream open access to research results; and embed open science in society.  As high-level objectives, they are hard to argue with and resonate with the work we are doing at F1000 to facilitate open science.

I was particularly pleased to see several objectives aiming to tackle the current lack of incentives and rewards for sharing new findings openly and transparently, which are integral to open science.  Published articles and journal-based metrics remain the main currency that researchers trade when applying for tenure or a grant – fearful that if they don’t include an admirable list, they won’t be able to progress in their career.  This is despite statements from many major funders, institutions and even institutional review boards that such methods of assessment will no longer be used – DORA, Wellcome Trust –  and an explosion in new tools and platforms that can enable use, re-use and impact-related metrics on a broader range of research outputs than ever before.

There are many services that provide insights into article-level measures of quality, attention and engagement around research outputs, for example, some quantitative – Altmetrics, PLOS ALMs, Plum Analytics – and some more qualitative – F1000Prime. National Information Standards Organisation (NISO) has released a set of standards around metrics for non-standard outputs.

There have been bold new models of publishing that take in all aspects of open science, including open data, open peer review and of course open access, such as F1000Research, ScienceOpen, and many publishers that have developed new article types such as data notes, software papers and the like, for example Scientific Data, Gigascience, SoftwareX, F1000Research. There have also been several projects working on developing a system to provide credit for such new outputs through data citation – FORCE11, RDA – and software citation – Software Sustainability Institute, FORCE11 – and for other important researcher activities such as peer review via ORCID Profiles, Publons.

There is great potential to use these new tools and platforms to support more holistic and tailored assessments of research and researchers; the technical infrastructure to enable this shift exists.  But for most researchers, little has really changed. While there are statements that researchers will be assessed using a wide range of measures on a broad set of outputs, those same researchers often serve on the grant, career and tenure boards. They know that when faced with a large volume of CVs or grant applications, and the absence of other research impact-related data being automatically available, they end up relying on the familiar currency of high-‘impact’ papers in high-profile journals as the main indication of track record.  And for researchers not on such panels, the perceived risk to their career of using new modes of sharing and publishing their work and relying on new measures of quality and attention, is just too great.

So what should we do? There seems to be widespread agreement that moving the sharing of new research findings towards an open science approach is important, not just in the EU but worldwide, but how can we shift the needle to get there?

What is common with many of the tools and platforms that are emerging is that they are typically driven by one stakeholder. Without buy-in to these approaches from all angles, i.e. not just new publication models but associated incentives from Higher Education institutions and funders to encourage their use and practical tools and information to help those faced with making decisions on grants and careers based on such outputs and publications, few researchers feel able to use them and trust them. This is why I believe having funders, Higher Education institutions and publishers working together towards a common goal of open science is so important to ensuring its success.

The Open Science Policy Platform is novel in its formation by including all the major stakeholders in academic research – working together in a concerted and coordinated fashion will help make this initiative a success. The Agenda talks about ‘speeding up the culture shift and transition’ and that is certainly needed. Ultimately, when it comes to researchers (as with many of us in reality), money talks.

With Open Access, things only started to shift at speed when the funders stepped in and mandated the change.  It is therefore good to see that the EU, a huge funder of European research, is the driving force behind this. The stance on requiring all European scientific articles to be freely accessible by 2020 is a great step forward. We need to see similar action for the broader set of issues around open science, beyond simply access to the article, if we want this agenda to succeed in changing the way science is done and is communicated, for the benefit of science and society.

previous post

F1000 Specialist of the Month

next post

Cholera control: the success of a single dose vaccine