Data review: Making it happen!
25 March, 2013 | Rebecca Lawrence |
|
|
I’ve recently returned from the Beyond the PDF2 unconference in Amsterdam and I’m really excited about the enthusiastic response we received to the proposals for new data review guidelines, which I presented on behalf of a project we have been co-organising with Geraldine Stoneham-Clement, Elizabeth Newbold and Jonathan Tedds.
The purpose of the project
At F1000Research we have been thinking about the challenges presented by publishing and peer reviewing data for quite some time. Our policy is that authors must publish and share all the data behind their results (within the obvious limits of data protection rules) so we’ve had to spend time thinking about how, practically, to make this work. One of the many challenges is the complexity of the relationship between the data/article peer review conducted by our journal and the varying levels of data curation conducted by different data repositories, from institutional to subject-specific to general repositories like Dryad and Figshare. The idea of setting up a workshop and inviting other stakeholders to get together and discuss these issues had come up in many conversations with Elizabeth Newbold’s team from the British Library and DataCite, and it had naturally been identified as a core element within the JISC-MRD (Managing Research Data) PREPARDE Project, headed up by Jonathan Tedds of the University of Leicester.
What became the tipping point, certainly for me, was a conversation with Geraldine Stoneham-Clement of the UK’s Medicines Research Council (MRC) who mentioned that they were about to embark on a consultation process to define how their peer reviewers should go about assessing the data management plans submitted in grant applications. We all realised that we were about to be discussing similar issues in our own silos, duplicating effort and, inevitably, setting out different and potentially conflicting requirements for data review.
And so the project was born.
The goals
We all agreed that the final output of the collaboration must be something concrete and actionable – a set of recommendations that the main players within the different stakeholder groups would be willing to agree to and implement – and not just talk. We started the process with a workshop at the British Library in mid-March, involving a small, targeted group of 35 people representing funders, repositories, institutions, researchers and publishers.
The outcomes
The workshop participants came together towards a proposed initial set of 12 recommendations that focus on three fundamental areas of shared concern:
- Connecting data review with data management planning before, during and after completion of a research study;
- Connecting scientific review, technical review and curation to facilitate any resultant data publication;
- Defining the guidelines, tools and resources required to connect data review with article review.
Jonathan Tedds presented the recommendations at the Research Data Alliance (RDA) launch event in Gothenburg, formalising the creation of an RDA Working Group on data review, and I presented them at the simultaneous Beyond the PDF2 event in the ‘Making it Happen’ session in Amsterdam. We are also planning to present the recommendations at a long list of other upcoming events.
Initial thoughts and feedback
We’ve received a lot of support for the proposals and lots of promises of detailed feedback – exactly what we are looking for. As is often the case in such a rapidly growing and changing area, we’ve learned that there are other groups having similar discussions and (thankfully) coming to similar conclusions, and they’re looking around, as we are, to see who else is working on data review. Our aim now is to connect with all these groups, combine our expertise and requirements, and formulate joint proposals that will work for everyone.
As word of the data review working group has spread, numerous people have approached me asking how we might be able to help link their specific sets of standards, guidelines, white papers, etc to our recommendations. We clearly don’t want to go to that level of specificity within the recommendations, especially given that the needs and requirements vary greatly among specialty areas and even between data types within the same field. It occurred to me however that as we seem to be naturally accumulating this information, it might be helpful for us to annotate the recommendations with a simple list of who is working on producing the various guidelines and tools relevant to each specific space. That way we can create a useful list of ongoing and completed projects, and provide information that could help to foster new collaboration and integration between some of these related efforts.
Now we need your help
So now it’s your turn. We need your feedback on the proposed recommendations!
- For those recommendations that relate directly to what you do, would you be prepared to implement them?
- If not, what are the barriers for you, and what alternatives would you propose? Please be bold – we need to make progress and to do so will require significant changes on everyone’s part.
- If you are working with groups already considering these issues, please tell us – we want to work with you!
- If you are involved in creating useful standards, guidelines, white papers etc that relate specifically to the detail of some of our data review recommendations (for example the Data Seal of Approval in relation to publishers providing a list of trusted repositories) then please also let us know.
Please circulate these proposed recommendations to as many of your colleagues as possible so we can also hear what they think.
To send feedback, please join the new JISC data publication listserv and then e-mail in your comments.
We are planning a follow-up workshop in late June in London, where we will pull together the feedback and create a final set of recommendations that will be written up as a short practice paper. We will then all have made a shared contribution to ‘Making it Happen’!
|