After a journal began tagging papers that adopted open science practices — such as sharing data and materials — a few other scientists may have been nudged into doing the same.
In January 2014, Psychological Science began rewarding digital badges to authors who committed to open science practices such as sharing the data and materials. A study published today in PLOS Biology looks at whether publicizing such behavior helps encourage others to follow their leads.
The authors summarize their main findings in the paper:
Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015…
Psychological Science issues three badges: “Open Data” means they have shared their data, “Open Materials” indicates they have shared their materials, and “Preregistration” shows the researchers have made their analysis plans available online. The present study looked at the effectiveness of two of these badges — open data and open materials.
Regarding the open materials badge, the authors reported:
Open materials also increased to a weaker degree, and there was more variability among comparison journals.
The study also tracked open practices in four other journals from the same discipline, in which they saw “no change in low data sharing rates over the same time period,” said Mallory Kidwell, project coordinator at the Center for Open Science in Charlottesville, Virginia, (who we are working with to create a retraction database on the Open Science Framework) and the lead author of the new paper.
Here’s what the badges look like:
Kidwell told us that, over the course of the study period (January 2012 to May 2015):
We also found that, with badges, authors were more likely to follow through in making the data accessible and sharing data that was correct, usable, and complete for other researchers to reuse or reanalyze. Among the authors that did report sharing data in the other [comparison] journals, the data was less likely to actually be available, correct, usable, or complete.
In the status quo, however, data and material sharing are treated as add-on activities, which result in more work for researchers with little rewards, said Kidwell. She noted:
The badges are an effective incentive for journals across scientific disciplines to reward authors who meet open research standards and demonstrate their value of transparent scientific research with relatively little bureaucratic burden or risk.
This isn’t the only initiative to test the idea of visually tagging papers: Other related projects have used digital badges to clarify the roles of co-authors of research papers.
But Melissa Haendel, an associate professor at the Oregon Health & Science University, who happens to have peer reviewed the new PLOS Biology paper, said the results don’t say much about the practice as a whole:
This is a case study about how effective a new policy and badge system is in a single journal at promoting open sharing of data, and therefore doesn’t have the weight that it might have if more broadly examined across a variety of journals. It is difficult to assess the difference between a change in policy having an effect, and the assignment of a badge.
She added:
To actually differentiate the effects of policy versus a badge, you’d need to have the same policy with and without the badge. Therefore, what is being evaluated here is actually policy plus the badge, and not the badges themselves. Also, no survey instrument was used on authors to further investigate their motivations as related to the badges.
Another restriction, said Haendel, is that it is difficult to work out if the improvement is due to other factors such as a change in governmental policies or a change in editor. Kidwell, however, pointed us to an editorial that listed the changes at the journal during this time period, and noted that implementing badges was the only change related to open science.
Furthermore, since Psychological Science has a relatively high rejection rate, authors may simply be doing whatever they can to get accepted, not embracing open science for its own sake, noted Haendel:
Here, with 93% rejection rate, it makes sense that authors are going to comply with everything in the guide to authors if they want their paper in Psychological Science, badges or not.
The authors note in the paper that the journal’s rejection rate is higher than the comparator psychology journals:
In comparison, rejection rates at Clinical Psychological Science, Developmental Psychology, Journal of Experimental Psychology: Learning, Memory, and Cognition, and Journal of Personality and Social Psychology are ~77%, 80%, 78%, and 89%, respectively.
Kidwell told us there were many reasons why the authors chose to study the effect of badges in Psychological Science instead of the five other journals which have so far adapted the badges:
1) Psychological Science has been awarding badges for the longest period of time, allowing for a better assessment of impact
2) Psychological Science is one of the leading journals publishing empirical psychology research, and its articles are highly impactful to the field
3) Articles published in Psychological Science span a broad range of disciplines within psychology, which makes its articles more comparable to articles published in all four discipline-specific comparison journals
4) Psychological Science has a comparatively high volume of articles to obtain a reasonably good sample size in a short period
The authors pointed out another limitation in their paper:
…many of the manuscript submissions of articles published in 2014 would have occurred in 2013—prior to the announcement of the new policy.
With psychology’s looming reproducibility crisis, access to experimental data and materials is crucial for the field, said Kidwell:
Having access to the original materials is extremely helpful when conducting a replication of an experiment, and badges are certainly a way to increase the amount of materials that are available for researchers interested in conducting replication studies.
Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our new daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.