“A crowd-sourced project to evaluate journal policies and advocate for open science practices. Currently in beta testing. We are also currently evaluating the reliability of the TOP Policy Evaluation Tool….”
“In its latest State of Open Data survey, Figshare revealed that a hefty 64 per cent of respondents made their data openly available in 2018.
The percentage, up four per cent from last year and seven per cent from 2016, indicates a healthy awareness of open data and for Daniel Hook, chief executive of Figshare’s parent company, Digital Science, it spells good news….
For example, the majority of respondents – 63 per cent – support national mandates for open data, an eight per cent rise from 2017. And, at the same time, nearly half of the respondents – 46 per cent – reckon data citations motivate them to make data openly available. This figure is up seven per cent from last year….
Yet, amid the data-sharing success stories, myriad worries remain. Top of the pile is the potential for data misuse….
Inappropriate sharing of data is another key concern….
Results indicated that a mighty 58 per cent of respondents felt they do not receive sufficient credit for sharing data, while only nine per cent felt they do….
Coko recently won funding from the Sloan Foundation to build DataSeer, an online service that will use Natural Language Processing to identify datasets that are associated with a particular article. …”
Transparency is essential for scientific progress. Access to underlying data and materials allows us to make progress through new discoveries and to better evaluate reported findings, which increases trust in science. However, there are challenges to changing norms of scientific practice. Culture change is a slow process because of inertia and the fear of unintended consequences.
One barrier to change that we encounter as we advocate to journals for more data sharing is an editor’s uncertainty about how their publisher will react to such a change. Will they help implement that policy? Will they discourage it because of uncertainty about how it might affect submission numbers or citation rates? With uncertainty, inaction seems to be easier.
“There are eight standards in the TOP guidelines; each moves scientific communication toward greater openness. These standards are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources. The complete guidelines are available in the TOP information commons at http://cos.io/top, along with a list of signatories that numbered 86 journals and 26 organizations as of 15 June 2015. …
The journal article is central to the research communication process. Guidelines for authors define what aspects of the research process should be made available to the community to evaluate, critique, reuse, and extend. Scientists recognize the value of transparency, openness, and reproducibility. Improvement of journal policies can help those values become more evident in daily practice and ultimately improve the public trust in science, and science itself.”
Abstract: The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While this movement has gained traction over the last two decades, several concerns about the credibility of empirical research have been identified in scientific disciplines that use research methods and practices that are commonplace in policy analysis. As a solution, we argue that policy analysis should adopt the transparent, open, and reproducible research practices espoused in related disciplines. We first discuss the importance of evidence-based policy in an era of increasing disagreement about facts, analysis, and expertise. We then review recent credibility crises of empirical research (difficulties reproducing results), their causes (questionable research practices such as publication biases and p-hacking), and their relevance to the credibility of evidence-based policy (trust in policy analysis). The remainder of the paper makes the case for “open” policy analysis and how to achieve it. We include examples of recent policy analyses that have incorporated open research practices such as transparent reporting, open data, and code sharing. We conclude with recommendations on how key stakeholders in evidence-based policy can make open policy analysis the norm and thus safeguard trust in using empirical evidence to inform important policy decisions.
“By developing shared standards for open practices across journals, we hope to translate scientific norms and values into concrete actions and change the current incentive structures to drive researchers’ behavior toward more openness. Although there are some idiosyncratic issues by discipline, we sought to produce guidelines that focus on the commonalities across disciplines….”
“After a month of intense conversations and negotiations, the Senate Homeland Security and Governmental Affairs Committee (HSGAC) will bring the ‘Fair Access to Science and Technology Research (FASTR) Act’ up for mark-up on Wednesday, July 29th. The language that will be considered is an amended version of FASTR, officially known as the ‘Johnson-Carper Substitute Amendment,’ which was officially filed by the HSGAC leadership late on Friday afternoon, per committee rules. There are two major changes from the original bill language to be particularly aware of. Specifically, the amendment Replaces the six month embargo period with ‘no later than 12 months, but preferably sooner’ as anticipated; and Provides a mechanism for stakeholders to petition federal agencies to ‘adjust’ the embargo period if the12 months does not serve ‘the public, industries, and the scientific community.’ We understand that these modifications were made in order accomplish a number of things: Satisfy the requirement of a number of Members of HSGAC that the language more closely track that of the OSTP Directive; Meet the preference of the major U.S. higher education associations for a maximum 12 month embargo; Ensure that, for the first time, a number of scientific societies will drop their opposition for the bill; and Ensure that any petition process an agency may enable is focused on serving the interests of the public and the scientific community …”
“Impact is multi-dimensional, the routes by which impact occur are different across disciplines and sectors, and impact changes over time. Jane Tinkler argues that if institutions like HEFCE specify a narrow set of impact metrics, more harm than good would come to universities forced to limit their understanding of how research is making a difference. But qualitative and quantitative indicators continue to be an incredible source of learning for how impact works in each of our disciplines, locations or sectors.”
“Open access for monographs and book chapters is a relatively new area of publishing, and there are many ways of approaching it. With this in mind, a recent publication from the Wellcome Trust aims to provide some guidance for publishers to consider when developing policies and processes for open access books. The Wellcome Trust recognises that implementation around publishing monographs and book chapters open access is in flux, and invites publishers to email Cecy Marden at firstname.lastname@example.org with any suggestions for further guidance that would be useful to include in this document. ‘Open Access Monographs and Book Chapters: A practical guide for publishers’ is available to download as a pdf from the Wellcome Trust website.”
“The purpose of this post is to shed some light on a specific issue in the transition to open access that particularly affects small and low-cost publishers and to suggest one strategy to address this issue. In the words of one Resource Requirements interviewee: ‘So the other set of members that we used to have about forty library members , but when we went to open access online, we lost the whole bunch of libraries. Yeah, so basically we sent everybody ,you know, a letter saying we are going to open access online, the annual membership is only $30, we hope you will continue to support us even though there are no longer print journals, and then a whole flu of cancellations came in from a whole bunch of libraries, which we had kind of thought might happen but given how cheap we are, I have to say I was really disappointed when it indeed did happen especially from whole bunch of [deleted] libraries [for which our journal is extremely relevant]. I was going, seriously $30?’ Comments: for a university library, a society membership fee, when not required for journal subscriptions, may be difficult to justify from an accounting perspective. $30 is a small cost; however, for a university the administrative work of tracking such memberships and cutting a check every year likely exceeds the $30 cost. With 40 library members at a cost of $30, the total revenue for this journal from this source was $1,200. A university or university library could sponsor this amount at less than the cost of many an article processing charge. The university and library where the faculty member is located have a support program for open access journals; clearly the will, and some funding, is there. One of the challenges is transitioning subscription dollars to support for open access, as I address in my 2013 First Monday article. Following is one suggestion for libraries, or for faculty to suggest to their libraries: why not engage your faculty who are independent or society publishers to gain support for cancellations or tough negotiations and lower prices for the big deals of large, highly profitable commercial publishers that I argue are critical to redirect funding to our own publishing activities? Here is one scenario that may help to explain the potential …”