BITSS Preprints | Why We Need Open Policy Analysis

Abstract:  The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While this movement has gained traction over the last two decades, several concerns about the credibility of empirical research have been identified in scientific disciplines that use research methods and practices that are commonplace in policy analysis. As a solution, we argue that policy analysis should adopt the transparent, open, and reproducible research practices espoused in related disciplines. We first discuss the importance of evidence-based policy in an era of increasing disagreement about facts, analysis, and expertise. We then review recent credibility crises of empirical research (difficulties reproducing results), their causes (questionable research practices such as publication biases and p-hacking), and their relevance to the credibility of evidence-based policy (trust in policy analysis). The remainder of the paper makes the case for “open” policy analysis and how to achieve it. We include examples of recent policy analyses that have incorporated open research practices such as transparent reporting, open data, and code sharing. We conclude with recommendations on how key stakeholders in evidence-based policy can make open policy analysis the norm and thus safeguard trust in using empirical evidence to inform important policy decisions.

Open Science Comes To Policy Analysis – CEGA – Medium

“This post is co-authored by Fernando Hoces de la Guardia, BITSS postdoctoral scholar, along with Sean Grant (Associate Behavioral and Social Scientist at RAND) and CEGA Faculty Director Ted Miguel. It is cross-posted with the BITSS Blog.

The Royal Society’s motto, “Take nobody’s word for it,” reflects a key principle of scientific inquiry: as researchers, we aspire to discuss ideas in the open, to examine our analyses critically, to learn from our mistakes, and to constantly improve. This type of thinking shouldn’t guide only the creation of rigorous evidence?—?rather, it should extend to the work of policy analysts whose findings may affect very large numbers of people. At the end of the day, a commitment to scientific rigor in public policy analysis is the only durable response to potential attacks on credibility. We, the three authors of this blog?—?Fernando Hoces de la Guardia, Sean Grant, and Ted Miguel?—?recently published a working paper suggesting a parallel between the reproducibility crisis in social science and observed threats to the credibility of public policy analysis. Researchers and policy analysts both perform empirical analyses; have a large amount of undisclosed flexibility when collecting, analyzing, and reporting data; and may face strong incentives to obtaining “desired” results (for example, p-values of <0.05 in research, or large negative/positive effects in policy analysis)….”