Hiring Policy at the LMU Psychology Department: Better have some open science track record

“Our department embraces the values of open science and strives for replicable and reproducible research. For this goal we support transparent research with open data, open materials, and study pre-registration. Candidates are asked to describe in what way they already pursued and plan to pursue these goals.”

How green is our valley?: five-year study of selected LIS journals from Taylor & Francis for green deposit of articles

Abstract:  This study reviews content from five different library and information science journals: Behavioral & Social Sciences Librarian, Collection Management, College & Undergraduate Libraries, Journal of Electronic Resources Librarianship and Journal of Library Administration over a five-year period from 2012–2016 to investigate the green deposit rate. Starting in 2011, Taylor & Francis, the publisher of these journals, waived the green deposit embargo for library and information science, heritage and archival content, which allows for immediate deposit of articles in these fields. The review looks at research articles and standing columns over the five years from these five journals to see if any articles were retrieved using the OA Button or through institutional repositories. Results indicate that less than a quarter of writers have chosen to make a green deposit of their articles in local or subject repositories. The discussion outlines some best practices to be undertaken by librarians, editors and Taylor & Francis to make this program more successful.

APA Creates Open Science and Methodology Chair to Deepen Commitment to Data Sharing, Transparency in Science

“The American Psychological Association has created an open science and methodology chair to work with its authors, reviewers, editors and publications board to understand and develop best practices for the evolving landscape of open science in psychological research. “APA is committed to promoting transparency and sound practice in psychological research,” said Rose Sokol-Chang, PhD, APA’s journals publisher. “We are enthusiastic about offering the psychology community another resource to bolster this work.” APA’s Publications and Communications Board approved the post and will issue an open call to recruit for it in early summer. The chair will initially work with a committee to help refine and extend the P&C Board policy for APA journals related to open science practices. APA Journals is committed to publishing transparent research, publishing replications and offering resources such as its Journal Article Reporting Standards for quantitative, qualitative and mixed methods research design; open science badges; and an APA Journals data repository, in conjunction with the Center for Open Science. “APA recognizes the importance of sharing data to aid secondary discovery, increase efficiency in research discoveries and improve reproducibility in science,” said Sokol-Chang. Qualifications for the post are experience in open science practices, including data sharing, reproducibility and preregistration; editorial experience and familiarity with APA journals policy; experience with data management, research methodology and clinical trials; and having served on an institutional review board. Interested applicants can read more about the position online or by email. APA is the world’s largest nonprofit publisher of psychological science, setting standards for scholarship in the field. APA Publishing produces journals, books, videos, databases and educational products for researchers, practitioners, educators, students and the public. “

Frankl: An open science platform

“Frankl is a blockchain platform and tokenised economy to promote, facilitate, and incentivise the practice of open science. The initial focus of Frankl is cognitive assessment – an area of our expertise, and a research domain that faces particular challenges that are amenable to blockchain solutions.

In Phase I, Frankl will develop app-based cognitive assessments that streamline test administration and improve accessibility for children and adults with physical or cognitive disabilities. Apps will interface with blockchain-based data storage, facilitating data sharing for clinical and research purposes while maintaining privacy of individuals via encryption. Access to the Frankl suite of apps will be via micropayments in Frankl tokens.

In Phase II, Frankl will release the source code for the apps, enabling researchers, clinicians, and independent app developers to build their own cognitive assessment apps on the Frankl platform. In this way, Frankl will create a marketplace to incentivise (via Frankl tokens) the development of new and better cognitive assessments, simultaneously promoting open science and disrupting the forecast (by 2021) USD 8 billion global market for cognitive assessment and training.

This whitepaper outlines the technical specifications for the Frankl platform, the practical path to its creation, and exemplar applications including our first use case – a cognitive assessment specifically designed for autistic children. We provide details of the Frankl token economy and participation, and sketch out our long term vision for the development of Frankl as an interface whereby blockchain technologies facilitate the widespread adoption of open science practices. …”

Welcome to Cogprints – Cogprints

“Welcome to CogPrints, an electronic archive for self-archive papers in any area of Psychology, Neuroscience, and Linguistics, and many areas of Computer Science (e.g., artificial intelligence, robotics, vison, learning, speech, neural networks), Philosophy (e.g., mind, language, knowledge, science, logic), Biology (e.g., ethology, behavioral ecology, sociobiology, behaviour genetics, evolutionary theory), Medicine (e.g., Psychiatry, Neurology, human genetics, Imaging), Anthropology (e.g., primatology, cognitive ethnology, archeology, paleontology), as well as any other portions of the physical, social and mathematical sciences that are pertinent to the study of cognition….”

Estimating the reproducibility of psychological science | Science

The co-authors of this work call themselves the Open Science Collaboration.

“No single indicator sufficiently describes replication success, and the five indicators examined here are not the only ways to evaluate reproducibility. Nonetheless, collectively these results offer a clear conclusion: A large portion of replications produced weaker evidence for the original findings despite using materials provided by the original authors, review in advance for methodological fidelity, and high statistical power to detect the original effect sizes. Moreover, correlational evidence is consistent with the conclusion that variation in the strength of initial evidence (such as original P value) was more predictive of replication success than variation in the characteristics of the teams conducting the research (such as experience and expertise). The latter factors certainly can influence replication success, but they did not appear to do so here.

 

Reproducibility is not well understood because the incentives for individual scientists prioritize novelty over replication. Innovation is the engine of discovery and is vital for a productive, effective scientific enterprise. However, innovative ideas become old news fast. Journal reviewers and editors may dismiss a new test of a published idea as unoriginal. The claim that “we already know this” belies the uncertainty of scientific evidence. Innovation points out paths that are possible; replication points out paths that are likely; progress relies on both. Replication can increase certainty when findings are reproduced and promote innovation when they are not. This project provides accumulating evidence for many findings in psychological research and suggests that there is still more work to do to verify whether we know what we think we know….”

APA releases new journal article reporting standards

“Brian Nosek, PhD, co-founder and director of the Center for Open Science, welcomed the new standards. “Achieving the ideals of transparency in science requires knowing what one needs to be transparent about,” he said. “These updated standards will improve readers’ understanding of what happened in the research. This will improve both the accuracy of interpretation of the existing evidence, and the ability to replicate and extend the findings to improve understanding.” APA has partnered with the Center for Open Science to advance open science practices in psychological research through open science badges on articles, a data repository for APA published articles and designating the COS’ PsyArXiv as the preferred preprint server for APA titles….”