Professors Receive NSF Grant to Develop Training for Recognizing Predatory Publishing | Texas Tech Today | TTU

“With more open-access journals making research articles free for people to view, some journals are charging authors publication fees to help cover costs. While some journals that do this are still peer-reviewed and credible, others are not and will publish lower quality work strictly for profit. The difference can be hard to tell, even to the most seasoned author….”

The National Science Foundation Awards scite Competitive R&D Grant to Build Tool to Identify and…

“scite, Inc. has been awarded a National Science Foundation (NSF) Small Business Innovation Research (SBIR)grant for $224,559 to conduct research and development (R&D) work ondeveloping a deep learning platform that can evaluate the reliability of scientific claims by citation analysis….”

Extending U.S. Biodiversity Collections to Promote Research and Education

“Our national heritage of approximately one billion biodiversity specimens, once digitized, can be linked to emerging digital data sources to form an information-rich network for exploring earth’s biota across taxonomic, temporal and spatial scales. A workshop held 30 October – 1 November 2018 at Oak Spring Garden in Upperville, VA under the leadership of the Biodiversity Collections Network (BCoN) developed a strategy for the next decade to maximize the value of our collections resource for research and education. In their deliberations, participants drew heavily on recent literature as well as surveys, and meetings and workshops held over the past year with the primary stakeholder community of collections professionals, researchers, and educators.

Arising from these deliberations is a vision to focus future biodiversity infrastructure and digital resources on building a network of extended specimen data that encompasses the depth and breadth of biodiversity specimens and data held in U.S. collections institutions. The extended specimen network (ESN) includes the physical voucher specimen curated and housed in a collection and its associated genetic, phenotypic and environmental data (both physical and digital). These core data types, selected because they are key to answering driving research questions, include physical preparations such as tissue samples and their derivative products such as gene sequences or metagenomes, digitized media and annotations, and taxon- or locality-specific data such as occurrence observations, phylogenies and species distributions. Existing voucher specimens will be extended both manually and through new automated methods, and data will be linked through unique identifiers, taxon name and location across collections, across disciplines and to outside sources of data. As we continue our documentation of earth’s biota, new collections will be enhanced from the outset, i.e., accessioned with a full suite of data. We envision the ESN proposed here will be the gold standard for the structured cloud of integrated data associated with all vouchered specimens. These permanent specimen vouchers, in which genotypes and phenotypes link to a particular environment in time and space, comprise an irreplaceable resource for the millennia….”

BCoN Report: Extending U.S. Biodiversity Collections to Promote Research and Education

The Biodiversity Collections Network has released its new report, Extending U.S. Biodiversity Collections to Promote Research and Education.  You are invited to download and share the summary brochure and to review the longer report that provides additional detail about this vision for the future. …”

Report urges massive digitization of museum collections | Science | AAAS

“The United States should launch an effort to create an all-encompassing database of the millions of stuffed, dried, and otherwise preserved plants, animals, and fossils in museums and other collections, a U.S. National Science Foundation (NSF)–sponsored white paper released today urges. The report, titled Extending U.S. Biodiversity Collections to Promote Research and Education, also calls for new approaches to cataloging digitized specimens and linking them to a range of other data about each organism and where it was collected. If the plan is carried out, “There will be [a] huge potential impact for the research community to do new types of research,” says NSF biology Program Director Reed Beaman in Alexandria, Virginia.

The effort could take decades and cost as much as half a billion dollars, however, and some researchers are worried the white paper will not win over policymakers. “I just wish that the report focused more on the potential benefits for noncollections communities,” says James Hanken, director of the Harvard Museum of Comparative Zoology in Cambridge, Massachusetts.

For the past 8 years, NSF has sponsored the $100 million, 10-year Advancing Digitization of Biodiversity Collections program, which has paid for nearly 62 million plant and animal specimens to be digitally photographed from multiple angles for specific research studies. New technology has greatly sped up the process. Already, researchers studying natural history and how species are related are reaping the benefits of easy access to a wealth of information previous locked in museums….”

EU open-access envoy urges foundations to join Plan S

“Organisations such as the Bill and Melinda Gates Foundation and the Wellcome Trust should join Plan S to continue their “moral leadership” on open research, Plan S founder and European Commission open-access envoy Robert-Jan Smits told Research Europe. He was speaking on his return from a weeklong tour of federal agencies, universities and learned societies in the United States, where he was attempting to boost international support for the plan….

Smits claimed that the feedback on Plan S he received in the US was mostly that independent foundations need to join….

Smits has said that Plan S is based on the Bill and Melinda Gates Foundation’s policies. These include that papers reporting research it has funded must be made openly available immediately and with a licence that permits unrestricted reuse. The foundation has forced some of the world’s most prestigious journals to change their policies so that they comply.

During the trip, Smits sought to quell fears that Plan S would undermine the so-called green open-access model, in which papers are placed in repositories, usually after a publisher-imposed embargo period. Plan S will not accept embargo periods, causing some concern that it will only support the gold open-access model in which papers are made openly available immediately, usually by paying publishers an article-processing charge.

Smits said that Plan S leaves “ample room” for repositories, article preprints and self-archiving. He also admitted that organisations in the US flagged the plan’s lack of recognition for publishers using the so-called diamond and platinum open-access models, which do not charge authors publication fees….

According to Smits, those he met who were most enthusiastic about Plan S were librarians and researchers at the Massachusetts Institute of Technology and Harvard University.

More cautiously interested parties, he said, were the White House’s Office of Science and Technology Policy, the National Institutes of Health and the National Science Foundation. Smits said this was because the OSTP is awaiting a new director who will set the agenda for open access at the federal level. Research Europe has approached these organisations for comment.

Those who were most sceptical of the plan were the learned societies, Smits said. These organisations rely on income from journal subscription charges and fear that the loss of revenue caused by a switch to open access would affect activities such as the organisation of conferences, he said….”

Digitizing the vast “dark data” in museum fossil collections | Salon.com

“The uniqueness of each museum collection means that scientists routinely make pilgrimages worldwide to visit them. It also means that the loss of a collection, as in the recent heart-wrenching fire in Rio de Janeiro, represents an irreplaceable loss of knowledge. It’s akin to the loss of family history when a family elder passes away. In Rio, these losses included one-of-a-kind dinosaurs, perhaps the oldest human remains ever found in South America, and the only audio recordings and documents of indigenous languages, including many that no longer have native speakers. Things we once knew, we know no longer; things we might have known can no longer be known.

But now digital technologies — including the internet, interoperable databases and rapid imaging techniques — make it possible to electronically aggregate museum data. Researchers, including a multi-institutional team I am leading, are laying the foundation for the coherent use of these millions of specimens. Across the globe, teams are working to bring these “dark data” — currently inaccessible via the web — into the digital light….

The sheer size of fossil collections, and the fact that most of their contents were collected before the invention of computers and the internet, make it very difficult to aggregate the data associated with museum specimens. From a digital point of view, most of the world’s fossil collections represent “dark data.” …

The Integrated Digitized Biocollections (iDigBio) site hosts all the major museum digitization efforts in the United States funded by the current NSF initiative that began in 2011….

Our group, called EPICC for Eastern Pacific Invertebrate Communities of the Cenozoicquantified just how much “dark data” are present in our joint collections. We found that our 10 museums contain fossils from 23 times the number of collection sites in California, Oregon and Washington than are currently documented in a leading online electronic database of the paleontological scientific literature, the Paleobiology Database….”

Open Science Grid | A national, distributed computing partnership for data-intensive research

“The OSG provides common service and support for resource providers and scientific institutions using a distributed fabric of high throughput computational services. The OSG does not own resources but provides software and services to users and resource providers alike to enable the opportunistic usage and sharing of resources. The OSG is jointly funded by the Department of Energy and the National Science Foundation. here for a two-page printable overview of OSG….”

A commitment to openness | Research Information

“I started life at Jisc as a programme manager, on a project that was jointly funded by the National Science Foundation in the US and Jisc. This was a fairly forward thinking project in digital libraries and from this, we began working on how to make sure researchers had maximum access to information and collections, and how we could do that collaboratively, building on expertise on both sides of the Atlantic.

At this, I managed the pilot site licence initiative, which in essence is what became Jisc Collections as it is today; it was about ensuring ongoing access as the world of journal archives became digital. The subsequent model licence, designed to provide a smooth transition from analogue, was, I think, a  world first, and the clauses added are aligned to the aspirations of the open access movement – as the world became born digital, open access was a logical next step. There were some real thought leaders in the sector at that time who made it their mission to ensure as many people as possible could have access to that publicly funded research, as a point of principle….”

Notes on the Public Access to Public Science Act – Harvard Open Access Project

“PAPS requires covered federal agencies to develop public-access policies (Section 2.a). There are four covered agencies: the National Aeronautics and Space Administration (NASA), the National Science Foundation (NSF), the National Institute of Standards and Technology (NIST), and the National Weather Service….”