“From Yojana Sharma in University World News (July 20, 2018): ‘China’s new regulations restricting the ‘export’ of scientific data collected within the country and asserting that any research for publication in international journals must first be approved by a new, yet to be set up authority, are causing uncertainty and concern for many researchers who are working in collaboration with China.’ …
But before Americans pile on, as if this kind of blunder could never occur in a country with a constitutional right to freedom of the press, recall a similar move by the George W. Bush administration during the height of paranoia after the 9/11 attacks….”
“This document aims to clarify the key elements of open data and to serve as a proposal to institute and strictly implement a policy for climate change and disaster risk reduction-related data and information based on its articulated and internationally accepted definition in the Philippines. The document describes the different considerations for the Philippines in its decision to fully adopt, support and promote a policy for open data for DRR. Defining the standards in an open data law will mandate compliance to the key elements of open data, which include: availability in digital format of data, downloadable via the internet in bulk for ease of use; amenability to intermixing with other datasets through an interoperable format structure and machine-readability of digital files; freedom to use, reuse and redistribute, even on commercial basis; and a ‘no conditions’ rule on the use of open data, except for appropriate citation for due credit.”
“Software now runs consumer products and critical systems that we trust with our safety and security. For example, cars, medical devices, voting machines, power grids, weapons systems, and stock markets all rely on code. While responsible companies cooperate with the technical community and the public to improve the safety of code, others do not. They instead try to prevent researchers and others from sharing safety research, threatening criminal and civil actions under the Digital Millennium Copyright Act and the Computer Fraud and Abuse Act. Chilling research puts us all at risk. Protect the public from unsafe code and help us to protect ourselves. Reform the DMCA and CFAA to unlock and encourage research about potentially dangerous safety and security weaknesses in software….”
Abstract: Energy policy often builds on insights gained from quantitative energy models and their underlying data. As climate change mitigation and economic concerns drive a sustained transformation of the energy sector, transparent and well-founded analyses are more important than ever. We assert that models and their associated data must be openly available to facilitate higher quality science, greater productivity through less duplicated effort, and a more effective science-policy boundary. There are also valid reasons why data and code are not open: ethical and security concerns, unwanted exposure, additional workload, and institutional or personal inertia. Overall, energy policy research ostensibly lags behind other fields in promoting more open and reproducible science. We take stock of the status quo and propose actionable steps forward for the energy research community to ensure that it can better engage with decision-makers and continues to deliver robust policy advice in a transparent and reproducible way.
“I mentioned that ASIO [Australian Security Intelligence Organisation] files go through an process known as ‘access examination’ before they’re released to the public. This is the case for all records more than twenty years old, not just the super secret ones. The vast majority of files are simply opened without restriction. Some, including most of the ASIO files, are opened ‘with exceptions’ — pages can be withheld, and text redacted. A few are withheld from the public completely. They have entries in RecordSearch, but you can’t see them — their access status is officially ‘closed’.
But because the metadata about access decisions is available online, we can start to build a picture of what we’re not allowed to see….”
“Some fields such as paleontology and archaeology have long maintained restrictions on the publication of site locations and promoted government policies and regulations to limit collection and trade in fossils, artefacts, and culturally sensitive and/or scientifically important material. Organizations such as the U.S. Forest Service do not disclose geospatial data in order to protect research sites. Other solutions include modification of research permits so that endangered species locations are not automatically uploaded into wildlife databases and masking such records on private land, as presently occurs in some states in the United States.
Is this relevant to any public health research? Other than personally identifiable information, what types of health data should not be made widely available?”
“The list of reasons why energy models and data are not openly available is long: business confidentiality; concerns over the security of critical infrastructure; a desire to avoid exposure and scrutiny; worries about data being misrepresented or taken out of context; and a lack of time and resources.
This secrecy is problematic, because it is well known that closed systems hide and perpetuate mistakes. A classic example is the spreadsheet error discovered in the influential Reinhart–Rogoff paper used to support economic policies of national austerity. The European Commission’s Energy Roadmap 2050 was based on a model that could not be viewed by outsiders, leaving it open to criticism. Assumptions that remain hidden, like the costs of technologies, can largely determine what comes out of such models. In the United Kingdom, opaque and overly optimistic cost assumptions for onshore wind went into models used for policymaking, and that may well have delayed the country’s decarbonization.
This closed culture is alien to younger researchers, who grew up with collaborative online tools and share code and data on platforms such as GitHub. Yet academia’s love affair with metrics and the pressure to publish set the wrong incentives: every hour spent on cleaning up a data set for public release or writing open-source code is time not spent working on a peer-reviewed paper.”
[In the area of AI [Schmidt] wants to see the industry push to make sure research stays out in the open and not controlled by military labs. Addressing the hall packed with security professionals, Schmidt made the case for open research, noting that historically companies never want to share anything about their research. “We’ve taken opposite view to build a large ecosystem that is completely transparent because it will get fixed faster,” he said. “Maybe there are some weaknesses, but I would rather do it that way because there are thousands of you who will help plug it….”
“With the stroke of a pen, the Librarian of Congress has authorized security researchers who are acting in good faith to conduct controlled research on consumer devices so long as the research does not violate other laws such as the Computer Fraud and Abuse Act (CFAA). This temporary exemption to the Digital Millennium Copyright Act (DMCA) begins today. The new temporary exemption is a big win for security researchers and for consumers who will benefit from increased security testing of the products they use.
The Digital Millennium Copyright Act (DMCA) (link is external) makes it illegal to circumvent controls that prevent access to copyrighted material. The result is that under the DMCA, researchers can’t investigate and discover security vulnerabilities if doing so requires reverse engineering or circumventing controls such as obfuscated code…. “