“The Wikipedia Library is an open research hub, a place for active Wikipedia editors to gain access to the vital reliable sources that they need to do their work and to be supported in using those resources to improve the encyclopedia. We aim to make access and use of sources free, easy, collaborative and efficient.
The Wikipedia Library is run by a team of Wikimedia Foundation staff and global volunteers. We operate on a community-organized satellite model: we administer the global project but work with local coordinators in local Wikipedia projects to help each community set up their own libraries….”
“At the Wikimedia Foundation, we believe that free access to knowledge and freedom of expression are fundamental human rights. We believe that when people have good information, they can make better decisions. Free access to information creates economic opportunity and empowers people to build sustainable livelihoods. Knowledge makes our societies more informed, more connected, and more equitable.
Over the past two years, we have seen governments censor Wikipedia, including in Turkey and most recently in China, denying these rights to millions of people around the world.
Today, we proceed to the European Court of Human Rights, an international court which hears cases of human rights violations within the Council of Europe, to ask the Court to lift the more than two-year block of Wikipedia in Turkey. We are taking this action as part of our continued commitment to knowledge and freedom of expression as fundamental rights for every person….”
“As a research scientist at the Wikimedia Foundation, you will work with researchers, software engineers, designers, and volunteers to design, test and evaluate new technologies. You will produce empirical insights to inform the organization’s and the movement’s efforts towards our strategic direction—to become the platform that serves open knowledge to the world and to empower all people to access and contribute to free knowledge. You will turn research questions into publicly shared, reproducible knowledge and work with a team that is strongly committed to principles of transparency, privacy and collaboration. You will use and develop free and open source technology and collaborate with researchers in the industry and academia….”
“While Gomila is officially launchingGoldentoday, it’s already full of content about things likethe latest batch of Y Combinator startupsandmorphogenetic engineering. And it’s already raised $5 million from Andreessen Horowitz, Gigafund, Founders Fund, SV Angel, Liquid 2 Ventures/Joe Montana, plus a long list of individual angel investors including Gomila’s Heyzap co-founder, Immad Akhund.
To state the obvious: Wikipedia is an incredibly useful website, but Gomila pointed out that notable companies and technologies like SV Angel, Benchling, Lisk and Urbit don’t currently have entries. Part of the problem is what he called Wikipedia’s “arbitrary notability threshold,” where pages are deleted for not being notable enough. …”
“The January  #1Lib1Ref campaign saw an energy exhibited by participants that was infectious. The campaign saw major additions, new entrants and a new sense of competition between languages and institutions. In this iteration #1Lib1Ref reached record highs and saw extensive participation from emerging communities and languages. For the first time the French Wikipedia took the lead with over 33% of the total number of contributions made during the campaign. Based on these results, we anticipate that #1Lib1Ref has the potential of supporting outreach in diverse communities….”
“But at least one new study suggests that Wikipedia is superior to other medical sources in at least one key respect: short-term knowledge acquisition. That is, when it comes to finding the right answers quickly, Wikipedia seems to lead the pack. This suggests a new way of thinking about the utility of the crowdsourced encyclopedia. Wikipedia delivers value not only by offering massive amounts of information with its nearly 5.8 million English articles so far, but by providing the means for even professional users to quickly identify and retrieve the most relevant information….
The authors of the paper, published in the Journal of Medical Internet Research in October, devised a “three-arm randomized trial” to test the comparative effects of three resources. 116 first- or second-year medical students in Canada took a multiple-choice medical test similar to the Canadian medical licensing examination. During the test, participants took notes on topics to research. After the test, the students were provided one of three pre-selected resources: Wikipedia, a digital textbook, or UpToDate, a subscription service mostly used by doctors. After the test, participants researched topics and took written notes using their assigned resource. Then the students retook the test using their notes.
If you’re like me, then at this point you’re probably feeling bad for the poor medical students. But at least the trial yielded a meaningful result: Students in the Wikipedia group had significantly better post-test performances on the exam compared to the digital textbook group. The Wikipedia group also outperformed the UpToDate group by a small margin, an impressive result given that UpToDate costs more than $500 annually for a subscription….
Abstract: Background: Web-based resources are commonly used by medical students to supplement curricular material. Three commonly used resources are UpToDate (Wolters Kluwer Inc), digital textbooks, and Wikipedia; there are concerns, however, regarding Wikipedia’s reliability and accuracy.
Objective: The aim of this study was to evaluate the impact of Wikipedia use on medical students’ short-term knowledge acquisition compared with UpToDate and a digital textbook.
Methods: This was a prospective, nonblinded, three-arm randomized trial. The study was conducted from April 2014 to December 2016. Preclerkship medical students were recruited from four Canadian medical schools. Convenience sampling was used to recruit participants through word of mouth, social media, and email. Participants must have been enrolled in their first or second year of medical school at a Canadian medical school. After recruitment, participants were randomized to one of the three Web-based resources: Wikipedia, UpToDate, or a digital textbook. During testing, participants first completed a multiple-choice questionnaire (MCQ) of 25 questions emulating a Canadian medical licensing examination. During the MCQ, participants took notes on topics to research. Then, participants researched topics and took written notes using their assigned resource. They completed the same MCQ again while referencing their notes. Participants also rated the importance and availability of five factors pertinent to Web-based resources. The primary outcome measure was knowledge acquisition as measured by posttest scores. The secondary outcome measures were participants’ perceptions of importance and availability of each resource factor.
Results: A total of 116 medical students were recruited. Analysis of variance of the MCQ scores demonstrated a significant interaction between time and group effects (P<.001, ?g2=0.03), with the Wikipedia group scoring higher on the MCQ posttest compared with the textbook group (P<.001, d=0.86). Access to hyperlinks, search functions, and open-source editing were rated significantly higher by the Wikipedia group compared with the textbook group (P<.001). Additionally, the Wikipedia group rated open access editing significantly higher than the UpToDate group; expert editing and references were rated significantly higher by the UpToDate group compared with the Wikipedia group (P<.001).
Conclusions: Medical students who used Wikipedia had superior short-term knowledge acquisition compared with those who used a digital textbook. Additionally, the Wikipedia group trended toward better posttest performance compared with the UpToDate group, though this difference was not significant. There were no significant differences between the UpToDate group and the digital textbook group. This study challenges the view that Wikipedia should be discouraged among medical students, instead suggesting a potential role in medical education.
Abstract : This paper addresses the integration of a Named Entity Recognition and Disambiguation (NERD) service within a group of open access (OA) publishing digital platforms and considers its potential impact on both research and scholarly publishing. The software powering this service, called entity-fishing, was initially developed by Inria in the context of the EU FP7 project CENDARI and provides automatic entity recognition and disambiguation using the Wikipedia and Wikidata data sets. The application is distributed with an open-source licence, and it has been deployed as a web service in DARIAH’s infrastructure hosted by the French HumaNum. In the paper, we focus on the specific issues related to its integration on five OA platforms specialized in the publication of scholarly monographs in the social sciences and humanities (SSH), as part of the work carried out within the EU H2020 project HIRMEOS (High Integration of Research Monographs in the European Open Science infrastructure). In the first section, we give a brief overview of the current status and evolution of OA publications, considering specifically the challenges that OA monographs are encountering. In the second part, we show how the HIRMEOS project aims to face these challenges by optimizing five OA digital platforms for the publication of monographs from the SSH and ensuring their interoperability. In sections three and four we give a comprehensive description of the entity-fishing service, focusing on its concrete applications in real use cases together with some further possible ideas on how to exploit the annotations generated. We show that entity-fishing annotations can improve both research and publishing process. In the last chapter, we briefly present further possible application scenarios that could be made available through infrastructural projects.
“Despite Wikipedia’s importance as a resource for both practicing physicists and the wider community, it is rare for professional physicists to contribute, in part because there are few, if any, professional incentives to do so. We’re all in agreement that researchers should receive proper attribution for our work (which is why PLOS ONE supports ORCID); and as credit is not given for submitting or editing Wikipedia pages, only a small fraction of the physicists that I asked about this have edited even a single Wikipedia page.
With this in mind, we’re excited to introduce PLOS ONE Topic Pages, which are peer-reviewed review articles written with Wikipedia in mind. These provide opportunities for author attribution and will result in both journal articles and Wikipedia pages of high quality and utility….”