If you like to read science studies you are most likely to get them through one of two avenues; the long-standing business model has been that a print journal gets the study and does the work formatting it and lends their 'goodwill' to it with marketing - in return, they hold copyright and subscribers pay to read it. A more recent approach has been companies that instead charge the scientists to publish the study but reading it is free - open access versus toll access, proponents claim, though in a practical sense someone is either paying to read or someone is paying to publish.
The recent trend toward even greater government funding of science studies has led to calls for more open access, since it seems unfair that taxpayers should have to pay private corporations to read taxpayer-funded studies(1) and Pres. Bush signed a law making steps in that direction, though not without a Democrat trying to squash the movement but, outside a core group of advocates, the big question has been whether or not the pay-to-publish model has as much credibility where it counts as traditional journals - citations and impact. Open access as a cultural mantra is fine but at the end of the day, researcher performance is valued, measured and rewarded by how much their work contributes to further research by other scientists and scholars.
Do open access studies get more quality citations, and if it were to follow that studies that are free-to-read get more citations and have an advantage, why not make them all open access?
A new study published in an open access publication, PLoS One, set out to determine whether or not open access articles were cited more just because anyone could access them or because articles more likely to be popular were made open access; self-selection rather than a causal benefit.
According to the 'self-selection bias' hypothesis, open access articles that seem to have greater impact may be an artifact of bias on the part of authors toward making their better, and thus more usable and citeable articles, open access to boost that result. If the open access advantage is self-selection bias, then making all research open access is a non-issue and pay-to-publish and toll access journals can co-exist in a capitalist world without issue.
A team from the University of Southampton's School of Electronics and Computer Science (ECS) in the UK and l'Université du Québec à Montréal in Canada set out to find some answers.
They analyzed all articles that were made open access at the four first institutions in the world to make self-archiving mandated, the University of Southampton School of Electronics and Computer Science, CERN, the University of Minho and Queensland University of Technology, and compared their citation impact with control articles, in the very same journals and years, from un-mandated institutions, that were made open access by author self-selection or not made open access at all. The open access impact turned out to be basically the same whether open access was self-selected or required by institutions.
The results also showed that the percentage of an institution's yearly research output that is made open access self-selectively varied between 5% and 25%, whereas the percentage when open access is mandated jumped to 60% and climbed toward 100% within a few years of mandate adoption.
The conclusion of the study was that the open access impact advantage is real, and caused by the greater accessibility of open access articles, but that open access mandates are also the way to make all research benefit from the greater likelihood of being used and cited that open access provides. The good news for legacy publishers is that higher-impact journals had a substantially greater open access benefit than lower impact open access journals - so reputation still counts, especially with 25,000 peer-reviewed journals competing for attention.
Distribution of Journal Impact Factors(JIF) by Journal. As with the distribution of individual article citation counts (Figure 3), the distribution of journal impact factors (average citation counts) is highly skewed. Most journal JIFs fall between 0 and 5, with the peak between 2 and 3, followed by a long rapidly shrinking tail, tail with very few journals having a JIF greater than 10.
Citation: Gargouri Y, Hajjem C, Larivière V, Gingras Y, Carr L, et al., 'Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research', PLoS ONE 5(10): e13636. doi:10.1371/journal.pone.0013636
(1) Should taxpayers have to pay to publish studies funded by taxpayers is the flip side of that, but not a topic for this article.
Does Open Access Lead To More Quality Citations? The Data Says ...
Comments