Evans and Reimer greatly underestimate effect of free access

A lot is being made of a new paper (no link – I only link to articles that are freely available) by University of Chicago sociologist James Evans in which he analyzes citation and free access histories of journals in various disciplines, and concludes that the effect of free access (which he mistakenly calls open access) on citations is much more modest than previously estimated.

This paper – and the response to it – has many flaws. A few of the most egregious:

1) The analysis shows that there are a lot of scientists out there benefiting from free access 

First, the authors and a lot of people responding to this paper seem to assume that the modest increase in the number of citations arising from the transition to free access is somehow an argument against free access. This is silly. Even if free access didn’t change citation numbers at all, it would still be an unambiguously good thing for a wealth of other reasons that I won’t rehash here.

The reason that open access opponents are so excited by the supposed conclusions of this paper are that a small citation increase associated with free access bolsters one of their favorite tropes – that all the important people (i.e. people who might eventually cite your paper) already have access to it because they are affiliated with a major research university in the developed world. If this were true journals making their contents freely available would have very little effect on which articles they read and cite. 

But even this paper – now being cited as evidence that free access is unimportant – reports at least an 8% increase in the number of citations associated with free access – suggesting that there are a significant number of active researchers out there who are getting access to articles only because they are available freely online. This may sound like a small number, but collectively across the global scientific community we’re talking about tens or hundreds of thousands of scientists.

Do the publishers really want to argue that even this modest increase in citations is unimportant? If so, I’ll remind them of it next time they issue a press release touting the 5% increase in their impact factor…

2) The 8% number comes from analysis of a lot of old articles. If you look only at articles that become freely online within two years of publication, the increase in citations is 20%. 

If you look at the online supplement for the article, you’ll see that it analyzes a lot of old literature. Very small amounts of old literature are cited, so it’s unsurprising that free access to this literature has little effect on its citation. However, if you look at their supplemental Figure S1(c), you see that the effect on more recently published articles – the bulk of what they analyze – is MUCH larger:

Analysis of effect of transition to free access as a function of time since publication.

So, for articles that are less than 2 years old, the effect is close to 20%. And the curve is clearly rising as you get to shorter time frames. Doing a little extrapolation it looks like the effect of immediate free access should be at least 50%. 

3.) The raw data for the paper are not available to confirm and/or reanalyze the authors’ claims

 Where the hell is the data for this paper? I’d love to look at the validity of their analyses and to do some of my own. But, whoops, I can’t. Because the data are private, and not provided anywhere by the authors or Science. This is even though Science‘s own publication policy makes it clear that:

After publication, all data necessary to understand, assess, and extend the conclusions of the manuscript must be available to any reader of Science

Are we really supposed to take seriously a paper presenting a bunch of complex analyses of data that the authors and the journal don’t make available?

This entry was posted in open access. Bookmark the permalink. Both comments and trackbacks are currently closed.

3 Comments