This article has been reproduced in a new format and may be missing content or contain faulty links. Contact email@example.com to report an issue.
Did you hear the story last Friday on NPR’s Morning Edition about open access (OA) journals and peer review? About the OA “sting” from Science, “Who’s Afraid of Peer Review?” Yeah. The “sting” has angered many OA advocates, me included, and has generated many responses; several of note are linked below.
In the admittedly few responses I’ve had a chance to read now that I’m back in the office (oh, why must such stories break when I’m on vacation?!), most focus on methodological flaws, , which are important to call out. DOAJ’s response calls attention to its recent work to clarify and tighten its criteria for inclusion. But what I’ve yet to see are a breakdown of numbers that show just how small a percentage of journals listed in DOAJ are actually “proven” negligent by this effort. If you only heard the NPR piece, then you’ll know that seemingly-big numbers of OA journals apparently don’t do peer review: 157, oh my! But what both the NPR interview and Science article fail to do is to put those numbers in useful context. 157 sounds big if you’re talking dollars, but not so big if you’re talking pennies. When it comes to the number of OA journals, this “sting” is counting change, not bills.
As reported in the article, 187 journals of the 304 identified to receive the fake paper were listed in DOAJ as of 2 Oct 2012 (167 in DOAJ, 121 in Beall’s list, 16 in both). That accounts for only 2.27% of all DOAJ journals on that date (the article notes there were 8250 titles in DOAJ). In the results, the real researcher behind the fake paper, John Bohannon, notes that 157 accepted the paper, but he does not continue to give DOAJ/Beall’s list/both breakdowns at this point. Of those 157,
we don’t know how many are in DOAJ without digging into his data (which, frankly, given that I know how small a percentage we’re talking about, I’m not going to expend energy doing); all we know is that “for DOAJ publishers that completed the review process, 45% accepted the bogus paper.” But apparently only “106 journals discernibly performed any review.” From the article, we don’t know how many of that 106 were DOAJ titles, so that 45% sounds bigger and badder than it may be. Generously presuming all 106 were DOAJ titles, then 48 DOAJ journals appear to have faulty peer review. Those 48 journals account for a mere 0.58% of all DOAJ titles a year ago; as of last Friday, when DOAJ had 9948 journals, that’s only 0.48%.
But even if all 157 journals (now, 158, per the author’s NPR interview Friday morning) that accepted the fake paper were DOAJ titles (again, I’m being intentionally generous), that’s still only 1.58% of all current DOAJ titles. I daresay in the traditional publishing realm, there are more than 1.58% of print journals that would be “proven” to have shoddy-to-no peer review, too.
Oh, and the journal behind this “sting,” Science? Yeah, it published the discredited and widely-maligned arsenic DNA paper in 2011, which slipped through its own peer review process (h/t, Michael Eisen’s response below). That lessens the sting a bit, doesn’t it?
Finally, a few final quibbles – and one concession – with the NPR piece, as that was my first introduction to the controversy, as it was for many:
- The total number of journals listed in DOAJ was never noted, so the numbers context above was lost on the audience (and it’s murky enough in the actual article);
- At no point in the interview it is mentioned that journals owned by Sage, Elsevier, and Wolters Kluwer – big name publishers of primarily traditional journals – were among the 157 accepting the fake paper; and,
- To be fair, Bohannon did make a clear effort not to malign all OA journals while discussion his “sting,” which I appreciate, but unfortunately, most researchers who’ve been resistant to OA likely only half-heard that defense.
As Bohannon rightly pointed out during his NPR interview, we do need to identify disreputable publishers who are taking advantage of the low-to-no capital needed to launch an e-journal using the OA article processing fee model, who through negligence are casting OA publishing in poor light. But more than that, we need to address issues in the peer review process for all journals.
Responses worth reading:
- DOAJ: Directory of Open Access Journals
- OASPA: Open Access Scholarly Publishers Association
- SPARC: the Scholarly Publishing and Academic Resources Coalition
- Michael Eisen
Updated 8 Oct 2013 to correct numbers above, and to link to the following, many of which unpack the flawed “methodology” behind the “sting”:
- Science Gone Bad
- Anti-tutorial: how to design and execute a really bad study
- Unscientific spoof paper accepted by 157 “black sheep” open access journals – but the Bohannon study has severe flaws itself
- What Science – and the Gonzo Scientist – got wrong: open access will make research better
- New “sting” of weak open-access journals
- Science Mag sting of OA journals: is it about Open Access or about peer review? (Good list of links to coverage, including others here, at the end of this one)
- Science, Peer Review, Open Access and Controversy
- The Troubled Peer Review System, the Open Access Wars, & the Blurry Line Between Human Subjects Research & Investigative Journalism (looking at this from an IRB angle)
- Who’s Afraid of Open Access?
- What’s “open” got to do with it?
- The Guardian: Open access publishing hoax: what Science magazine got wrong