Science Magazine’s Open Access “Sting”

Twitter icon
Facebook icon
Google icon
LinkedIn icon
e-mail icon
October 04, 2013

Yesterday, Science Magazine published an article reporting on a “sting” exercise designed to expose the flaws in the editorial processes of Open Access journals. The article Who’s Afraid of Peer Review?” by self-proclaimed “gonzo” scientist John Bohannon recounts his experience in submitting a deliberately flawed scientific paper under a pseudonym to 304 Open Access journals over a ten month period. He reports that to date, over half of the journals to which the article was submitted have accepted the paper, raising legitimate concerns over the quality of the editorial practices of some Open Access journals.

While this article shines a light on an issue of deep concern to the academic community, it is important to try and unpack just what Bohannon’s study did - and didn’t - investigate.

A significant limitation of this “sting” is that the flawed articles were sent only to a selected group of Open Access journals, and no comparative control group of subscription-based journals was used. Without such a control group, the author’s conclusion that that the Open Access model is somehow responsible for lower editorial standards can not be substantiated.

An additional issue with the construction of the “sting” is that the journals that the flawed articles were submitted to were not selected in an appropriately randomized way. Bohannon notes that the journals were selected from two sources – the Directory of Open Access Journals (DOAJ) and Jeffrey Beall’s list of ‘predatory’ open-access journals and that were chosen on various grounds, including their discipline/subject coverage, language of publication, and publication fee policy. 

By selecting only Open Access journals that levy publication fees to support their publishing operations and excluding the large number of Open Access publishers who do not charge such fees, the design of the “sting” was further compromised.

Because of these limitations, while Bohannon’s data might support the conclusion that a significant number of Open Access journals currently employ low-quality editorial practices, it does not support broader conclusions about how open access journals compare with similar subscription journals, or about the prevalence of this problem in the larger scholarly publishing marketplace.

Many Open Access publishers have already reacted to the “sting,” with two organizations in particular posting important responses. The Open Access Scholarly Publishers Association (OASPA) posted a thorough response acknowledging their awareness of the issue of low-quality editorial practices, and reiterating their commitment as a trade association to promote best practices in Open Access publishing. Their statement notes in part:

In our view the most important lesson from this recent article in Science is that the publishing community needs stronger mechanisms to help identify reliable and rigorous journals and publishers, regardless of access or business model.”

The Directory of Open Access journals responded in a similar manner, also acknowledging that a problem exists, and (describing the detailed steps) that they have been taking to address the issue over time: 

“Proactively, DOAJ has been working on new criteria for the best part of this year. The revised criteria will deter low-quality publishers from suggesting journals in the first place and will reveal those who fail to step up to the mark within the grace period. As far back as June 2013, DOAJ opened up the first draft of the revised for criteria for public consultation

A response to the consultation period was then published in September along with a second revision of the criteria. All the documents have been, and remain, publicly available under these two links.”

Both of these responses reflect the ongoing deliberativeness with which the major Open Access publishing organizations are addressing this issue, and their efforts are welcome. 

While the Science “sting” does not lend itself to drawing any truly valid conclusions about the overall quality Open Access journals, it does raise significant questions about potential flaws in the peer review and editorial processes used throughout the scholarly publishing marketplace. Perhaps an unintended outcome of this “sting” operation will be to spark a closer scrutiny of this aspect of journal publishing.

Join the SPARC Mailing List

To prevent automated spam submissions leave this field empty.

SPARC Resources

View resource portal»

Why Access Matters