One month of stonewalling
In early February, a number of bloggers brought to your attention a peculiar paper on mitochondrial proteomics, a paper which was obviously odd on even casual inspection, containing grandiose claims of a theoretical revolution that were entirely unsupported and ludicrous assertions of evidence for God in the genome. Deeper examination revealed that much of the paper had also been plagiarized from various sources. To the credit of the journal, the paper was quickly retracted one month ago today; however, the retraction was entirely based on the plagiarism, and none of the other failings of the paper were addressed, nor were any of the patent errors in the review process at the journal Proteomics discussed. This is strange, especially in light of the fact that the Warda/Han paper was the most accessed article in the journal. This is not an issue that should be swept under the rug!
Today, several of us — Steven Salzberg, Lars Juhl Jensen, and Attila Csordas — are repeating our call for an explanation of the events that led to the leakage of such an egregiously ridiculous paper into print. Bad papers are a dime-a-dozen, and we aren't so much concerned with the detailed discussion of the flaws in this one paper as we are with seeing the integrity of the peer-review process maintained, or better, improved. The Warda/Han paper had obvious red flags that marked it as potentially problematic in the title, the abstract, and scattered throughout the body, and it's hard to imagine how any reviewer or editor could have let them simply slip by without comment, yet that is exactly what seems to have happened.
We want to know how this paper slipped through the cracks, because we want to know how large the cracks in the peer review process at Proteomics are. It's a journal with a good reputation, and we are not presuming that there was any wrong-doing or systematic failure of peer review there, but we do think that a lack of transparency is of concern: there is no assumption of a crime, but the ongoing cover-up is grounds for suspicion. Let's see some self-criticism from the journal editor, and an open discussion of steps being taken to prevent such errors from occurring again.
Alternatively, if the journal wants to outsource its quality control to a mob of bloggers, that works, too … but we tend to be less formal and much more brutally and publicly critical than an in-house process might be, and we're also going to be less well-informed than the actual principals in the review process. Better explanations are in order. Let's see representatives of the journal provide them.