language-icon Old Web
English
Sign In

Weeding Out the Bad Apples

2016 
“How do you deal with data manipulation and retractions?” Indicative of the nebulous nature of the issue, this is a question often posed to us in hushed tones, late at night at meetings or toward the end of a phone call. Though certainly not our favorite topic, due to the procedural complexities and inherent sadness involved, we would like to take a moment to shed some light on this type of retraction at Cell Metabolism.In simplified terms, retractions occur because the conclusions based on data presented in the paper cannot be relied upon anymore. Though still rare, the overall number of retractions has gone up in past years, contributing to the reproducibility concerns plaguing the biomedical community. However, it should be emphasized that not all retractions are due to data manipulation. At Cell Metabolism, papers are retracted if (1) there is proven scientific misconduct affecting the data presented, (2) the scope of corrections that would be needed is extensive, or (3) the changes needed would affect the conclusions of the paper. For the latter two retraction reasons, the mistakes might not be a consequence of fraud, falsification, or violation of ethical and publishing practices, but they are serious enough to shake the foundational conclusions.A common perception is that journals are reluctant to take action on reported concerns of data manipulation because months and sometimes years go by before anything happens. However, rather than being indicative of tergiversation on our part, the silence is due to the lengthy and confidential nature of the behind-the-scenes workings until the time we issue a retraction, if one is necessary. In order to start an investigation, we ask to be formally notified of concerns, usually in the form of an email, clearly detailing the issues at hand. Our current policy at Cell Metabolism is that the editors do not follow blog discussions on data manipulation because we cannot consistently and comprehensively monitor the ongoing discourse; also, our experience has been that it is difficult to ascertain the validity of many of the allegations—at face value, PowerPoint presentations showing blown-up images “with circles and arrows,” quoting Arlo Guthrie’s “Alice’s Restaurant,” invariably appear incriminating. In our opinion, to conclusively determine if there is an issue or not, one needs to see the raw data. Another point of contention that we wrestle with is that the majority of the notification emails are from anonymous “concerned readers.” We certainly understand the worry about backlash for a whistleblower, especially for a junior researcher, and have always respected the person’s wishes. However, there is a fine line between genuine reports and timewasters. Over the years, we have received our fair share of both serious and frivolous emails. Examples of the latter type include “Dr. X does not buffer their PBS appropriately; you can’t trust their data; you have to retract all their papers,” and “I saw Dr. Y present the data 4 years ago and they were different; clearly the authors are lying.” Our thought is that it is best to provide the journal with sufficient confidential information about the issue and context, and for the concerned reader to be available for follow-up questions from the editors.Fraudulent data fabrication and falsification can occur at various stages of the work—during data generation, analysis, and processing or during figure preparation. On the journal end, figures are usually the first problematic aspect that is detected. Is the western blot band A from Figure 1 the same as band B from Figure 6? Many bands look similar, but are they identical? We recently had a case at Cell Metabolism where the western blot bands in different experiments for different figures looked identical to all of us, including the authors, but after extensive analysis by a team of experts, the blots proved to be different. When a seemingly valid concern is raised, our first port of call is to ask the authors for the raw data, which raises another issue: how long can we reasonably expect authors to have access to the original data? Different institutions and funding bodies across the world have different statutes of limitations for data retention. Our current operating guideline at the journal is that, in most cases, 5–7 years is a reasonable time frame to expect authors to still have the data at hand. Based on the scope of the issues raised, we may also ask the authors to contact their institution’s office of research integrity, if they have not already done so, to review the concerns. Having the involvement of another reviewing body can add additional layers of complexity and delays, for example due to the type of confidentiality agreements that the institution may have. At the journal, we may wait to receive the official report of the investigation to alert the readers, although, depending on the circumstances, we may also proceed independently.In our experience, once it is determined that there are valid concerns surrounding a paper, most authors are willing to take the responsible course of action. Retractions are extremely distressing for all parties concerned—the authors, especially the ones who were not aware of the problem, the journal, and the community at large. They make us all pause and reflect. As scientists, we evaluate data with a skeptical eye, though our default is to trust that each other’s data have been generated in good faith. The underlying reasons for scientific misconduct are complex, and we all unwittingly play our part in the larger problem. At the journal, we are at the end of a chain of events, but we recognize that we can no longer just rely on the self-correcting nature of science and that effective collaboration with the community, funding bodies, and our publishing colleagues is needed to curb the real incidents of scientific misconduct.Our collective goal is to be more vigilant from submission to publication. At Cell Press, we screen the figures on accepted papers, although we fully appreciate that the lion’s share of scrutinizing the data falls on our reviewers, as experts in the field with first-hand knowledge of the type of data presented, to flag issues during peer review. We also often consult with statistical experts, especially for the Clinical and Translational Reports at Cell Metabolism. Reviewing is increasingly more challenging as science becomes more interdisciplinary and new technologies allow one to ask questions that we could not previously have tackled. As we approach “Reviewer Week,” we wanted to take this opportunity to recognize and thank our unsung heroes at Cell Metabolism, as well as at other journals, for the tremendous work they put in every day to ensure the rigor of published papers and maintain the credibility of the metabolism field. Every year we see thousands of papers, and we can honestly say that peer review works and that collaboration among authors, reviewers, and editors almost invariably results in stronger papers. Thank you, Reviewers, for the important work that you do.We hope that this Editorial has helped clarify what our current practices are regarding retractions at Cell Metabolism. As usual, please let us know if you have questions, feedback, or ideas for improving the journal and helping us better serve the community. We love hearing from you! Please continue to make the most of our monthly editorial office hours, email us, or catch us in person at meetings and lab visits.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []