Did you recently log onto your favorite journal’s website and see this? (For anyone who doesn’t want to bother clicking, it’s the video from Rick Astley’s “Never Gonna Give You Up.”) If so, your favorite journal was hijacked.
We’re pleased to present a guest post from Michèle B. Nuijten, a PhD student at Tilburg University who helped develop a program called “statcheck,” which automatically spots statistical mistakes in psychology papers, making it significantly easier to find flaws. Nuijten writes about how such a program came about, and its implications for other fields.
Readers of Retraction Watch know that the literature contains way too many errors – to a great extent, as some research suggests, in my field of psychology. And there is evidence that problem is only likely to get worse.
To reliably investigate these claims, we wanted to study reporting inconsistencies at a large scale. However, extracting statistical results from papers and recalculating the p-values is not only very tedious, it also takes a LOT of time.
So we created a program known as “statcheck” to do the checking for us, by automatically extracting statistics from papers and recalculating p-values. Unfortunately, we recently found that our suspicions were correct: Half of the papers in psychology contain at least one statistical reporting inconsistency, and one in eight papers contain an inconsistency that might have affected the statistical conclusion.
A group of computer scientists has a pair of retractions for duplicating “substantial parts” of other articles written by different authors. Both papers, published in Neural Computing and Applications, are on ways to screen for breast cancer more effectively.
According to the abstract of “An improved data mining technique for classification and detection of breast cancer from mammograms,” computers make the process of identifying cancer in lesions detected by mammograms faster and more accurate:
Although general rules for the differentiation between benign and malignant breast lesion exist, only 15–30% of masses referred for surgical biopsy are actually malignant. Physician experience of detecting breast cancer can be assisted by using some computerized feature extraction and classification algorithms. Computer-aided classification system was used to help in diagnosing abnormalities faster than traditional screening program without the drawback attribute to human factors.
The article has been cited four times, according to Thomson Scientific’s Web of Knowledge. The retraction note reveals where “substantial parts” of the article came from:
A BioMed Central journal has pulled the paper of a scientist who decided to prohibit countries that are friendly to immigrants from using his software.
Recently, German scientist Gangolf Jobb declared that starting on October 1st scientists working in countries that are, in his opinion, too welcoming to immigrants — including Great Britain, France and Germany — could no longer use his Treefinder software, which creates trees showing potential evolutionary relationships between species. He’d already banned its use by U.S. scientists in February, citing the country’s “imperialism.” Last week, BMC Evolutionary Biology pulled the paper describing the software, noting it now “breaches the journal’s editorial policy on software availability.”
The authors of a paper on a mechanism for potential cancer therapies are retracting it after realizing they published some proprietary findings “without permission and agreement from St. Jude Children’s Research Hospital.”
Thirteen papers in Mathematics and Mechanics of Solids now have an expression of concern, after it came to light that an author on most of the papers coordinated the peer-review process.
David Y. Gao, a well-known and prolific mathematician at the Federation University Australia, is the author of 11 of the papers, and also the guest editor of the special issue in which they were set to appear. The papers were published online earlier this year.
A spokesperson for SAGE, which publishes the journal, confirmed that the publisher decided to re-review the papers after learning about Gao’s role in the peer-review process:
Authors have retracted a highly cited Nature letter that purported to discover a much sought-after, stable light source from quantum dots, after they realized the light was actually coming from another source: the glass the dots were affixed to.
When the paper “Non-blinking semiconductor nanocrystals” was published in 2009, it received some media coverage, such as in Chemistry World. That’s partly because very small sources of “non-blinking” light could have wide-ranging, big-picture applications, author Todd Krauss, a physical chemist at the University of Rochester, told us:
Off the top of my head, a quantum computer. Quantum cryptography is another one. People want a stable light source that obeys quantum physics, instead of classic physics.
The retraction note, published Wednesday, explains how the researchers found out the effect was coming from the glass, not quantum dots:
An electrical engineering paper published in April has been retracted because of similarities to a 2012 paper from different authors, including “almost identical” data in two of the papers’ tables.
The authors were unable to provide the original numbers for the suspect tables, along with a pair of “similar” figures, which bore a striking resemblance to ones presented in the same 2012 paper. Corresponding author Tao Jin at Fuzhou University in China requested the withdrawal “in order to repeat the experiments and obtain new data.”
This week’s issue of Science includes a retraction of a highly cited paper about manipulating the current in a string of molecules with a magnet, after an investigation by the co-authors revealed “inappropriate data handling” by the first author.