Archive for the ‘psychology’ Category
That’s the question posed by Armin Günther at the Leibniz Institute for Psychology Information in Germany in a recent presentation. There is some evidence to suggest that psychology overall has a problem — the number of retractions has increased four-fold since 1989, and some believe the literature is plagued with errors. Social psychologist Diederik Stapel is number three on our leaderboard, with 58 retractions.
But does any particular field have more retractions, on average, than others? Günther examines some trends and provides his thoughts on the state of the field. Take a look at his presentation (we recommend switching to full-screen view): Read the rest of this entry »
A leading psychology research society in Germany has called for the end of PubPeer postings based on a computer program that trawls through psychology papers detecting statistical errors, saying it is needlessly causing reputational damage to researchers.
Last month, we reported on an initiative that aimed to clean up the psychology literature by identifying statistical errors using the algorithm “statcheck.” As a result of the project, PubPeer was set to be flooded with more than 50,000 entries for the study’s sample papers — even when no errors were detected.
On October 20, the German Psychological Society (DGPs) issued a statement criticizing the effort, expressing concern that alleged statistical errors are posted on PubPeer before authors of original studies are contacted. The DGPs also claimed when mistakes that are detected by statcheck and posted on PubPeer turn out to be false positives, it still results in damage to researchers that is “no longer controllable,” as entries on PubPeer cannot be easily removed.
Today, statcheck’s creators, led by Michèle Nuijten — a PhD student at Tilburg University in the Netherlands, who we’ve previously interviewed about statcheck — responded to DGPs’ critcisms, saying that there is value in Read the rest of this entry »
The detection process uses the algorithm “statcheck” — which we’ve covered previously in a guest post by one of its co-developers — to scan just under 700,000 results from the large sample of psychology studies. Although the trends in Hartgerink’s present data are yet to be explored, his previous research suggests that around half of psychology papers have at least one statistical error, and one in eight have mistakes that affect their statistical conclusions. In the current effort, regardless of whether any mistakes are found, the results from the checks are then posted to PubPeer, and authors are alerted through an email.
Till now, the initiative is one of the biggest large-scale post-publication peer review efforts of its kind. Some researchers are, however, concerned about its current process of detecting potential mistakes, particularly the fact that potentially stigmatizing entries are created even if no errors are found. Read the rest of this entry »
A communications journal has retracted parts of a paper about a famous German political scientist after her great-nephew threatened the journal with legal action, claiming bits of the paper were defamatory.
The European Journal of Communication (EJC) retracted the parts of the paper that reviewed a biography of Elisabeth Noelle-Neumann, published in Germany in 2013. The biography was titled “Elisabeth Noelle-Neumann: Demoskopin zwischen NS-Ideologie und Konservatismus;” a Google-translate of that title gives “Elisabeth Noelle-Neumann: pollster between Nazi ideology and conservatism.”
Noelle-Neumann is most well known for her mass communication theory, the “Spiral of Silence,” which refers to the tendency to remain silent on a subject when your view opposes that of the masses. Because parts of the paper are now redacted, it is unclear what statements were potentially defamatory. Read the rest of this entry »
JAMA authors have retracted — and replaced — a 2014 paper about the mental health effects of household moves on kids, after they found errors while completing an additional analysis.
The original paper concluded that in “families who moved out of high-poverty neighborhoods, boys experienced an increase and girls a decrease in rates of depression and conduct disorder,” according to a press release issued by the journal along with the paper (which also got some press attention from Reuters). But part of that conclusion is wrong.
Scientific fraud isn’t what keeps Andrew Gelman, a professor of statistics at Columbia University in New York, up at night. Rather, it’s the sheer number of unreliable studies — uncorrected, unretracted — that have littered the literature. He tells us more, below.
Whatever the vast majority of retractions are, they’re a tiny fraction of the number of papers that are just wrong — by which I mean they present no good empirical evidence for their claims.
I’ve personally had to correct two of my published articles. Read the rest of this entry »
The Open Science Framework (OSF) has pulled a dataset from 70,000 users of the online dating site OkCupid over copyright concerns, according to the study author.
The release of the dataset generated concerns, by making personal information — including personality traits — publicly available.
Emil Kirkegaard, a master’s student at Aarhus University in Denmark, told us that the OSF removed the data from its site after OkCupid filed a claim under the Digital Millennium Copyright Act (DMCA), which requires the host of online content to remove it under certain conditions. Kirkegaard also submitted a paper based on this dataset to the journal he edits, Open Differential Psychology. But with the dataset no longer public, the fate of the paper is subject to “internal discussions,” he told us.
The case raises important questions about when retractions are appropriate, and whether they can have a chilling effect on scientific discourse. Although Hanna Kokko of the University of Zurich, Switzerland — who co-authored both papers — agreed that the academic literature needed to be corrected, she didn’t want to retract the earlier paper; the journal imposed that course of action, said Kokko.
We’ve both been at conferences — Adam at the Society of Cardiovascular Anesthesiologists in Savannah, and Ivan at the Council of Science Editors in Baltimore, where he’ll be on a panel today about finding fraud — so we haven’t had a lot of time to run down retractions. But there were a few retraction-related developments in the past few days that we wanted to highlight for Retraction Watch readers:
First, another great investigation by The Cancer Letter and The New York Times, this one into the International Early Lung Cancer Action Program (I-ELCAP) run by Claudia Henschke and David Yankelevitz. The design and conclusions of that trial has been criticized by other pulmonologists.
The new investigation, however, reveals that an October 2008 review of the study found that the researchers couldn’t find 90 percent of the subjects’ consent forms. That, The Cancer Letter notes, could mean a huge number of retractions that could displace Joachim Boldt as the current record holder: Read the rest of this entry »
Monkey business? 2002 Cognition paper retracted as prominent psychologist Marc Hauser takes leave from Harvard
Marc Hauser, a prominent Harvard psychology researcher and author, will be taking a leave of absence from the university following “a lengthy internal investigation found evidence of scientific misconduct in his laboratory” that has led to the retraction of one of his papers, according to The Boston Globe.
The retraction, of a 2002 paper in Cognition, reads, in part: “An internal examination at Harvard University . . . found that the data do not support the reported findings. We therefore are retracting this article,” the Globe reports. It also includes the sentence “MH accepts responsibility for the error.”
The retraction notice does not yet appear anywhere on the journal’s site, where the PDF version of the study is still available, nor on the Medline abstract. Its circumstances appear to be atypical, according to the Globe: Read the rest of this entry »