About these ads

Retraction Watch

Tracking retractions as a window into the scientific process

Archive for the ‘unreliable findings’ Category

Dr. Oz: Following green coffee bean diet retraction, site scrubbed, “further study is needed”

with 4 comments

green coffee beanOn Monday, we were first to report that a study of green coffee bean extract for weight loss touted on the Dr. Oz Show had been retracted.

That story has been widely picked up by the media, including The Washington Post, which yesterday reported that the show had posted a statement about the development: Read the rest of this entry »

About these ads

Authors retract green coffee bean diet paper touted by Dr. Oz

with 20 comments

green coffee beanTwo authors of a 2012 paper sponsored by a company that made grand claims about green coffee bean extract’s abilities to help people lose weight have retracted it. The study was cited by The Dr. Oz Show, and last month it cost the company a $3.5 million settlement with the Feds.

Here’s the notice for “Randomized, double-blind, placebo-controlled, linear dose, crossover study to evaluate the efficacy and safety of a green coffee bean extract in overweight subjects,” a paper originally published in Diabetes, Metabolic Syndrome and Obesity: Targets and Therapy: Read the rest of this entry »

How meta: Paper on errors retracted for “too many stupid mistakes”

with 4 comments

Measurement-in-Physical-Education-and-Exercise-ScienceA paper published in Measurement in Physical Education and Exercise Science has been retracted for statistical and typographical mistakes.

Here’s the notice for “Comparing Measurement Error Between Two Different Methods of Measurement of Various Magnitudes”: Read the rest of this entry »

Cell line switch sinks PLoS ONE cancer paper

with 4 comments

plosWe’ve written before about how common cell line mix ups are in cancer research; according to a 2012 Wall Street Journal article (paywalled), between a fifth and a third of cancer cell lines tested by suspicious researchers turned out to be misidentified.

Obviously, mistakenly studying the wrong kind of cancer is a waste of precious resources, both time and money. And it’s clear the problem hasn’t gone away. PLoS ONE just retracted a cancer paper originally published in December 2012 for studying two cell lines that had been contaminated by other cell types.

Here’s the notice for “Epithelial Mesenchymal Transition Is Required for Acquisition of Anoikis Resistance and Metastatic Potential in Adenoid Cystic Carcinoma”:
Read the rest of this entry »

Data questions prompt retraction of PLOS ONE cardiovascular paper

with 2 comments

plosonePLoS One has retracted a 2013 article on atherosclerosis in mice over concerns about the integrity of the data.

The paper, “The Effect of Soluble RAGE on Inhibition of Angiotensin II-Mediated Atherosclerosis in Apolipoprotein E Deficient Mice,” came from a group of researchers in South Korea.

It purported to show that: Read the rest of this entry »

Another Nature stem cell paper is retracted

with 8 comments

nature 73014Another stem cell paper has been retracted from Nature, this one a highly cited 2008 study that had already been the subject of what the journal’s news section called a “furore” in 2010.

According to that 2010 news story:

The researchers behind the original work1, led by Thomas Skutella of the University of Tübingen, reported using cells from adult human testes to create pluripotent stem cells with similar properties to embryonic stem cells.

But a 2010 Brief Communication Arising called those findings into question. And now, the authors have retracted the paper. Here’s the notice for “Generation of pluripotent stem cells from adult human testis:” Read the rest of this entry »

Written by Ivan Oransky

August 1, 2014 at 10:00 am

The camel doesn’t have two humps: Programming “aptitude test” canned for overzealous conclusion

with 9 comments

From Larry Summers to James Watson, certain scientists have a long and questionable tradition of using “data” to make claims about intelligence and aptitude.

So it’s no surprise that, when well-known computer scientist Richard Bornat claimed his PhD student had created a test to separate people who would succeed at programming versus those who didn’t, people happily embraced it. After all, it’s much easier to say there’s a large population that will just never get it, instead of re-examining your teaching methods.

The paper, called “The camel has two humps,” suggested instead of a bell curve, programming success rates look more like a two-humped ungulate: the kids who get it, and the kids who never will.

Though the paper was never formally published, it made the rounds pretty extensively. Now, Bornat has published a retraction, stating that he wrote the article during an antidepressant-driven mania that also earned him a suspension from his university. Here’s the meat of the notice: Read the rest of this entry »

Written by Cat Ferguson

July 18, 2014 at 8:30 am

Follow

Get every new post delivered to your Inbox.

Join 35,971 other followers