Another busy week at Retraction Watch. Here’s what was happening elsewhere on the web:
- “A shocking piece of statistics has been uncovered in a paper published in a respectable psychiatry journal,” writes Neuroskeptic.
- Elsevier has been hacked before. Now someone has created fake Elsevier phishing emails to lure unsuspecting scientists.
- “[E]xcessive regulations are wasting scientists’ time and taxpayers’ dollars,” according to the NSF’s policymaking body.
- And “ethics overload may be counterproductive to the successful achievement of research outcomes,” write a group of authors who appear to have misread instructions.
- But “the HHS Office for Human Research Protections (OHRP) opened just one investigation into allegations of violations of human subject protections in all of 2013,” the Report on Research Compliance found.
- The FDA is up to its old tricks again, trying to turn reporters into stenographers. (From our sister journal, Embargo Watch)
- In related embargo news, so to speak, Kent Anderson asks what the “news” associated with publishing a paper in a journal is.
- And Gary Schwitzer, founder of HealthNewsReview.org, provides a great guide to reading health care news stories.
- It’s all well and good to require that researchers report methods to foster reproducibility, but if you can’t tell which model organism or antibody they used, what’s the point?
- Data Colada takes apart the Jens Forster paper under scrutiny.
- RIKEN, a home to the STAP stem cell controversy, will review 20,000 papers. And F1000 Research posts a failed replication of the STAP cell experiments.
- “Between 0% and 94% of university students acknowledge having committed [various kinds of] academic fraud.”
- Plagiarism and faking it in journalism: The Jayson Blair story on PBS.
- Professors are prejudiced, too.
- Science writer extraordinaire Ed Yong in conversation with The Wall Street Journal‘s Robert Lee Hotz at NYU.
One way for scientists to feel less burdened by regulation would be for them to fund their own studies
out of their wallet. Less useless studies would benefit the public.
Difficult to see how that would work.
In a recent year I tabulated the supplies ordered for one of my projects. This is about 4 people full time (4 man-years) working in molecular biology. The total supplies was $119,000 per year. If we include floor charges (I think $12 per ft-sq per year) it comes out to about $143,000 per year.
So, if we all volunteer, get no healthcare, pension or other benefits, and incur no indirect costs or administrative charges, we would still need to come up with over $35,000 each.
I’m not entirely sure that would help improve the quality of science research.
Business does it all the time.
Startup entrepeneurs do it, not businesses. The problem is that the product that we academic scientists generate is knowledge, which doesn’t generate revenue. Good luck getting a bank loan. You could suggest that the scientists develop sellable products instead, but then it becomes a biotech company of which there are already too many.
Um, no.
Businesses who do research and development go to the government for funds just like academics.
Some companies set up research institutes, but these are wholly corporate vanity projects which have no connection with the business model of the firm.
Internal research for companies is exclusively product engineering and design. Before you can think of offering a product on the market, all of the technology must be fully matured and that maturation takes place elsewhere.
In other words, all the electronic devices in the latest iCrap had their origin 10-50 years ago in anonymous laboratories where their principles were worked out without any potential application in mind.
With respect to:
‘It’s all well and good to require that researchers report methods to foster reproducibility, but if you can’t tell which model organism or antibody they used, what’s the point?’
That state:-
“This information is meant to be documented in the ‘materials and methods’ of journal articles, but as many can attest, the information provided there is often not adequate for this task. Such a fundamental shortcoming costs time and resources, and prevents efficient turns of the research cycle whereby research findings are validated and extended toward new discoveries. It also prevents us from retrospectively tagging a resource as problematic or insufficient, should the research process reveal issues with a particular resource.”
I have been told to shorten my M&M section a number of times. I generally try to cite the original methods paper for any assay I use and have had complaints about this too. Writing out exactly how you do immunohistochemistry on a fixed/waxed sample takes quite a chunk of text, especially if you know where you can get false negatives.
The journals want short M&M sections and so that is what we have.
I also think that we should upload every gel used in a paper as a raw Tiff file.
I never get a single band with any of the things I am interested in.
Is it just glioma that has different intron splicing or is it a general property of cancer cells?
Buggered if I know, the literature only shows one band in almost all the papers, whereas the manufacturers gels shows dimers/tetramers and shortened versions.
I am looking at HiF 1alpha at the moment; I see a lovely band at 63KD, which is a 550 aa-ish splice variant. look in the literature and the gels only show a piece of the Gel from 95-115 KD.
How did we get to the stage where, as an example, this:-
http://www.nature.com/ncb/journal/v14/n2/images_article/ncb2424-f2.jpg
This is supposed to be evidence?
Each gel run is full of information and we get to see a tiny photoshopped sliver and the raw data is sitting in a folder somewhere and not in a supplementary data file.
Wow. I did not know it worked like that. It sounds like gel, and other stuff? has much bigger problems than the sort of stuff discussed at RW. Like beyond publishing, more like actual methodology ones. This is bad. Someone needs to be in charge of setting standards. And the journals not giving you space to include what need to to do a good job deserves profanity. Zounds. Yet there is a body of academic research on medical news stories. Sigh.
Weekend Silly Reads #4
http://i.imgur.com/x2bGxZO.png
Scientists are not overregulated. If anything, scientists are underregulated. The problem is that there is now an excess of regulations that are badly managed. Publishers should be called to attention to provide a more flexible platform that is able to deal with author and public complaints, especially regarding problems in their editor boards or journal content. There is an increasing lack of transparency and an increasing iron-fisted approach, often lacking in logic or openness. Worse yet, there is a problematic corporate culture of assigning blame as much as possible to authors to scape goat legal threats and so that responsibility is not assigned proportionately to editors and publishers who have, through poor infrastructure, been responsible for academic oversight. I think a key question that noone is asking is how do we effectively hold all parties accountable?
But “the HHS Office for Human Research Protections (OHRP) opened just one investigation into allegations of violations of human subject protections in all of 2013,” the Report on Research Compliance found.
Hey, guys, thanks for recognizing my analysis of OHRP activity, post-SUPPORT. I found my stats pretty shocking, too.
This is my 8th year writing Report on Research Compliance and covering OHRP. The story also updates a piece I wrote in 2011 when OHRP was already on the decline.
The reporters get stuff wrong, they don’t know what the words mean, ‘melodramatic drivel’ alone is a basic problem. Media criticism has some excellent tools, not used in this paper. Including knowing news stories usually follow weird genre convenients. Medical/illness ones must be uplifting. New things are always Breakthroughs! The latest research is THE best and Changes Everything! If there was ever a topic that called out for the case study approach over the statistical! Also it’s MDs who participate in the hypey news stories. A better process for examining medical news stories would not have doctors involved.