Editor won’t investigate data concerns about paper linking anti-prostitution laws to increased rape

After reading an economics paper that claimed to document an increase in the rate of rape in European countries following the passage of prostitution bans, a data scientist had questions. 

The scientist, who wishes to remain anonymous, sent a detailed email to an editor of the Journal of Law and Economics, which had published the paper last November, outlining concerns about the data and methods the authors used. 

Among them: the historical rates of rape recorded in the paper did not match the values in the official sources the authors said they used. In other cases, data that were available from the official sources were missing in the paper, the researchers didn’t incorporate all the data they had collected into their model, and a variable was coded inconsistently, the data scientist wrote. (We’ve made the full critique available here.)

Given the consequences the conclusions of the article could have for people in the sex industry, the data scientist wrote, “I hope that someone takes this very seriously and looks into it the [sic] validity of the analysis and the data they used.” 

In response, Sam Peltzman, an editor of the journal and a professor emeritus of economics at the University of Chicago’s Booth School of Business, instructed the data scientist to contact the authors of the article: 

The email raises serious questions but without any specific request. Your questions can better be answered by the authors than editors who, as you must know, cannot give each submission the kind of careful attention reflected in your email. Accordingly, we ask that you contact the authors directly if you have not already done so.

If you mean the email as a prologue to a critique, I am happy to discuss our relevant policies or any other question about our editorial process.

The data scientist wrote back with a specific request: 

I have just informed you, the editor, that it appears that the authors made an error in at least one of their models that resulted in a substantive difference in the conclusions of the article you edited … I am requesting you investigate if these models are correct and if so, at very least issue a correction. [emphasis original]

In response, Peltzman reiterated his refusal to investigate: 

I can only repeat what was in my last letter. You should take this up with the authors first. The editors cannot become involved unless your conversation with the authors fails to resolve the issues and a comment is received through the usual submission process.           

The University of Chicago Press, which publishes the Journal of Law and Economics, states on its publication ethics page that

When notified of possible errors or corrections, the editor(s) of the journal will review and resolve them in consultation with the Press and according to the Press’s best practices. 

We asked Peltzman why he refused to investigate the concerns the data scientist had raised. He told us:  

The JLE does not have the resources to investigate concerns about data procedure used by authors. 

We select referees knowledgeable about the topic of any submission.  Occasionally a referee might comment on some detail of data used by authors.  more often the referee and editors have to take data details at face value and focus their efforts on evaluating empirical results and analysis.  While I can only speak for the JLE it is my impression that these procedures are common among economics journals that publish empirical articles.

Peltzman also explained that the journal’s standard procedure for considering critiques of published articles, “designed to avoid misunderstanding and excessive burden on editors’ and referees’ time,” starts with the critic contacting the authors directly. 

If the authors don’t respond, or if their response is unsatisfactory, the critic could then submit a comment to the journal along with their correspondence with the authors, which the editors would handle as any other submission. 

“Editors obviously cannot be expected to look at raw data for every paper they review,” the data scientist acknowledged, “but when concerns are brought directly to them it is their responsibility to take them seriously. If readers can’t trust that editors will address serious concerns appropriately, it will undermine their faith in the scientific process.”  

We contacted the authors of the paper, Huasheng Gao and Vanya Stefanova Petrova of Fudan University’s Fanhai International School of Finance in Shanghai, and shared the data scientist’s critique. They responded with an 11-page PDF, available here, standing by their work. 

About the differences between the data and their paper and the official sources, they said: 

the data we have used in the paper were the most up-to-date data available at the time we started the empirical work in 2018 … Eurostat is constantly revising its data. It is possible that the data contained in its current version are different from the historical version

The data scientist was unimpressed, and noted that the authors had not responded to a key aspect of the critique: 

Even if the authors believe it was a reasonable strategy to only assess two years post policy change, the relative year variable for year 2— the year in which they identified a large causal increase in rape in the criminalized prostitution countries and a reduction in the prostitution decriminalized countries — was coded incorrectly (or differently for some reason). When the coding is consistent with their original coding scheme, a reduction in rape is seen in the criminalized prostitution group. I’m not sure why they didn’t address this in their response.

The authors also did not directly respond to the data scientist’s concern that if they had incorporated every year of data they had on rape rates into their model, instead of only the two years following a change in prostitution laws, they would not have gotten the same results, the scientist said. 

To check whether data values had indeed changed since the authors started their work, the scientist went to the website of the University of Michigan’s Institute for Social Research, where the survey data the authors used is available for download, and found that no substantive changes had been made. 

The scientist told us: 

If they did something wrong or made a mistake they should just take accountability and retract the article.

Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on Twitter, like us on Facebook, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at [email protected].

2 thoughts on “Editor won’t investigate data concerns about paper linking anti-prostitution laws to increased rape”

  1. I can count at least 15 similar examples, where I reported such inconsistencies or clear data fabrication, but journal editors tried their best to brush the stuff under the carpet.

  2. Shame on the Journal of Law and Economics! Of course Editors aren’t expected to check all raw data in article submissions but if they have reason to believe data was fabricated in an article they published they should investigate. What a joke of a journal.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.