Discussion Paper
No. 2017-78 | September 28, 2017
B. D. McCullough
Quis custodiet ipsos custodes?: Despite evidence to the contrary, the American Economic Review concluded that all was well with its archive
(Published in The practice of replication)

Abstract

In 2011, the Annual Report of the Editor of the American Economic Review reported that the journal’s data-code archive was functioning well, and made no changes in the archive rules. This was based on an audit of the archive that the editor has commissioned. The audit was performed by a graduate student who apparently had no experience with archives, and the audit concluded that all was largely well with the archive. In point of fact, all was not well with the archive: the archive did not support the publication of reproducible research. The rules for the archive should have been changed and were not; thus the American Economic Review continued to publish articles that were not reproducible. The cause of reproducible research was set back many years.

JEL Classification:

B40

Links

Cite As

[Please cite the corresponding journal article] B. D. McCullough (2017). Quis custodiet ipsos custodes?: Despite evidence to the contrary, the American Economic Review concluded that all was well with its archive. Economics Discussion Papers, No 2017-78, Kiel Institute for the World Economy. http://www.economics-ejournal.org/economics/discussionpapers/2017-78


Comments and Questions



Andrew C. Chang, Board of Governors of the Federal Reserve System, Washington DC, USA - Referee report 1
October 11, 2017 - 21:27
see attached file

B. D. McCullough - Reply to comments of referee 1
December 13, 2017 - 08:13
see attached file

Anonymous - Referee report 2
October 18, 2017 - 10:08
This paper provides a useful audit of the audit and draws some important lessons as journals begin to take data archives more seriously. I think the paper would be more effective, however, if it focused more on the lessons and less on the finger pointing. As noted in the paper, the AER has recognized the problems with its archive and is recruiting for a data editor. That renders the investigative reporting aspect of the paper moot. I also think that a shortened version of the paper would actually make the key points more effectively by allowing them to stand out more. Here are some specific suggestions. 1. I would motivate the paper with the current situation, i.e. that the current editors understand that the archive is not working and are seeking a data editor. The purpose of this paper is to draw some lessons from what happened in the past to help inform the path forward. While I appreciate the desire to call out those who should have known better, that motivation is likely to dissuade those who need to learn the lessons from paying attention to the paper.2. I think the repeated calling out of Glandon as a graduate student detracts from the larger messages. I am also not sure that the accusation about hiring someone with no experience with archives helps either. It is worth stating the Glandon, as a graduate student, had limited experience with the wide variety of datasets and code that should be included in the archive. But I would make it more about the limited breadth of his experience with data and code than pointing a finger at him as a graduate student or as having not worked with archives. I don’t see working with archives as being experience that would be expected of any economist, at least not at that time. The point is really that we need people who specialize in this so that going forward we can audit archives. You might suggest what kind of background or experience such a person should have. Again, the point is what we should do differently going forward more than making Glandon look bad for being a graduate student.3. Similarly, the article reads like an attack on Moffitt. Sure it is worth mentioning his name once or twice. But I don’t imagine any editor at that time would have done anything differently, and all the editors should hold some responsibility. Others besides Moffitt read Glandon’s report and didn’t object to Moffitt’s decision. You might use AER instead of Moffitt for some accusations, but also have fewer accusations and more recommendations.4. The paper needs to be better structured. Right now it is all over the place, which makes it hard to tease out the important concepts and recommendations. Also, the sprinkling of accusations throughout makes the paper sounds like investigative reporting and not an academic article. One suggested structure for the paper is: introduction, data vs. reproducibility (what the policies are and should be), definition of reproducibility (what tasks are included, how to interpret results), what happened in 2008 (and how differs from what you’ve laid out in the prior two sections), recommendations.5. I think the caveat of different versions of the software applies more often than we expect, making the binary rule hard to apply.6. I don’t see the Hoxby example as relevant to your discussion of reproducibility. You say in the first paragraph that the definition is “the data and code in the archive reproduce the published results”. Hoxby’s dataset would have included her coding of the variable. It was not a transformation using a software program on a dataset, it was a concept that she created and she created the data for it. I agree that her research is not replicable because of her data source, but that doesn’t mean that if you had her dataset and her code, you could reproduce the results. That is, a perfect archive would not address the measurement issue you raise. In general, it is important to make sure that your examples match the kind of replication (what you call reproducibility) that you are talking about. In general, there are some important concepts and some good discussion in this paper, but I recommend that the author organize the discussion in a more linear manner and focus on the facts and the recommendations and less on the finger pointing. The facts do speak for themselves in the sense that the Glandon report was not an appropriate audit and Moffitt should have know better.

B. D. McCullough - Reply to comments of referee 2
December 13, 2017 - 08:14
see attached file

Anonymous - Referee report 3
November 15, 2017 - 14:06
see attached file

B. D. McCullough - Reply to comments of referee 3
December 13, 2017 - 08:15
see attached file

W. Robert Reed - Decision letter
January 20, 2018 - 19:31
see attached file