Hostname: page-component-78c5997874-dh8gc Total loading time: 0 Render date: 2024-11-05T11:31:00.187Z Has data issue: false hasContentIssue false

Standard Operating Procedures: A Safety Net for Pre-Analysis Plans

Published online by Cambridge University Press:  15 July 2016

Winston Lin
Affiliation:
Columbia University
Donald P. Green
Affiliation:
Columbia University

Abstract

Across the social sciences, growing concerns about research transparency have led to calls for pre-analysis plans (PAPs) that specify in advance how researchers intend to analyze the data they are about to gather. PAPs promote transparency and credibility by helping readers distinguish between exploratory and confirmatory analyses. However, PAPs are time-consuming to write and may fail to anticipate contingencies that arise in the course of data collection. This article proposes the use of “standard operating procedures” (SOPs)—default practices to guide decisions when issues arise that were not anticipated in the PAP. We offer an example of an SOP that can be adapted by other researchers seeking a safety net to support their PAPs.

Type
The Profession
Copyright
Copyright © American Political Science Association 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Anderson, Michael L. 2008. “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” Journal of the American Statistical Association 103 (484): 14811495.CrossRefGoogle Scholar
Angrist, Joshua D., Imbens, Guido W., and Rubin, Donald B.. 1996. “Identification of Causal Effects Using Instrumental Variables.” Journal of the American Statistical Association 91 (434): 444455.Google Scholar
Bidwell, Kelly, Casey, Katherine, and Glennerster, Rachel. 2015. “The Impact of Voter Knowledge Initiatives in Sierra Leone.” AEA RCT Registry. https://www.socialscienceregistry.org/trials/26.Google Scholar
Brodeur, Abel, , Mathias, Sangnier, Marc, and Zylberberg, Yanos. 2016. “Star Wars: The Empirics Strike Back.” American Economic Journal: Applied Economics 8 (1): 132.Google Scholar
Casey, Katherine, Glennerster, Rachel, and Miguel, Edward. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan.” Quarterly Journal of Economics 127 (4): 17551812.Google Scholar
Chambers, Christopher D., Feredoes, Eva, Muthukumaraswamy, Suresh D., and Etchells, Peter J.. 2014. “Instead of ‘Playing the Game’ It Is Time to Change the Rules: Registered Reports at AIMS Neuroscience and Beyond.” AIMS Neuroscience 1 (1): 417.Google Scholar
Chan, An-Wen, Tetzlaff, Jennifer M., Gøtzsche, Peter C., Altman, Douglas G., Mann, Howard, Berlin, Jesse A., Dickersin, Kay, Hróbjartsson, Asbjørn, Schulz, Kenneth F., Parulekar, Wendy R., Krleža-Jeric, Karmela, Laupacis, Andreas, and Moher, David. 2013. “SPIRIT 2013 Explanation and Elaboration: Guidance for Protocols of Clinical Trials.” BMJ 346: e7586.Google Scholar
Efron, Bradley. 2010. Large-Scale Inference: Empirical Bayes Methods for Estimation, Testing, and Prediction. New York: Cambridge University Press.Google Scholar
Franco, Annie, Malhotra, Neil, and Simonovits, Gabor. 2014. “Publication Bias in the Social Sciences: Unlocking the File Drawer.” Science 345 (6203): 15021505.Google Scholar
Franco, Annie, Malhotra, Neil, and Simonovits, Gabor. 2016. “Underreporting in Psychology Experiments: Evidence from a Study Registry.” Social Psychological and Personality Science 7 (1): 812.Google Scholar
Freedman, David A. 2008. “Oasis or Mirage?” Chance 21 (1): 5961.Google Scholar
Freedman, David A. 2010. “Survival Analysis: An Epidemiological Hazard?” In Statistical Models and Causal Inference: A Dialogue with the Social Sciences, ed. Collier, David, Sekhon, Jasjeet S., and Stark, Philip B., 169192. New York: Cambridge University Press.Google Scholar
Gerber, Alan and Malhotra, Neil. 2008. “Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals.” Quarterly Journal of Political Science 3 (3): 313326.Google Scholar
Humphreys, Macartan, Sanchez de la Sierra, Raul, and van der Windt, Peter. 2013. “Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration.” Political Analysis 21 (1): 120.Google Scholar
Lin, Winston, Green, Donald P., and Coppock, Alexander. 2015. “Standard Operating Procedures for Don Green’s Lab at Columbia.” https://github.com/acoppock/Green-Lab-SOP.Google Scholar
McKenzie, David. 2012. “A Pre-Analysis Plan Checklist.” World Bank. http://blogs.worldbank.org/impactevaluations/a-pre-analysis-plan-checklist.Google Scholar
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., Glennerster, R., Green, D. P., Humphreys, M., Imbens, G., Laitin, D., Madon, T., Nelson, L., Nosek, B. A., Petersen, M., Sedlmayr, R., Simmons, J. P., Simonsohn, U., and van der Laan, M.. 2014. “Promoting Transparency in Social Science Research.” Science 343 (6166): 3031.Google Scholar
Monogan, James E. III. 2013. “A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections.” Political Analysis 21 (1): 2137.Google Scholar
Monogan, James E. III. 2015. “Research Preregistration in Political Science: The Case, Counterarguments, and a Response to Critiques.” PS: Political Science and Politics 48 (03): 425429.Google Scholar
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., Ishiyama, J., Karlan, D., Kraut, A., Lupia, A., Mabry, P., Madon, T., Malhotra, N., Mayo-Wilson, E., McNutt, M., Miguel, E., Paluck, E. L., Simonsohn, U., Soderberg, C., Spellman, B. A., Turitto, J., VandenBos, G., Vazire, S., Wagenmakers, E. J., Wilson, R., and Yarkoni, T.. 2015. “Promoting an Open Research Culture.” Science 348 (6242): 14221425.Google Scholar
Nyhan, Brendan. 2015. “Increasing the Credibility of Political Science Research: A Proposal for Journal Reforms.” PS: Political Science and Politics 48 (S1): 7883.Google Scholar
O’Donoghue, Ted and Rabin, Matthew. 2001. “Choice and Procrastination.” Quarterly Journal of Economics 116 (1): 121160.CrossRefGoogle Scholar
Olken, Benjamin A. 2015. “Promises and Perils of Pre-Analysis Plans.” Journal of Economic Perspectives 29 (3): 6180.CrossRefGoogle Scholar
Open Science Collaboration. 2015. “Estimating the Reproducibility of Psychological Science.” Science 349: aac4716.Google Scholar
Rosenthal, Robert. 1979. “The ‘File Drawer Problem’ and Tolerance for Null Results.” Psychological Bulletin 86 (3): 638641.Google Scholar
Rubin, Donald B. 2007. “The Design versus the Analysis of Observational Studies for Causal Effects: Parallels with the Design of Randomized Trials.” Statistics in Medicine 26 (1): 2036.Google Scholar
Simmons, Joseph P., Nelson, Leif D., and Simonsohn, Uri. 2011. “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” Psychological Science 22 (11): 13591366.Google Scholar
Tukey, J. W. 1993. “Tightening the Clinical Trial.” Controlled Clinical Trials 14 (4): 266285.Google Scholar
Westfall, Peter H., Tobias, Randall D., and Wolfinger, Russell D.. 2011. Multiple Comparisons and Multiple Tests Using SAS. 2nd ed. Cary, NC: SAS Institute.Google Scholar