Hostname: page-component-586b7cd67f-2brh9 Total loading time: 0 Render date: 2024-11-23T01:56:22.579Z Has data issue: false hasContentIssue false

Publication Biases in Replication Studies

Published online by Cambridge University Press:  16 November 2020

Adam J. Berinsky
Affiliation:
Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA, USA. Email: [email protected], [email protected]
James N. Druckman*
Affiliation:
Department of Political Science, Northwestern University, Evanston, IL, 60208. Email: [email protected]
Teppei Yamamoto
Affiliation:
Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA, USA. Email: [email protected], [email protected]
*
Corresponding author James N. Druckman

Abstract

One of the strongest findings across the sciences is that publication bias occurs. Of particular note is a “file drawer bias” where statistically significant results are privileged over nonsignificant results. Recognition of this bias, along with increased calls for “open science,” has led to an emphasis on replication studies. Yet, few have explored publication bias and its consequences in replication studies. We offer a model of the publication process involving an initial study and a replication. We use the model to describe three types of publication biases: (1) file drawer bias, (2) a “repeat study” bias against the publication of replication studies, and (3) a “gotcha bias” where replication results that run contrary to a prior study are more likely to be published. We estimate the model’s parameters with a vignette experiment conducted with political science professors teaching at Ph.D. granting institutions in the United States. We find evidence of all three types of bias, although those explicitly involving replication studies are notably smaller. This bodes well for the replication movement. That said, the aggregation of all of the biases increases the number of false positives in a literature. We conclude by discussing a path for future work on publication biases.

Type
Article
Copyright
© The Author(s) 2020. Published by Cambridge University Press on behalf of the Society for Political Methodology

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Edited by Jeff Gill

References

Anderson, C. J. et al. 2016. “Response to Comment on ‘Estimating the Reproducibility of Psychological Science’.” Science 351:10371039CrossRefGoogle Scholar
Baker, M. 2016. “Is There a Reproducibility Crisis?Nature 533:452455.CrossRefGoogle Scholar
Berinsky, A. J., Druckman, J. N., Yamamoto, T.. 2020. “Replication Data for: Publication Biases in Replication Studies.” https://doi.org/10.7910/DVN/BJMZNR, Harvard Dataverse, V1.CrossRefGoogle Scholar
Bollen, K., Cacioppo, J. T., Kaplan, R. M., Krosnick, J. A., Olds, J. L., and Dean, H.. 2015. “Social, Behavioral, and Economic Sciences Perspectives on Robust and Reliable Science.” Report of the Subcommittee on Replicability in Science Advisory Committee to the National Science Foundation Directorate for Social, Behavioral, and Economic Sciences.Google Scholar
Brown, A. W., Mehta, T. S., and Allison, D. B.. 2017. “Publication Bias in Science: What is it, Why is it Problematic, and How can it be Addressed?” In The Oxford Handbook of the Science of Communication, edited by Jamieson, K. H., Kahan, D., and Scheufele, D. A., 93101. Oxford: Oxford University Press.Google Scholar
Camerer, C. F. et al. 2016. “Evaluating Replicability of Laboratory Experiments in Economics.” Science 351:14331436.CrossRefGoogle ScholarPubMed
Camerer, C. F. et al. 2018. “Evaluating the Replicability of Social Science Experiments in Nature and Science Between 2010 and 2015.” Nature Human Behavior 2:637644.CrossRefGoogle ScholarPubMed
Christensen, G., Freese, J., and Miguel, E.. 2019. Transparent and Reproducible Social Science Research: How to Do Open Science. Oakland, CA: University of California Press.Google Scholar
Coffman, L. C., Niederle, M., and Wilson, A. J.. 2017. “A Proposal to Organize and Promote Replications.” American Economic Review 107:4145.CrossRefGoogle Scholar
Coppock, A. 2019. “Generalizing from Survey Experiments Conducted on Mechanical Turk: A Replication Approach.” Political Science Research and Methods 7:613628.CrossRefGoogle Scholar
Coppock, A., Leeper, T. J., and Mullinix, K. J.. 2018. “Generalizability of Heterogeneous Treatment Effect Estimates across Samples.” Proceedings of the National Academy of Sciences 115:1244112446.CrossRefGoogle ScholarPubMed
Druckman, J. N., and Green, D. P.. 2021. “A New Era of Experimental Political Science.” In Advances in Experimental Political Science, edited by Druckman, J. N., and Green, D. P.. New York: Cambridge University Press.CrossRefGoogle Scholar
Dumas-Mallet, E., Smith, A., Boraud, T., and Gonon, F.. 2017. “Poor Replication Validity of Biomedical Association Studies Reported by Newspapers.” PLoS One 12:e0172650.CrossRefGoogle ScholarPubMed
Dunning, T., Grossman, G., Humphreys, M., Hyde, S., McIntosh, C., and Nellis, G.. 2019 . Information, Accountability, and Cumulative Learning: Lessons from Metaketa I. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Fanelli, D. 2018. “Is Science Really Facing a Reproducibility Crisis, and do We Need it to?Proceedings of the National Academy of Sciences 115:26282631.CrossRefGoogle ScholarPubMed
Fanelli, D., Costas, R., and Ioannidis, J. P. A.. 2017. “Meta-Assessment of Bias in Science.” Proceedings of the National Academy of Sciences 114: 37143719.CrossRefGoogle Scholar
Findley, M. G., Jensen, N. M., Malesky, E. J., and Pepinsky, T. B.. 2016. “Can Results-Free Review Reduce Publication Bias?: The Results and Implications of a Pilot Study.” Comparative Political Studies 49:16671703.CrossRefGoogle Scholar
Franco, A., Malhotra, N., and Simonovits, G.. 2014. “Publication Bias in the Social Sciences: Unlocking the File Drawer.” Science 345:15021505.CrossRefGoogle ScholarPubMed
Franco, A., Malhotra, N., and Simonovits, G.. 2015. “Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results.” Political Analysis 23:306312.CrossRefGoogle Scholar
Franco, A., Malhotra, N., and Simonovits, G.. 2016. “Underreporting in Psychology Experiments: Evidence from a Study Registry.” Social Psychological and Personality Science 7:812.CrossRefGoogle Scholar
Freese, J., and Peterson, D.. 2017. “Replication in Social Science.” Annual Review of Sociology 43:147165.CrossRefGoogle Scholar
Gerber, A. S., Green, D. P., and Nickerson, D.. 2000. “Testing for Publication Bias in Political Science.” Political Analysis 9:385392.CrossRefGoogle Scholar
Gerber, A. S., and Malhotra, N.. 2008. “Publication Bias in Empirical Sociological Research: Do Arbitrary Significance Levels Distort Published Results?Sociological Methods & Research 37:330.CrossRefGoogle Scholar
Gerber, A. S., Malhotra, N., Dowling, C. M., and Doherty, D.. 2010. “Publication Bias in Two Political Behavior Literatures.” American Politics Research 38:591613.CrossRefGoogle Scholar
Gilbert, D. T., King, G., Pettigrew, S., and Wilson, T. D.. 2016. “Comment on Estimating the Reproducibility of Psychological Science.” Science 351:1037.CrossRefGoogle ScholarPubMed
Graham, M. H., Huber, G. A., Malhotra, N., and Mo, C. H.. 2020. “Irrelevant Events and Voting Behavior: Replications Using Principles from Open Science.” Working Paper, Yale University.CrossRefGoogle Scholar
Hainmueller, J., Hangartner, D., and Yamamoto, T.. 2015. “Validating Vignette and Conjoint Survey Experiments Against Real-World Behavior.” Proceedings of the National Academy of Sciences 112:23952400.CrossRefGoogle ScholarPubMed
Horiuchi, Y., Markovich, Z., and Yamamoto, T.. 2020. “Does Conjoint Analysis Mitigate Social Desirability Bias?” Working Paper, Massachusetts Institute of Technology.CrossRefGoogle Scholar
Ioannidis, J. P. A., and Trikalinos, T. A.. 2005. “Early Extreme Contradictory Estimates May Appear in Published Research: The Proteus Phenomenon in Molecular Genetics Research and Randomized Trials.” Journal of Clinical Epidemiology 58:543549.CrossRefGoogle ScholarPubMed
Janz, N., and Freese, J.. n.d.Replicate Others as You Would Like to Be Replicated Yourself.” Political Science and Politics, forthcoming.Google Scholar
Lupia, A., and Elman, C.. 2014. “Openness in Political Science: Data Access and Research Transparency.” PS: Political Science & Politics 47:1942.Google Scholar
Malhotra, N. 2021. “The Scientific Credibility of Experiments.” In Advances in Experimental Political Science, edited by Druckman, J. N. and Green, D. P.. New York: Cambridge University Press.Google Scholar
Martin, G. N., and Clarke, R. M.. 2017. “Are Psychology Journals Anti-replication? A Snapshot of Editorial Practices.” Frontiers in Psychology 8:16.CrossRefGoogle ScholarPubMed
Monogan, J. E. 2015. “Research Preregistration in Political Science: The Case, Counterarguments, and a Response to Critiques.” PS: Political Science & Politics 48:425429.Google Scholar
Mullinix, K. J., Leeper, T. J., Druckman, J. N., and Freese, J.. 2015. “The Generalizability of Survey Experiments.” Journal of Experimental Political Science 2:109138.CrossRefGoogle Scholar
Mummolo, J., and Peterson, E.. 2019. “Demand Effects in Survey Experiments: An Empirical Assessment.” American Political Science Review 113:517529.CrossRefGoogle Scholar
Nosek, B. A. et al. 2015. “Promoting an Open Research Culture.” Science 348:14221425.CrossRefGoogle ScholarPubMed
Nosek, B. A., Ebersole, C. R., DeHaven, A. C., and Mellor, D. T.. 2018. “The Preregistration Revolution.” Proceedings of the National Academy of Sciences 115:26002606.CrossRefGoogle ScholarPubMed
Open Science Collaboration. 2015. “Estimating the Reproducibility of Psychological Science.” Science 349:aac4716-1–aac4716-8.Google Scholar
Rosenthal, R. 1979. “The File Drawer Problem and Tolerance for Null Results.” Psychological Bulletin 86:638641.CrossRefGoogle Scholar
Teele, D. L., and Thelen, K.. 2017. “Gender in the Journals: Publication Patterns in Political Science.” PS: Political Science & Politics 50:433447.Google Scholar
Supplementary material: Link

Berinsky et al. Dataset

Link
Supplementary material: PDF

Berinsky et al. supplementary material

Berinsky et al. supplementary material

Download Berinsky et al. supplementary material(PDF)
PDF 284.3 KB