Hostname: page-component-586b7cd67f-rcrh6 Total loading time: 0 Render date: 2024-11-29T02:21:24.122Z Has data issue: false hasContentIssue false

Verify original results through reanalysis before replicating

Published online by Cambridge University Press:  27 July 2018

Michèle B. Nuijten
Affiliation:
Department of Methodology and Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, 5037 AB, Tilburg, The Netherlands. [email protected][email protected]@[email protected]//mbnuijten.comhttp://marjanbakker.euhttps//www.tilburguniversity.edu/webwijs/show/e.maassen/http://jeltewicherts.net
Marjan Bakker
Affiliation:
Department of Methodology and Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, 5037 AB, Tilburg, The Netherlands. [email protected][email protected]@[email protected]//mbnuijten.comhttp://marjanbakker.euhttps//www.tilburguniversity.edu/webwijs/show/e.maassen/http://jeltewicherts.net
Esther Maassen
Affiliation:
Department of Methodology and Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, 5037 AB, Tilburg, The Netherlands. [email protected][email protected]@[email protected]//mbnuijten.comhttp://marjanbakker.euhttps//www.tilburguniversity.edu/webwijs/show/e.maassen/http://jeltewicherts.net
Jelte M. Wicherts
Affiliation:
Department of Methodology and Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, 5037 AB, Tilburg, The Netherlands. [email protected][email protected]@[email protected]//mbnuijten.comhttp://marjanbakker.euhttps//www.tilburguniversity.edu/webwijs/show/e.maassen/http://jeltewicherts.net

Abstract

In determining the need to directly replicate, it is crucial to first verify the original results through independent reanalysis of the data. Original results that appear erroneous and that cannot be reproduced by reanalysis offer little evidence to begin with, thereby diminishing the need to replicate. Sharing data and scripts is essential to ensure reproducibility.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Agnoli, F., Wicherts, J. M., Veldkamp, C. L. S., Albiero, P. & Cubelli, R. (2017) Questionable research practices among Italian research psychologists. PLoS One 12(3):e0172792. Available at: http://doi.org/10.1371/journal.pone.0172792.Google Scholar
Bakker, M., van Dijk, A. & Wicherts, J. M. (2012) The rules of the game called psychological science. Perspectives on Psychological Science 7(6):543–54. Available at: http://doi.org/10.1177/1745691612459060.Google Scholar
Bakker, M. & Wicherts, J. M. (2011) The (mis)reporting of statistical results in psychology journals. Behavior Research Methods 43(3):666–78. Available at: http://doi.org/10.3758/s13428-011-0089-5.Google Scholar
Bakker, M. & Wicherts, J. M. (2014) Outlier removal and the relation with reporting errors and quality of research. PLoS One 9(7):e103360. Available at: http://doi.org/10.1371/journal.pone.0103360.Google Scholar
Brown, N. J. L. & Heathers, J. A. J. (2017) The GRIM test: A simple technique detects numerous anomalies in the reporting of results in psychology. Social Psychological and Personality Science 8(4):363–69. Available at: http://doi.org/10.1177/1948550616673876.Google Scholar
Ebrahim, S., Sohani, Z. N., Montoya, L., Agarwal, A., Thorlund, K., Mills, E. J. & Ioannidis, J. P. (2014) Reanalysis of randomized clinical trial data. Journal of the American Medical Association 312(10):1024–32. Available at: http://doi.org/10.1001/jama.2014.9646.Google Scholar
Epskamp, S. & Nuijten, M. B. (2016) statcheck: Extract statistics from articles and recompute p-values. Available at: https://cran.r-project.org/web/packages/statcheck/ (R package version 1.2.2).Google Scholar
Giofrè, D., Cumming, G., Fresc, L., Boedker, I. & Tressoldi, P. (2017) The influence of journal submission guidelines on authors' reporting of statistics and use of open research practices. PLoS One 12(4):e0175583. Available at: http://doi.org/10.1371/journal.pone.0175583.Google Scholar
Ioannidis, J. P., Allison, D. B., Ball, C. A., Coulibaly, I., Cui, X., Culhane, A. C., Falchi, M., Furlanello, C., Game, L., Jurman, G., Mangion, J., Mehta, T., Nitzburg, M., Page, G. P., Petretto, E. & van Noort, V. (2009) Repeatability of published microarray gene expression analyses. Nature Genetics 41(2):149–55.Google Scholar
John, L. K., Loewenstein, G. & Prelec, D. (2012) Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science 23(5):524–32.Google Scholar
Kidwell, M. C., Lazarevic, L. B. Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L.-S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C., Errington, T. M., Fiedler, S. & Nosek, B. A. (2016) Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS Biology 14(5):e1002456. Available at: http://doi.org/10.1371/journal.pbio.1002456.Google Scholar
Nuijten, M. B., Borghuis, J., Veldkamp, C. L. S., Dominguez-Alvarez, L., Van Assen, M. A. L. M. & Wicherts, J. M. (2017) Journal data sharing policies and statistical reporting inconsistencies in psychology. Collabra: Psychology 3(1):122. Available at: http://doi.org/10.1525/collabra.102.Google Scholar
Nuijten, M. B., Hartgerink, C. H. J., Van Assen, M. A. L. M., Epskamp, S. & Wicherts, J. M. (2016) The prevalence of statistical reporting errors in psychology (1985–2013). Behavior Research Methods 48(4):1205–26. Available at: http://doi.org/10.3758/s13428-015-0664-2.Google Scholar
Petrocelli, J., Clarkson, J., Whitmire, M. & Moon, P. (2012) When abc – c′: Published errors in the reports of single-mediator models. Behavior Research Methods 45(2):595601. Available at: http://doi.org/10.3758/s13428-012-0262-5Google Scholar
Royal Netherlands Academy of Arts and Sciences (2018) Replication studies. Improving reproducibility in the empirical sciences. KNAW.Google Scholar
Schönbrodt, F. D. (2018). p-checker: One-for-all p-value analyzer. Available at: http://shinyapps.org/apps/p-checker/.Google Scholar
Simmons, J. P., Nelson, L. D. & Simonsohn, U. (2011) False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science 22:1359–66. Available at: http://doi.org/10.1177/0956797611417632.Google Scholar
Vanpaemel, W., Vermorgen, M., Deriemaecker, L. & Storms, G. (2015) Are we wasting a good crisis? The availability of psychological research data after the storm. Collabra 1(1):15. Available at: http://doi.org/10.1525/collabra.13.Google Scholar
Veldkamp, C. L. S., Nuijten, M. B., Dominguez-Alvarez, L., van Assen, M. A. L. M. & Wicherts, J. M. (2014) Statistical reporting errors and collaboration on statistical analyses in psychological science. PLoS One 9(12):e114876. Available at: http://doi.org/10.1371/journal.pone.0114876.Google Scholar
Wicherts, J. M., Bakker, M. & Molenaar, D. (2011) Willingness to share research data is related to the strength of the evidence and the quality of reporting of statistical results. PLoS One 6(11):e26828. Available at: http://doi.org/10.1371/journal.pone.0026828.Google Scholar
Wicherts, J. M., Borsboom, D., Kats, J. & Molenaar, D. (2006) The poor availability of psychological research data for reanalysis. American Psychologist 61:726–28. Available at: http://doi.org/10.1037/0003-066X.61.7.726.Google Scholar