Hostname: page-component-586b7cd67f-t7fkt Total loading time: 0 Render date: 2024-11-22T21:34:04.522Z Has data issue: false hasContentIssue false

The lack of robust evidence for cleansing effects

Published online by Cambridge University Press:  18 February 2021

Ivan Ropovik
Affiliation:
Charles University, Institute for Research and Devlopment of Education, Prague, Czechia, [email protected] University of Presov, Faculty of Education, Presov, Slovakia08001
Alessandro Sparacio
Affiliation:
Université Grenoble Alpes, Laboratoire Inter-universitaire de Psychologie, Grenoble, France, [email protected] [email protected] Swansea University, Department of Psychology, SwanseaSA2 8PP, UK
Hans IJzerman
Affiliation:
Université Grenoble Alpes, Laboratoire Inter-universitaire de Psychologie, Grenoble, France, [email protected] [email protected] Institute Universitaire de France, Paris, France, 75231.

Abstract

The pattern of data underlying the successful replications of cleansing effects is improbable and most consistent with selective reporting. Moreover, the meta-analytic approach presented by Lee and Schwarz is likely to find an effect even if none existed. Absent more robust evidence, there is no need to develop a theoretical account of grounded procedures.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Becker, T. E. (2005b). Potential problems in the statistical control of variables in organizational research: A qualitative analysis with recommendations. Organizational Research Methods, 8(3), 274289.CrossRefGoogle Scholar
Carter, E. C., Schönbrodt, F. D., Gervais, W. M., & Hilgard, J. (2019). Correcting for bias in psychology: A comparison of meta-analytic methods. Advances in Methods and Practices in Psychological Science, 2(2), 115144. https://doi.org/10.1177/2515245919847196.CrossRefGoogle Scholar
Ferguson, C. J., & Heene, M. (2012). A vast graveyard of undead theories: Publication bias and psychological science's aversion to the null. Perspectives on Psychological Science, 7(6), 555561.CrossRefGoogle ScholarPubMed
Friese, M., & Frankenbach, J. (2019). p-hacking and publication bias interact to distort meta-analytic effect size estimates. Psychological Methods. 10.1037/met0000246. Advance online publication.Google ScholarPubMed
IJzerman, H., Hadi, R., Coles, N., Sarda, E., Klein, R. A., & Ropovik, I. (2020). Social thermoregulation: A meta-analysis. Unpublished manuscript.Google Scholar
McShane, B. B., Böckenholt, U., & Hansen, K. T. (2016). Adjusting for publication bias in meta-analysis: An evaluation of selection methods and some cautionary notes. Perspectives on Psychological Science, 11(5), 730749. https://doi.org/10.1177/1745691616662243.CrossRefGoogle ScholarPubMed
Morey, R. D. (2013). The consistency test does not – and cannot – deliver what is advertised: A comment on Francis (2013). Journal of Mathematical Psychology, 57(5), 180183.Google Scholar
Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014b). p-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143(2), 534547. https://doi.org/10.1037/a0033242.CrossRefGoogle Scholar
Sparacio, A., Ropovik, I., Jiga-Boy, G. M., Forscher, P. S., Paris, B., & IJzerman, H. (2021). Stress regulation via self-administered mindfulness and biofeedback interventions in adults: A pre-registered meta-analysis. Unpublished manuscript.Google Scholar
Stanley, T. D., & Doucouliagos, C. H. (2014). Meta-regression approximations to reduce publication selection bias. Research Synthesis Methods, 5(1), 6078. doi: 10.1002/jrsm.1095.CrossRefGoogle ScholarPubMed