Hostname: page-component-cd9895bd7-mkpzs Total loading time: 0 Render date: 2024-12-18T22:11:58.232Z Has data issue: false hasContentIssue false

In Defense of HARKing

Published online by Cambridge University Press:  06 March 2018

Jeffrey B. Vancouver*
Affiliation:
Department of Psychology, Ohio University
*
Correspondence concerning this article should be addressed to Jeffrey B. Vancouver, Ohio University, Department of Psychology, 200 Porter Hall, Athens, OH 45701. E-mail: [email protected]

Extract

Science is a complex task. It involves the creation and dissemination of knowledge. The creation of knowledge requires identifying and abstracting patterns (i.e., identifying phenomena and theorizing about the processes that bring it about), as well as systematically observing to better see and quantify the patterns (e.g., effect size estimating) or assess the validity of the abstractions used to explain the patterns (i.e., theory testing). To help (a) hone in on what observations would be useful and (b) communicate what the patterns mean, we are encouraged to develop and report hypotheses. That is, strategically, hypotheses facilitate the planning of data collection by helping the researcher understand what patterns need to be observed to assess the merit of an explanation. Meanwhile, tactically, hypotheses help focus the audience on the crucial patterns needed to answer a question or test a theory. When the strategic hypotheses are not supported, it raises a question regarding what to do tactically. Depending on the result (i.e., different direction; null), one might construct a hypothesis to facilitate dissemination without reporting this post hoc construction or remove mention of a hypothesis altogether. This practice is called HARKing (i.e., hypothesizing after results are known). HARKing has been so disparaged as to be considered a “detrimental research practice” (Grand et al., 2018, p. 6). As such, the Society for Industrial and Organizational Psychology's (SIOP) Robust and Reliability Science task force appears to be recommending that HARKing not be taught by educators, encouraged by reviewers or editors, or practiced by authors. I do not agree with those recommendations, and I elaborate on my position below.

Type
Commentaries
Copyright
Copyright © Society for Industrial and Organizational Psychology 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Bosco, F. A., Aguinis, H., Field, J. G., Pierce, C. A., & Dalton, D. R. (2016). HARKing's threat to organizational research: Evidence from primary and meta-analytic sources. Personnel Psychology, 69, 709750.Google Scholar
Grand, J. A., Rogelberg, S. G., Allen, T. A., Landis, R. S., Reynolds, D. H., Scott, J. C., . . . Truxillo, D. M. (2018). A systems-based approach to fostering robust science in industrial-organizational psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice, 11 (1), 442.Google Scholar
Hollenbeck, J. R., & Wright, P. M. (2017). Harking, sharking, and tharking: Making the case for post hoc analysis of scientific data. Journal of Management, 43, 518.CrossRefGoogle Scholar
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524532.Google Scholar
Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality & Social Psychology Review, 2, 196217.CrossRefGoogle ScholarPubMed
Runkel, P. J., & McGrath, J. E. (1972). Research on human behavior: A systematic guide to method. New York: Holt, Rinehart, and Winston.Google Scholar
Vancouver, J. B. (2010). Improving I-O science through synthetic validity. Industrial and Organizational Psychology: Perspectives on Science and Practice, 3, 360362.CrossRefGoogle Scholar