Hostname: page-component-78c5997874-mlc7c Total loading time: 0 Render date: 2024-11-03T01:48:59.753Z Has data issue: false hasContentIssue false

Improving I-O Science Through Synthetic Validity

Published online by Cambridge University Press:  07 January 2015

Jeffrey B. Vancouver*
Affiliation:
Ohio University
*
E-mail: [email protected], Address: Department of Psychology, Ohio University, Athens, OH 45701

Extract

The purpose of this comment is to advocate for the development of a synthetic validity database along the lines described by Johnson et al. (2010), not only because of its practical utility but also because of its potential contribution to the science of industrial and organizational (I-O) psychology generally. Specifically, one of the core scientific pursuits of psychology is the understanding of individual differences (IDs). Knowledge of the context in which specific IDs matter (i.e., in terms of level of behaviors exhibited) can provide information about the nature of those IDs. For example, if the degree to which a job requires continuous learning of symbolic material is related to the predictive validity of general cognitive ability (g; i.e., learning context moderates the g–performance relationship), it is likely that g has something to do with the acquisition of new knowledge. Alternatively, or additionally, if the degree to which a job requires making decisions among large numbers of multi-attribute options is related to the validity of g as a predictor of job performance, it implies that g involves information processing capacity (and perhaps it is via this mechanism that g is related to learning). That is, evidence of the moderator effects of context (i.e., conditions on which jobs or tasks vary) on ID–performance relationships provides information about the nature of the ID construct, as well as whether the ID construct is likely to be predictive of performance in any particular context. This is another reason to develop a database that indicates what context variables predict the validities of ID constructs.

Type
Commentaries
Copyright
Copyright © Society for Industrial and Organizational Psychology 2010 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Ackerman, P. L. (1988). Determinants of individual differences during skill acquisition: Cognitive abilities and information processing. Journal of Experimental Psychology: General, 117, 288318.Google Scholar
Drasgow, F. (2003). Intelligence and the workplace. In Borman, W. C., Ilgen, D. R., & Klimoski, R. J. (Eds.), Handbook of psychology: Industrial and organizational psychology (Vol. 12, pp. 107130). Hoboken, NJ: Wiley.Google Scholar
Harvey, R. J. (1991). Job analysis. In Dunnette, M. D., & Hough, L. M. (Eds.), Handbook of industrial and organizational psychology (2nd ed., Vol. 2, pp. 71163). Palo Alto, CA: Consulting Psychologists Press.Google Scholar
Johnson, J. W., Steel, P., Scherbaum, C. A., Hoffman, C. C., Jeanneret, P. R., & Foster, J. (2010). Validation is like motor oil: Synthetic is better. Industrial and Organizational Psychology: Perspectives on Science and Practice, 3, 305328 Google Scholar
Keil, C. T., & Cortina, J. M. (2001). Degradation of validity over time: A test and extension of Ackerman's model. Psychological Bulletin, 127, 673697.Google Scholar
Meyer, R. D., Dalal, R. S., & Bonaccio, S. (2009). A meta-analytic investigation into the moderating effects of situational strength on the conscientiousness-performance relationship. Journal of Organizational Behavior, 30, 10771102.Google Scholar
Steel, P., & Kammeyer-Mueller, J. (2009). Using a meta-analytic perspective to enhance job component validation. Personnel Psychology, 62, 533552.Google Scholar