Hostname: page-component-586b7cd67f-t7fkt Total loading time: 0 Render date: 2024-12-01T02:48:06.074Z Has data issue: false hasContentIssue false

Developing an Online Synthetic Validation Tool

Published online by Cambridge University Press:  07 January 2015

Rodney A. McCloy*
Affiliation:
Human Resources Research Organization
Dan J. Putka
Affiliation:
Human Resources Research Organization
Robert E. Gibby
Affiliation:
Procter & Gamble
*
E-mail: [email protected], Address: Human Resources Research Organization (HumRRO), 10503 Timberwood Circle, Suite 101, Louisville, KY 40223

Extract

In their comprehensive apologetic treatment of synthetic validity, Johnson et al. (2010) echo Hough (2001), advocating development of a central synthetic validation database, which would serve as a repository of validity information to support future synthetic validation efforts. They offer two potential approaches for developing such a database. The first entails the conduct of “a large-scale study in which tests are administered to and performance ratings gathered on incumbents in a large number of jobs in a variety of organizations.” The authors consider this approach to be “ideal but impractical,” largely because of the scope and cost of the data collection required and the resulting investment required by any sponsoring organization. The second entails the conduct of multiple local studies to generate empirical estimates of relationships between measures of various predictor constructs and a standardized set of job components. Johnson et al. consider this approach more practical, citing Meyer, Dalal, and Bonaccio (2009) as a benchmark example.

Type
Commentaries
Copyright
Copyright © Society for Industrial and Organizational Psychology 2010 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

*

Human Resources Research Organization (HumRRO)

**

Procter & Gamble.

References

Hough, L. M. (2001). I/Owes its advances to personality. In Roberts, B.W., & Hogan, R.T. (Eds.), The intersection of personality and industrial/organizational psychology (pp. 1944). Washington, DC: American Psychological Association.Google Scholar
Johnson, J. W., Steel, P., Scherbaum, C. A., Hoffman, C. A., Jeanneret, P. R., & Foster, J. (2010). Validation is like motor oil: Synthetic is better. Industrial and Organizational Psychology, 3, 305328.Google Scholar
Meyer, R. D., Dalal, R. S., & Bonaccio, S. (2009). A meta-analytic investigation into the moderating effects of situational strength on the conscientiousness–performance relationship. Journal of Organizational Behavior, 30, 10771102.Google Scholar
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill.Google Scholar
Oswald, F. L., & McCloy, R. A. (2003). Meta-analysis and the art of the average. In Murphy, K. R. (Ed.), Validity generalization: A critical review (pp. 311338). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., & Flesihman, E. A. (1995). Development of prototype Occupational Information Network (O*NET) content model. Volume I: Report [and] Volume II: Appendices. Washington, DC: American Institutes for Research.Google Scholar
Peterson, N. G., Wise, L. L., Arabian, J., & Hoffman, R. G. (2001). Synthetic validation and validity generalization: When empirical validation is not possible. In Campbell, J. P., & Knapp, D. J. (Eds.), Exploring the limits of personnel selection and classification (pp. 411451). Mahwah, NJ: Erlbaum.Google Scholar