Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-hc48f Total loading time: 0 Render date: 2024-12-23T15:29:52.645Z Has data issue: false hasContentIssue false

9 - Participant Recruitment

from Part II - The Building Blocks of a Study

Published online by Cambridge University Press:  25 May 2023

Austin Lee Nichols
Affiliation:
Central European University, Vienna
John Edlund
Affiliation:
Rochester Institute of Technology, New York
Get access

Summary

A strong participant recruitment plan is a major determinant of the success of human subjects research. The plan adopted by researchers will determine the kinds of inferences that follow from the collected data and how much it will cost to collect. Research studies with weak or non-existent recruitment plans risk recruiting too few participants or the wrong kind of participants to be able to answer the question that motivated them. This chapter outlines key considerations for researchers who are developing recruitment plans and provides suggestions for how to make recruiting more efficient.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

American Association for Public Opinion Research (2016). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, 9th ed. American Association for Public Opinion Research.Google Scholar
Andreadis, I. (2020). Text message (SMS) pre-notifications, invitations and reminders for web surveys. Survey Methods: Insights from the Field, Special Issue: Advancements in Online and Mobile Survey Methods. https://doi.org/10.11587/DX8NNEGoogle Scholar
Anseel, F., Lievens, F., Schollaert, E., & Choragwicka, B. (2010). Response rates in organizational science, 1995–2008: A meta-analytic review and guidelines for survey researchers. Journal of Business and Psychology, 25(3), 335349. https://doi.org/10.1007/s10869-010-9157-6Google Scholar
Antoun, C., Zhang, C., Conrad, F. G., & Schober, M. F. (2016). Comparisons of online recruitment strategies for convenience samples: Craigslist, Google AdWords, Facebook, and Amazon Mechanical Turk. Field Methods, 28(3), 231246. https://doi.org/10.1177/1525822X15603149Google Scholar
Arthur, W., Jr., Hagen, E., & George, F., Jr. (2021). The lazy or dishonest respondent: Detection and prevention. Annual Review of Organizational Psychology and Organizational Behavior, 8, 105137. https://doi.org/10.1146/annurev-orgpsych-012420-055324CrossRefGoogle Scholar
Bethlehem, J. (2009). Applied Survey Methods. A Statistical Perspective. John Wiley & Sons.Google Scholar
Boas, T. C., Christenson, D. P., & Glick, D. M. (2020). Recruiting large online samples in the United States and India: Facebook, Mechanical Turk, and Qualtrics. Political Science Research and Methods, 8(2), 232250. https://doi.org/10.1017/psrm.2018.28Google Scholar
Bosnjak, M., Tuten, T. L., & Wittmann, W. W. (2005). Unit (non) response in web‐based access panel surveys: An extended planned‐behavior approach. Psychology & Marketing, 22(6), 489505. https://doi.org/10.1002/mar.20070Google Scholar
Brenner, P. S., Cosenza, C., & Fowler, F. J., Jr. (2020). Which subject lines and messages improve response to e-mail invitations to web surveys? Field Methods, 32(4), 365382. https://doi.org/10.1177/1525822X20929647Google Scholar
Callegaro, M., Villar, A., Yeager, D., & Krosnick, J. A. (2014). A critical review of studies investigating the quality of data obtained with online panels based on probability and nonprobability samples. In Callegaro, M., Baker, R., Bethlehem, J., et al. (eds.), Online Panel Research: A Data Quality Perspective (pp. 23–53). John Wiley & Sons.Google Scholar
Cantrell, J., Bennett, M., Thomas, R. K., et al. (2018). It’s getting late: Improving completion rates in a hard-to-reach sample. Survey Practice, 11(2). https://doi.org/10.29115/SP-2018-0019Google Scholar
Chandler, J. J. & Paolacci, G. (2017). Lie for a dime: When most prescreening responses are honest but most study participants are impostors. Social Psychological and Personality Science, 8(5), 500508. https://doi.org/10.1177/1948550617698203Google Scholar
Conn, K. M., Mo, C. H., & Sellers, L. M. (2019). When less is more in boosting survey response rates. Social Science Quarterly, 100(4), 14451458. https://doi.org/10.1111/ssqu.12625Google Scholar
Coopersmith, J., Vogel, L. K., Bruursema, T., & Feeney, K. (2016). Effects of incentive amount and type of web survey response rates. Survey Practice, 9(1), 110.Google Scholar
Coppock, A. (2019). Generalizing from survey experiments conducted on Mechanical Turk: A replication approach. Political Science Research and Methods, 7(3), 613628. https://doi.org/10.1017/psrm.2018.10Google Scholar
Curran, P. G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66, 419. https://doi.org/10.1016/j.jesp.2015.07.006Google Scholar
Daikeler, J., Bošnjak, M., & Lozar Manfreda, K. (2020). Web versus other survey modes: An updated and extended meta-analysis comparing response rates. Journal of Survey Statistics and Methodology, 8(3), 513539. https://doi.org/10.1093/jssam/smz008Google Scholar
Daniel, J. (2011). Sampling Essentials: Practical Guidelines for Making Sampling Choices. SAGE Publications.Google Scholar
De Bruijne, M. & Wijnant, A. (2014). Improving response rates and questionnaire design for mobile web surveys. Public Opinion Quarterly, 78(4), 951962. https://doi.org/10.1093/poq/nfu046CrossRefGoogle Scholar
de Leeuw, E. D., Callegaro, M., Hox, J., Korendijk, E., & Lensvelt-Mulders, G. (2007). The influence of advance letters on response in telephone surveys. Public Opinion Quarterly, 71(3), 413443. https://doi.org/10.1093/poq/nfm014CrossRefGoogle Scholar
Dennis, M. L. (1991). Changing the conventional rules: Surveying homeless people in nonconventional locations. Housing Policy Debate, 2(3), 699732. https://doi.org/10.1080/10511482.1991.9521070Google Scholar
Desilver, D. (2013). Chart of the week: Americans on the move. Pew Research Center, November 22. Available at: www.pewresearch.org/fact-tank/2013/11/22/chart-of-the-week-americans-on-the-move/.Google Scholar
Devine, E. G., Waters, M. E., Putnam, M., et al. (2013). Concealment and fabrication by experienced research subjects. Clinical Trials, 10, 935948. https://doi.org/10.1177/1740774513492917Google Scholar
Dillman, D. A. (1978). Mail and Telephone Surveys: The Total Design Method. John Wiley & Sons.Google Scholar
Dillman, D. & Edwards, M. (2016). Designing a mixed mode survey. In Wolf, C., Joye, D., Smith, T., & Fu, Y.-C. (eds.), SAGE Handbook of Survey Methodology (pp. 255268). SAGE Publications.CrossRefGoogle Scholar
Dillman, D. (2017). The promise and challenge of pushing respondents to the Web in mixed-mode surveys. Survey Methodology, Statistics Canada, Catalogue No. 12-001-X, Vol. 43, No. 1. Available at: www.statcan.gc.ca/pub/12-001-x/2017001/article/14836-eng.htm.Google Scholar
Edwards, P. J., Roberts, I., & Clarke, M. J., et al. (2009). Methods to increase response to postal and electronic questionnaires. Cochrane Database of Systematic Reviews, MR000008. https://doi.org/10.1002/14651858.MR000008.pub4Google Scholar
Etikan, I., Musa, S. A., & Alkassim, R. S. (2016). Comparison of convenience sampling and purposive sampling. American Journal of Theoretical and Applied Statistics, 5(1), 14. https://doi.org/ 10.11648/j.ajtas.20160501.11Google Scholar
European Society for Opinion and Marketing Research (2012). 28 Questions to help buyers of online samples. Available at: https://swiss-insights.ch/wp-content/uploads/2020/05/ESOMAR-28-Questions-to-Help-Buyers-of-Online-Samples-September-2012.pdf.Google Scholar
Eyrich-Garg, K. M. & Moss, S. L. (2017). How feasible is multiple time point web-based data collection with individuals experiencing street homelessness? Journal of Urban Health, 94(1), 6474. https://doi.org/10.1007/s11524-016-0109-yGoogle Scholar
Gellar, J., Hughes, S., Delannoy, C., et al. (2020). Multilevel Regression with Poststratification for the Analysis of SMS Survey Data (No. c71d456bbf9f4026988e1a8107df4764). Mathematica Policy Research.Google Scholar
Göritz, A. S. (2006). Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1(1), 5870.Google Scholar
Göritz, A. S. (2014). Determinants of the starting rate and the completion rate in online panel studies. Online Panel Research: Data Quality Perspective, A, 154170. https://doi.org/10.1002/9781118763520.ch7Google Scholar
Goyder, J. (2019). The Silent Minority: Non-Respondents in Sample Surveys. Routledge.Google Scholar
Griggs, A. K., Smith, A. C., Berzofsky, M. E., et al. (2021). Examining the impact of a survey’s email timing on response latency, mobile response rates, and breakoff rates. Field Methods, March 30. https://doi.org/10.1177/1525822X21999160Google Scholar
Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56(4), 475495. https://doi.org/10.1086/269338CrossRefGoogle Scholar
Groves, R.M. & Heeringa, S. G. (2006). Responsive design for household surveys: Tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society: Series A, 169(3): 439457. https://doi.org/10.1111/j.1467-985X.2006.00423.xGoogle Scholar
Groves, R. M. & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: a meta-analysis. Public Opinion Quarterly, 72(2), 167189. https://doi.org/10.1093/poq/nfn011Google Scholar
Groves, R. M., Singer, E., & Corning, A. (2000). Leverage–saliency theory of survey participation: Description and an illustration. Public Opinion Quarterly, 64(3), 299308. https://www.jstor.org/stable/3078721Google Scholar
Haas, G. C., Trappmann, M., Keusch, F., Bähr, S., & Kreuter, F. (2020). Using geofences to collect survey data: Lessons learned from the IAB-SMART study. Survey Methods: Insights from the Field, December 10. https://doi.org/10.13094/SMIF-2020-00023Google Scholar
Hall, E. A., Zuniga, R., Cartier, J., et al. (2003). Staying in Touch: A Fieldwork Manual of Tracking Procedures for Locating Substance Abusers in Follow-Up Studies, 2nd ed. UCLA Integrated Substance Abuse ProgramsGoogle Scholar
Heckathorn, D. D. & Cameron, C. J. (2017). Network sampling: From snowball and multiplicity to respondent-driven sampling. Annual Review of Sociology, 43, 101119. https://doi.org/10.1146/annurev-soc-060116-053556Google Scholar
Jia, P., Furuya-Kanamori, L., Qin, Z. S., Jia, P. Y., & Xu, C. (2021). Association between response rates and monetary incentives in sample study: A systematic review and meta-analysis. Postgraduate Medical Journal, 97(1150), 501510. https://dx.doi.org/10.1136/postgradmedj-2020-137868Google Scholar
Lawes, M., Hetschko, C., Sakshaug, J. W., & Grießemer, S. (2021). Contact modes and participation in app-based smartphone surveys: Evidence from a large-scale experiment. Social Science Computer Review, March 11. https://doi.org/10.1177/0894439321993832Google Scholar
Levine, B., Krotki, K., & Lavrakas, P. J. (2019). Redirected inbound call sampling (RICS) telephone surveying via a new survey sampling paradigm. Public Opinion Quarterly, 83(2), 386411. https://doi.org/10.1093/poq/nfz024Google Scholar
Lindeman, N. (2019) What is the average survey response rate? Available at: https://surveyanyplace.com/average-survey-response-rate/.Google Scholar
Lindsay, J. (2005). Getting the numbers: The unacknowledged work in recruiting for survey research. Field Methods, 17(1), 119128. https://doi.org/10.1177/1525822X04271028Google Scholar
Liu, M. & Inchausti, N. (2017). Improving survey response rates: The effect of embedded questions in web survey email Invitations. Survey Practice, 10(1), 16. https://doi.org/10.29115/SP-2017-0005Google Scholar
Liu, M. & Wronski, L. (2018). Examining completion rates in web surveys via over 25,000 real-world surveys. Social Science Computer Review, 36(1), 116124. https://doi.org/10.1177/0894439317695581Google Scholar
MacDonald, S. (2021). The science behind email open rates (and how to get more people to read your emails). Available at: www.superoffice.com/blog/email-open-rates/.Google Scholar
Marcus, B. & Schütz, A. (2005). Who are the people reluctant to participate in research? Personality correlates of four different types of nonresponse as inferred from self‐and observer ratings. Journal of Personality, 73(4), 959984. https://doi.org/10.1111/j.1467-6494.2005.00335.xGoogle Scholar
Marcus, B., Bosnjak, M., Lindner, S., Pilischenko, S., & Schütz, A. (2007). Compensating for low topic interest and long surveys: a field experiment on nonresponse in web surveys. Social Science Computer Review, 25(3), 372383. https://doi.org/10.1177/0894439307297606CrossRefGoogle Scholar
Mavletova, A. & Couper, M. P. (2015). A meta-analysis of breakoff rates in mobile web surveys. In Mavletova, A. & Couper, M. P. (eds.), Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies (pp. 8198). Available at: www.jstor.org/stable/j.ctv3t5r9n.11.Google Scholar
Mazzone, J. & Pickett, J. (2011). The Household Diary Study: Mail Use & Attitudes in FY 2010. The United States Postal Service.Google Scholar
McClean, C. (2020) Most Americans don’t answer cellphone calls from unknown numbers. Pew Research Center, December 14. Available at: www.pewresearch.org/fact-tank/2020/12/14/most-americans-dont-answer-cellphone-calls-from-unknown-numbers/.Google Scholar
Medway, R. L. & Fulton, J. (2012). When more gets you less: A meta-analysis of the effect of concurrent web options on mail survey response rates. Public Opinion Quarterly, 76(4), 733746. https://doi.org/10.1093/poq/nfs047Google Scholar
Meng, X. L. (2018). Statistical paradises and paradoxes in big data (I): Law of large populations, big data paradox, and the 2016 US presidential election. Annals of Applied Statistics, 12(2), 685726. https://10.1214/18-AOAS1161SFGoogle Scholar
Mercer, A., Caporaso, A., Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79, 105129. https://doi.org/10.1093/poq/nfu059Google Scholar
Messer, B. L. & Dillman, D. A. (2011). Surveying the general public over the Internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75, 429457. https://doi.org/10.1093/poq/nfr021.Google Scholar
Mook, D. G. (1983). In defense of external invalidity. American Psychologist, 38 (4), 379387. https://doi.org/10.1037/0003-066X.38.4.379Google Scholar
Pasek, J. & Krosnick, J. A. (2010). Measuring intent to participate and participation in the 2010 census and their correlates and trends: Comparisons of RDD telephone and non-probability sample Internet survey data. Statistical Research Division of the US Census Bureau, 15, 2010.Google Scholar
Patton, M. Q. (2007). Sampling, qualitative (purposive). The Blackwell Encyclopedia of Sociology. John Wiley & Sons.Google Scholar
Petrovčič, A., Petrovčič, G., & Manfreda, K. L. (2016). The effect of email invitation elements on response rate in a web survey within an online community. Computers in Human Behavior, 56, 320329. https://doi.org/10.1016/j.chb.2015.11.025Google Scholar
Porter, S. R. & Whitcomb, M. E. (2003). The impact of contact type on web survey response rates. Public Opinion Quarterly, 67, 579588.Google Scholar
Preacher, K. J., Rucker, D. D., MacCallum, R. C., & Nicewander, W. A. (2005). Use of the extreme groups approach: a critical reexamination and new recommendations. Psychological Methods, 10(2), 178192. https://doi.org/10.1037/1082-989X.10.2.178Google Scholar
Rath, J. M., Williams, V. F., Villanti, A. C., et al. (2017). Boosting online response rates among nonresponders: a dose of funny. Social Science Computer Review, 35(5), 619632. https://doi.org/10.1177/0894439316656151Google Scholar
Revilla, M. (2017). Analyzing survey characteristics, participation, and evaluation across 186 surveys in an online opt-in panel in Spain. Methods, Data, Analyses, 11(2), 135162. https://doi.org/10.12758/mda.2017.02Google Scholar
Revilla, M. & Höhne, J. K. (2020). How long do respondents think online surveys should be? New evidence from two online panels in Germany. International Journal of Market Research, 62(5), 538545. https://doi.org/10.1177/1470785320943049Google Scholar
Reyes, G. (2020). Understanding nonresponse rates: Insights from 600,000 opinion surveys. The World Bank Economic Review, 34 (Supplement), S98S102. https://doi.org/10.1093/wber/lhz040Google Scholar
Rogelberg, S. G., Conway, J. M., Sederburg, M. E., et al. (2003). Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology, 88(6), 1104. https://doi.org/10.1037/0021-9010.88.6.1104Google Scholar
Sackett, P. R. & Yang, H. (2000). Correction for range restriction: An expanded typology. Journal of Applied Psychology, 85(1), 112118. https://doi.org/10.1037/0021-9010.85.1.112CrossRefGoogle ScholarPubMed
Sakshaug, J. W., Cernat, A., & Raghunathan, T. E. (2019). Do sequential mixed-mode surveys decrease nonresponse bias, measurement error bias, and total bias? An experimental study. Journal of Survey Statistics and Methodology, 7(4), 545571. https://doi.org/10.1093/jssam/smy024Google Scholar
Sánchez-Fernández, J., Muñoz-Leiva, F., & Montoro-Ríos, F. J. (2012). Improving retention rate and response quality in web-based surveys. Computers in Human Behavior, 28(2), 507514. https://doi.org/10.1016/j.chb.2011.10.023Google Scholar
Sauermann, H. & Roach, M. (2013). Increasing web survey response rates in innovation research: An experimental study of static and dynamic contact design features. Research Policy, 42(1), 273286. https://doi.org/10.1016/j.respol.2012.05.003Google Scholar
Schonlau, M. & Couper, M. P. (2017). Options for conducting web surveys. Statistical Science, 32(2), 279292. https://10.1214/16-STS597Google Scholar
Schouten, B., Peytchev, A., & Wagner, J. (2020). Adaptive Survey Design. Chapman and Hall/CRC Press.Google Scholar
Schumacher, S. & Kent, N. (2020). 8 charts on internet use around the world as countries grapple with COVID-19. Pew Research Center, April 2. Availabale at: www.pewresearch.org/fact-tank/2020/04/02/8-charts-on-internet-use-around-the-world-as-countries-grapple-with-covid-19/.Google Scholar
Shatz, I. (2017). Fast, free, and targeted: Reddit as a source for recruiting participants online. Social Science Computer Review, /35(4), 537549. https://doi.org/10.1177/0894439316650163Google Scholar
Shoemaker, P. J., Eichholz, M., & Skewes, E. A. (2002). Item nonresponse: Distinguishing between don‘t know and refuse. International Journal of Public Opinion Research, 14(2), 193201. https://doi.org/10.1093/ijpor/14.2.193Google Scholar
Simons, D. J., Shoda, Y., & Lindsay, D. S. (2017). Constraints on generality (COG): A proposed addition to all empirical papers. Perspectives on Psychological Science, 12(6), 11231128. https://doi.org/10.1177/1745691617708630Google Scholar
Singer, E. & Ye, C. (2013). The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112141. https://doi.org/10.1177/0002716212458082Google Scholar
Snowberg, E. & Yariv, L. (2021). Testing the waters: Behavior across participant pools. American Economic Review, 111(2), 687719. https://10.1257/aer.20181065Google Scholar
Springer, V., Martini, P., Lindsey, S., & Vezich, I. (2016). Practice based considerations for using multi-stage survey design to reach special populations on Amazon’s Mechanical Turk. Survey Practice, 9(5), 18. https://doi.org/10.29115/SP-2016-0029Google Scholar
Tourangeau, R. (2018). Choosing a mode of survey data collection. In Vannette, D. & Krosnick, J. (eds.), The Palgrave Handbook of Survey Research. Palgrave Macmillan. https://doi.org/10.1007/978-3-319-54395-6_7Google Scholar
Trouteaud, A. R. (2004). How you ask counts: A test of internet-related components of response rates to a web-based survey. Social Science Computer Review, 22, 385392. https://doi.org/10.1177/0894439304265650Google Scholar
Tuten, T.L., Galesic, M., & Bosnjak, M. (2004). Effects of immediate versus delayed notification of prize draw results on response behavior in web surveys: An experiment. Social Science Computer Review, 22, 377384. https://doi.org/10.1177/0894439304265640Google Scholar
Van Mol, C. (2017). Improving web survey efficiency: the impact of an extra reminder and reminder content on web survey response. International Journal of Social Research Methodology, 20 (4), 317327. https://doi.org/10.1080//13645579.2016.1185255Google Scholar
VerifyBee (2019). How to fix an invalid email address. VerifyBee, June 10. Available at: https://verifybee.com/how-to-fix-an-invalid-email-address.Google Scholar
Williams, D., Edwards, S., Giambo, P., & Kena, G. (2018). Cost effective mail survey design. In Proceedings of the Federal Committee on Statistical Methodology Research and Policy Conference, Washington, DC, December.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Participant Recruitment
  • Edited by Austin Lee Nichols, Central European University, Vienna, John Edlund, Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.010
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Participant Recruitment
  • Edited by Austin Lee Nichols, Central European University, Vienna, John Edlund, Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.010
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Participant Recruitment
  • Edited by Austin Lee Nichols, Central European University, Vienna, John Edlund, Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.010
Available formats
×