Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-mkpzs Total loading time: 0 Render date: 2024-12-21T15:26:27.324Z Has data issue: false hasContentIssue false

10 - Survey Research

from Part III - Deep Dives on Methods and Tools for Testing Your Question of Interest

Published online by Cambridge University Press:  12 December 2024

Harry T. Reis
Affiliation:
University of Rochester, New York
Tessa West
Affiliation:
New York University
Charles M. Judd
Affiliation:
University of Colorado Boulder
Get access

Summary

Survey research is a method commonly used to understand what members of a population think, feel, and do. This chapter uses the total survey error perspective and the fitness for use perspective to explore how biasing and variable errors occur in surveys. Coverage error and sample frames, nonprobability samples and web panels, sampling error, nonresponse rates and nonresponse bias, and sources of measurement error are discussed. Different pretesting methods and modes of data collection commonly used in surveys are described. The chapter concludes that survey research is a tool that social psychologists may use to improve the generalizability of studies, to evaluate how different populations react to different experimental conditions, and to understand patterns in outcomes that may vary over time, place, or people.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2024

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Alwin, D. F. (2007). Margins of Error: A Study of Reliability in Survey Measurement. John Wiley & Sons.CrossRefGoogle Scholar
Amaya, A., Biemer, P. P., and Kinyon, D. (2020). Total error in a big data world: Adapting the TSE framework to big data. Journal of Survey Statistics and Methodology, 8(1), 89119.CrossRefGoogle Scholar
American Association for Public Opinion Research. (2016). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, 9th ed. American Association for Public Opinion Research, www.aapor.org/AAPOR_Main/media/publications/Standard-Definitions20169theditionfinal.pdf.Google Scholar
American National Election Studies (2021). The Anes Guide to Public Opinion and Electoral Behavior (August 16, 2021, version), https://electionstudies.org/resources/anes-guide.Google Scholar
Ansolabehere, S., and Schaffner, B. F. (2014). Does survey mode still matter? Findings from a 2010 multi-mode comparison. Political Analysis, 22(3), 285303.CrossRefGoogle Scholar
Antoun, C., Zhang, C., Conrad, F. G., and Schober, M. F. (2016). Comparisons of online recruitment strategies for convenience samples: Craigslist, Google AdWords, Facebook, and Amazon Mechanical Turk. Field Methods, 28(3), 231246.CrossRefGoogle Scholar
Baker, R., Brick, J. M., Bates, N. A., Battaglia, M., Couper, M. P., Dever, J. A., Gile, K. J., and Tourangeau, R. (2013). Summary report of the AAPOR task force on non-probability sampling. Journal of Survey Statistics and Methodology, 1(2), 90143.CrossRefGoogle Scholar
Battaglia, M. P., Dillman, D. A., Frankel, M. R., Harter, R., Buskirk, T. D., McPhee, C. B., DeMatteis, J. M., and Yancey, T. (2016). Sampling, data collection, and weighting procedures for address-based sample surveys. Journal of Survey Statistics and Methodology, 4(4), 476500.CrossRefGoogle Scholar
Biemer, P. P., and Lyberg, L. E. (2003). Introduction to Survey Quality. John Wiley & Sons, Inc.CrossRefGoogle Scholar
Blom, A. G., Bosnjak, M., Cornilleau, A., Cousteaux, A.-S., Das, M., Douhou, S., and Krieger, U. (2016). A comparison of four probability-based online and mixed-mode panels in Europe. Social Science Computer Review, 34(1), 825.CrossRefGoogle Scholar
Blumberg, S. J., and Luke, J. V. (2022). Wireless Substitution: Early Release of Estimates from the National Health Interview Survey, July–December 2021, U.S. Department of Health and Human Services; Centers for Disease Control and Prevention; National Center for Health Statistics.CrossRefGoogle Scholar
Bradburn, N. M., Sudman, S., and Wansink, B. (2004). Asking Questions: The Definitive Guide to Questionnaire Design – for Market Research, Political Polls, and Social and Health Questionnaires. Jossey-Bass.Google Scholar
Brick, J. M., and Williams, D. (2013). Explaining rising nonresponse rates in cross-sectional surveys. Annals of the American Academy of Political and Social Science, 645(1), 3659.CrossRefGoogle Scholar
Buhrmester, M. D., Talaifar, S., and Gosling, S. D. (2018). An evaluation of Amazon’s Mechanical Turk, its rapid rise, and its effective use. Perspectives on Psychological Science, 13(2), 149154.CrossRefGoogle ScholarPubMed
Bureau of Labor Statistics, B. (2022). Household and establishment survey response rates (December 1, 2022), www.bls.gov/osmr/response-rates/home.htm.Google Scholar
Callegaro, M., and DiSogra, C. (2009). Computing response metrics for online panels. Public Opinion Quarterly, 72(5), 10081032.CrossRefGoogle Scholar
Chyung, S. Y., Barkin, J. R., and Shamsy, J. A. (2018). Evidence-based survey design: The use of negatively worded items in surveys. Performance Improvement, 57(3), 1625.CrossRefGoogle Scholar
Cornesse, C., Blom, A. G., Dutwin, D., Krosnick, J. A., de Leeuw, E. D., Legleye, S., Pasek, J., Pennay, D., Phillips, B., Sakshaug, J. W., Struminskaya, B., and Wenz, A. (2020). A review of conceptual approaches and empirical evidence on probability and nonprobability sample survey research. Journal of Survey Statistics and Methodology, 8(1), 436.CrossRefGoogle Scholar
Davern, M., Bautista, R., Freese, J., Morgan, S. L., and Smith, T. W. (2021). General Social Survey 2021 Cross-section (machine-readable data file (68,846 cases) and one codebook (506 pages)).Google Scholar
de Leeuw, E. (2005). To mix or not to mix data collection modes in surveys [review article]. Journal of Official Statistics, 21(2), 233255.Google Scholar
Dijkstra, W. (2016). Sequence Viewer (version 6.1), www.sequenceviewer.nl/index.html.Google Scholar
Dillman, D. A., Smyth, J. D., and Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. John Wiley & Sons.CrossRefGoogle Scholar
DiSogra, C., and Callegaro, M. (2016). Metrics and design tool for building and evaluating probability-based online panels. Social Science Computer Review, 34(1), 2640.CrossRefGoogle Scholar
Ditonto, T. M., Lau, R. R., and Sears, D. O. (2013). AMPing racial attitudes: Comparing the power of explicit and implicit racism measures in 2008. Political Psychology, 34(4), 487510.CrossRefGoogle Scholar
Dutwin, D., and Buskirk, T. D. (2020). Telephone sample surveys: Dearly beloved or nearly departed? Trends in survey errors in the era of declining response rates. Journal of Survey Statistics and Methodology, 9(3), 353380.CrossRefGoogle Scholar
Dykema, J., Garbarski, D., Wall, I. F., and Edwards, D. F. (2019). Measuring trust in medical researchers: adding insights from cognitive interviews to examine agree–disagree and construct-specific survey questions. Journal of Official Statistics, 35(2), 353386.CrossRefGoogle ScholarPubMed
Dykema, J., Schaeffer, N. C., Garbarski, D., Assad, N., and Blixt, S. (2022). Towards a reconsideration of the use of agree–disagree questions in measuring subjective evaluations. Research in Social and Administrative Pharmacy, 18(2), 23352344.CrossRefGoogle ScholarPubMed
English, N., Kennel, T., Buskirk, T., and Harter, R. (2018). The construction, maintenance, and enhancement of address-based sampling frames. Journal of Survey Statistics and Methodology, 7(1), 6692.CrossRefGoogle Scholar
Fields, J. F., Hunter-Childs, J., Tersine, A., Sisson, J., Parker, E., Velkoff, V., Logan, C., and Shin, H. B. (2020). Design and Operation of the 2020 Household Pulse Survey, U.S. Census Bureau.Google Scholar
Fowler, F. J., and Mangione, T. W. (1990). Standardized Survey Interviewing: Minimizing Interviewer-Related Error. Sage Publications.CrossRefGoogle Scholar
Gallup (2022). Employee Engagement, www.gallup.com/394373/indicator-employee-engagement.aspx (retrieved November 15, 2022).Google Scholar
Geisen, E., and Bergstrom, J. R. (2017). Usability Testing for Survey Research. Morgan Kaufmann Publishers.Google Scholar
Graesser, A. C., Cai, Z., Louwerse, M. M., and Daniel, F. (2006). Question understanding aid (QUAID): A web facility that helps survey methodologists improve the comprehensibility of questions. Public Opinion Quarterly, 70, 322.CrossRefGoogle Scholar
Groves, R. M. (1989). Survey Errors and Survey Costs. John Wiley & Sons, Inc.CrossRefGoogle Scholar
Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 646675.CrossRefGoogle Scholar
Groves, R. M., and Couper, M. (1998). Nonresponse in Household Interview Surveys. John Wiley & Sons, Inc.CrossRefGoogle Scholar
Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., and Tourangeau, R. (2004). Survey Methodology. John Wiley & Sons, Inc.Google Scholar
Groves, R. M., and Heeringa, S. G. (2006). Responsive design for household surveys: Tools for actively controlling survey nonresponse and costs. Journal of the Royal Statistical Society, 169(3), 439457.CrossRefGoogle Scholar
Groves, R. M., and Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72(2), 167189.CrossRefGoogle Scholar
Groves, R. M., Singer, E., and Corning, A. (2000). Leverage-saliency theory of survey participation: Description and an illustration. Public Opinion Quarterly, 64(3), 299308.CrossRefGoogle Scholar
Kennedy, C., and Hartig, H. (2019). Response rates in telephone surveys have resumed their decline, www.pewresearch.org/fact-tank/2019/02/27/response-rates-in-telephone-surveys-have-resumed-their-decline.Google Scholar
Kennedy, C., Mercer, A., Keeter, S., Hatley, N., McGeeney, K., and Gimenez, A. (2016). Evaluating Online Nonprobability Surveys (May 2, 2016). Pew Research Center, www.pewresearch.org/methods/2016/05/02/evaluating-online-nonprobability-surveys (retrieved December 7, 2018).Google Scholar
Keusch, F., and Conrad, F. G. (2021). Using smartphones to capture and combine self-reports and passively measured behavior in social research. Journal of Survey Statistics and Methodology, 10(4), 863885.CrossRefGoogle Scholar
Kish, L. (1965). Survey Sampling. John Wiley & Sons, Inc.Google Scholar
Kosinski, M., Matz, S. C., Gosling, S. D., Popov, V., and Stillwell, D. (2015). Facebook as a research tool for the social sciences: Opportunities, challenges, ethical considerations, and practical guidelines. American Psychologist, 70(6), 543556.CrossRefGoogle ScholarPubMed
Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5, 213236.CrossRefGoogle Scholar
Krysan, M., and Couper, M. P. (2003). Race in the live and the virtual interview: Racial deference, social desirability, and activation effects in attitude surveys. Social Psychology Quarterly, 66(4), 364383.CrossRefGoogle Scholar
Lau, A., and Kennedy, C. (2019). When Online Survey Respondents Only “Select Some That Apply,” www.pewresearch.org/methods/2019/05/09/when-online-survey-respondents-only-select-some-that-apply.Google Scholar
Lavrakas, P. J., Traugott, M. W., Kennedy, C., Holbrook, A., de Leeuw, E. D., and West, B. T. (eds.) (2019). Experimental Methods in Survey Research: Techniques That Combine Random Sampling with Random Assignment. John Wiley & Sons, Inc.CrossRefGoogle Scholar
Lenzner, T. (2014). Are readability formulas valid tools for assessing survey question difficulty? Sociological Methods & Research, 43(4), 677698.CrossRefGoogle Scholar
Little, R. J. A., and Rubin, D. B. (2002). Statistical Analysis with Missing Data. John Wiley & Sons, Inc.CrossRefGoogle Scholar
Lynn, P. (2003). PEDAKSI: Methodology for collecting data about survey non-respondents. Quality & Quantity, 37, 239261.CrossRefGoogle Scholar
McCarthy, J. (2022). Same-sex marriage support inches up to new high of 71%, https://news.gallup.com/poll/393197/same-sex-marriage-support-inches-new-high.aspx.Google Scholar
Maitland, A., and Presser, S. (2016). How accurately do different evaluation methods predict the reliability of survey questions? Journal of Survey Statistics and Methodology, 4(3), 362381.CrossRefGoogle Scholar
Maitland, A., and Presser, S. (2018). How do question evaluation methods compare in predicting problems observed in typical survey conditions? Journal of Survey Statistics and Methodology, 6(4), 465490.CrossRefGoogle Scholar
National Center for Education Statistics. (2022). National Assessment of Educational Progress (NAEP), 2019 and 2022 Reading Assessments, www.nationsreportcard.gov/reading/survey-questionnaires/?grade=4.Google Scholar
National Center for Health Statistics. (2022). National Health Interview Survey, 2021 Survey Description. Hyattsville, Maryland Centers for Disease Control and Prevention, National Center for Health Statistics, https://ftp.cdc.gov/pub/Health_Statistics/NCHS/Dataset_Documentation/NHIS/2021/srvydesc-508.pdf.Google Scholar
Olson, K. (2010). An examination of questionnaire evaluation by expert reviewers. Field Methods, 22(4), 295318.CrossRefGoogle Scholar
Olson, K., and Bilgen, I. (2011). The role of interviewer experience on acquiescence. Public Opinion Quarterly, 75(1), 99114.CrossRefGoogle Scholar
Olson, K., Lepkowski, J. M., and Garabrant, D. H. (2011). An experimental examination of the content of persuasion letters on nonresponse rates and survey estimates in a nonresponse follow-up study. Survey Research Methods, 5(1), 2126.Google Scholar
Olson, K., and Peytchev, A. (2007). Effect of interviewer experience on interview pace and interviewer attitudes. Public Opinion Quarterly, 71, 273286.CrossRefGoogle Scholar
Olson, K., Smyth, J. D., Dykema, J., Holbrook, A. L., Kreuter, F., and West, B. T. (eds.) (2020). Interviewer Effects from a Total Survey Error Perspective. CRC Press.CrossRefGoogle Scholar
Olson, K., Smyth, J. D., and Ganshert, A. (2018). The effects of respondent and question characteristics on respondent answering behaviors in telephone interviews. Journal of Survey Statistics and Methodology, 7(2), 275–308.CrossRefGoogle Scholar
Olson, K., Smyth, J. D., Horwitz, R., Keeter, S., Lesser, V., Marken, S., Mathiowetz, N. A., McCarthy, J. S., O’Brien, E., Opsomer, J. D., Steiger, D., Sterrett, D., Su, J., Suzer-Gurtekin, Z. T., Turakhia, C., and Wagner, J. (2021). Transitions from telephone surveys to self-administered and mixed-mode surveys: AAPOR task force report. Journal of Survey Statistics and Methodology, 9(3), 381411.CrossRefGoogle Scholar
Olson, K., Smyth, J. D., and Kirchner, A. (2020). The effect of question characteristics on question reading behaviors in telephone surveys. Journal of Survey Statistics and Methodology, 8(4), 636666.CrossRefGoogle Scholar
Olson, K., Wagner, J., and Anderson, R. (2020). Survey costs: Where are we and what is the way forward? Journal of Survey Statistics and Methodology, 9(5), 921942.CrossRefGoogle Scholar
O’Muircheartaigh, C., and Campanelli, P. (1998). The relative impact of interviewer effects and sample design effects on survey precision. Journal of the Royal Statistical Society, A, 161, 6377.CrossRefGoogle Scholar
Ongena, Y. P., and Dijkstra, W. (2006). Methods of behavior coding of survey interviews. Journal of Official Statistics, 22(3), 419451.Google Scholar
Peer, E., Rothschild, D., Gordon, A., Evernden, Z., and Damer, E. (2022). Data quality of platforms and panels for online behavioral research. Behavior Research Methods, 54(4), 16431662.CrossRefGoogle ScholarPubMed
Peterson, S., Toribio, N., Farber, J., and Hornick, D. (2021). Nonresponse Bias Report for the 2020 Household Pulse Survey, www2.census.gov/programs-surveys/demo/technical-documentation/hhp/2020_HPS_NR_Bias_Report-final.pdf.Google Scholar
Presser, S., and McCulloch, S. (2011). The growth of survey research in the United States: Government-sponsored surveys, 1984–2004. Social Science Research, 40(4), 10191024.CrossRefGoogle Scholar
Prins, K. (2016). Population Register Data, Basis for the Netherlands’ Population Statistics. Statistics Netherlands.Google Scholar
Roberts, C., Herzing, J. M. E., Sobrino Piazza, J., Abbet, P., and Gatica-Perez, D. (2022). Data privacy concerns as a source of resistance to complete mobile data collection tasks via a smartphone app. Journal of Survey Statistics and Methodology, 10(3), 518548.CrossRefGoogle Scholar
Sakshaug, J. W., Couper, M. P., Ofstedal, M. B., and Weir, D. R. (2012). Linking survey and administrative records: mechanisms of consent. Sociological Methods & Research, 41(4), 535569.CrossRefGoogle Scholar
Saris, W. E., and Gallhofer, I. N. (2007). Design, Evaluation, and Analysis of Questionnaires for Survey Research. John Wiley & Sons.CrossRefGoogle Scholar
Scanlon, P. (2020). Using Targeted Embedded Probes to Quantify Cognitive Interviewing Findings. In Beatty, P. C., Collins, D., Kaye, L., Padilla, J.-L., Willis, G. B., and Wilmot, A. (eds.) Advances in Questionnaire Design, Development, Evaluation and Testing. John Wiley & Sons.Google Scholar
Schaeffer, N. C., and Dykema, J. (2011). Response 1 to Fowler’s chapter: Coding the behavior of interviewers and respondents to evaluate survey questions. In Madans, J., Miller, K., Maitland, A., and Willis, G. (eds.) Question Evaluation Methods: Contributing to the Science of Data Quality. John Wiley & Sons.Google Scholar
Schaeffer, N. C., and Dykema, J. (2020). Advances in the science of asking questions. Annual Review of Sociology, 46(1), 3760.CrossRefGoogle Scholar
Schaeffer, N. C., Garbarski, D., Freese, J., and Maynard, D. W. (2013). An interactional model of the call for survey participation: Actions and reactions in the survey recruitment call. Public Opinion Quarterly, 77(1), 323351.CrossRefGoogle ScholarPubMed
Schaeffer, N. C., Min, B. H., Purnell, T., Garbarski, D., and Dykema, J. (2018). Greeting and response: Predicting participation from the call opening. Journal of Survey Statistics and Methodology, 6(1), 122148.CrossRefGoogle ScholarPubMed
Schober, M. F., Conrad, F. G., Antoun, C., Ehlen, P., Fail, S., Hupp, A. L., Johnston, M., Vickers, L., Yan, H. Y., and Zhang, C. (2015). Precision and disclosure in text and voice interviews on smartphones. PLOS ONE, 10(6), e0128337.CrossRefGoogle ScholarPubMed
Schoeni, R. F., Stafford, F., Mcgonagle, K. A., and Andreski, P. (2013). Response rates in national panel surveys. Annals of the American Academy of Political and Social Science, 645(1), 6087.CrossRefGoogle ScholarPubMed
Schouten, B., Peytchev, A., and Wagner, J. (2018). Adaptive Survey Design. CRC Press.Google Scholar
Schuldt, J. P., Konrath, S. H., and Schwarz, N. (2011). “Global warming” or “climate change”? Whether the planet is warming depends on question wording. Public Opinion Quarterly, 75(1), 115124.CrossRefGoogle Scholar
Shiffman, S., Stone, A. A., and Hufford, M. R. (2008). Ecological momentary assessment. Annual Review of Clinical Psychology, 4(1), 132.CrossRefGoogle ScholarPubMed
Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In Groves, R. M., Dillman, D. A., Eltinge, J. L., and Little, R. J. A. (eds.) Survey Nonresponse. John Wiley & Sons, Inc.Google Scholar
Singer, E., and Ye, C. (2013). The use and effects of incentives in surveys. Annals of the American Academy of Political and Social Science, 645(1), 112141.CrossRefGoogle Scholar
Smith, T. W., and Son, J. (2019). Tracking question-wording experiments across time in the General Social Survey, 1984–2014. In Lavrakas, P. J., Traugott, M. W., Kennedy, C., Holbrook, A., Leeuw, E. D., and West, B. T. (eds.) (2019). Experimental Methods in Survey Research: Techniques That Combine Random Sampling with Random Assignment. John Wiley & Sons, Inc.Google Scholar
Smyth, J. D., Dillman, D. A., Christian, L. M., and Stern, M. J. (2006). Comparing check-all and forced-choice question formats in web surveys. Public Opinion Quarterly, 70(1), 6677.CrossRefGoogle Scholar
Smyth, J. D., and Olson, K. (2019). The effects of mismatches between survey question stems and response options on data quality and responses. Journal of Survey Statistics and Methodology, 7(1), 3465.CrossRefGoogle Scholar
Smyth, J. D., and Olson, K. (2020). How well do interviewers record responses to numeric, interviewer field-code, and open-ended narrative questions in telephone surveys? Field Methods, 32(1), 89104.CrossRefGoogle Scholar
Smyth, J. D., Olson, K., and Stange, M. (2019). Within-household selection methods: A critical review and experimental examination. In Lavrakas, P. J., Traugott, M. W., Kennedy, C., Holbrook, A., Leeuw, E. D., and West, B. T. (eds.) (2019). Experimental Methods in Survey Research: Techniques That Combine Random Sampling with Random Assignment. John Wiley & Sons, Inc.Google Scholar
Stenger, R., Olson, K., and Smyth, J. D. (2023). Comparing readability measures and computer‐assisted question evaluation tools for self‐administered survey questions. Field Methods, 35(4), 287302.CrossRefGoogle Scholar
Sudman, S., and Bradburn, N. M. (1982). Asking Questions: A Practical Guide to Questionnaire Design. Jossey-Bass Publishers.Google Scholar
Sudman, S., Bradburn, N. M., and Schwarz, N. (1996). Thinking about Answers: The Application of Cognitive Processes to Survey Methodology. Jossey-Bass Publishers.Google Scholar
Swain, S. D., Weathers, D., and Niedrich, R. W. (2008). Assessing three sources of misresponse to reversed Likert items. Journal of Marketing Research, 45(1), 116131.CrossRefGoogle Scholar
Tourangeau, R., Couper, M. P., and Conrad, F. (2007). Color, labels, and interpretive heuristics for response scales. Public Opinion Quarterly, 71(1), 91112.CrossRefGoogle Scholar
Tourangeau, R., Maitland, A., Steiger, D., and Yan, T. (2020). A framework for making decisions about question evaluation methods. In Beatty, P. C., Collins, D., Kaye, L., Padilla, J.-L., Willis, G. B., and Wilmot, A. (eds.) Advances in Questionnaire Design, Development, Evaluation, and Testing. John Wiley & Sons.Google Scholar
Tourangeau, R., Rips, L. J., and Rasinski, K. A. (2000). The Psychology of Survey Response. Cambridge University Press.CrossRefGoogle Scholar
Tourangeau, R., and Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859883.CrossRefGoogle ScholarPubMed
Census Bureau, U.S. (2022a). Household Pulse Survey User Notes, Phase 3.3. U.S. Census Bureau, www2.census.gov/programs-surveys/demo/technical-documentation/hhp/Phase3-3_2022_Household_Pulse_Survey_User_Notes_03022022.pdf.Google Scholar
Census Bureau, U.S. (2022b). Measuring Household Experiences during the Coronavirus Pandemic, www.census.gov/data/experimental-data-products/household-pulse-survey.html.Google Scholar
Walzenbach, S., Burton, J., Couper, M. P., Crossley, T. F., and Jäckle, A. (2022). Experiments on multiple requests for consent to data linkage in surveys. Journal of Survey Statistics and Methodology, 11(3), 518540.CrossRefGoogle Scholar
Wells, B. M., Hughes, T., Park, R., CHIS Redesign Working Group, and Ponce, N. (2019). Evaluating the California Health Interview Survey of the Future: Results from a Statewide Pilot of an Address-Based Sampling Mail Push-to-Web Data Collection, https://healthpolicy.ucla.edu/chis/design/Documents/CHIS%20Fall%202018%20ABS%20Web%20Pilot%20Report%20for%20DHCS%20(July%202019).pdf.Google Scholar
Wells, B. M., Hughes, T., Park, R., CHIS Redesign Working Group, Rogers, T. B., and Ponce, N. (2018). Evaluating the California Health Interview Survey of the Future: Results from a Methodological Experiment to Test an Address-Based Sampling Mail Push-to-Web Data Collection, https://healthpolicy.ucla.edu/chis/design/Documents/CHIS%20Spring%202018%20ABS%20Web%20Field%20Experiment%20Report.pdf.Google Scholar
West, B. T., and Blom, A. G. (2017). Explaining interviewer effects: a research synthesis. Journal of Survey Statistics and Methodology, 5(2), 175211.Google Scholar
West, B. T., Kreuter, F., and Jaenichen, U. (2013). “Interviewer” effects in face-to-face surveys: A function of sampling, measurement error, or nonresponse? Journal of Official Statistics, 29(2), 277297.CrossRefGoogle Scholar
West, B. T., and Olson, K. (2010). How much of interviewer variance is really nonresponse error variance? Public Opinion Quarterly, 74(5), 10041026.CrossRefGoogle Scholar
Williams, D., and Brick, J. M. (2018). Trends in U.S. face-to-face household survey nonresponse and level of effort. Journal of Survey Statistics and Methodology, 6(2), 186211.CrossRefGoogle Scholar
Willis, G. B. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications.CrossRefGoogle Scholar
Ye, C., Fulton, J., and Tourangeau, R. (2011). More positive or more extreme? A meta-analysis of mode differences in response choice. Public Opinion Quarterly, 75(2), 349365.CrossRefGoogle Scholar
Yeager, D. S., Krosnick, J. A., Chang, L., Javitz, H. S., Levendusky, M. S., Simpser, A., and Wang, R. (2011). Comparing the accuracy of RDD telephone surveys and internet surveys conducted with probability and non-probability samples. Public Opinion Quarterly, 75(4), 709747.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Survey Research
  • Edited by Harry T. Reis, University of Rochester, New York, Tessa West, New York University, Charles M. Judd, University of Colorado Boulder
  • Book: Handbook of Research Methods in Social and Personality Psychology
  • Online publication: 12 December 2024
  • Chapter DOI: https://doi.org/10.1017/9781009170123.011
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Survey Research
  • Edited by Harry T. Reis, University of Rochester, New York, Tessa West, New York University, Charles M. Judd, University of Colorado Boulder
  • Book: Handbook of Research Methods in Social and Personality Psychology
  • Online publication: 12 December 2024
  • Chapter DOI: https://doi.org/10.1017/9781009170123.011
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Survey Research
  • Edited by Harry T. Reis, University of Rochester, New York, Tessa West, New York University, Charles M. Judd, University of Colorado Boulder
  • Book: Handbook of Research Methods in Social and Personality Psychology
  • Online publication: 12 December 2024
  • Chapter DOI: https://doi.org/10.1017/9781009170123.011
Available formats
×