Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2024-12-22T20:07:46.744Z Has data issue: false hasContentIssue false

15 - Self-Report Measures

from Part III - Self-Report Measures

Published online by Cambridge University Press:  12 December 2024

John E. Edlund
Affiliation:
Rochester Institute of Technology, New York
Austin Lee Nichols
Affiliation:
Central European University, Vienna
Get access

Summary

Self-report measures are questions that are answered by respondents about themselves. They are essential to researchers and policy-makers; they provide a direct window for researchers and policy-makers to learn what people know, what they do, and how they think about an issue, a person, or an event. This chapter begins with an overview of how people go about answering survey questions. To answer a survey question, respondents must first understand what the question asks. Next, they retrieve relevant information required by the question and integrate it into an estimate or a judgment. Then, they map the estimate or the judgment to one of the response options provided to them. At each stage of this survey response process, respondents could run into problems that would negatively impact the accuracy and completeness of their answers. Lastly, the chapter discusses how the context in which a survey item is asked and the mode of data collection affect self-report measures. The chapter concludes with recommendations on how to improve the quality of self-report measures.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2024

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Alwin, D. F. (2007). Margins of Error: A Study of Reliability in Survey Measurement. John Wiley.CrossRefGoogle Scholar
Alwin, D. F., Baumgartner, E. M., & Beattie, B. A. (2018). Number of response categories and reliability in attitude measurement. Journal of Survey Statistics and Methodology, 6, 212239.CrossRefGoogle Scholar
American National Election Studies. (2021). ANES 2020 time series study full release [data set and documentation]. July 19, 2021 version. www.electionstudies.orgGoogle Scholar
Belli, R. F., Shay, W. L., & Stafford, F. P. (2001). Event history calendars and question list surveys: A direct comparison of interviewing methods. Public Opinion Quarterly, 65, 4574.CrossRefGoogle ScholarPubMed
Blair, E., & Burton, S. (1987). Cognitive processes used by survey respondents to answer behavioral frequency questions. Journal of Consumer Research, 14, 280288.CrossRefGoogle Scholar
Bradburn, N. (2016). Surveys as social interactions. Journal of Survey Statistics and Methodology, 4, 94109.CrossRefGoogle Scholar
Bradburn, N., Sudman, S., & Wansink, B. (2004). Asking questions: The Definitive Guide to Questionnaire Design. Wiley & Sons.Google Scholar
Brown, N. R. (2002). Encoding, representing, and estimating event frequencies: Multiple strategy perspective. In Sedlmeier, P. & Betsch, T. (eds.), Frequency Processing and Cognition (pp. 3754). Oxford University Press.CrossRefGoogle Scholar
Burton, S., & Blair, E. (1991). Task conditions, response formulation processes, and response accuracy for behavioral frequency questions in surveys. Public Opinion Quarterly, 55, 5079.CrossRefGoogle Scholar
Carp, F. (1974). Position effects on interview response. Journal of Gerontology, 29, 581587.CrossRefGoogle Scholar
Chen, C., Lee, S. Y., & Stevenson, H. W. (1995). Response style and cross-cultural comparisons of rating scales among East Asian and North American students. Psychological Science, 6, 170175.CrossRefGoogle Scholar
Chessa, A. G., & Holleman, B. C. (2007). Answering attitudinal questions: Modelling the response process underlying contrastive questions. Applied Cognitive Psychology, 21(2), 203225.CrossRefGoogle Scholar
Conrad, F. G., & Schober, M. F. (2000). Clarifying question meaning in a household telephone survey. Public Opinion Quarterly, 64, 128.CrossRefGoogle Scholar
Conrad, F. G., Schober, M. F., & Schwarz, N. (2014). Pragmatic processes in survey interviewing. In Holtgraves, T. M. (ed.), The Oxford Handbook of Language and Social Psychology (pp. 420437). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199838639.013.005Google Scholar
Couper, M. P., Conrad, F. G., & Tourangeau, R. (2007). Visual context effects in Web surveys. Public Opinion Quarterly, 71(4), 623634. https://doi.org/10.1093/poq/nfm044CrossRefGoogle Scholar
Daikeler, J., Bach, R. L., Silber, H., & Eckman, S. (2022). Motivated misreporting in smartphone surveys. Social Science Computer Review, 40(1), 95107. https://doi.org/10.1177/0894439319900936CrossRefGoogle Scholar
Davis, R. E., Couper, M. P., Janz, N. K., Caldwell, C. H., & Resnicow, K. (2010). Interviewer effects in public health surveys. Health Education Research, 25, 1426.CrossRefGoogle ScholarPubMed
Eckman, S., Kreuter, F., Kirchner, A., Jäckle, A., Tourangeau, R., & Presser, S. (2014). Assessing the mechanisms of misreporting to filter questions in surveys. Public Opinion Quarterly, 78(3), 721733. https://doi.org/10.1093/poq/nfu030CrossRefGoogle Scholar
Ehlen, P., Schober, M. F., & Conrad, F. G. (2007). Modeling speech disfluency to predict conceptual misalignment in speech survey interfaces. Discourse Processes, 44(3), 245265.CrossRefGoogle Scholar
Fowler, F. J. Jr (1995). Improving Survey Questions: Design and Evaluation. SAGE Publications.Google Scholar
Galesic, M., & Tourangeau, R. (2007). What is sexual harassment? It depends on who asks! Framing effects on survey responses. Applied Cognitive Psychology, 21(2), 189202.CrossRefGoogle Scholar
Galesic, M., Tourangeau, R., Couper, M. P., & Conrad, F. G. (2008). Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly, 72(5), 892913.CrossRefGoogle ScholarPubMed
Galesic, M., & Yan, T. (2011). Use of eye tracking for studying survey response processes. In Das, M., Ester, P., & Kaczmirek, L. (eds.), Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies (pp. 349370). Routledge.Google Scholar
Grice, H. P. (1975). Logic and conversation. In Cole, P. and Morgan, J. (eds.), Syntax and Semantics: Speech Acts (vol. 3, pp. 4158). Seminar Press.CrossRefGoogle Scholar
Groves, R. M., Fowler, F. J. Jr., Couper, M. P., Lepkowski, J. M., Singer, E. & Tourangeau, R. (2009). Survey Methodology. Wiley.Google Scholar
He, J., & van de Vijver, F. J. R. (2015). Effects of a general response style on cross-cultural comparisons: Evidence from the Teaching and Learning International Survey. Public Opinion Quarterly, 79, 267290.CrossRefGoogle Scholar
Holbrook, A. L., Krosnick, J. A., Moore, D., & Tourangeau, R. (2007). Response order effects in dichotomous categorical questions presented orally: The impact of question and respondent attributes. Public Opinion Quarterly, 71(3), 325348.CrossRefGoogle Scholar
Holtgraves, T. (2004). Social desirability and self-reports: Testing models of socially desirable responding. Personality and Social Psychology Bulletin, 30, 161172.CrossRefGoogle ScholarPubMed
Holtgraves, T., Eck, J., & Lasky, B. (1997). Face management, question wording, and social desirability. Journal of Applied Social Psychology, 27, 16501671.CrossRefGoogle Scholar
Hox, J. (1997). From theoretical concepts to survey questions. In Lyberg, L. et al. (eds.). Survey Measurement and Process Quality (pp. 4769). Wiley.CrossRefGoogle Scholar
Huber, R., & Ghosh, A. (2021). Large cognitive fluctuations surrounding sleep in daily living. iScience, 3(19), 102159.CrossRefGoogle Scholar
Jobe, J. B., Tourangeau, R., & Smith, A. F. (1993). Contributions of survey research to the understanding of memory. Applied Cognitive Psychology, 7, 567584.CrossRefGoogle Scholar
Johnson, T., Kulesa, P., Cho, Y. I., & Shavitt, S. (2005). The relation between culture and response styles: Evidence from 19 countries. Journal of Cross-Cultural Psychology, 36, 264277.CrossRefGoogle Scholar
Johnson, T., Shavitt, S., & Holbrook, A. (2010). Survey response styles across cultures. In Matsumoto, D. & Van de Vijver, F. (eds.), Cross-Cultural Research Methods in Psychology (pp. 130176). Cambridge University Press.CrossRefGoogle Scholar
Kamoen, N., Holleman, B., Mak, P., Sanders, T., & Van Den Bergh, H. (2011). Agree or disagree? Cognitive processes in answering contrastive survey questions. Discourse Processes, 48, 355385.CrossRefGoogle Scholar
Kamoen, N., Holleman, B., Mak, P., Sanders, T., & Van Den Bergh, H. (2017). Why are negative questions difficult to answer? On the processing of linguistic contrasts in surveys. Public Opinion Quarterly, 81, 613635.CrossRefGoogle Scholar
Kemmelmeier, M. (2016). Cultural differences in survey responding: Issues and insights in the study of response biases. International Journal of Psychology, 51, 439444.CrossRefGoogle Scholar
Keusch, F., & Yan, T. (2018). Is satisficing responsible for response order effects in rating scale questions? Survey Research Methods, 12(3), 259270.Google Scholar
Keusch, F., & Yan, T. (2019). Impact of response scale features on survey responses to factual/behavioral questions. In Lavrakas, P. J. et al. (eds.), Experimental Methods in Survey Research: Techniques that Combine Random Sampling with Random Assignment (pp. 131150). Wiley & Sons.CrossRefGoogle Scholar
Knäuper, B., Carrière, K., Chamandy, M., Xu, Z., Schwarz, N., & Rosen, N. O. (2016). How aging affects self-reports. European Journal of Ageing, 13(2), 185193.CrossRefGoogle ScholarPubMed
Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5, 213236.CrossRefGoogle Scholar
Krosnick, J. A. (1999). Survey research. Annual Review of Psychology, 50, 537567.CrossRefGoogle ScholarPubMed
Krosnick, J. A., & Alwin, D. F. (1987). An evaluation of a cognitive theory of response-order effects in survey measurement. Public Opinion Quarterly, 51, 201219.CrossRefGoogle Scholar
Krosnick, J. A., & Presser, S. (2010). Questionnaire design. In Wright, J. D. & Marsden, P. V. (eds.), Handbook of Survey Research, 2nd ed. (pp. 263313). Elsevier.Google Scholar
Lee, L., Brittingham, A., Tourangeau, R., Willis, G., Ching, P., Jobe, J., & Black, S. (1999). Are reporting errors due to encoding limitations or retrieval failure? Surveys of child vaccination as a case study. Applied Cognitive Psychology, 13(1), 4363.3.0.CO;2-A>CrossRefGoogle Scholar
Lenzner, T., Kaczmirek, L., & Galesic, M. (2011). Seeing through the eyes of the respondent: An eye-tracking study on survey question comprehension. International Journal of Public Opinion Research, 23, 361373.CrossRefGoogle Scholar
Liu, M., & Stainback, K. (2013). Interviewer gender effects on survey responses to marriage related questions. Public Opinion Quarterly, 77(2), 606618.CrossRefGoogle Scholar
Loftus, E. F., & Marburger, W. (1983). Since the eruption of Mt. St. Helens, has anyone beaten you up? Improving the accuracy of retrospective reports with landmark events. Memory & Cognition, 11, 114120.CrossRefGoogle ScholarPubMed
Maher, J. P., Rebar, A. L., & Dunton, G. F. (2018). Ecological momentary assessment is a feasible and valid methodological tool to measure older adults’ physical activity and sedentary behavior. Frontiers in Psychology, 9, 1485.CrossRefGoogle ScholarPubMed
Maitland, A., & Presser, S. (2016). How accurately do different evaluation methods predict the reliability of survey questions? Journal of Survey Statistics and Methodology, 4, 362381.CrossRefGoogle Scholar
Maitland, A., & Presser, S. (2018). How do question evaluation methods compare in predicting problems observed in typical survey conditions? Journal of Survey Statistics and Methodology, 6, 465490.CrossRefGoogle Scholar
Mason, R., Carlson, J. E., & Tourangeau, R. (1994). Contrast effects and subtraction in part–whole questions. Public Opinion Quarterly, 58, 569578.CrossRefGoogle Scholar
McCool, D., Schouten, J. D., & Lugtig, P. (2021). An app-assisted travel survey in official statistics: Possibilities and challenges. Journal of Official Statistics, 37, 149170.CrossRefGoogle Scholar
McDonald, J. A., Scott, Z. A., and Hanmer, M. J. (2017). Using self-prophecy to combat vote overreporting on public opinion surveys. Electoral Studies, 50, 137141.CrossRefGoogle Scholar
Mingay, D., & Greenwell, M. (1989). Memory bias and response-order effects. Journal of Official Statistics, 5(3), 253263.Google Scholar
Neter, J., & Waksberg, J. (1964). A study of response errors in expenditure data from household interviews. Journal of the American Statistical Association, 59, 1855.CrossRefGoogle Scholar
Norenzayan, A., & Schwarz, N. (1999). Telling what they want to know: Participants tailor causal attributions to researchers’ interests. European Journal of Social Psychology, 29, 10111020.3.0.CO;2-A>CrossRefGoogle Scholar
Peter, J., & Valkenburg, P. M. (2011). The influence of sexually explicit internet material on sexual risk behavior: A comparison of adolescents and adults. Journal of Health Communication, 16(7), 750765.CrossRefGoogle ScholarPubMed
Peytchev, A., Conrad, F., Couper, M., & Tourangeau, R. (2010). Increasing respondents’ use of definitions in web surveys. Journal of Official Statistics, 26, 633650.Google ScholarPubMed
Revilla, M., Saris, W. E., & Krosnick, J. A. (2014). Choosing the number of categories in agree/disagree scales. Sociological Methods & Research, 43, 7397.CrossRefGoogle Scholar
Saris, W. E., & Gallhofer, I. (2007a). Design, Evaluation, and Analysis of Questionnaires for Survey Research. John Wiley.CrossRefGoogle Scholar
Saris, W. E., & Gallhofer, I. (2007b). Estimation of the effects of measurement characteristics on the quality of survey questions. Survey Research Methods, 1, 2943.Google Scholar
Schaeffer, N. C. (1980). Evaluating race-of-interviewer effects in a National Survey. Sociological Methods and Research, 8, 400419.CrossRefGoogle Scholar
Schaeffer, N. C. (1991). Conversation with a purpose or conversation? Interaction in the standardized interview. In Biemer, P. P., Groves, R. M., Lyberg, L. E., Mathiowetz, N. A., & Sudman, S. (eds.). Measurement Errors in Surveys (pp. 367391). Wiley & Sons.Google Scholar
Schaeffer, N.C., & Presser, S. (2003). The science of asking questions. Annual Review of Sociology, 29, 6588.CrossRefGoogle Scholar
Schober, M. F., & Conrad, F. G. (1997). Does conversational interviewing reduce survey measurement error? Public Opinion Quarterly, 61, 576602.CrossRefGoogle Scholar
Schober, M. F., Conrad, F. G., & Fricker, S. S. (2004). Misunderstanding standardized language in research interviews. Applied Cognitive Psychology, 18, 169188.CrossRefGoogle Scholar
Schwarz, N. (1997). Questionnaire design: The rocky road from concepts to answers. In Lyberg, L. et al. (eds.), Survey Measurement and Process Quality (pp. 2945). Wiley.Google Scholar
Schwarz, N. (1999). Self-reports: How the questions shape the answers. American Psychologist, 54, 93105.CrossRefGoogle Scholar
Schwarz, N. (2019). Surveys, experiments, and the psychology of self-report. In Kardes, F. R., Herr, P. M., & Schwarz, N. (eds.), Handbook of Research Methods in Consumer Psychology (pp. 1740). Routledge. https://doi.org/10.4324/9781351137713-2Google Scholar
Schwarz, N., & Hippler, H. J. (1991). Response alternatives: The impact of their choice and presentation order. In Biemer, P., Groves, R. M., Lyberg, L. E., Mathiowetz, N. A., & Sudman, S. (eds.), Measurement Error in Surveys (pp. 4156). Wiley.Google Scholar
Schwarz, N., Hippler, H. J., Deutsch, B., & Strack, F. (1985). Response categories: Effects on behavioral reports and comparative judgments. Public Opinion Quarterly, 49, 388395.CrossRefGoogle Scholar
Schwarz, N., Hippler, H. J., & Noelle-Neumann, E. (1992). A cognitive model of response-order effects in survey measurement. In Schwarz, N. & Sudman, S. (eds.), Context Effects in Social and Psychological Research, 187201. Springer.CrossRefGoogle Scholar
Schwarz, N., Knäuper, B., Hippler, H. J., Noelle-Neumann, E., & Clark, F. (1991). Rating scales: Numeric values may change the meaning of scale labels. Public Opinion Quarterly, 55, 570582.CrossRefGoogle Scholar
Schwarz, N., Strack, F., & Mai, H. P. (1991). Assimilation and contrast effects in part–whole question sequences: A conversational logic analysis. Public Opinion Quarterly, 55, 323.CrossRefGoogle Scholar
Schwarz, N., Strack, F., Müller, G., & Chassein, B. (1988). The range of response alternatives may determine the meaning of the question: Further evidence on informative functions of response alternatives. Social Cognition, 6, 107117.CrossRefGoogle Scholar
Simas, M., Cates, A., & Fucci, A. (2019). Who uses these apps? Paper presented at the Biannual Conference of European Survey Research Association.Google Scholar
Strack, F., Martin, L. L., & Schwarz, N. (1988). Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology, 18(5), 429442. https://doi.org/10.1002/ejsp.2420180505CrossRefGoogle Scholar
Strack, F., Schwarz, N., & Wänke, M. (1991). Semantic and pragmatic aspects of context effects in social and psychological research. Social Cognition, 9, 111125.CrossRefGoogle Scholar
Sudman, S., Bradburn, N. M., & Schwarz, N. (1996). Thinking about Answers: The Application of Cognitive Processes to Survey Methodology. Jossey-Bass.Google Scholar
Suessbrick, A. L., Schober, M. F., & Conrad, F. G. (2000). Different respondents interpret ordinary questions quite differently. In 2000 Proceedings of the Section on Survey Research Methods: Papers Presented at the Annual Meeting of the American Statistical Association (pp. 907912). American Statistical Association.Google Scholar
Tourangeau, R. (1984). Cognitive science and survey methods. In Jabine, T., Straf, M. L., Tanur, J. M., and Tourangeau, R. (eds.), Cognitive Aspects of Survey Design: Building a Bridge between Disciplines (pp. 73100). National Academy Press.Google Scholar
Tourangeau, R. (2000). Remembering what happened: Memory errors and survey reports. In Stone, A. A., Turkkan, J. S., Bachrach, C. A., Jobe, J. B., Kurtzman, H. S., & Cain, V. S. (eds.), The Science of Self-Report: Implications for Research and Practice (pp. 2947). Lawrence Erlbaum Associates.Google Scholar
Tourangeau, R. (2018). The survey response process from a cognitive viewpoint. Quality Assurance in Education, 26, 169181.CrossRefGoogle Scholar
Tourangeau, R. (2021). Survey reliability: Models, methods, and findings. Journal of Survey Statistics and Methodology, 9, 961991.CrossRefGoogle ScholarPubMed
Tourangeau, R., & Bradburn, N. (2010). The psychology of survey response. In Wright, J. D. & Marsden, P. V. (eds.), Handbook of Survey Research, 2nd ed. (pp. 315346). Elsevier.Google Scholar
Tourangeau, R., Couper, M. P., & Conrad, F. (2007). Color, labels, and interpretive heuristics for response scales. Public Opinion Quarterly, 71(1), 91112. https://doi.org/10.1093/poq/nfl046CrossRefGoogle Scholar
Tourangeau, R., Maitland, A., Steiger, D., & Yan, T. (2020). A framework for making decisions about question evaluation methods. In Beatty, P. et al. (eds.). Advances in Questionnaire Design, Development, Evaluation and Testing (pp. 4773). Wiley.CrossRefGoogle Scholar
Tourangeau, R., & Rasinski, K. (1988). Cognitive processes underlying context effects in attitude measurement. Psychological Bulletin, 103, 299314.CrossRefGoogle Scholar
Tourangeau, R., Rasinski, K. A., & Bradburn, N. (1991). Measuring happiness in surveys: A test of the subtraction hypothesis. Public Opinion Quarterly, 55, 255266.CrossRefGoogle Scholar
Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The Psychology of Survey Response. Cambridge University Press.CrossRefGoogle Scholar
Tourangeau, R., & Smith, T. W. (1996). Asking sensitive questions: The impact of data collection mode, question format, and question context. Public Opinion Quarterly, 60, 275304.CrossRefGoogle Scholar
Tourangeau, R., Smith, T. W., & Rasinski, K. (1997). Motivation to report sensitive behaviors on surveys: Evidence from a bogus pipeline experiment. Journal of Applied Social Psychology, 27, 209222.CrossRefGoogle Scholar
Tourangeau, R., Sun, H., & Yan, T. (2021). Comparing methods for assessing reliability. Journal of Survey Statistics and Methodology, 9, 651673.Google ScholarPubMed
Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133, 859883.CrossRefGoogle ScholarPubMed
Tourangeau, R., Yan, T., & Sun, H. (2020). Who can you count on? Understanding the determinants of reliability. Journal of Survey Statistics and Methodology, 8, 903931.CrossRefGoogle ScholarPubMed
Van Vaerenbergh, Y., & Thomas, T. (2013). Response styles in survey research: A literature review of antecedents, consequences and remedies. International Journal of Public Opinion Research, 25(3), 195217.CrossRefGoogle Scholar
US Bureau of Labor Statistics. (n.d.). Consumer Expenditure Surveys [webpage]. www.bls.gov/cexGoogle Scholar
US Department of Agriculture. (n.d.). FoodAPS National Household Food Acquisition and Purchase Survey [webpage]. www.ers.usda.gov/data-products/foodaps-national-household-food-acquisition-and-purchase-surveyGoogle Scholar
West, B. T., & Blom, A. G. (2017). Explaining interviewer effects: A research synthesis. Journal of Survey Statistics and Methodology, 5, 175211.Google Scholar
Yan, T. (2006). How successful I am depends on what number I get: The effects of numerical scale labels and the need for cognition on survey responses. In 2006 Proceedings of the Section on Survey Research Methods: Papers Presented at the Annual Meeting of the American Statistical Association (pp. 42624269). American Statistical Association.Google Scholar
Yan, T. (2017). Survey questionnaire design. In Balakrishnan, N. et al. (eds.), Wiley StatsRef: Statistics Reference Online. John Wiley & Sons. https://doi.org/10.1002/9781118445112.stat06642.pub2Google Scholar
Yan, T. (2021). Consequences of asking sensitive questions. Annual Review of Statistics and Its Application, 8, 109127.CrossRefGoogle Scholar
Yan, T., & Cantor, D. (2019). Asking survey questions about criminal justice involvement. Public Health Reports, 134, 46s56s.CrossRefGoogle ScholarPubMed
Yan, T., & Keusch, F. (2015). The effects of the direction of rating scales on survey responses in a telephone survey. Public Opinion Quarterly, 79, 145165.CrossRefGoogle Scholar
Yan, T., Keusch, F., & He, L. (2018). The impact of question and scale characteristics on scale direction effects. Survey Practice, 11(2). https://doi.org/10.29115/SP-2018-0008CrossRefGoogle Scholar
Yan, T., Kreuter, F., & Tourangeau, R. (2012). Latent class analysis of response inconsistencies across modes of data collection. Social Science Research, 41, 10171027.CrossRefGoogle ScholarPubMed
Yan, T., Machado, J., Simas, M., Heller, A., & Denbaly, M. (2019). The feasibility of using smartphones to record food purchase and acquisition. Paper presented at the Annual Conference of the American Association of Public Opinion Research.Google Scholar
Yan, T., & Tourangeau, R. (2008). Fast times and easy questions: The effects of age, experience, and question complexity on web survey response times. Applied Cognitive Psychology, 22, 5168. https://doi.org/10.1002/acp.1331CrossRefGoogle Scholar
Yan, T., & Tourangeau, R. (2022). Detecting underreporters of abortions and miscarriages in the National Study of Family Growth, 2011–2015. PLOS ONE, 17(8), e0271288. https://doi.org/10.1371/journal.pone.0271288.CrossRefGoogle ScholarPubMed

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Self-Report Measures
  • Edited by John E. Edlund, Rochester Institute of Technology, New York, Austin Lee Nichols, Central European University, Vienna
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 12 December 2024
  • Chapter DOI: https://doi.org/10.1017/9781009000796.016
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Self-Report Measures
  • Edited by John E. Edlund, Rochester Institute of Technology, New York, Austin Lee Nichols, Central European University, Vienna
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 12 December 2024
  • Chapter DOI: https://doi.org/10.1017/9781009000796.016
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Self-Report Measures
  • Edited by John E. Edlund, Rochester Institute of Technology, New York, Austin Lee Nichols, Central European University, Vienna
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 12 December 2024
  • Chapter DOI: https://doi.org/10.1017/9781009000796.016
Available formats
×