Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-06T12:09:18.119Z Has data issue: false hasContentIssue false

11 - Representing the populations: what general social surveys can learn from surveys among specific groups

Published online by Cambridge University Press:  05 September 2014

Ineke Stoop
Affiliation:
The Netherlands Institute for Social Research/SCP
Roger Tourangeau
Affiliation:
Westat Research Organisation, Maryland
Brad Edwards
Affiliation:
Westat Research Organisation, Maryland
Timothy P. Johnson
Affiliation:
University of Illinois, Chicago
Kirk M. Wolter
Affiliation:
University of Chicago
Nancy Bates
Affiliation:
US Census Bureau
Get access

Summary

The usual suspects: why don’t they participate?

Some groups are unlikely candidates for survey participation. However, even groups deemed to be very difficult may turn out to be not so difficult in the end. People with learning disabilities – and their parents and carers – were much keener to answer questions about their own situation than expected (Stoop & Harrison, 2012a). Ethnic minorities may be hard to reach but, once reached by someone who speaks their language, may participate just as often as the majority population (Feskens, Hox, Lensvelt-Mulders, & Schmeets, 2007). Busy people may have little time to spare but are used to doing lots of things quickly and are thus generally not underrepresented in burdensome time use studies (Van Ingen, Stoop, & Breedveld, 2009).

When trying to identify hard-to-reach respondents, it rapidly becomes clear that being hard to reach may be related to sociodemographic and socioeconomic characteristics of individuals under study, but also depends greatly on the survey design. Survey mode, in particular, has been identified as an important factor in reaching different groups in the population. Geographically isolated groups may be easy to reach by telephone; mobile-only households may have no problem with a web-based survey; linguistic minorities may be quite happy to complete a questionnaire translated into their language; and the illiterate may enjoy a conversation with a face-to-face interviewer. Mixed-mode surveys aim to overcome the fact that different groups of people may be more or less easy to reach and more or less willing to respond in a particular mode (de Leeuw, 2008; de Leeuw, Dillman, & Hox, 2008; Dillman & Messer, 2010). More generally, securing participation by the hard to reach or hard to survey will depend on the design and topic of the survey and the efforts one is willing to make.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2014

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Anderson, T., & Stewart, J. B. (2007). Introduction to African American Studies: Transdisciplinary Approaches and Implications. Baltimore: Inprint.Google Scholar
Aspinall, P. J. (2009). Estimating the Size and Composition of the Lesbian, Gay, and Bisexual Population in Britain. Equality and Human Rights Commission, Research Report 37.
Blom, A. G. (2012). Explaining cross-country differences in survey contact rates: application of decomposition methods. Journal of the Royal Statistical Society: Series A (Statistics in Society), 175, Part 1, 217–42.CrossRefGoogle Scholar
Blom, A., Lynn, P., & Jäckle, A. (2008). Understanding Cross-National Differences in Unit Non-Response: The Role of Contact Data. ISER Working Paper Series No. 2008–01, University of Essex.Google Scholar
Braun, M. (2003). Communication and social cognition. In Harkness, J. A., van de Vijver, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 57–67). Hoboken, NJ: John Wiley & Sons.Google Scholar
D’Arrigo, J., Durrant, G. B., & Steele, F. (2009). Using Field Process Data to Predict Best Times of Contact, Conditioning on Household and Interviewer Influences. Working Paper M09–12. University of Southampton.Google Scholar
Davis, R. E., Caldwell, C. H., Couper, M. P., Janz, N. K., Alexander, G. L., Greene, S. M., et al. (2012). Ethnic identity, questionnaire content, and the dilemma of race matching in surveys of African Americans by African American interviewers. Field Methods, 25(2), 142–61.CrossRefGoogle ScholarPubMed
de Leeuw, E. D. (2005). To mix or not to mix data collection modes in surveys. Journal of Official Statistics, 21(2), 233–55.Google Scholar
de Leeuw, E. D. (2008). Choosing the method of data collection. In de Leeuw, E. D., Hox, J. J., & Dillman, D. A. (eds.), International Handbook of Survey Methodology (pp. 299–316). New York: Taylor & Francis Group/Lawrence Erlbaum Associates.Google Scholar
de Leeuw, E. D., Dillman, D. A., & Hox, J. J. (2008). Mixed mode surveys: when and why. In de Leeuw, E. D., Hox, J. J., & Dillman, D. A. (eds.), International Handbook of Survey Methodology (pp. 113–35). New York: Taylor & Francis Group/Lawrence Erlbaum Associates.Google Scholar
de Voogd, L. (2007). Ethnic and cultural surveys. What are the challenges in conducting surveys on ethnicity and religion. Research World, December 2007, 20–21.Google Scholar
Dept, S., Ferrari, A., & Wäyrynen, L. (2010). Developments in translation verification procedures in three multilingual assessments: a plea for an integrated translation and adaptation monitoring tool. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B. E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 157–73). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Dillman, D. A. (2000). Mail and Internet Surveys: The Tailored Design Method (2nd edn.). New York: John Wiley & Sons.Google Scholar
Dillman, D. A., & Messer, B. L. (2010). Mixed-mode surveys. In Marsden, P. & Wright, J. (eds.), Handbook of Survey Research (2nd edn.) (pp. 551–74). Bingley: Emerald Group Publishing Ltd.Google Scholar
Ellison, G., & Gunstone, B. (2009). Sexual Orientation Explored: A Study of Identity, Attraction, Behaviour and Attitudes in 2009. Equality and Human Rights Commission. Research Report 35.
ESF (1999). The European Social Survey (ESS) – A Research Instrument for the Social Sciences in Europe. Report prepared for the Standing Committee for the Social Sciences (SCSS) of the European Science Foundation. Strasbourg, European Science Foundation.
European Social Survey (2002). Core Questionnaire Development. (downloaded August 8, 2011).
European Social Survey. (2014). ESS6–2012 Survey Documentation Report. Edition 1.3. Bergen, European Social Survey Data Archive, Norwegian Social Science Data Services.
Feskens, R. C. W. (2009). Difficult groups in survey research and the development of tailor-made approach strategies. Dissertation: Universiteit Utrecht.
Feskens, R., Hox, J., Lensvelt-Mulders, G., & Schmeets, H. (2007). Nonresponse among ethnic minorities: a multivariate analysis. Journal of Official Statistics, 23(3), 387–408.Google Scholar
Fitzgerald, R., Widdop, S., Gray, M., & Collins, D. (2011). Identifying sources of error in cross-national questionnaires: application of an error source typology to cognitive interview data. Journal of Official Statistics, 7(4), 569–99.Google Scholar
Goyder, J. (1987). The Silent Minority. Nonrespondents on Sample Surveys. Cambridge: Polity Press.Google Scholar
Goyder, J., Boyer, L., & Martinelli, G. (2006). Integrating exchange and heuristic theories of survey nonresponse. Bulletin de Méthodologie Sociologique, 92, October 2006, 28–44.CrossRefGoogle Scholar
Groves, R. M. (1989). Survey Errors and Survey Costs. New York: John Wiley & Sons.CrossRefGoogle Scholar
Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 646–75.CrossRefGoogle Scholar
Groves, R. M., & Couper, M. P. (1998). Nonresponse in Household Interview Surveys. New York: John Wiley & Sons.CrossRefGoogle Scholar
Groves, R. M., & McGonagle, K. A. (2001). A theory-guided interview training protocol regarding survey participation. Journal of Official Statistics, 17(2), 249–66.Google Scholar
Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly, 68, 2–31.CrossRefGoogle Scholar
Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation. Description and an illustration. Public Opinion Quarterly, 64, 299–308.CrossRefGoogle ScholarPubMed
Häder, S., & Lynn, P. (2007). How representative can a multi-nation survey be? In Jowell, R., Roberts, C., Fitzgerald, R. & Eva, G. (eds.), Measuring Attitudes Cross-Nationally. Lessons from the European Social Survey (pp. 33–52). London: Sage.Google Scholar
Harkness, J. A. (2003). Questionnaire translation. In Harkness, J. A., van de Vijver, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 35–56). Hoboken, NJ: John Wiley & Sons.Google Scholar
Harkness, J. A. (2007). Improving the comparability of translations. In Jowell, R., Roberts, C., Fitzgerald, R., & Eva, G. (eds.), Measuring Attitudes Cross-Nationally. Lessons from the European Social Survey (pp. 79–93). London: Sage.Google Scholar
Harkness, J. A. (2011). Questionnaire design. Cross-cultural survey guidelines. (downloaded October 15, 2012).
Harkness, J. A., Edwards, B., Hansen, S. E., Millar, D. R., & Villar, A. (2010). Designing questionnaires for multi-population research. In Harkness, J. A., Braun, M., Edwards, B., Johnson, T. P., Lyberg, L. E., Mohler, P. Ph., Pennell, B.-E., & Smith, T. W. (eds.), Survey Methods in Multinational, Multiregional, and Multicultural Contexts (pp. 33–58). Hoboken, NJ: John Wiley & Sons.CrossRefGoogle Scholar
Harkness, J. A., Van de Vijver, F. J. R., & Johnson, T. P. (2003). Questionnaire design in comparative research. In Harkness, J. A., van de Vijver, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 19–34). Hoboken, NJ: John Wiley & Sons.Google Scholar
Hu, S. S., Link, M. W., & Mokdad, A. H. (2010). Reaching linguistically isolated people: findings from a telephone survey using real-time interpreters. Field Methods, 22(1), 39–56.CrossRefGoogle Scholar
Irving Fisher Committee on Central Bank Statistics. (2008). STCPM31: Accounting for the very rich in household surveys of income and wealth. IFC Bulletin No 28. IFC’s contribution to the 56th ISI Session, Lisbon, 2007, Switzerland, Basel: Bank for International Settlements, 399–431. (downloaded October 15, 2012).
Jäckle, A., Roberts, C., & Lynn, P. (2008). Assessing the effect of data collection mode on measurement. International Statistical Review, 78(1), 3–20.CrossRefGoogle Scholar
Johnston, L. G., & Sabin, K. (2010). Sampling hard-to-reach populations with respondent-driven sampling. Methodological Innovations Online, 5(2), 38–48.CrossRefGoogle Scholar
Jowell, R., Kaase, M., Fitzgerald, R., & Eva, G. (2007). The European Social Survey as a measurement model. In Jowell, R., Roberts, C., Fitzgerald, R., & Eva, G. (eds.), Measuring Attitudes Cross-Nationally. Lessons from the European Social Survey (pp. 1–31). London: Sage.CrossRefGoogle Scholar
Jowell, R., Roberts, C., Fitzgerald, R., & Eva, G. (eds.) (2007). Measuring Attitudes Cross-Nationally. Lessons from the European Social Survey. London: Sage.CrossRef
Koch, A., Blom, A. G., Stoop, I., & Kappelhof, J. (2009). Data collection quality assurance in cross-national surveys: the example of the ESS. Methoden Daten Analysen. Zeitschrift für Empirische Sozialforschung, 3(2): 219–47.Google Scholar
Koch, A., Fitzgerald, R., Stoop, I., & Widdop, S. (2010). Field Procedures in the European Social Survey Round 5: Enhancing Response Rates. Mannheim, European Social Survey, GESIS.Google Scholar
Kreuter, F., & Kohler, U. (2009). Analyzing contact sequences in call record data: potential and limitations of sequence indicators for nonresponse adjustments in the European Social Survey. Journal of Official Statistics, 25(2), 203–26.Google Scholar
Loosveldt, G. (2008). Face-to-face interviews. In de Leeuw, E. D., Hox, J. J., Dillman, D. A. (eds.), International Handbook of Survey Methodology (pp. 201–20). New York: Taylor & Francis Group/Lawrence Erlbaum Associates.Google Scholar
Matsuo, H., Billiet, J., Loosveldt, G., & Malnar, B. (2010). Response-based Quality Assessment of ESS Round 4: Results for 30 Countries Based on Contact Files. Onderzoeksverslag Centrum voor Sociologisch Onderzoek. CeSO/SM/2010-2.
Obeid, S., & Baron-Epel, O. (2012). The Social Capital of the Arab Community in Israel. Paper presented at the International Conference on European Social Survey, Nicosia, Cyprus, November 2012.
Purdon, S., Campanelli, P., & Sturgis, P. (1999). Interviewers calling strategies on face-to-face interview survey. Journal of Official Statistics, 15(2), 199–216.Google Scholar
Said, E. W. (1979). Orientalism. New York: Vintage books. 25th Anniversary Edition (2003).Google Scholar
Saris, W. E., & Gallhofer, I. N. (2007a). Design, Evaluation, and Analysis of Questionnaires for Survey Research. Wiley Series in Survey Methodology. New York: John Wiley & Sons.CrossRefGoogle Scholar
Saris, W. E., & Gallhofer, I. N. (2007b). Can questions travel successfully? In Jowell, R., Roberts, C., Fitzgerald, R., & Eva, G. (eds.), Measuring Attitudes Cross-Nationally: Lessons from the European Social Survey (pp. 53–77). London: Sage.CrossRefGoogle Scholar
Schaeffer, N. C., Dykema, J., & Maynard, D. W. (2010). Interviewers and interviewing. In Marsden, P. & Wright, J. (eds.), Handbook of Survey Research (2nd edn.) (pp. 37–70). Bingley: Emerald Group Publishing Ltd.Google Scholar
Simon, P. (2007). Ethnic Statistics and Data Protection in the Council of Europe Countries. Study Report. Strasbourg: European Commission against Racism and Intolerance (ECRI).Google Scholar
Singer, E. (2011). Towards a cost-benefit theory of survey participation: evidence, further test, and implications. Journal of Official Statistics, 27(2), 379–92.Google Scholar
Smith, T. M. (2003). Developing comparable questions in cross-national surveys. In Harkness, J. A., van de Vijver, F. J. R., & Mohler, P. Ph. (eds.), Cross-Cultural Survey Methods (pp. 69–91). Hoboken, NJ: John Wiley & Sons.Google Scholar
Stoop, I. (2005). The Hunt for the Last Respondent. The Hague, Social and Cultural Planning Office.Google Scholar
Stoop, I. (2012). Unit non-response due to refusal. In Gideon, L. (ed.), Handbook of Survey Methodology for the Social Sciences (pp. 121–47). Heidelberg: Springer.CrossRefGoogle Scholar
Stoop, I., Billiet, J., Koch, A., & Fitzgerald, R. (2010). Improving Survey Response: Lessons Learned from the European Social Survey. Chichester: John Wiley & Sons.CrossRefGoogle Scholar
Stoop, I., Devacht, S., Billiet, J., Loosveldt, G., & Philippens, M. (2003). The Development of a Uniform Contact Description Form in the ESS. Paper presented at the 14th International Workshop on Household Survey Nonresponse, Leuven, September 2003.
Stoop, I., & Harrison, E. (2012a). Classification of surveys. In Gideon, L. (ed.), Handbook of Survey Methodology for the Social Sciences (pp. 7–21). Heidelberg: Springer.CrossRefGoogle Scholar
Stoop, I., & Harrison, E. (2012b). Repeated cross-sectional surveys using F2F. In Gideon, L. (ed.), Handbook of Survey Methodology for the Social Sciences (pp. 249–76). Heidelberg: Springer.CrossRefGoogle Scholar
Tourangeau, R., & Smith, T. W. (1996). Asking sensitive questions: the impact of data collection mode, question format, and question context. Public Opinion Quarterly, 60(2), 275–304.CrossRefGoogle Scholar
Van Ingen, E., Stoop, I., & Breedveld, K. (2009). Nonresponse in the Dutch time use survey: strategies for response enhancement and bias reduction. Field Methods, 21(1), pp. 69–90.CrossRefGoogle Scholar
West, B., & Olson, K. (2010). How much of interviewer variance is really nonresponse error variance. Public Opinion Quarterly, 74(5), 1004–26.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×