Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-g8jcs Total loading time: 0 Render date: 2024-11-26T06:16:07.108Z Has data issue: false hasContentIssue false

18 - Online Research Methods

from Part III - Data Collection

Published online by Cambridge University Press:  25 May 2023

Austin Lee Nichols
Affiliation:
Central European University, Vienna
John Edlund
Affiliation:
Rochester Institute of Technology, New York
Get access

Summary

This chapter examines four prominent online research methods – online surveys, online experiments, online content analysis, and qualitative approaches – and a number of issues/best practices related to them that have been identified by scholars across a number of disciplines. In addition, several platforms for conducting online research, including online survey and experimental design platforms, online content capture programs, and related quantitative and qualitative data analysis tools, are identified in the chapter. Various advantages (e.g., time saving, cost, etc.) and disadvantages (e.g., sampling issues, validity and privacy issues, ethical issues) of each method are then discussed along with best practices for using them when conducting online research.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Antoun, C., Couper, M. P., & Conrad, F. G. (2017). Effects of mobile versus PC web on survey response quality: A crossover experiment in a probability web panel. Public Opinion Quarterly, 81(S1), 280306. https://doi.org/10.1093/poq/nfw088CrossRefGoogle Scholar
Anwyl-Irvine, A. L., Massonnié, J., Flitton, A., Kirkham, N., & Evershed, J. K. (2020). Gorilla in our midst: An online behavioral experiment builder. Behavior Research Methods, 52(1), 388407. https://doi.org/10.3758/s13428-019-01237-xGoogle Scholar
Aristeidou, M., Scanlon, E., & Sharples, M. (2017). Profiles of engagement in online communities of citizen science participation. Computers in Human Behavior, 74, 246256. https://doi.org/10.1016/j.chb.2017.04.044Google Scholar
Armstrong, B., Reynolds, C., Bridge, G., et al. (2020). How does citizen science compare to online survey panels? A comparison of food knowledge and perceptions between the Zooniverse, Prolific and Qualtrics UK panels. Frontiers in Sustainable Food Systems, 4, 306. https://doi.org/10.3389/fsufs.2020.575021Google Scholar
Babbie, E. R. (2020). The Practice of Social Research. Cengage Learning.Google Scholar
Barnhoorn, J. S., Haasnoot, E., Bocanegra, B. R., & van Steenbergen, H. (2015). QRTEngine: An easy solution for running online reaction time experiments using Qualtrics. Behavior Research Methods, 47(4), 918929. https://doi.org/10.3758/s13428-014-0530-7CrossRefGoogle ScholarPubMed
Bell, D. (2006). An Introduction to Cybercultures. Routledge.Google Scholar
Beymer, M. R., Holloway, I. W., & Grov, C. (2018). Comparing self-reported demographic and sexual behavioral factors among men who have sex with men recruited through Mechanical Turk, Qualtrics, and a HIV/STI clinic-based sample: Implications for researchers and providers. Archives of sexual behavior, 47(1), 133142. https://doi.org/10.1007/s10508-016-0932-yCrossRefGoogle Scholar
Bortree, D. S. (2005). Presentation of self on the Web: An ethnographic study of teenage girls’ weblogs. Education, Communication & Information, 5(1), 2539. https://doi.org/10.1080/14636310500061102Google Scholar
Bronner, F. & Kuijlen, T. (2007). The live or digital interviewer: A comparison between CASI, CAPI and CATI with respect to differences in response behaviour. International Journal of Market Research, 49(2), 167190. https://doi.org/10.1177/147078530704900204Google Scholar
Callegaro, M. (2010). Do you know which device your respondent has used to take your online survey. Survey Practice, 3(6), 112.CrossRefGoogle Scholar
Casler, K., Bickel, L., & Hackett, E. (2013). Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Computers in Human Behavior, 29(6), 21562160. https://doi.org/10.1016/j.chb.2013.05.009Google Scholar
Chew, C. & Eysenbach, G. (2010). Pandemics in the age of Twitter: Content analysis of Tweets during the 2009 H1N1 outbreak. PloS One, 5(11), e14118. https://doi.org/10.1371/journal.pone.0014118CrossRefGoogle ScholarPubMed
Christenson, D. P. & Glick, D. M. (2013). Crowdsourcing panel studies and real-time experiments in MTurk. The Political Methodologist, 20(2), 2732.Google Scholar
Clifford, S., Jewell, R. M., & Waggoner, P. D. (2015). Are samples drawn from Mechanical Turk valid for research on political ideology? Research & Politics, 2(4). https://doi.org/10.1177/2053168015622072CrossRefGoogle Scholar
Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web-or Internet-based surveys. Educational and Psychological Measurement, 60(6), 821836. https://doi.org/10.1177/00131640021970934Google Scholar
Coste, J., Quinquis, L., Audureau, E., & Pouchot, J. (2013). Non response, incomplete and inconsistent responses to self-administered health-related quality of life measures in the general population: Patterns, determinants and impact on the validity of estimates – a population-based study in France using the MOS SF-36. Health and Quality of Life Outcomes, 11(1), 115. https://doi.org/10.1186/1477-7525-11-44Google Scholar
Crump, M. J., McDonnell, J. V., & Gureckis, T. M. (2013). Evaluating Amazon’s Mechanical Turk as a tool for experimental behavioral research. PloS One, 8(3), e57410. https://doi.org/10.1371/journal.pone.0057410Google Scholar
Dandurand, F., Shultz, T. R., & Onishi, K. H. (2008). Comparing online and lab methods in a problem-solving experiment. Behavior Research Methods, 40(2), 428434. https://doi.org/10.3758/BRM.40.2.428Google Scholar
De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: A review. Computers & Education, 46(1), 628. https://doi.org/10.1016/j.compedu.2005.04.005Google Scholar
De Leeuw, J. R. (2015). jsPsych: A JavaScript library for creating behavioral experiments in a web browser. Behavior Research Methods, 47(1), 112. https://doi.org/10.3758/s13428-014-0458-yCrossRefGoogle Scholar
Denissen, J. J., Neumann, L., & Van Zalk, M. (2010). How the Internet is changing the implementation of traditional research methods, people’s daily lives, and the way in which developmental scientists conduct research. International Journal of Behavioral Development, 34(6), 564575. https://doi.org/10.1177/0165025410383746CrossRefGoogle Scholar
Denscombe, M. (2009). Item non‐response rates: A comparison of online and paper questionnaires. International Journal of Social Research Methodology, 12(4), 281291. https://doi.org/10.1080/13645570802054706Google Scholar
Dietrich, S. & Winters, M. S. (2015). Foreign aid and government legitimacy. Journal of Experimental Political Science, 2(2), 164171. https://doi.org/10.1017/XPS.2014.31Google Scholar
Dietz, P., Striegel, H., Franke, A. G., et al. (2013). Randomized response estimates for the 12‐month prevalence of cognitive‐enhancing drug use in university students. Pharmacotherapy: The Journal of Human Pharmacology and Drug Therapy, 33(1), 4450. https://doi.org/10.1002/phar.1166Google Scholar
Dillman, D. A. (2000). Procedures for conducting government-sponsored establishment surveys: Comparisons of the total design method (TDM), a traditional cost-compensation model, and tailored design. In Proceedings of American Statistical Association, Second International Conference on Establishment Surveys, Buffalo, New York, June 17–21 (pp. 343–352).Google Scholar
Edlund, J. E., Lange, K. M., Sevene, A. M., et al. (2017). Participant crosstalk: Issues when using the Mechanical Turk. Tutorials in Quantitative Methods for Psychology, 13(3), 174182. https://doi.org/10.20982/tqmp.13.3.p174Google Scholar
Eysenbach, G. & Till, J. E. (2001). Ethical issues in qualitative research on internet communities. BMJ, 323(7321), 11031105. https://doi.org/10.1136/bmj.323.7321.1103Google Scholar
Eysenbach, G. & Wyatt, J. (2002). Using the Internet for surveys and health research. Journal of Medical Internet Research, 4(2), e13. https://doi.org/10.2196/jmir.4.2.e13CrossRefGoogle ScholarPubMed
Fan, W. & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26(2), 132139. https://doi.org/10.1016/j.chb.2009.10.015Google Scholar
Fox, F. E., Morris, M., & Rumsey, N. (2007). Doing synchronous online focus groups with young people: Methodological reflections. Qualitative Health Research, 17(4), 539547. https://doi.org/10.1177/1049732306298754CrossRefGoogle ScholarPubMed
Galesic, M. & Bosnjak, M. (2009). Effects of questionnaire length on participation and indicators of response quality in a web survey. Public Opinion Quarterly, 73(2), 349360. https://doi.org/10.1093/poq/nfp031Google Scholar
Gandomi, A. & Haider, M. (2015). Beyond the hype: Big data concepts, methods, and analytics. International Journal of Information Management, 35(2), 137144. https://doi.org/10.1016/j.ijinfomgt.2014.10.007Google Scholar
Greenacre, Z. A. (2016). The importance of selection bias in Internet surveys. Open Journal of Statistics, 6(03), 397. https://doi.org/10.4236/ojs.2016.63035Google Scholar
Greenlaw, C. & Brown-Welty, S. (2009). A comparison of web-based and paper-based survey methods: Testing assumptions of survey mode and response cost. Evaluation Review, 33(5), 464480. https://doi.org/10.1177/0193841X09340214Google Scholar
Grootswagers, T. (2020). A primer on running human behavioural experiments online. Behavior Research Methods, 1(4), 22832286. https://doi.org/10.3758/s13428-020-01395-3CrossRefGoogle Scholar
Gureckis, T. M., Martin, J., McDonnell, J., (2016). psiTurk: An open-source framework for conducting replicable behavioral experiments online. Behavior Research Methods, 48(3), 829842. https://doi.org/10.3758/s13428-015-0642-8CrossRefGoogle ScholarPubMed
Hall, M. G., Grummon, A. H., Lazard, A. J., Maynard, O. M., & Taillie, L. S. (2020). Reactions to graphic and text health warnings for cigarettes, sugar-sweetened beverages, and alcohol: An online randomized experiment of US adults. Preventive Medicine, 137, 106120. https://doi.org/10.1016/j.ypmed.2020.106120Google Scholar
Hallett, R. E. & Barber, K. (2014). Ethnographic research in a cyber era. Journal of Contemporary Ethnography, 43(3), 306330. https://doi.org/10.1177/0891241613497749Google Scholar
Hamby, T. & Taylor, W. (2016). Survey satisficing inflates reliability and validity measures: An experimental comparison of college and Amazon Mechanical Turk samples. Educational and Psychological Measurement, 76(6), 912932. https://doi.org/10.1177/0013164415627349Google Scholar
Heerwegh, D. & Loosveldt, G. (2008). Face-to-face versus web surveying in a high-internet-coverage population: Differences in response quality. Public Opinion Quarterly, 72(5), 836846. https://doi.org/10.1093/poq/nfn045Google Scholar
Hilbig, B. E. (2016). Reaction time effects in lab-versus web-based research: Experimental evidence. Behavior Research Methods, 48(4), 17181724. https://doi.org/10.3758/s13428-015-0678-9CrossRefGoogle ScholarPubMed
Hine, C. (2000). Virtual Ethnography. SAGE Publications.CrossRefGoogle Scholar
Hoare, K. J., Buetow, S., Mills, J., & Francis, K. (2013). Using an emic and etic ethnographic technique in a grounded theory study of information use by practice nurses in New Zealand. Journal of Research in Nursing, 18(8), 720731. https://doi.org/10.1177/1744987111434190Google Scholar
Hohwü, L., Lyshol, H., Gissler, M., et al. (2013). Web-based versus traditional paper questionnaires: A mixed-mode survey with a Nordic perspective. Journal of Medical Internet Research, 15(8), e173. https://doi.org/10.2196/jmir.2595Google Scholar
Holsti, O. R. (1969). Content Analysis for the Social Sciences and Humanities. Addison-Wesley.Google Scholar
Ibarra, J. L., Agas, J. M., Lee, M., Pan, J. L., & Buttenheim, A. M. (2018). Comparison of online survey recruitment platforms for hard-to-reach pregnant smoking populations: Feasibility study. JMIR Research Protocols, 7(4), e8071. https://doi.org/10.2196/resprot.8071Google Scholar
Johnson, N. F. & Humphry, N. (2012). The Teenage Expertise Network (TEN): An online ethnographic approach. International Journal of Qualitative Studies in Education, 25(6), 723739. https://doi.org/10.1080/09518398.2011.590160Google Scholar
Joinson, A. N., Woodley, A., & Reips, U. D. (2007). Personalization, authentication and self-disclosure in self-administered Internet surveys. Computers in Human Behavior, 23(1), 275285. https://doi.org/10.1016/j.chb.2004.10.012Google Scholar
Kaplowitz, M. D., Hadlock, T. D., & Levine, R. (2004). A comparison of web and mail survey response rates. Public Opinion Quarterly, 68(1), 94101. https://doi.org/10.1093/poq/nfh006Google Scholar
Kenny, A. J. (2005). Interaction in cyberspace: An online focus group. Journal of Advanced Nursing, 49(4), 414422. https://doi.org/10.1111/j.1365-2648.2004.03305.xGoogle Scholar
King, D. B., O’Rourke, N., & DeLongis, A. (2014). Social media recruitment and online data collection: A beginner’s guide and best practices for accessing low-prevalence and hard-to-reach populations. Canadian Psychology/Psychologie Canadienne, 55(4), 240. https://doi.org/10.1037/a0038087CrossRefGoogle Scholar
Kongsved, S. M., Basnov, M., Holm-Christensen, K., & Hjollund, N. H. (2007). Response rate and completeness of questionnaires: A randomized study of Internet versus paper-and-pencil versions. Journal of Medical Internet Research, 9(3), e25. https://doi.org/10.2196/jmir.9.3.e25Google Scholar
Kozinets, R. V. (2002). The field behind the screen: Using netnography for marketing research in online communities. Journal of Marketing Research, 39, 6172. https://doi.org/10.1509/jmkr.39.1.61.18935Google Scholar
Kozinets, R. V. (2010). Netnography: Doing Ethnographic Research Online. SAGE Publications.Google Scholar
Kramer, J., Rubin, A., Coster, W., et al. (2014). Strategies to address participant misrepresentation for eligibility in Web‐based research. International Journal of Methods in Psychiatric Research, 23(1), 120129. https://doi.org/10.1002/mpr.1415Google Scholar
Kraut, R., Olson, J., Banaji, M., et al. (2004). Psychological research online: Report of Board of Scientific Affairs’ Advisory Group on the Conduct of Research on the Internet. American Psychologist, 59(2), 105. https://doi.org/10.1037/0003-066X.59.2.105Google Scholar
Krippendorff, K. (2018). Content Analysis: An Introduction to Its Methodology. SAGE Publications.Google Scholar
Lallukka, T., Pietiläinen, O., Jäppinen, S., et al. (2020). Factors associated with health survey response among young employees: A register-based study using online, mailed and telephone interview data collection methods. BMC Public Health, 20(1), 184. https://doi.org/10.1186/s12889-020-8241-8CrossRefGoogle ScholarPubMed
Larsen, M. C. (2008). Understanding social networking: On young people’s construction and co-construction of identity online. Online Networking: Connecting People. Icfai University Press.Google Scholar
Lazer, D., Pentland, A., Adamic, L., et al. (2009). Social science. Computational social science. Science, 323(5915), 721723. https://doi.org/10.1126/science.1167742Google Scholar
Leach, M. J., Hofmeyer, A., & Bobridge, A. (2016). The impact of research education on student nurse attitude, skill and uptake of evidence‐based practice: A descriptive longitudinal survey. Journal of Clinical Nursing, 25(1–2), 194203. https://doi.org/10.1111/jocn.13103Google Scholar
Lee, H., Wright, K. B., O’Connor, M., & Wombacher, K. (2014). Framing medical tourism: An analysis of persuasive appeals, risks and benefits, and new media features of medical tourism broker websites. Health Communication, 29(7), 637645. https://doi.org/10.1080/10410236.2013.794412Google Scholar
Lefever, S., Dal, M. & Matthiasdottir, A. (2007). Online data collection in academic research: Advantages and limitations. British Journal of Educational Technology, 38(4), 574582. https://doi.org/10.1111/j.1467-8535.2006.00638.xGoogle Scholar
Levay, K. E., Freese, J., & Druckman, J. N. (2016). The demographic and political composition of Mechanical Turk samples. Sage Open, 6(1), 2158244016636433. doi: https://doi.org/10.1177/2158244016636433Google Scholar
Lieberman, D. Z. (2008). Evaluation of the stability and validity of participant samples recruited over the Internet. Cyberpsychology & Behavior, 11(6), 743745. https://doi.org/10.1089/cpb.2007.0254Google Scholar
Lindhjem, H. & Navrud, S. (2011). Are Internet surveys an alternative to face-to-face interviews in contingent valuation? Ecological Economics, 70(9), 16281637. https://doi.org/10.1016/j.ecolecon.2011.04.002Google Scholar
Link, M. W. & Mokdad, A. H. (2005). Effects of survey mode on self-reports of adult alcohol consumption: A comparison of mail, web and telephone approaches. Journal of Studies on Alcohol, 66(2), 239245. https://doi.org/10.15288/jsa.2005.66.239Google Scholar
Manninen, V. J. (2017). Sourcing practices in online journalism: an ethnographic study of the formation of trust in and the use of journalistic sources. Journal of Media Practice, 18(2–3), 212228. https://doi.org/10.1080/14682753.2017.1375252Google Scholar
Manovich, L. (2012). How to compare one million images? In D. M. Berry (ed.), Understanding Digital Humanities (pp. 249278). Palgrave Macmillan.Google Scholar
Manyika, J., Chui, M., Brown, B., et al. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity. McKinsey Global Institute.Google Scholar
Markham, A. N. (2005). The methods, politics, and ethics of representation in online ethnography. In The SAGE Handbook of Qualitative Research. SAGE Publications.Google Scholar
Mason, W. & Watts, D. J. (2012). Collaborative learning in networks. Proceedings of the National Academy of Sciences, 109(3), 764769. https://doi.org/10.1073/pnas.1110069108Google Scholar
McInroy, L. B. (2016). Pitfalls, potentials, and ethics of online survey research: LGBTQ and other marginalized and hard-to-access youths. Social Work Research, 40(2), 8394. https://doi.org/10.1093/swr/svw005Google Scholar
McMillan, S. J. (2000). The microscope and the moving target: The challenge of applying content analysis to the World Wide Web. Journalism & Mass Communication Quarterly, 77(1), 8098. https://doi.org/10.1177/107769900007700107Google Scholar
Mullinix, K. J., Leeper, T. J., Druckman, J. N., & Freese, J. (2015). The generalizability of survey experiments. Journal of Experimental Political Science, 2(2), 109138. https://doi.org/10.1017/XPS.2015.19Google Scholar
Murray, E., Khadjesari, Z., White, I., et al. (2009). Methodological challenges in online trials. Journal of Medical Internet Research, 11(2), e9. https://doi.org/10.2196/jmir.1052Google Scholar
Murthy, D. (2011). Emergent digital ethnographic methods for social research. In Hesse-Biber, S. N. (ed.), Handbook of Emergent Technologies in Social Research (pp. 158–179). Oxford University Press.Google Scholar
Mutz, D. C. (2011). Population-Based Survey Experiments. Princeton University Press.Google Scholar
Nayak, M. S. D. P. & Narayan, K. A. (2019). Strengths and weakness of online surveys. IOSR Journal of Humanities and Social Science, 24(5), 3138. doi: 10.9790/0837-2405053138Google Scholar
Nimrod, G. (2018). Technophobia among older Internet users. Educational Gerontology, 44(2–3), 148162. https://doi.org/10.1080/03601277.2018.1428145CrossRefGoogle Scholar
Parigi, P., Santana, J. J., & Cook, K. S. (2017). Online field experiments: Studying social interactions in context. Social Psychology Quarterly, 80(1), 119. https://doi.org/10.1177/0190272516680842Google Scholar
Pechey, R. & Marteau, T. M. (2018). Availability of healthier vs. less healthy food and food choice: An online experiment. BMC Public Health, 18(1), 111. https://doi.org/10.1186/s12889-018-6112-3Google Scholar
Peirce, J., Gray, J. R., Simpson, S., et al. (2019). PsychoPy2: Experiments in behavior made easy. Behavior Research Methods, 51(1), 195203. https://doi.org/10.3758/s13428-018-01193-yGoogle Scholar
Pfeil, U. & Zaphiris, P. (2009). Investigating social network patterns within an empathic online community for older people. Computers in Human Behavior, 25(5), 11391155. https://doi.org/10.1016/j.chb.2009.05.001Google Scholar
Pullmann, H., Allik, J., & Realo, A. (2009). Global self-esteem across the life span: A cross-sectional comparison between representative and self-selected Internet samples. Experimental Aging Research, 35(1), 2044. https://doi.org/10.1080/03610730802544708Google Scholar
Radford, J., Pilny, A., Reichelmann, A., et al. (2016). Volunteer science: An online laboratory for experiments in social psychology. Social Psychology Quarterly, 79(4), 376396. https://doi.org/10.1177/0190272516675866Google Scholar
Ramo, D. E. & Prochaska, J. J. (2012). Broad reach and targeted recruitment using Facebook for an online survey of young adult substance use. Journal of Medical Internet Research, 14(1), e28. https://doi.org/10.2196/jmir.1878Google Scholar
Rains, S. A., Peterson, E. B., & Wright, K. B. (2015). Communicating social support in computer-mediated contexts: A meta-analytic review of content analyses examining support messages shared online among individuals coping with illness. Communication Monographs, 82(4), 403430. https://doi.org/10.1080/03637751.2015.1019530Google Scholar
Reimers, S. & Stewart, N. (2015). Presentation and response timing accuracy in Adobe Flash and HTML5/JavaScript Web experiments. Behavior Research Methods, 47(2), 309327. https://doi.org/10.3758/s13428-014-0471-1Google Scholar
Riffe, D., Lacy, S., Fico, F., & Watson, B. (2019). Analyzing Media Messages: Using Quantitative Content Analysis in Research. Routledge.Google Scholar
Russomanno, J., Patterson, J. G., & Tree, J. M. J. (2019). Social media recruitment of marginalized, hard-to-reach populations: Development of recruitment and monitoring guidelines. JMIR Public Health and Surveillance, 5(4), e14886. https://doi.org/10.2196/14886Google Scholar
Salmons, J. (2014). Qualitative Online Interviews: Strategies, Design, and Skills. SAGE Publications.Google Scholar
Schneider, S. M. & Foot, K. A. (2004). The Web as an object of study. New Media & Society, 6(1), 114122. https://doi.org/10.1177/1461444804039912Google Scholar
Shen, G. C. C., Chiou, J. S., Hsiao, C. H., Wang, C. H., & Li, H. N. (2016). Effective marketing communication via social networking site: The moderating role of the social tie. Journal of Business Research, 69(6), 22652270. https://doi.org/10.1016/j.jbusres.2015.12.040Google Scholar
Simmons, A. D. & Bobo, L. D. (2015). Can non-full-probability internet surveys yield useful data? A comparison with full-probability face-to-face surveys in the domain of race and social inequality attitudes. Sociological Methodology, 45(1), 357387. https://doi.org/10.1177/0081175015570096Google Scholar
Skitka, L. J. & Sargis, E. G. (2006). The Internet as psychological laboratory. Annual Review of Psychology, 57, 529555. https://doi.org/10.1146/annurev.psych.57.102904.190048Google Scholar
Stern, M. J., Bilgen, I., & Dillman, D. A. (2014). The state of survey methodology: Challenges, dilemmas, and new frontiers in the era of the tailored design. Field Methods, 26(3), 284301. doi: https://doi.org/10.1177/1525822X13519561Google Scholar
Takahashi, Y., Uchida, C., Miyaki, K., et al. (2009). Potential benefits and harms of a peer support social network service on the Internet for people with depressive tendencies: Qualitative content analysis and social network analysis. Journal of Medical Internet Research, 11(3), e29. https://doi.org/10.2196/jmir.1142Google Scholar
Valkenburg, P. M. & Peter, J. (2007). Online communication and adolescent well-being: Testing the stimulation versus the displacement hypothesis. Journal of Computer-Mediated Communication, 12(4), 11691182. https://doi.org/10.1111/j.1083-6101.2007.00368.xGoogle Scholar
Wagg, A. J., Callanan, M. M., & Hassett, A. (2019). Online social support group use by breastfeeding mothers: A content analysis. Heliyon, 5(3), e01245. https://doi.org/10.1016/j.heliyon.2019.e01245Google Scholar
Walther, J. B. (2007). Selective self-presentation in computer-mediated communication: Hyperpersonal dimensions of technology, language, and cognition. Computers in Human Behavior, 23(5), 25382557. https://doi.org/10.1016/j.chb.2006.05.002Google Scholar
Walther, J. B. & Burgoon, J. K. (1992). Relational communication in computer‐mediated interaction. Human Communication Research, 19(1), 5088. https://doi.org/10.1111/j.1468-2958.1992.tb00295.xGoogle Scholar
Wang, Y. & Sandner, J. (2019). Like a “frog in a well”? An ethnographic study of Chinese rural women’s social media practices through the WeChat platform. Chinese Journal of Communication, 12(3), 324339. https://doi.org/10.1080/17544750.2019.1583677Google Scholar
Wang, Y. C., Kraut, R. E., & Levine, J. M. (2015). Eliciting and receiving online support: Using computer-aided content analysis to examine the dynamics of online social support. Journal of Medical Internet Research, 17(4), e99. https://doi.org/10.2196/jmir.3558Google Scholar
Weigold, A., Weigold, I. K., & Russell, E. J. (2013). Examination of the equivalence of self-report survey-based paper-and-pencil and internet data collection methods. Psychological Methods, 18(1), 53. https://doi.org/10.1037/a0031607Google Scholar
Weinberg, J. D., Freese, J., & McElhattan, D. (2014). Comparing data characteristics and results of an online factorial survey between a population-based and a crowdsource-recruited sample. Sociological Science, 1, 292310. https://doi.org/10.15195/v1.a19Google Scholar
Wenz, A. (2021). Do distractions during web survey completion affect data quality? Findings from a laboratory experiment. Social Science Computer Review, 39(1), 148161. https://doi.org/10.1177/0894439319851503Google Scholar
Williams, T. A. & Shepherd, D. A. (2017). Mixed method social network analysis: Combining inductive concept development, content analysis, and secondary data for quantitative analysis. Organizational Research Methods, 20(2), 268298. https://doi.org/10.1177/1094428115610807Google Scholar
Woo, S. E., Keith, M., & Thornton, M. A. (2015). Amazon Mechanical Turk for industrial and organizational psychology: Advantages, challenges, and practical recommendations. Industrial and Organizational Psychology, 8(2), 171. https://doi.org/10.1017/iop.2015.21Google Scholar
Wright, K. B. (2005). Researching Internet-based populations: Advantages and disadvantages of online survey research, online questionnaire authoring software packages, and web survey services. Journal of Computer-Mediated Communication, 10(3), JCMC1034. https://doi.org/10.1111/j.1083-6101.2005.tb00259.xGoogle Scholar
Wright, K. B. (2016). Communication in health-related online social support groups/communities: A review of research on predictors of participation, applications of social support theory, and health outcomes. Review of Communication Research, 4, 6587. https://doi.org/10.12840/issn.2255-4165.2016.04.01.010Google Scholar
Wright, K. B. (2017). Web-based survey methodology. In Liamputtong, P. (ed.), Handbook of Research Methods in Health Social Sciences (pp. 114). Springer. https://doi.org/10.1007/978-981-10-2779-6_18-1Google Scholar
Wright, K., Fisher, C., Rising, C., Burke-Garcia, A., Afanaseva, D., & Cai, X. (2019). Partnering with mommy bloggers to disseminate breast cancer risk information: Social media intervention. Journal of Medical Internet Research, 21(3), e12441. https://doi.org/10.2196/12441Google Scholar
Zhang, J., Calabrese, C., Ding, J., Liu, M., & Zhang, B. (2018). Advantages and challenges in using mobile apps for field experiments: A systematic review and a case study. Mobile Media & Communication, 6(2), 179196. https://doi.org/10.1177/2050157917725550Google Scholar
Zhou, H. & Fishbach, A. (2016). The pitfall of experimenting on the Web: How unattended selective attrition leads to surprising (yet false) research conclusions. Journal of Personality and Social Psychology, 111(4), 493. https://doi.org/10.1037/pspa0000056Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Online Research Methods
  • Edited by Austin Lee Nichols, Central European University, Vienna, John Edlund, Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.019
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Online Research Methods
  • Edited by Austin Lee Nichols, Central European University, Vienna, John Edlund, Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.019
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Online Research Methods
  • Edited by Austin Lee Nichols, Central European University, Vienna, John Edlund, Rochester Institute of Technology, New York
  • Book: The Cambridge Handbook of Research Methods and Statistics for the Social and Behavioral Sciences
  • Online publication: 25 May 2023
  • Chapter DOI: https://doi.org/10.1017/9781009010054.019
Available formats
×