Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-23T01:08:21.916Z Has data issue: false hasContentIssue false

18 - Eyetracking Research

from Part IV - Behavioral Measures

Published online by Cambridge University Press:  12 December 2024

John E. Edlund
Affiliation:
Rochester Institute of Technology, New York
Austin Lee Nichols
Affiliation:
Central European University, Vienna
Get access

Summary

This chapter describes the use of eyetracking as an advanced research tool in the social and behavioral sciences. It covers the correlation of eye movements to behavior, the basic anatomy of the eye and its movements, and different kinds of eyetrackers that can be used to capture a range of behaviors. It also explains how one should select an eyetracker, and how to obtain good-quality data. Data quality always affects the final result, and this chapter explains how the accuracy and precision of gaze data affect behavioral analysis along with giving examples about the real-life application of eyetracking in social and behavioral research. The potential of eyetracking as a technology is vast, providing a window to the visual perception which is likely to make eyetracking a very important tool in the years to come.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2024

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Bahill, A., Brockenbrough, A., & Troost, B. (1981). Variability and development of a normative data base for saccadic eye movements. Investigative Ophthalmology & Visual Science, 21(1), 116125.Google ScholarPubMed
Becker, W. (1989). The neurobiology of saccadic eye movements: Metrics. Reviews of Oculomotor Research, 3, 1367.Google ScholarPubMed
Belli, R. F., Traugott, M. W., & Beckmann, M. N. (2001). What leads to voting overreports? Contrasts of overreporters to validated voters and admitted nonvoters in the American National Election Studies. Journal of Official Statistics, 17(4), 479498.Google Scholar
Binaee, K., & Diaz, G. (2019). Movements of the eyes and hands are coordinated by a common predictive strategy. Journal of Vision, 19(12).CrossRefGoogle ScholarPubMed
Blignaut, P., & Beelders, T. (2012). The precision of eye-trackers: A case for a new measure. In Proceedings of the 2006 Symposium on Eye Tracking Research and Applications (pp. 289292). ACM.Google Scholar
Chaudhary, A. K., & Pelz, J. B. (2020a). pi t –enhancing the precision of eye tracking using iris feature motion vectors [preprint]. ArXiv. https://doi.org/10.48550/arXiv.2009.09348CrossRefGoogle Scholar
Chaudhary, A. K., & Pelz, J. B. (2020b). Privacy-preserving eye videos using rubber sheet model. In ETRA ’20 Short Papers: ACM Symposium on Eye Tracking Research and Applications (pp. 15). ACM.Google Scholar
Clarke, A. D., Mahon, A., Irvine, A., & Hunt, A. R. (2017). People are unable to recognize or report on their own eye movements. Quarterly Journal of Experimental Psychology, 70(11), 22512270.CrossRefGoogle ScholarPubMed
Conklin, K., & Pellicer-Sanchez, A. (2016). Using eye-tracking in applied linguistics and second language research. Second Language Research, 32(3), 453467.CrossRefGoogle Scholar
Darwin, E. (1794). Zoonomia or the Laws of Organic Life (vol. 1). Printed for J. johnson.Google Scholar
Diaz, G., Cooper, J., Kit, D., & Hayhoe, M. (2013). Real-time recording and classification of eye movements in an immersive virtual environment. Journal of Vision, 13(12).CrossRefGoogle Scholar
Dodge, R. (1911). Visual motor functions. Psychological Bulletin, 8(11), 382385.CrossRefGoogle Scholar
Dodge, R. (1916). Visual motor functions. Psychological Bulletin, 13(11), 421427.CrossRefGoogle Scholar
Duchowski, A. T., Shivashankaraiah, V., Rawls, T., Gramopadhye, A. K., Melloy, B. J., & Kanki, B. (2000). Binocular eye tracking in virtual reality for inspection training. In Proceedings of the 2000 Symposium on Eye Tracking Research and Applications (pp. 8996). ACM.CrossRefGoogle Scholar
Eberhard, K. M., Spivey-Knowlton, M. J., Sedivy, J. C., & Tanenhaus, M. K. (1995). Eye movements as a window into real-time spoken language comprehension in natural contexts. Journal of Psycholinguistic Research, 24(6), 409436.CrossRefGoogle ScholarPubMed
Evans, K. M., Jacobs, R. A., Tarduno, J. A., & Pelz, J. B. (2012). Collecting and analyzing eye tracking data in outdoor environments. Journal of Eye Movement Research, 5(2).CrossRefGoogle Scholar
Feit, A. M., Williams, S., Toledo, A., Paradiso, A., Kulkarni, H., Kane, S., & Morris, M. R. (2017). Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 11181130). ACM.CrossRefGoogle Scholar
Foulsham, T., & Kingstone, A. (2013). Fixation-dependent memory for natural scenes: An experimental test of scanpath theory. Journal of Experimental Psychology: General, 142(1):4156.CrossRefGoogle ScholarPubMed
Geisler, W. S., & Perry, J. S. (1998). Real-time foveated multiresolution system for low-bandwidth video communication. In Human Vision and Electronic Imaging III (pp. 294305). SPIE.CrossRefGoogle Scholar
Grant, E. R., & Spivey, M. J. (2003). Eye movements and problem solving: Guiding attention guides thought. Psychological Science, 14(5), 462466.CrossRefGoogle ScholarPubMed
Grossberg, S., Srihasam, K., & Bullock, D. (2012). Neural dynamics of saccadic and smooth pursuit eye movement coordination during visual tracking of unpredictably moving targets. Neural Networks, 27, 120.CrossRefGoogle ScholarPubMed
Ha, J., Park, S., & Im, C.-H. (2022). Novel hybrid brain-computer interface for virtual reality applications using steady-state visual-evoked potential-based brain–computer interface and electrooculogram-based eye tracking for increased information transfer rate. Frontiers in Neuroinformatics, 16.CrossRefGoogle ScholarPubMed
Harris, D. J., Wilson, M. R., Holmes, T., de Burgh, T., & Vine, S. J. (2022). Eye movements in sports research and practice: Immersive technologies as optimal environments for the study of gaze behavior. In Stuart, S. (ed.), Eye Tracking: Background, Methods, and Applications (pp. 207221). Springer.CrossRefGoogle Scholar
Jensen, R. R., Stets, J. D., Suurmets, S., Clement, J., & Aanæs, H. (2017). Wearable gaze trackers: Mapping visual attention in 3d. In Scandinavian Conference on Image Analysis (pp. 6676). Springer.CrossRefGoogle Scholar
Jogeshwar, A. K. (2020). Analysis and visualization tool for motion and gaze. In ETRA ’20 Short Papers: ACM Symposium on Eye Tracking Research and Applications (pp. 13). ACM.Google Scholar
Jogeshwar, A. K., Diaz, G. J., Farnand, S. P., & Pelz, J. B. (2020). The cone model: Recognizing gaze uncertainty in virtual environments. Electronic Imaging, 32, 18.CrossRefGoogle Scholar
Jogeshwar, A. K., & Pelz, J. B. (2021). GazeEnViz4D: 4-d gaze-in-environment visualization pipeline. Procedia Computer Science, 192, 29522961.CrossRefGoogle Scholar
Jogeshwar, A. (2023). Look at the Bigger Picture: Analyzing Eye Tracking Data With Multi-Dimensional Visualization. Rochester Institute of Technology.Google Scholar
Joseph, A. W., Jeevitha Shree, D., Saluja, K. P. S., Mukhopadhyay, A., Murugesh, R., & Biswas, P. (2021). Eye tracking to understand impact of aging on mobile phone applications. In Design for Tomorrow (vol. 1, pp. 315326). Springer.CrossRefGoogle Scholar
Kaminska, O., & Foulsham, T. (2016). Eye-tracking social desirability bias. Bulletin of Sociological Methodology / Bulletin de Methodologie Sociologique, 130(1), 7389.CrossRefGoogle Scholar
Kassner, M., Patera, W., & Bulling, A. (2014). Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (pp. 11511160). ACM.CrossRefGoogle Scholar
Koch, M., Weiskopf, D., & Kurzhals, K. (2022). A spiral into the mind: Gaze spiral visualization for mobile eye tracking. Proceedings of the ACM on Computer Graphics and Interactive Techniques, 5(2), 116.CrossRefGoogle Scholar
Kothari, R., Yang, Z., Kanan, C., Bailey, R., Pelz, J. B., & Diaz, G. J. (2020). Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities. Scientific Reports, 10(1), 118.CrossRefGoogle ScholarPubMed
Kovesdi, C., Spielman, Z., LeBlanc, K., & Rice, B. (2018). Application of eye tracking for measurement and evaluation in human factors studies in control room modernization. Nuclear Technology, 202(2–3), 220229.CrossRefGoogle Scholar
Kowler, E. (1995). Cogito ergo moveo: Cognitive control of eye movement. In Exploratory Vision: The Active Eye, pages 5177. Springer.Google Scholar
Kröger, J. L., Lutz, O. H.-M., & Müller, F. (2019). What does your gaze reveal about you? On the privacy implications of eye tracking. In Privacy and Identity Management: Data for Better Living (pp. 226241). Springer.Google Scholar
Lasky, N. V., Fisher, B. S., & Jacques, S. (2017). “Thinking thief” in the crime prevention arms race: Lessons learned from shoplifters. Security Journal, 30(3), 772792.CrossRefGoogle Scholar
Leigh, R., & Zee, D. (2015). The Neurology of Eye Movements, 5th ed. Oxford University Press.CrossRefGoogle Scholar
Li, T.-H., Suzuki, H., & Ohtake, Y. (2020). Visualization of user’s attention on objects in 3D environment using only eye tracking glasses. Journal of Computational Design and Engineering, 7(2), 228237.CrossRefGoogle Scholar
Lisberger, S. G. (2010). Visual guidance of smooth-pursuit eye movements: Sensation, action, and what happens in between. Neuron, 66(4), 477491.CrossRefGoogle ScholarPubMed
Macatee, R. J., Albanese, B. J., Schmidt, N. B., & Cougle, J. R. (2017). Attention bias towards negative emotional information and its relationship with daily worry in the context of acute stress: An eye-tracking study. Behaviour Research and Therapy, 90, 96110.CrossRefGoogle ScholarPubMed
Marti, S., Bayet, L., & Dehaene, S. (2015). Subjective report of eye fixations during serial search. Consciousness and Cognition, 33, 115.CrossRefGoogle ScholarPubMed
Maurus, M., Hammer, J. H., & Beyerer, J. (2014). Realistic heatmap visualization for interactive analysis of 3D gaze data. In ETRA ’14: Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 295298). ACM.CrossRefGoogle Scholar
Mele, M. L., & Federici, S. (2012). Gaze and eye-tracking solutions for psychological research. Cognitive Processing, 13(1), 261265.CrossRefGoogle ScholarPubMed
Merchant, J. (1967). The Oculometer [technical report]. NASA.Google Scholar
Mühlenbeck, C., Jacobsen, T., Pritsch, C., & Liebal, K. (2017). Cultural and species differences in gazing patterns for marked and decorated objects: A comparative eye-tracking study. Frontiers in psychology, 8.CrossRefGoogle ScholarPubMed
Munn, S. M., & Pelz, J. B. (2009). Fixtag: An algorithm for identifying and tagging fixations to simplify the analysis of data collected by portable eye trackers. ACM Transactions on Applied Perception (TAP), 6(3), 125.CrossRefGoogle Scholar
Nyström, M., Andersson, R., Holmqvist, K., & Van De Weijer, J. (2013). The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods, 45(1), 272288.CrossRefGoogle ScholarPubMed
Pelz, J. B., Hayhoe, M. M., Ballard, D. H., Shrivastava, A., Bayliss, J. D., & von der Heyde, M. (1999). Development of a virtual laboratory for the study of complex human behavior. In Stereoscopic Displays and Virtual Reality Systems VI (pp. 416426). SPIE.CrossRefGoogle Scholar
Pfeiffer, T., & Renner, P. (2014). Eyesee3D: A low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology. In ETRA ’14: Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 369376). ACM.CrossRefGoogle Scholar
Porterfield, W. (1735). An essay concerning the motions of our eyes. Part I. Of their external motions. Edinburgh Medical Essays and Observations, 3, 160260.Google Scholar
Reingold, E. M. (2014). Eye tracking research and technology: Towards objective measurement of data quality. Visual Cognition, 22(3–4), 635652.CrossRefGoogle ScholarPubMed
Rucci, M., & Victor, J. D. (2015). The unsteady eye: An information-processing stage, not a bug. Trends in Neurosciences, 38(4), 195206.CrossRefGoogle Scholar
Salverda, A. P., & Tanenhaus, M. K. (2017). The visual world paradigm. In de Groot, A. & Hagoort, P. (eds.), Research Methods in Psycholinguistics and the Neurobiology of Language: A Practical Guide (pp. 89110). Wiley-Blackwell.CrossRefGoogle Scholar
Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye Tracking Research and applications (pp. 7178). ACM.CrossRefGoogle Scholar
Savitzky, A., & Golay, M. J. (1964). Smoothing and differentiation of data by simplified least squares procedures. Analytical Chemistry, 36(8), 16271639.CrossRefGoogle Scholar
Shepherd, M., Findlay, J. M., & Hockey, R. J. (1986). The relationship between eye movements and spatial attention. Quarterly Journal of Experimental Psychology, 38(3), 475491.CrossRefGoogle ScholarPubMed
Spering, M., & Carrasco, M. (2015). Acting without seeing: Eye movements reveal visual processing without awareness. Trends in Neurosciences, 38(4), 247258.CrossRefGoogle ScholarPubMed
Startsev, M., & Zemblys, R. (2022). Evaluating eye movement event detection: A review of the state of the art. Behavior Research Methods, 55(4), 16531714.CrossRefGoogle ScholarPubMed
Tatler, B. W., & Wade, N. J. (2003). On nystagmus, saccades, and fixations. Perception, 32(2), 167184.CrossRefGoogle ScholarPubMed
Tinker, M. A., & Paterson, D. G. (1939). Influence of type form on eye movements. Journal of Experimental Psychology, 25(5), 528531.CrossRefGoogle Scholar
Tonsen, M., Baumann, C. K., & Dierkes, K. (2020). A high-level description and performance evaluation of pupil invisible [preprint]. ArXiv. https://doi.org/10.48550/arXiv.2009.00508CrossRefGoogle Scholar
Ukai, K., & Howarth, P. A. (2008). Visual fatigue caused by viewing stereoscopic motion images: Background, theories, and observations. Displays, 29(2), 106116.CrossRefGoogle Scholar
Vansteenkiste, P., Cardon, G., Philippaerts, R., & Lenoir, M. (2015). Measuring dwell time percentage from head-mounted eye-tracking data–comparison of a frame-by-frame and a fixation-by-fixation analysis. Ergonomics, 58(5), 712721.CrossRefGoogle Scholar
Wang, D., Mulvey, F. B., Pelz, J. B., & Holmqvist, K. (2017). A study of artificial eyes for the measurement of precision in eye-trackers. Behavior Research Methods, 49(3), 947959.CrossRefGoogle ScholarPubMed
Watalingam, R. D., Richetelli, N., Pelz, J. B., & Speir, J. A. (2017). Eye tracking to evaluate evidence recognition in crime scene investigations. Forensic Science International, 280, 6480.CrossRefGoogle ScholarPubMed

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×