No CrossRef data available.
Article contents
Don't admit defeat: A new dawn for the item in visual search
Published online by Cambridge University Press: 24 May 2017
Abstract
Even though we lack a precise definition of “item,” it is clear that people do parse their visual environment into objects (the real-world equivalent of items). We will review evidence that items are essential in visual search, and argue that computer vision – especially deep learning – may offer a solution for the lack of a solid definition of “item.”
- Type
- Open Peer Commentary
- Information
- Copyright
- Copyright © Cambridge University Press 2017
References
Becker, S. I., Ansorge, U. & Horstmann, G. (2009) Can inter-trial priming effects account for the similarity effect in visual search?
Vision Research
49:1738–56.CrossRefGoogle Scholar
Desimone, R., Albright, T., Gross, C. & Bruce, C. (1984) Stimulus-selective properties of inferior temporal neurons in the macaque. Journal of Neuroscience
4(8):2051–62.CrossRefGoogle ScholarPubMed
Egly, R., Driver, J. & Rafal, R. D. (1994) Shifting visual attention between objects and locations: Evidence from normal and parietal lesion subjects. Journal of Experimental Psychology: General
123:161–77. doi: 10.1037//0096-3445.123.2.161.CrossRefGoogle ScholarPubMed
Einhäuser, W., Spain, M. & Perona, P. (2008) Objects predict fixations better than early saliency. Journal of Vision
8(14):18.CrossRefGoogle ScholarPubMed
He, K., Zhang, X., Ren, S. & Sun, J. (2015) Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification. arXiv 1502.01852.CrossRefGoogle Scholar
Hubel, D. H. & Wiesel, T. N. (1959) Receptive fields of single neurons in the cat's striate cortex. Journal of Physiology
148(3):574–91.CrossRefGoogle ScholarPubMed
Kristjánsson, Á. & Driver, J. (2008) Priming in visual search: Separating the effects of target repetition, distractor repetition and role-reversal. Vision Research
48(10):1217–32. Available at: http://doi.org/10.1016/j.visres.2008.02.007.CrossRefGoogle ScholarPubMed
Krizhevsky, A., Sutskever, I. & Hinton, G. E. (2012) ImageNet classification with deep convolutional neural networks. In: Conference of Advances in neural information processing systems, ed. Pereira, F., Burges, C. J. C., Bottou, L. & Weinberger, K. Q., pp. 1097–105. Neural Information Processing Systems Foundation. Available at: https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.Google Scholar
Le, Q. V., Ranzato, M., Monga, R., Devin, M., Chen, K., Corrado, G. S., Dean, J. & Ng, A. Y. (2012) Building high-level features using large scale unsupervised learning. Paper presented at the ICML.CrossRefGoogle Scholar
LeCun, Y., Kavukvuoglu, K. & Farabet, C. (2010) Convolutional networks and applications in vision. Paper presented at the ISCAS.CrossRefGoogle Scholar
Lee, H., Grosse, R., Ranganath, R. & Ng, A. Y. (2009) Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations. Paper presented at the ACM.CrossRefGoogle Scholar
Marr, D. (1982) Vision: A computational investigation into the human representation and processing of visual information. W.H. Freeman.Google Scholar
Meeter, M. & Van der Stigchel, S. (2013) Visual priming through a boost of the target signal: Evidence from saccadic landing positions. Attention, Perception, and Psychophysics
75:1336–41.CrossRefGoogle ScholarPubMed
Meeter, M., Van der Stigchel, S. & Theeuwes, J. (2010) A competitive integration model of exogenous and endogenous eye movements. Biological Cybernetics
102:271–91.CrossRefGoogle ScholarPubMed
Theeuwes, J., Kramer, A. F., Hahn, S. & Irwin, D. E. (1998) Our eyes do not always go where we want them to go: Capture of eyes by new objects. Psychological Science
9:379–85.CrossRefGoogle Scholar
Theeuwes, J., Mathôt, S. & Grainger, J. (2013) Exogenous object-centered attention. Attention Perception, and Psychophysics
75:812–18. doi: 10.3758/s13414-013-0459-4.CrossRefGoogle ScholarPubMed
Theeuwes, J., Mathôt, S. & Kingstone, A. (2010) Object-based eye movements: The eyes prefer to stay within the same object. Attention Perception, and Psychophysics
72(3):12–21. doi: 10.3758/APP.72.3.597.CrossRefGoogle ScholarPubMed
Theeuwes, J., Mathôt, S. & Grainger, J. (2013) Exogenous object-centered attention. Attention, Perception, and Psychophysics
75:812–18.CrossRefGoogle ScholarPubMed
Trappenberg, T. P., Dorris, M. C., Munoz, D. P. & Klein, R. M. (2001) A model of saccade initiation based on the competitive integration of exogenous and endogenous signals in the superior colliculus. Journal of Cognitive Neuroscience
13(2):256–71.CrossRefGoogle Scholar
Target article
The impending demise of the item in visual search
Related commentaries (30)
An appeal against the item's death sentence: Accounting for diagnostic data patterns with an item-based model of visual search
Analysing real-world visual search tasks helps explain what the functional visual field is, and what its neural mechanisms are
Chances and challenges for an active visual search perspective
Cognitive architecture enables comprehensive predictive models of visual search
Contextual and social cues may dominate natural visual search
Don't admit defeat: A new dawn for the item in visual search
Eye movements are an important part of the story, but not the whole story
Feature integration, attention, and fixations during visual search
Fixations are not all created equal: An objection to mindless visual search
Gaze-contingent manipulation of the FVF demonstrates the importance of fixation duration for explaining search behavior
How functional are functional viewing fields?
Item-based selection is in good shape in visual compound search: A view from electrophysiology
Looking further! The importance of embedding visual search in action
Mathematical fixation: Search viewed through a cognitive lens
Oh, the number of things you will process (in parallel)!
Parallel attentive processing and pre-attentive guidance
Scanning movements during haptic search: similarity with fixations during visual search
Searching for unity: Real-world versus item-based visual search in age-related eye disease
Set size slope still does not distinguish parallel from serial search
Task implementation and top-down control in continuous search
The FVF framework and target prevalence effects
The FVF might be influenced by object-based attention
The “item” as a window into how prior knowledge guides visual search
Those pernicious items
Until the demise of the functional field of view
What fixations reveal about oculomotor scanning behavior in visual search
Where the item still rules supreme: Time-based selection, enumeration, pre-attentive processing and the target template?
Why the item will remain the unit of attentional selection in visual search
“I am not dead yet!” – The Item responds to Hulleman & Olivers
“Target-absent” decisions in cancer nodule detection are more efficient than “target-present” decisions!
Author response
On the brink: The demise of the item in visual search moves closer