Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-27T00:51:36.009Z Has data issue: false hasContentIssue false

Neurophysiological evidence for visual perceptual categorization of words and faces within 150 ms

Published online by Cambridge University Press:  01 May 1998

HALINE E. SCHENDAN
Affiliation:
Program in Neurosciences, University of California at San Diego, La Jolla, USA Interdisciplinary Program in Cognitive Science, University of California at San Diego, La Jolla, USA
GIORGIO GANIS
Affiliation:
Department of Cognitive Science, University of California at San Diego, La Jolla, USA
MARTA KUTAS
Affiliation:
Program in Neurosciences, University of California at San Diego, La Jolla, USA Interdisciplinary Program in Cognitive Science, University of California at San Diego, La Jolla, USA Department of Cognitive Science, University of California at San Diego, La Jolla, USA
Get access

Abstract

The nature and early time course of the initial processing differences between visually matched linguistic and nonlinguistic images were studied with event-related potentials (ERPs). The first effect began at 90 ms when ERPs to written words diverged from other objects, including faces. By 125 ms, ERPs to words and faces were more positive than those to other objects, effects identified with the P150. The amplitude and scalp distribution of P150s to words and faces were similar. The P150 seemed to be elicited selectively by images resembling any well-learned category of visual patterns. We propose that (a) visual perceptual categorization based on long-term experience begins by 125 ms, (b) P150 amplitude varies with the cumulative experience people have discriminating among instances of specific categories of visual objects (e.g., words, faces), and (c) the P150 is a scalp reflection of letterstring and face intracranial ERPs in posterior fusiform gyrus.

Type
Research Article
Copyright
© 1998 Society for Psychophysiological Research

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)