Hostname: page-component-586b7cd67f-t7fkt Total loading time: 0 Render date: 2024-11-25T07:22:44.094Z Has data issue: false hasContentIssue false

Latent Semantic Analysis (LSA), a disembodied learning machine, acquires human word meaning vicariously from language alone

Published online by Cambridge University Press:  01 August 1999

Thomas K. Landauer
Affiliation:
Institute of Cognitive Science, University of Colorado at Boulder, Boulder, CO 80309 [email protected] psych-www.colorado.edu/faculty/landauer.html

Abstract

The hypothesis that perceptual mechanisms could have more representational and logical power than usually assumed is interesting and provocative, especially with regard to brain evolution. However, the importance of embodiment and grounding is exaggerated, and the implication that there is no highly abstract representation at all, and that human-like knowledge cannot be learned or represented without human bodies, is very doubtful. A machine-learning model, Latent Semantic Analysis (LSA) that closely mimics human word and passage meaning relations is offered as a counterexample.

Type
Open Peer Commentary
Copyright
© 1999 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)