Book contents
- Frontmatter
- Contents
- General Introduction
- PART I THE NATURE OF MACHINE ETHICS
- PART II THE IMPORTANCE OF MACHINE ETHICS
- Introduction
- 4 Why Machine Ethics?
- 5 Authenticity in the Age of Digital Companions
- PART III ISSUES CONCERNING MACHINE ETHICS
- PART IV APPROACHES TO MACHINE ETHICS
- PART V VISIONS FOR MACHINE ETHICS
- References
5 - Authenticity in the Age of Digital Companions
from PART II - THE IMPORTANCE OF MACHINE ETHICS
Published online by Cambridge University Press: 01 June 2011
- Frontmatter
- Contents
- General Introduction
- PART I THE NATURE OF MACHINE ETHICS
- PART II THE IMPORTANCE OF MACHINE ETHICS
- Introduction
- 4 Why Machine Ethics?
- 5 Authenticity in the Age of Digital Companions
- PART III ISSUES CONCERNING MACHINE ETHICS
- PART IV APPROACHES TO MACHINE ETHICS
- PART V VISIONS FOR MACHINE ETHICS
- References
Summary
With the advent of “thinking” machines, old philosophical questions about life and consciousness acquired new immediacy. Computationally rich software and, more recently, robots have challenged our values and caused us to ask new questions about ourselves (Turkle, 2005 [1984]). Are there some tasks, such as providing care and companionship, that only befit living creatures? Can a human being and a robot ever be said to perform the same task? In particular, how shall we assign value to what we have traditionally called relational authenticity? In their review of psychological benchmarks for human-robot interaction, Kahn et al. (2007) include authenticity as something robots can aspire to, but it is clear that from their perspective robots will be able to achieve it without sentience. Here, authenticity is situated on a more contested terrain.
Eliza and the crisis of authenticity
Joseph Weizenbaum's computer program Eliza brought some of these issues to the fore in the 1960s. Eliza prefigured an important element of the contemporary robotics culture in that it was one of the first programs that presented itself as a relational artifact, a computational object explicitly designed to engage a user in a relationship (Turkle, 2001, 2004; Turkle, Breazeal, Dasté, & Scassellati, 2006; Turkle, Taggart, Kidd, & Dasté, 2006). Eliza was designed to mirror users' thoughts and thus seemed consistently supportive, much like a Rogerian psychotherapist.
- Type
- Chapter
- Information
- Machine Ethics , pp. 62 - 76Publisher: Cambridge University PressPrint publication year: 2011
References
- 18
- Cited by