Book contents
- Frontmatter
- Contents
- General Introduction
- PART I THE NATURE OF MACHINE ETHICS
- PART II THE IMPORTANCE OF MACHINE ETHICS
- PART III ISSUES CONCERNING MACHINE ETHICS
- Introduction
- 6 What Matters to a Machine?
- 7 Machine Ethics and the Idea of a More-Than-Human Moral World
- 8 On Computable Morality
- 9 When Is a Robot a Moral Agent?
- 10 Philosophical Concerns with Machine Ethics
- 11 Computer Systems
- 12 On the Morality of Artificial Agents
- 13 Legal Rights for Machines
- PART IV APPROACHES TO MACHINE ETHICS
- PART V VISIONS FOR MACHINE ETHICS
Introduction
from PART III - ISSUES CONCERNING MACHINE ETHICS
Published online by Cambridge University Press: 01 June 2011
- Frontmatter
- Contents
- General Introduction
- PART I THE NATURE OF MACHINE ETHICS
- PART II THE IMPORTANCE OF MACHINE ETHICS
- PART III ISSUES CONCERNING MACHINE ETHICS
- Introduction
- 6 What Matters to a Machine?
- 7 Machine Ethics and the Idea of a More-Than-Human Moral World
- 8 On Computable Morality
- 9 When Is a Robot a Moral Agent?
- 10 Philosophical Concerns with Machine Ethics
- 11 Computer Systems
- 12 On the Morality of Artificial Agents
- 13 Legal Rights for Machines
- PART IV APPROACHES TO MACHINE ETHICS
- PART V VISIONS FOR MACHINE ETHICS
Summary
Several of the authors in this part raise doubts about whether machines are capable of making ethical decisions, which would seem to thwart the entire project of attempting to create ethical machines. Drew McDermott, for instance, in “What Matters to a Machine?” characterizes ethical dilemmas in such a way that it would seem that machines are incapable of experiencing them, thus making them incapable of acting in an ethical manner. He takes as the paradigm of an ethical dilemma a situation of moral temptation in which one knows what the morally correct action is, but one's self-interest (or the interest of someone one cares about) inclines one to do something else. He claims that “the idiosyncratic architecture of the human brain is responsible for our ethical dilemmas and our regrets about the decisions we make,” and this is virtually impossible to automate. As a result, he thinks it extremely unlikely that we could create machines that are complex enough to act morally or immorally.
Critics will maintain that McDermott has defined “ethical dilemma” in a way that few ethicists would accept. (See S. L. Anderson's article in this part.) Typically, an ethical dilemma is thought of as a situation where several courses of action are possible and one is not sure which of them is correct, rather than a situation where one knows which is the correct action, but one doesn't want to do it.
- Type
- Chapter
- Information
- Machine Ethics , pp. 79 - 87Publisher: Cambridge University PressPrint publication year: 2011
- 1
- Cited by