Book by Sven Nyholm.
Published by Rowman & Littlefield Publishers.
Can robots perform actions, make decisions, collaborate with humans, be our friends, perhaps fall in love, or potentially harm us? Even before these things truly happen, ethical and philosophical questions already arise. The reason is that we humans have a tendency to spontaneously attribute minds and “agency” to anything even remotely humanlike. Moreover, some people already say that robots should be our companions and have rights. Others say that robots should be slaves. This book tackles emerging ethical issues about human beings, robots, and agency head on. It explores the ethics of creating robots that are, or appear to be, decision-making agents. From military robots to self-driving cars to care robots or even sex robots equipped with artificial intelligence: how should we interpret the apparent agency of such robots? This book argues that we need to explore how human beings can best coordinate and collaborate with robots in responsible ways. It investigates ethically important differences between human agency and robot agency to work towards an ethics of responsible human-robot interaction.
Table of Contents
- Artificial Agency, Human Responsibility: An “Existential” Problem
- Human-Robot Collaborations and Responsibility-Loci
- Moral Machines? Robotic Agents and the Traditional Moral Theories
- The Ethics of Human-Robot Coordination
- Human-Robot Relationships: Could there be Friendship or Mutual Love between Humans and Robots?
- Is it Wrong to Create Robots that Humans will become Emotionally Attached to?
- Robot Rights – should Robots be “Slaves”?
About the Author
- Sven Nyholm’s 2012 dissertation (from the University of Michigan) won the Proquest Distinguished Dissertation Award, and was published in book-form by De Gruyter. Nyholm’s articles have appeared in general philosophy journals, ethics journals, philosophy of technology journals, and bioethics journals. He writes about ethical theory, human self-understanding, and emerging technologies.