Project by MIT Media Lab. The Moral Machine is a platform for gathering a human perspective on moral decisions made by machine intelligence, such as self-driving cars. We generate moral dilemmas, where a driverless car must choose the lesser of two evils, such as killing two passengers or five pedestrians. As an outside observer, people judge which outcome they think is more acceptable. They can then see how their responses compare with other people. If they are feeling creative, people can also design their own scenarios, for others to view, share, and discuss. [ . . . ]
- Thesis by Edmond Awad. Moral machines : perception of moral judgment made by machines
- Article in Slate: Should a Self-Driving Car Kill Two Jaywalkers or One Law-Abiding Citizen?