Cass R. Sunstein
With respect to questions of fact, people use heuristics—mental short-cuts, or rules of thumb, that generally work well, but that also lead to systematic errors. People use moral heuristics too—moral short-cuts, or rules of thumb, that lead to mistaken and even absurd moral judgments. These judgments are highly relevant to law and politics. Examples are given from a number of domains, with an emphasis on appropriate punishment. Moral framing effects are discussed as well.
JOSHUA D. GREENE
ABSTRACT This article reviews recent advances in the cognitive neuroscience of moral judgment and behavior. This field is conceived, not as the study of a distinct set of neural functions, but as an attempt to understand how the brain’s core neural systems coordinate to solve problems that we define, for nonneuroscientific reasons, as “moral.” These systems enable the representation of value, cognitive control, the imagination of distal events, and the representation of mental states. Research examines the brains of morally pathological individuals, the responses of healthy brains to prototypically immoral actions, and the brain’s responses to more complex moral problems such as philosophical and economic dilemmas.
The fate of industrially farmed animals is one of the most pressing ethical questions of our time. Tens of billions of sentient beings, each with complex sensations and emotions, live and die on a production line…
This is the 12th in a series of interviews with philosophers on race that I am conducting for The Stone. This week’s conversation is with Peter Singer, a professor of bioethics at Princeton University. He is the author of numerous books, including, most recently, “The Most Good You Can Do.” — George Yancy
Philosophers have been gnawing on the infamous Trolley Problem for decades, and it’s always been a purely intellectual exercise with no “right” answer. But we’re suddenly in a world in which autonomous machines, including self-driving cars, have to be programmed to deal with Trolley Problem-like emergencies in which lives hang in the balance. There’s no dodging the issue: The programmers have to decide how machines can behave appropriately in crunch time (as it were).