What would you do?
The brakes on your car have been sabotaged and you are racing down the road toward a crowd of pedestrians. If you do nothing, the car will stay on its course and kill five people. If you sharply turn the steering wheel, the crowd will be saved, but someone else on the side of the road will be killed.
That hypothetical situation, known as a “moral dilemma,” is the kind of vexing ethical question that Florida State University scholars used in a new study published in the journal Cognition.
Nick Byrd, a doctoral candidate in the Department of Philosophy, and Paul Conway, assistant professor in the Department of Psychology, collaborated on the research.
Their findings clarified the psychology of making moral decisions.
“We are interested in understanding what causes individual differences in moral judgments,” Byrd said. “If you and I respond differently to the same moral dilemma, we want to understand why because that can help us understand the psychology of moral judgment and more generally, how morality works.”
The FSU research challenges some fundamental stereotypes about two responses that people may have when confronted by moral dilemmas. They might respond analytically, which others often view as being reflective. Or, a person might have an intuitive or emotional reaction, which is usually considered more impulsive. It’s the classic contrast between making choices with the gut or brain.
Previous ethical reasoning research suggested that analytical people tended to determine right or wrong based on consequences — that’s an ethical theory called utilitarianism. Utilitarians focus on the end result of their choices, and the outcome is more important than existing rules or orders.
In the brake-failure example, choosing to steer the car toward one person to save five people would be utilitarian because, mathematically, the death of one person would be deemed a better consequencethan the deaths of five people.
Conversely, allowing the five people to die fits with deontological philosophy, typically associated with intuitive people. They tend to follow strict moral rules, such as “Thou Shalt Not Kill,” so intentionally killing one person – even to save others — would break that rule.
Previous research treated deontological responses as the polar opposite of utilitarianism. Byrd and Conway’s research showed the differences were not so black and white.
When they presented new moral dilemmas to participants of the study, the findings confirmed that analytic utilitarian responses correlated with careful reasoning. However, the FSU research challenged previous studies suggesting harm-averse deontological thinkers were solely emotional and impulsive.
“They seem to be quite analytical and reflective but in a different way, not a mathematical way like utilitarians,” Byrd said. “Our research suggests that deontological thinkers might reflect on the logical effects of a moral rule, while utilitarians might reflect on the costs and benefits of consequences. Both mindsets can involve careful reflection, albeit in different ways.”
Conway echoed that point, emphasizing that deontological judgments can include emotional impulses, but the FSU research demonstrates those choices are not ruled by emotion.
“These findings show there are logical reasons to refuse to directly cause harm, as people guided by a deontological perspective might respond,” Conway said. “They believe moral rules exist for a reason, so giving people the freedom to break moral rules might undermine society.”
Ultimately, Byrd hopes the FSU research raises awareness of the nuances between deontological and utilitarian thinkers and dispels stereotypes about different responses to challenging situations.