Teetering On The Footbridge

Tuesday, June 27, 2006

By Wray Herbert

Imagine that you are the operator of a San Francisco cable car. One day, the car’s brakes go out, and you’re careering down Powell Street at an untoward speed. Ahead you see five students, crossing the track on their way home from class. There is no way to stop the car or warn the students. The only way to avoid killing all five is to throw a switch and turn onto another track. But if you do that you will run over and kill another student who is straggling behind the group. What do you do?

This is a slightly embellished version of what philosophers call the “trolley dilemma,” which is used to explore how people reason about morally ambiguous situations. The scenario is often used together with another, the so-called “footbridge dilemma.” In this case, a runaway trolley is again heading toward five innocent victims. But you’re no longer the driver. You and a fat man are standing on a footbridge overlooking the track, and you realize that the only way you can spare the five students is to push the fat man off the bridge, on to the track below. Push or no push?

Nevermind that even a very fat man would probably not stop a runaway trolley car. That’s not the point. Focus on the two dilemmas, which are fundamentally the same. In each, you can sacrifice one life to save five. Yet people react very differently to the two situations. People automatically see the logic in the trolley dilemma, and almost all opt for the utilitarian solution. But given the footbridge dilemma, most are morally repulsed by the idea of pushing the fat man off the bridge. They won’t do it. This seeming inconsistency has baffled both philosophers and psychologists for years.

Why does the human brain process these two dilemmas so differently? Why does our reason fail us on the footbridge? Northeastern University psychologists Piercarlo Valdesolo and David DeSteno are among the scientists who have been studying moral judgments in the laboratory, and they are coming to believe that moral reasoning is not as, well, reasonable as we like to think. Indeed, what we do in the name of morality may be more emotional than rational. According to the theory, humans operate according to certain “rules of thumb.” These are automatic, knee-jerk assessments, and they are very powerful, requiring a lot of mental work to overcome. Much of the time they are helpful, in routine everyday matters, but we also fall back on them in situations of uncertainty—or moral ambiguity. And they sometimes fool the more rational mind.

That’s what happens on the footbridge, say Valdesolo and DeSteno. Apparently one rule of thumb, emotionally powerful, says we don’t push people off bridges. Perhaps it’s the tactile nature of the act that makes it seem more like murder than saving lives. Whatever the source of the feeling, it’s strong enough to prevent what’s arguably the more reasonable (and moral) action: Keeping five students from perishing. There is experimental evidence for this: The rare few who do opt to sacrifice the fat man clearly struggle with the choice. They take much longer to decide, as if they had to free themselves from the tug of the quicker intuitive impulse.

Valdesolo and De Steno wondered: If our emotions are so influential in our moral judgments, might it be possible to determine people’s actions by manipulating their emotions? The short answer, as they report in the June issue of Psychological Science, is yes. The scientists presented research subjects with the two classical dilemmas, but before they did, they primed their emotions with completely irrelevant materials. One group watched a video clip of a Saturday Night Live skit, while another watched part of a short documentary about a Spanish village.

As funny as the Spanish village was, it was no competition for the Not-Ready-For-Prime-Time-Players, so the first group headed into the dilemmas feeling much more upbeat. And this uplifted mood trumped the negative feelings tied to the fat man falling. The participants were more likely to choose the practical, logical course of action on the footbridge, and what’s more, the longer they took the more likely they were to choose the greatest good for the greatest number. The mood manipulation did not affect choices in the trolley dilemma, which makes sense since this scenario was not as ambiguous to begin with.

None of this answers the fundamental question: Are you a better person if you murder one person to spare five? That’s for ascended masters. But you probably are a more humble person now, knowing just how easily your most profound judgments and actions can be shaped by others.

For more insights into human nature, visit the Association for Psychological Science website at www.psychologicalscience.org/onlyhuman.


posted by Wray Herbert @ 10:48 AM

5 Comments:

At 5:34 PM , Blogger ninaangel75 said...

These are excellent scenarios to consider when determining your own moral judgement. I chose to push the fat man for the good of the greater. It still makes me wonder, though, if I really was in such a situation, would I decide the say way? Is this really a knee-jerk reaction for me? I'm not so sure it would be.

 
At 7:34 PM , Blogger I n g e r said...

I heard on RadioLab from a researcher at Princeton, I think--one of your guys?--who worked with equipment that could photograph the brain and tell us which parts of it were firing at the precise moment of decision. He posed this exact dilemma to subjects: flip the switch or push the fat guy.

Turns out that there are two distinct parts of the brain firing off in each of those decisions. There is the part that we get from our primate past: don't kill the guy beside you. (We are genetically programmed to not kill the guy beside us.) But there's another part of our brain that fires off when we try to calculate a matter of greater good, like killing one to save five. That's new: that's pure human. If you can actually look at the brain when we contemplate the dilemma--flip the switch or push the fat guy--it looks like a battlefield, firing off all over the place, primitive instinct warring against reason. Sophie's Choice.

I love this stuff.

 
At 1:16 PM , Blogger Maggie said...

The other aspect of this that hasn't been considered at all, is to throw oneself off the footbridge to save the lives of six people, if you count the fat man. But, this definitely goes against our primal instincts of survival, and is a more (most?) human decision. I do agree with Michelle, that the easiest decision is to let fate take its course and not intercede at all.

 
At 9:38 AM , Blogger Phil said...

I agree with the idea that to push a lever is much easier than deciding to take another man's life. To flip the lever seems like nothing in the heat of the moment, but to challenge one moral of losing 5 children or 1 fat man with the sepereate moral of not killing makes the situation difficult. In the Footbridge situation you are left with two choices that seem equally wrong. The question at hand in this situation is: Should I kill or let kill?

 
At 6:36 AM , Blogger TonyRP said...

I can see the moral rationalization at work. Our intuition to not push the fat man in the second scenario is instantaneous and incredibly strong, and so, having reacted emotionally, we subsequently come up with "rational" reasons why our gut reaction was the correct one. I can see no way that you are an "agent of death" in one situation but not the other. In either case, it is the trolley striking the man that has killed him. In either case, you are entirely 100% responsible for the trolley striking him. You have killed a man, the only difference in the two situations being a pull of a lever, or a push of a heavyset gentleman. The difference in the end comes down to exerted force and arm motion and nothing more. Likewise, there is no third option in the second scenario that doesn't exist in the first. (Consider the conditions specified: in both cases, you have realized that you have a course of action which will save the lives of five at the expense of one; in both cases, you know with absolute certainty that, if you carry out this course of action, five lives will be saved, and one will end; there is no-one else around to pull the lever or to push the fat man, so the decision is entirely yours.) In either situation, you can take action and kill one to save five, or you can allow five to die in order that you do not have to kill the one. To do nothing, whether it be to not pull the lever, or not push the man, is in no way different from the second option I mentioned, allowing five to die so that you do not have to kill one. Logically, these situations are identical in every relevant way, and different only in morally irrelevant details. The real issue is whether consequentialism (utilitarianism) is a justifiable ethical doctrine. Most, confronted with this or a similar situation, would say yes. But consider the implications. Torture is the first example that comes to my mind. Is torture ever justified? To keep to a logically consistent ethical theory, and assuming you said it was the morally right thing to kill the one to save the five, wouldn't you have to accept that, in certain cases, torture, if it may save many at the price of death, or grave psycho- and physiological damage to one or a few, is justified? This is what is really at issue, what sort of ethical theory should guide our morality, and what are the implications and shortcomings of any particular theory?

 

Post a Comment

Subscribe to Post Comments [Atom]

<< Home