The Neurology of Stereotypes

Thursday, April 24, 2008

By Wray Herbert

I once attended a polytechnic school where the mascot was an engineer. It was a men’s school, so the image of the mascot was a kind of geeky looking guy. He wore a goofy hat and was always surveying land or harnessing electrical power or some such. It was an unflattering caricature, and meant to be funny in a self-deprecating kind of way.

A lot of students and alumni didn’t like it, though, and the engineer was eventually replaced with a bird. That’s the problem with stereotypes: They contain enough truth to be both humorous and cruel. We all use stereotypes, probably more than we’d care to admit, because they are fast and efficient cognitive shortcuts that save us a lot of time and energy. You probably have a caricature of an engineer in your mind’s eye right now.

Psychologists are very interested in stereotypes, and how the brain processes them. Do we know when we are falling back on a broad caricature, or do we use them automatically, without deliberation or conscious awareness? How do we police our own lazy mental habits to avoid harming others with simplistic stereotypical thinking? Do we know that stereotypes are wrong, yet find them too psychologically tempting to avoid?

Psychologist Wim De Neys of Leuven University, Belgium, decided that the best way to explore these questions was to actually look at the brain in action. Past research has shown that a particular region of the brain’s frontal lobe becomes active when we detect conflict in our thinking—between an easy stereotype, say, and a more reasoned and complex view. But actually overriding stereotypical thinking requires another part of the frontal lobe. De Neys basically wanted see if stereotypical thinking is a detection problem or a self-control problem. To see, he watched these two brain regions during stereotypical thinking, to see what lit up.

He used a classic psychology problem to make people summon up the stereotypes residing in their neurons. Here’s how it works: Say there’s a room with 1000 people in it, and we know that 995 are lawyers and the other five are engineers. We get to meet just one of these people, named Jack, picked randomly from the group. We learn that Jack is 45-years-old and has four children. He has little interest in politics or social issues and is generally conservative. He likes sailing and mathematical puzzles. Is Jack a lawyer, or an engineer?

Well, which is he? Logically, if you use the statistical part of your brain only, the obvious answer is that he’s a lawyer, simply because there are all those lawyers in the room and there’s a better chance of meeting a lawyer in the room than an engineer. But a lot of people immediately say engineer because Jack fits a stereotype. The majority of even highly educated people do this. Others do say lawyer—and so quickly that it seems instantaneous—but the question is whether the brain needs to quash that powerful engineer caricature in order to give the more reasoned response.

De Neys watched volunteers’ brains as they puzzled through this and similar problems. He found (and describes in the May issue of the journal Psychological Science) that the brain’s stereotype detector lit up regardless of whether the subject answered stereotypically or rationally. So apparently we all detect the stereotype and recognize that it is out of sync with reality. But the brain’s inhibition center—the part of the brain that says, “No, I am not falling for that simplistic idea”—lit up only when the subjects actually reasoned that Jack was a lawyer—that is, only when they overrode the stereotype and made a calculation based on probability. Apparently some of us find the ready caricatures too tempting and use them anyway, against our better judgment.

This goes way beyond fairness to engineers. Think about another stereotype, this one the typical lung cancer victim. He’s old, right? That’s at least the stereotype that most teenagers have, and the one they use to justify taking up smoking. Young people don’t die of lung cancer, so smoking must not be risky for the young—all scientific evidence to the contrary notwithstanding. Stereotypes can indeed be cruel and hurtful—even to those who conjure them up.

For more insights into the quirks of human nature, visit “We’re Only Human . . .” at www.psychologicalscience.org/onlyhuman. This blog now also appears in both the print and web editions of Scientific American Mind at www.sciam.com.


posted by Wray Herbert @ 3:44 PM 5 Comments

The Gist of the Matter

Wednesday, April 23, 2008

By Wray Herbert

I had a brief stint teaching writing and rhetoric to college freshmen, and I tried to pass on to my students a valued lesson a favorite professor had given me. Nuance is good; generalities are facile. Be wary of any thinker who thinks in black or white, insists on yes or no, or argues without gradation or grayness or subtlety. God is in the details.

I guess I still believe that. But I’ll have to say that the intervening years of parenting have given me an appreciation for some plain and absolute values: AIDS, bad. Seat belts, good. Heroin, bad. And so forth. I’m not really interested in discussing the subtleties of these positions. Am I getting more rigid as I get older?

Well, maybe I am, but that may not be an entirely bad thing. Recent psychological research suggests that our brains are like hybrid engines, switching back and forth between two very different kinds of thinking. Sometimes we crunch data and painstakingly calculate choices and positions, and sometimes we rapidly and automatically seize on the essence, the simple value, the gist of the matter. And how we think determines in large part the decisions we make. The trick is in knowing when to do which.

Teenagers are not very good at this, as it turns out, with sometimes tragic consequences. Consider this recent experiment by three Cornell University psychologists. Britain Mills, Valerie Reyna and Steven Estrada used a laboratory manipulation to trigger either precise, quantitative thinking or unnuanced “gist thinking” in a large sample of high school students. Then they studied their actual life choices, and their intentions for the near future. The topic was sex and its risks, including pregnancy and diseases like syphilis and AIDS.

Here’s an example of the triggers that the psychologists used. Sometimes they asked a specific question, like: “Are you likely to get pregnant or get someone pregnant in the next six months?” Other times they asked very general questions, like: “Overall, for you, which of the following best describes the risks of having sex: low, medium or high?” The idea here is that precise questions trigger precise memories—actual literal memories of past experiences—and that recalling these specific events jumpstarts the brain process devoted to fine-grain analysis: They start actually weighing risks and benefits, and when they do that, they often end up rationalizing “acceptable risk.” The global questions, by contrast, summon up simple values and qualitative thinking, like: “Risk, bad. Avoid risk.”

They also asked the teenagers to say yes or no to statements like these: “Less risk is better than more risk” and “No risk is better than some risk.” The idea was to sort out the absolute thinkers from the relativists, on the theory that any thinking about relative risk puts the brain into analytic mode, which in turn leads (paradoxically) to increased risk-taking.

And paradoxical or not, that is exactly what the scientists found. They subsequently asked all the students not only if they had had sex already, but also the specifics of what they were planning on: sex within the coming year, sex before age 20, and so forth. As reported in the May issue of the journal Psychological Science, the teenagers who weighed the relative risks and benefits of sex were much more likely to actually have sex (or plan on it) than were those who thought in global ways about risk and peril. Put another way, simple absolute values were protective. Too much data crunching was not. Or to borrow a newer version of the old maxim: “The devil’s in the details.”

For more insights into the quirks of human nature, visit “We’re Only Human . . .” at www.psychologicalscience.org/onlyhuman.


posted by Wray Herbert @ 11:48 AM 1 Comments

A Deadly Philosophy

Thursday, April 03, 2008

By Wray Herbert

Humans are the only species that systematically murders its own for ideological reasons. More than 50 million people were victims of mass murder in the 20th century, making it the deadliest century on record. That included the Ottoman Turks’ murder of 1.5 million Armenians, the Nazis’ extermination of six million Jews, Mao’s murder of 30 million Chinese, and the Khmer Rouge’s destruction of 1.7 million Cambodians. The list goes on.*

Some of these deaths had to do with land and water and such, but most did not. Most were over philosophy. Why would this be? Philosophy is not threatening in any literal sense; it can’t maim or make you die, even when it’s very different from your view. Psychologists are very interested in this paradox: Why is philosophy—or worldview, or ideology—so threatening? Put another way, what are the cognitive and emotional underpinnings of mass murder and genocide?

One emerging theory suggests that genocide may make sense, at least on an emotional level. Think of it this way. Besides being the only animal to murder on principle, humans are also the only animal cognitively advanced enough to understand mortality. We all know we are going to die, and there is absolutely nothing we can do to prevent this. That fact should be utterly terrifying, so terrifying that we should be paralyzed by fear and trembling.

But we’re not. We get up every morning, dress and groom ourselves, go to work, play with the kids, and so forth. How do we manage this? Well, one way we manage is by constructing meaning, and we do that by imagining a meaningful world. That’s called philosophy—or religion, or whatever. Humans are meaning-making creatures.

The problem occurs when our carefully constructed philosophy is threatened. And the greatest threat to a belief system is, well, an alternative belief system. To put it bluntly, your unfamiliar worldview makes me keenly aware of my mortality; it threatens my very existence. So why shouldn’t I wish you dead? Philosophy is personal.

Scientists have actually been studying this entanglement of personal mortality and cultural hatred in the lab, with some interesting results. Here’s a recent experiment by Joseph Hayes and his colleagues at the University of Alberta, Canada. These psychologists wanted to explore whether a philosophical threat could indeed conjure up thoughts of death, and further whether those thoughts might be quelled by actual annihilation of the philosophical “enemy.” To explore this, they recruited devout Christians for an experiment. They had these Christians read an actual news story about the “Muslimization of Nazareth”: The article described how Jesus’s birthplace had become largely a Muslim city, and how the dominant (and militant) Muslim population was marginalizing the Christians who remained.

The idea was that this unwelcome news about a holy Christian landmark would threaten the Christian readers’ worldview—and in turn their personal security. And indeed it appears it did. After they had read about Nazareth, they all took a psychological test that gauges preoccupation with thoughts of death and dying. As reported in the May issue of the journal Psychological Science, those who had read the report were much more morbid in their imagery than those who had not. They were also much more derogatory toward Muslims than were Christians who had not read the news.

So that’s pretty unsettling in itself. But here’s where it gets really interesting. Hayes and his colleagues then told half of the participants another bit of news, only in this case it was made up. They told them that an airplane had crashed on its way to Nazareth, killing all 117 devout Muslims aboard. When they crunched the data, they found that those who had “witnessed” the annihilation of the Muslims were significantly less morbid in their thinking and significantly less derogatory toward Muslims. Put another way, knowing of the violent death of the Muslims effectively undid the perceived threat to the Christians’ philosophy and well-being. It restored meaning and security to their lives.

Isn't it possible that the plane crash simply made the Christians more sympathetic toward the Muslims, at least temporarily? The psychologists actually considered and rejected this idea, based on a surprising finding. The Christians who read about Nazareth became increasingly negative not only toward Muslims, but also toward Buddhists and Hindus and atheists. That is, they became antagonistic toward any worldview that questioned the absolute validity of Christianity. What's more, those who read the fabricated story about the plane crash were less disparaging of all these worldviews. Since no Hindus or Buddhists or atheists perished in the crash, there would be no reason for the Christians to feel sympathy toward these people.

So our brain fights death with death. It reasons that if an enemy dies, his philosophy must have been perverse or weak or just plain wrong, and thus no real threat to our superior worldview—nor to our life and limb. It’s a powerful psychological defense. In real life, of course, it just raises the ante. It’s tit for tat, and the new century starts counting its genocide victims.

*For a thorough examination of 20th century genocide, see Lewis Simons’s “Genocide and the Science of Proof” in the January 2006 issue of National Geographic magazine.

For more insights into human nature, visit “We’re Only Human . . .” at www.psychologicalscience.org/onlyhuman.


posted by Wray Herbert @ 4:15 PM 9 Comments