Some of my best friends are pawns

Thursday, November 19, 2009

By Wray Herbert

There are certain rules of conduct on which most ethical people would agree. It’s not nice to date the boss’s daughter just to get ahead in the company. Or marry her son. And no parent would approve of a child befriending another child just because he happens to own an Xbox 360 Elite. That would be like an adult warming up to a colleague simply because he happens to have season tickets for the New Orleans Saints.

All of these ethical lapses fall under the general category of using people, which we’re taught early on not to do. People are not instruments or tools to be wielded for our own purposes, pawns to help us achieve our personal goals.

Yet we do use people anyway, often in more subtle ways than these. Why is that? Why do these moral strictures fail much of the time? New and forgiving research suggests that the urge to use people may be deeply embedded in human nature. Indeed, seeing others as useful or not may be as fundamental as perceiving gender or race in navigating out social world.

University of Waterloo psychologist Grainne Fitzsimons is interested in the interplay of personal goals and stereotypes. We are all motivated by goals, from big ones like career success to more modest ones, like losing ten pounds—or simply getting to the train on time. In fact, we spend much of daily lives in pursuit of one goal or another. We also categorize people. We all do, whether we like it or not, simply because we need to find order in the world’s complexity. So we pigeonhole others as blue-collar or professional, conservative or liberal, Black or white or Asian, man or woman, young or old.

Given that personal goals and stereotyping are both so basic to our psychology, Fitzsimons reasoned, is it possible that our goals actually influence how we pigeonhole people? Or put another way, why would we not categorize others as instruments or tools if we see them as helping us get what we want in life? Working with psychologist James Shah of Duke, she designed an experiment to explore this possibility.

Here’s the gist of the study. They had a group of volunteers focus on a goal—say, staying fit and healthy. Then they had them pick three people who they felt could help them meet their goal; let’s call them Ian, Susan and Joe. They also listed three people who they did not perceive as helpful or useful in staying fit—not a hindrance but not instrumental either. We’ll call them Nancy, Ben and Lori.

The names are important because, later on, the volunteers read a series of sentences with these names embedded in them: “The cashier gave Ian his change.” “Ben was tired of arguing” And so forth. There was a pretense for this reading, but then the psychologists surprised the volunteers with a memory test, in which they had to supply the right names: “The cashier gave ____ his change.”

The researchers expected mistakes. Indeed, it was really the mistakes they were studying. They wanted to see if they were more likely to mix up people who they had categorized as useful with other people they saw as useful (confusing Ian with Susan, for example), as opposed to confusing useful people with non-useful people (Joe and Nancy, for instance). If they did the former—confusing instrumental people only with each other—that would suggest that were grouping anyone who served their purposes as alike. It would suggest that we have a mental category for “people-who-get-me-what-I-want.”

And that’s precisely what they found. As reported on-line this week in the journal Psychological Science, the controls—those who were not focused on the fitness goal—made random errors, confusing Ben with Ian with Nancy with Susie. But those who were intent on their personal health-and-fitness goal were much more likely to perceive and remember people categorically, according to their utility, their value in helping reach the goal. Not to put too fine a point on it: All pawns look alike.

This is humbling, but it does not mean we’re slave to our automatic stereotyping. Our neurons may be categorizing the boss’s daughter as a useful tool for achieving our career goals, but whether or not to be a cad remains a choice. Our ethical sensibilities can still trump that impulse to use people as pawns, but it helps to be mindful of our baser nature.

For more insights into the quirks of human nature, visit “Full Frontal Psychology” at True/Slant. Selections from “We’re Only Human” also appear regularly in the magazine Scientific American Mind.


posted by Wray Herbert @ 2:39 PM 2 Comments

"The Piece of Cake Heuristic"

Tuesday, November 17, 2009

By Wray Herbert

Don’t bother searching your long-term memory. There is no “Piece of Cake Heuristic.” I just made that up. I made it up and capitalized the main words and threw in an obscure word and added quotation marks—all so you, the reader, might consider the concept intellectually important and worthy of your attention. After all, it has a name and it’s in print—so it must have some heft, right?

Well, maybe--or maybe not, according to new research. University of Chicago psychologist Aparna Labroo and colleagues wondered if simply naming an idea—an economic theory, a medical diagnosis, a legal precedent—might make it easier for the mind to process, and thus more accessible. They further speculated that this cognitive ease might shape judgments of importance. They gave this idea a jargony label (the “Name-Ease” Effect), and then tested it in the laboratory.

Labroo’s idea is consistent with much earlier work on mental effort: If ideas are easier to process for whatever reason, we tend to find them more familiar and comfortable. Vocabulary, pronunciation, even the typeface in which these sentences are printed—all these can affect cognitive palatability. Labroo wanted to see if official names might have the same force. The link to importance is a bit more complicated. We all believe ideas are important if they are memorable—after all, that’s why we remember them. But we also associate importance with difficulty: The tougher to grasp, the more important an idea must be. If it’s too easy to process, it must be trivial.

The psychologists wanted to sort out these competing ideas, and here’s one of several experiments they ran. They had a group of volunteers read a legal case concerning school prayer. They all read the same case description, but for some the case was given a name, Engel v. Vitale. Once they had all read the case, some of the volunteers were asked to recall the details of the case, while others were instructed to think about the meaning of the case. In other words, some completed a memory task while others completed a comprehension task. Then they all rated the importance of the school prayer case.

The researchers were exploring the interplay of effort, memory and understanding in judgments of importance—and the findings were intriguing. Knowing that the case was officially called Engel v Vitale made it seem more important—but only for those who were focused on remembering it. In other words, the name made the information easier to process, and attributing this ease to the case’s memorability gave it weight. The case name did the opposite for those who were actually trying to comprehend the case: It made the case seem too familiar, and thus run-of-the-mill and simplistic.

Labroo and her colleagues reran this experiment many times, with a variety of ideas: an economic principle (the Coase Theorem); a mathematical concept (the Weierstrass Theorem); a medical diagnosis (acromegaly); and a psychological concept (Optimal Distinctiveness Theory). They got the same basic results, no matter what the subject matter. The psychologists’ paper on the “Name-Ease” Effect was published on-line this week in the journal Psychological Science. You be the judge of its importance.


For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Excerpts from the “We’re Only Human” blog also appear regularly in the magazine Scientific American Mind. Wray Herbert's book on heuristics will be published by Crown in autumn 2010.


posted by Wray Herbert @ 1:10 PM 1 Comments

A Case for the Distractible Toddler

Tuesday, November 10, 2009

By Wray Herbert

When my oldest son was three years old, someone gave him a very large can of Legos as a gift, enough to build a fortress. So we decided to build a fortress. Or I did, but he was an enthusiastic co-conspirator in the project—at least for about ten minutes. But then he got distracted by the sound of an ambulance siren outside; then he re-discovered a plastic triceratops; then he thought he should inspect the ashes in the fireplace. I tried to reengage him in the fortress, because I was doing an excellent job. But he had lots of things to do. He was busy.

Toddlers are distractible. Their minds flit constantly here and there, and they have a terrible time concentrating on even the most stimulating project. They might be fascinated by a colorful new toy, but only until the next best toy comes along, or the next or the next.

This can be maddening for parents, especially for those of us who want to give our kids a leg up on getting into a premier university. Parents often try to teach their toddlers self-control and mental discipline, to rein in their impulsivity. Increasingly, pre-school teachers do this, too. They see inattention and lack of focus as academic problems to be fixed.

But should we really be trying to teach self-control? Is there perhaps a reason why toddlers are such space cadets? Psychologists are beginning to raise these questions, and some are even suggesting that it may be detrimental to the developing brain to push it toward maturity too soon. Indeed, children’s impulsivity may be an essential tradeoff, one that allows the young mind to learn social conventions and language.

University of Pennsylvania neuropsychologist Sharon Thompson-Schill and her colleagues study a region of the brain called the prefrontal cortex, or PFC. This is basically the part of the brain that gives us mental agility and self-control; it filters out irrelevant information and allows us to focus. It is also the last part of the brain to mature and become fully functional. It lags behind the rest of the brain until about age four.

Why would that be? Well, the psychologists speculate that an immature PFC may not be a deficit at all, but rather an advantage in the first years of life. Here’s an example of their evidence, discussed in the most recent issue of the journal Current Directions in Psychological Science. It has to do with guessing. Say you are naïve about the game of football, but you are playing a guessing game: Will the offensive team pass or run the ball? You observe that the team passes the ball three out of every four plays, so you guess “pass” 75 percent of the time and “run” 25 percent of the time.

That’s not smart. Smart would be saying “pass” all the time. And if you played this game with your toddler, that is likely what he or she would do. Toddlers are often better at this, because their immature brains are still operating on a brute-force competition between two alternatives: pass or run. They are not yet capable of nuance and probability. That is, they’re not really capable of guessing.

And good thing, because toddlers can’t afford to guess. They have a lot of learning to do, and much of that learning has to do with hard-and-fast rules and conventions. Having an immature and inflexible mind is an advantage in finding patterns in the chaos of the world. In fact, this rigidity may be essential to language acquisition. Learning language is an intimidating task; it requires saying the right thing in the right context, and agreeing with everyone else that these are the right things to say. Consider the example of irregular verbs: They are simply conventions; they can only be learned by brute force, and that’s precisely how toddlers learn them. It’s no surprise, the psychologists note, that kids pick up languages so effortlessly compared to adults

And it’s not just language. Toddlers are mastering all sorts of social conventions that, like irregular verbs, simply must be learned. They’re the rules of the world. In this sense, trying to hasten the brain’s development may be not only difficult by unwise.

For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Selections from “We’re Only Human” also appear in the magazine Scientific American Mind.


posted by Wray Herbert @ 1:25 PM 0 Comments

Close Encounters of the Rude Kind

Thursday, November 05, 2009

By Wray Herbert

One of my personal crotchets is people who walk down busy city sidewalks without looking where they’re going. These days they might be texting on an electronic device, but it’s not the technology I object to. They could just as well be reading a book. What’s annoying is the expectation that the crowds will part, that all the other pedestrians will make the effort to get out of their way.

This may be simple rudeness. But I suspect that some of these people truly believe they can skillfully multi-task even in a crowd. Well they can’t, and I’ve now got science to prove it. Finnish researchers did a laboratory simulation to see how pedestrians avoid collisions in everyday sidewalk encounters. Millions of people pass by millions of other people without incident every day on the world’s streets, and the scientists wanted to know how we manage this. Although they simulated polite pedestrians, their findings hold a valuable lesson for the self-centered as well.

Cognitive psychologist Lauri Nummenmaa and her colleagues studied volunteers’ eye gaze as they encountered an animated man walking toward them on a city street. They wanted to see if the simulated stranger’s eye gaze was an important cue in avoiding sidewalk collisions. In the simulation, the stranger looked steadily either to the left or the right, and the volunteers had to decide which way to move. The results, reported on-line this week in the journal Psychological Science, were clear: If the stranger looked to his left, volunteers not only looked but also moved to the stranger’s right; and vice versa. The scientists also ran a more realistic scenario in which the stranger looked straight ahead until the last minute, and then suddenly shifted his gaze left or right. They got the same results.

Much recent work on the brain’s “mirror neurons” suggests that humans automatically mimic others, and that this unconscious aping is important to social interaction. Interestingly, the volunteers in these studies did not mirror the stranger’s eye gaze, suggesting that their own eye movements are not simply an automatic neuronal reflex. That reflex may be occurring, but it doesn’t stop there: It appears the pedestrians are also “mind reading,” quickly but deliberately interpreting a stranger’s eye gaze as a signal of intent to walk left or right. That is, they are social animals, analyzing and navigating a social world.

This lab simulation captures only half a real-life sidewalk encounter. On an actual city street, not only am I observing and reasoning about your gaze and intentions, you are doing the same with my gaze. It’s a social contract that protects both of us and keeps the world moving smoothly. Unless, of course, your mind is somewhere else.

For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Excerpts from “We’re Only Human” appear regularly in Newsweek.com and in the magazine Scientific American Mind.


posted by Wray Herbert @ 12:40 PM 1 Comments