"For just pennies a day"

Friday, September 25, 2009

By Wray Herbert

There are so many things you can purchase or accomplish for just pennies a day. You can get lots of interesting magazine subscriptions, or a good life insurance plan—no physical required. You can adopt a needy child in Africa, or save the Earth from global warming.

The “pennies a day” marketing scheme has been around a long time, and whoever came up with it showed extraordinary psychological insight. Indeed, science is only now beginning to demonstrate what these marketers sensed intuitively—that people are not entirely rational when it comes to processing numbers. What’s more, the way we think about scales and rates and ratios can make us into either cautious or indiscriminate consumers.

In a way this is obvious. “Pennies a day” is a meaningless ratio, because we’re not really reaching into our pockets each and every day for those copper coins. That’s what the marketers want you to visualize, but most of us are not truly fooled by the ruse. We know automatically--without doing any arithmetic at all—that we’re really talking about dollars a month and maybe hundreds of dollars over a year or years. It’s all a matter of knowing the meaningful scale.

But what if the manipulation of numbers is more subtle, or more complex? Are there marketing phrases and terms that do fool our imperfect minds? University of Michigan psychologist Katherine Burson and her co-workers believe so, and they’ve run a couple interesting experiments to simulate the kinds of offers we might well encounter in our daily lives. Here’s an example:

Imagine you’re in the market for a cell phone plan. After shopping around, you’ve narrowed your choices to two: Plan A costs $32 a month, and for that you’re guaranteed no more than 42 dropped calls out of 1000. Plan B only costs $27 a month, but the number of dropped calls is 65. In other words, you get what you pay for, and consumers make their choice based on what’s more important—money or service.

But what if the same offer was phrased this way? Plan A costs $364 a year, and drops 4.2 calls per 100. Plan B costs $324 and drops 6.5 calls per 100. It takes only the tiniest bit of arithmetic to see that nothing has changed. The offers are identical to what they were before, except that the scale has changed. But actually two scales have changed, and in different ways, so it’s not a no-brainer like “pennies a day.”

So how do consumers process these different offers? The psychologists gave these choices to a large group of volunteers, and the results were interesting. Consumers preferred Plan B when it was described as having a lower price per year, but they preferred Plan A when it was described as having fewer dropped calls per 1000. Notice that it’s the “per year” and “per 1000” that are important. Making the scale bigger also made the difference appear more exaggerated, so emotionally consumers feel like they’re getting much better service or a big savings in cost. Consumers actually changed their preferences with the larger scale—they became more discriminating—even though the real terms remained unchanged.

This is pretty remarkable—and unnerving. But there’s more. In a second experiment, the researchers offered a slightly different choice for movie rental plans. In this scenario, Plan A costs $10 a month for seven new movies per week. Plan B costs $12 a month for nine new movies a week. As before, either choice could make sense, depending on which meets your financial and movie-watching needs.

Then they once again changed the terms: This time the prices stayed the same, but instead of a weekly allotment of movies, consumers now got a yearly allotment. That is, for $10 a month they got 364 movies per year, and for $12 a month they got 468. How did the movie aficionados process these offers? As reported in the current issue of the journal Psychological Science, dramatically more consumers chose plan B when it was expressed in movies per year. It's the emotional impact of that number--468. That's a lot of movies, and a lot more than the other plan gets you, and still for only $12 a month. When you come to think of it, that's really just pennies a day.


For more insights into the quirks of human nature, visit the "Full Frontal Psychology" blog at True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine Scientific American Mind and at Newsweek.com.


posted by Wray Herbert @ 3:16 PM 0 Comments

Changing the old dating rules

Tuesday, September 22, 2009

By Wray Herbert

Women are much choosier than men when it comes to romance. This is well known, but the reason for this gender difference is unclear. Evolutionary psychologists think it’s because, way back in prehistoric times, “dating” was much riskier for women. Men who made an ill-advised choice in the ancient version of a singles bar simply had one lousy night. Women who chose unwisely could end up facing years of motherhood.

That’s less true today, yet women remain much more selective. Is this difference a vestige of our early ancestry? Or might it be totally unrelated to reproductive risk, something more modern and mundane? A couple of Northwestern University psychologists, Eli Finkel and Paul Eastwick, decided to explore this question in an unusual laboratory: a real-life speed dating event.

For the uninitiated, speed dating is an increasingly popular way for men and women to meet and find potential partners. Participants attend a sponsored event and go on a series of very brief “dates,” about four minutes each. Typically, the women sit scattered around a room, and the men make the rounds. Afterward, both men and women indicate to the sponsor if they would be interested in seeing any of the others again. If two “yeses” match up, they get phone numbers and that’s it. They’re on their own.

Men say “yes” a lot more than women. That’s expected, but Finkel and Eastwick had a novel theory about why. Perhaps it could be explained by the simple convention of men standing and approaching—and women sitting passively. There has been a lot of recent work on the mutual influence of body and mind--how we embody our thoughts and emotions—and the psychologists speculated that physically approaching someone might be enough to make the potential date more appealing romantically—and thus to make the men less choosy overall.

They tested this in a series of 15 heterosexual speed dating events, involving 350 young men and women. Each participant went on about 12 dates, but the researchers changed the rules: In these events, the women and men approached each other about equally. Following each date, each participant rated the other for romantic desire and romantic chemistry. They also rated their own sense of self-confidence on the date. A bit later, they decided thumbs up or thumbs down.

The results were a score. As reported on-line in the journal Psychological Science, the well-known gender difference vanished when men and women assumed more egalitarian roles. The difference didn’t completely reverse when women were on the move. That is, their choosiness went away but they didn’t become more indiscriminate than men. This suggests that the ancient tendencies may still have some force, but they are also reinforced by arbitrary social norms. What’s more, it was increased self-confidence that appeared to make the difference: Simply standing and being on the move boosted confidence, which in turn boosted romantic attraction.

We don’t speed date through real life, of course, but there are all sorts of social conventions based on gender, and these presumably shape romantic feelings and actions. Having men behave more like women and women more like men appears at least to narrow this one gap between the sexes.

For more insights into the quirks of human behavior, visit the “Full Frontal Psychology” blog at True/Slant. Selections from “We’re Only Human” also appear regularly at Newsweek.com and in the magazine Scientific American Mind.


posted by Wray Herbert @ 11:41 AM 1 Comments

Making Sense of Pat

Tuesday, September 15, 2009

By Wray Herbert

Fans of the old Saturday Night Live will remember skits about the androgynous Pat. Pat’s formless body and non-descript clothes offered no clue about gender. Nor did Pat’s behavior, and the running joke was that the celebrity guest hosts would go ridiculous lengths to figure out if Pat was a man or a woman. They always failed.

The skits were funny in part because Pat defied a deep-seated urge to put people into tidy pigeonholes—to stereotype. Pat wasn’t aggressive in a stereotypical male way, and Pat wasn’t particularly caring in a stereotypical female way. Pat was just Pat.

We all trade in stereotypes every day, whether we like it or not. It’s how we sort an impossibly complex world into manageable categories: man, woman, Italian, Chinese, lawyer, engineer. Stereotypes can be unfair and hurtful to many people, but the power of stereotyping is undeniable. It’s a fact of the human psyche.

But what exactly is going on in the mind when we stereotype someone? Is the process instantaneous and automatic, or do we deliberate over traits and categories before making judgments? A clever new study of the actual internal process of stereotyping—from basic perception to judgment—offers some provocative findings.

Tufts University psychologists Jonathan Freeman and Nalini Ambady used many common stereotypes, including gender stereotypes, to explore a new theory about the cognitive mechanics underlying caricatures. Here’s the basic idea: When we catch sight of a stranger’s face, we immediately begin to extract information: That’s no problem if it’s the Marlboro Man or Betty Crocker, but most of us aren’t archetypal icons of our gender. Most humans are somewhere in between, so our immediate perception is usually more tentative: “He’s probably male.” This tentative perception in turn triggers a tentative stereotype: "He’s likely to be aggressive." In other words, our perceptions and categories are not crisp and fixed, but rather in dynamic flux. It takes a few seconds for this ambiguous impression to stabilize into a final interpretation of the stranger.

At least that’s the theory, which the psychologists decided to test in the lab. To do so, they morphed photos of men and women into amalgams of male and female traits, some more ambiguous than others. None were as baffling as the fictional Pat, but they were deliberately ambiguous—like in the real world. Then they used an innovative lab technique to explore the cognitive processing of these faces: Instead of scanning their brains, they tracked their hand movements. They flashed the photographs on a screen, and instructed the volunteers to move a mouse rapidly toward one of two adjectives—for example, “aggressive” and “caring”—in opposite corners of the screen. The psychologists tracked the computer mouse movements to see how quickly and directly they categorized each face by stereotypical traits.

The idea here is that the hands have a mind of their own, in the sense that movements reflect the mind’s hesitation and conflict. The results were fascinating. An instantaneous stereotype would be a straight line from the starting point to one of the two adjectives—male, therefore aggressive, no hesitation. Nobody did that. Instead the movements appear as curves, suggesting some hesitation and deliberation in each judgment.

But here’s the really interesting part, reported on-line this week in the journal Psychological Science: The more ambiguous the face was, the more curved the path to judgment. That is, a male face with female traits might ultimately be judged as male and therefore aggressive, but not before the volunteer’s hand was tugged a bit toward the alternative stereotype of caring female. It’s like the mind is saying: Yeah, probably aggressive, but what about those nurturing features? What do I make of those? It’s as if the perceived gender ambiguity triggers a cognitive “competition” between incomplete and contradictory stereotypes, which persists until the mind settles on one or the other.

This is more than just a clever experiment, Freeman and Ambady believe. Even though the cognitive ambiguity is active only for an instant during the stereotyping process, those few seconds of contemplating life’s ambiguity may undermine our mind’s rigid categories—and have lasting effects on social judgments and behavior way down the line.

For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Excerpts from “We’re Only Human” also appear regularly at Newsweek.com and in the magazine Scientific American Mind.


posted by Wray Herbert @ 4:52 PM 1 Comments

Cold Shoulder, Warm Heart

Friday, September 11, 2009

By Wray Herbert

One of Robert Frost’s best-loved poems is the short verse “Fire and Ice,” which goes like this:

Some say the world will end in fire;
Some say in ice.
From what I’ve tasted of desire
I hold with those who favor fire.
But if I had to perish twice,
I think I know enough of hate
To know that for destruction ice
Is also great
And would suffice.


Most good poets are part psychologist, and Frost shows keen insight into the human mind in these seemingly simple lines. Indeed, his 1920 poem anticipated ideas that are just now emerging in cognitive science—specifically the notion that our bodily sensations are inextricably bound up with emotions like hatred and desire. Or to put it a way that the Bard of New England would have appreciated, the metaphorical thermometer is as much a gauge of social life as it is of degrees Fahrenheit.

At least that’s the theory, which psychologists have been exploring in various ways in the laboratory. Here’s a recent example, from Hans IJzerman and Gun Semin of Utrecht University. The psychologists were intrigued by such metaphors as “the cold shoulder” and “warm feelings,” and decided to test the link between thermometer readings and feelings of closeness or distance, affection or iciness. They ran a few experiments to test this in different ways.

The first experiment was straightforward. Volunteers who had just arrived in the lab were asked to hold the experimenter’s beverage for a few minutes, ostensibly so he could do something that required two hands. Some were handed a cold beverage, and others a warm one. Then they were asked to rate both themselves and an acquaintance on a well-known scale that measures social proximity; the more they overlapped with
the other, the higher their score on closeness; the less overlap, the more distant they were feeling. The results were also straightforward. Holding the warm beverage induced greater feelings of closeness than the cold beverage.

Those findings are intriguing but hardly conclusive, so the researchers looked at the body-mind link a different way. When we are literally close to someone or something, we see more detail; our experience is more concrete. Similarly, distance makes our vision of things more vague and abstract. The psychologists reasoned from this that feelings of warmth would induce not only emotional closeness toward others, but also perceptual closeness--and thus more vivid and concrete perceptions.

They didn’t use beverages in this study. Instead they varied the room temperature, from the low 60s F to low 70s F. This isn’t a huge variation, but the researchers figured it would be enough to test the idea that temperature shapes emotion and thought. They showed all the volunteers a short film clip of chess pieces moving around, but not the usual way chess pieces move, and they asked the volunteers to describe “in their own words” what was happening. The idea was that room temperature would shape their perceptions and as a result the language that the volunteers used. That is, warm observers would write concrete descriptions of the chess scene, and chilly observers would write more abstract descriptions.

And that’s exactly what they found. When they coded the language in the narratives, they found that room temperature did indeed affect the volunteers’ choice of words. The warm volunteers also expressed greater feelings of closeness toward the experimenter.

The psychologists decided to take this one step further, to see if temperature shapes not only language but worldview. It’s well known that people from cultures that place a high value of individualism—Americans, for example—have a particular cognitive style, compared to more communitarian cultures. Specifically, those from communal cultures tend to see patterns in the world, where individualists tend to see disconnected parts. The researchers suspected that warmth would spark more a more relational worldview, while cold would induce a more self-reliant view.

They varied the room temperature as before, but this time they had the volunteers take a perception test specifically designed to differentiate these cognitive styles. That is, some people perceive patterns where others see independent components, and this is taken as a measure of either a relational or individualistic worldview. And once again, temperature showed a clear and direct connection to how volunteers processed what they saw. As reported on-line in the journal Psychological Science, warmth made volunteers see the connections between things, while the chilly were more individualistic in their perceptions of the world.

So affection, concrete language, communitarian worldview—that’s a lot to hook to the simple rising and falling of mercury. But perhaps it shouldn’t be surprising, the researchers say. After all, the mind evolved along with the body over millions of years, so the way we think and feel was no doubt shaped by real and important experiences in the world. What could be more basic than staying warm?

For more insights into the quirks of the human mind, visit the “Full Frontal Psychology” blog at True/Slant. Excerpts from the “We’re Only Human” blog also appear regularly at Newsweek.com and in the magazine Scientific American Mind.


posted by Wray Herbert @ 12:34 PM 2 Comments

The Myth of Binge Eating

Thursday, September 03, 2009

By Wray Herbert



An inviolable principle of most addiction recovery programs is total abstinence. It appears that for true addicts, one drink or one toke or one line is enough to trigger a binge—and a likely relapse. This dogma is not so hard and fast when it comes to food because . . . well, because we all have to eat.

Still, chronic overeaters do often embrace a version of the abstinence dogma, treating certain foods like Johnny Walker to an alcoholic. It might be an economy-sized bag of potato chips or a
hot fudge sundae or a double order of Buffalo wings. Every foodie has a taboo food or two that will predictably shatter his or her discipline and will power and send the dieter into face-stuffing freefall.

Or so the wisdom goes. But is it true? Surprisingly, this idea has never been tested in a real-life situation, so a team of psychologists decided to do just that. Traci Mann of the University of Minnesota and several colleagues suspected that the notion of catastrophic relapse was too simplistic for a complex behavior like eating. Food-minded people do violate their own rules, of course, but perhaps they make up for their transgressions with a little deprivation later on. This is the idea they wanted to explore.

To do so, the psychologists recruited a large group of college undergrads—all women. They deliberately chose college-age women because as a group they tend to be more weight-conscious and to diet more than the general population. They also questioned each of the volunteers individually to identify their attitudes toward eating, how often they dieted, their weight fluctuations, and so forth. The women thought they were taking part in a broad study of “health habits.”

To make the situation as realistic as possible, the women simply went about their days—going to class, studying, socializing, whatever—but they carried electronic “diaries” with them at all times. The psychologists paged the women once an hour during waking hours, and asked them a variety of questions, including queries about eating and snacking and—importantly—about diet violations. The study took two days, and the results showed no evidence that eating a forbidden food triggers binge or relapse. This was true even among the women most preoccupied with weight and dieting.

The researchers wanted to double-check this finding. So they did a second study, this one lasting eight days, during which the women kept detailed logs of their food consumption. In the first study, it was unclear how each of the volunteers defined a food violation. It might have been a single bite of a Snickers bar, or an entire tray of lasagna. So in this study, the researchers created a ruse that required about half the volunteers to drink an 8-ounce milkshake; they figured this would be a eating violation to most weight-conscious college women. They then compared their post-milkshake calorie consumption to their calorie consumption for the week before, and they also compared the violators to those who had not violated their diet.

And guess what? Drinking the forbidden milkshake was not a dietary catastrophe. Indeed, as reported on-line in the journal Psychological Science, the women who drank the shake ate no more calories overall than the other women, and their calorie consumption the day of the violation was no greater than their typical daily consumption had been for the prior week—about 1,400 calories. In other words, they somehow compensated for the milkshake later in the day—skipping an evening snack, going light at dinner—and as a result got themselves back on track without delay.

This is good news. It’s not clear from this study if the women deliberately compensated for the taboo milkshake, or if that caloric balancing act takes place on an unconscious level. Perhaps that doesn’t matter. The bottom line is that a milkshake is just that and no more. It’s not symbolic of weakness or failure, and doesn’t have to ruin a day or a week or a lifetime commitment.

For more insights into the quirks of human behavior, visit the “Full Frontal Psychology” blog at True/Slant. Selections from “We’re Only Human” also appear regularly in the magazine Scientific American Mind and at Newsweek.com.


posted by Wray Herbert @ 1:42 PM 6 Comments