Brrrr. It's Lonely Out There

Thursday, August 28, 2008

By Wray Herbert

Sylvia Plath was only 24-years-old when she penned her brooding poem “Winter landscape, with rocks,” which ends this way: “Last summer’s reeds are all engraved in ice, as is your image in my eye; dry frost glazes the window of my hurt; what solace can be struck from rock to make heart’s waste grow green again? Who’d walk in this bleak place?” It would be six years before the young artist’s depression would drive her to suicide, but the pain of her isolation was already apparent in the cold, wintry metaphors of this poem.

But why cold and wintry? What made this troubled young woman think of ice and frost when she wanted to depict the emotional bleakness of her life, her desperate sense of disconnection? Why not searing heat and punishing sunshine? What does loneliness have to do with the temperature?

If this seems like a silly question, it’s because we all make the same connection in our minds all the time, and it’s seemingly automatic. Just think of the clichés: the cold shoulder, a chilly reception, an icy stare. The idea of being alone—including social disconnection and rejection—appears to be inextricably tied to the sub-zero end of the thermometer.

Psychologists are curious about this metaphor, and others. Some believe that metaphors are much more than literary conventions, indeed that they are constellations of ancient and recent experience that we use to help us comprehend the complexity of our emotional lives. According to this view, metaphors are readily available because they are deep-wired into our neurons.

But how did they get there? Two psychologists at the University of Toronto decided to explore this question in the laboratory. Chen-Bo Zhong and Geoffrey Leonardelli wanted to see if our use of metaphor in thinking and judgment might be influenced by our most basic perceptions of the world—the information that enters the brain through the senses. Our ancient ancestors probably linked warmth and togetherness by necessity, as do infants still; bodily warmth often means comfort and survival. Might cold and isolation be similarly linked in the mind?

Here’s how the psychologists tested the idea. They divided a group of volunteers in two, and had half of them recall a personal experience in which they had been socially excluded—rejection from a club, for example. This was meant to “prime” their unconscious feelings of isolation and loneliness. The others recalled a happier experience, one in which they had been accepted into a group.

Then they had all the volunteers estimate the temperature in the room, on the pretense that the building’s maintenance staff wanted that information. The estimates ranged widely, from about 54 degrees F to a whopping 104 degrees F. That’s surprising in itself, but here’s the interesting part: Those who had been primed to feel isolated and rejected gave consistently lower estimates of the temperature, by almost five degrees. In other words, the recalled memories of being ostracized actually made people experience the world as colder.

The psychologists decided to double-check these findings a slightly different way. In another experiment, instead of relying on volunteers’ memories, the researchers actually triggered feelings of exclusion. They had the volunteers play a computer-simulated ball tossing game, but the game was actually rigged. Some of the volunteers tossed the ball around in a normal friendly way, but others were left out, just as an unpopular kid might be left out by other kids at the playground.

Afterwards, all the volunteers rated the desirability of certain drinks and foods: hot coffee, crackers, an ice-cold Coke, an apple, and hot soup. The findings were striking. As reported in the September issue of the journal Psychological Science, the “unpopular” volunteers who had been ostracized on the virtual “playground” were much more likely than the others to want either hot soup or coffee. Their preference for warmth, for “comfort food,” presumably resulted from actually feeling the cold in the cold shoulder.

It appears that physical sensations and abstract psychological experience are tightly intertwined, and that intertwining may explain the power and appeal of metaphor. But it may also illuminate the relationship between our very real moods and our perceptions of the world around us. Experiencing cold may actually act as a catalyst in mood disorders, the psychologists suggest, exacerbating feelings of isolation and loneliness

So it’s literally a cold, cruel world for some, which makes one wonder about Sylvia Plath’s suicide: The poet killed herself in London in February of 1963, in the middle of England’s coldest winter in hundreds of years.


For more insights into human nature, visit “We’re Only Human” weblog at www.psychologicalscience.org/onlyhuman. Selections from Wray Herbert’s blog also appear in the magazine Scientific American Mind and at http://www.sciam.com/.


posted by Wray Herbert @ 11:17 AM 4 Comments

Foraging in the Modern World

Thursday, August 14, 2008

By Wray Herbert

I live in a town with hundreds of restaurants serving many of the world’s cuisines: Sushi bars, pizza parlors, pho, tapas, KFC, you name it. My family eats out a fair amount, and we know and appreciate all these tastes, so we could conceivably explore a different menu every outing. But we don’t. Some years ago we discovered a neighborhood café that we all really like, and that’s pretty much where we go. It’s our place.

I know that other people are different. We’re basically opting for certainty and predictability, where others prefer exploration and change. But why do people differ on this trait? What motivates some to constantly seek out the next best thing, the greener grass, while others of us are content to stick with what’s known and safe? How do we know there’s not a new and better favorite eatery just around the corner? Are we trading off curiosity and novelty for the luxury of not having to make a decision?

Psychologists are very interested in this question, and some believe it may reflect a fundamental difference in cognitive style, wired into our neurons. Think of it this way: Our ancient ancestors had to forage in the savanna for food and water, but there was no telling where they would find these resources. The environment was patchy, with a watering hole here and an antelope herd there, but no uniformity or predictability. So what was the best search strategy? Once you find a hunting ground with some antelope in it, do you set up camp and make it your own, or go looking for a better hunting ground, then a better one still?

Now fast-forward to modern times. Our challenges are perhaps more intellectual and abstract, but we still have to decide how to deal with an uncertain world. Faced with a problem or decision or choice, do we bear down and exploit one idea for all it’s worth, or move rapidly on from one solution to another to another? Or maybe we do both, depending on the problem, toggling back and forth depending on what works.

Indiana University psychologists Thomas Hills, Peter Todd and Robert Goldstone decided to explore these questions in the laboratory. They wanted to see if people do indeed have a consistent cognitive style for foraging, whether it’s for food or ideas. They also wanted to see if priming those ancient foraging neurons—triggering either exploring or exploitation instincts—influences the way people approach modern problems.

Since they couldn’t actually ask people to forage for food in the wild, they used some modern tools: a computer game and a board game. They had a group of volunteers use icons to “forage” in a computerized world, moving around until they stumbled upon a hidden supply of food or water, then deciding if and when to move on, continue the search, and in which direction, and so forth. The scientists tracked their movements.

But the volunteers explored two very different worlds: Some foraged in a “clumpy” world, which had fewer but richer supplies of nutrients. Others explored a “diffuse” environment, which had many more, but much smaller, supplies. The idea was to “prime” the optimal foraging strategy for each possible world. Those in a diffuse world would in theory do better giving up on any one spot quickly, and moving on rapidly, and navigating to avoid any duplication. Those in a clumpy world would be more likely to stay put, exploiting the rich lodes of nutrients rather than keeping up the search.

That was the first part of the experiment. Afterward, the volunteers participated in a more abstract, intellectual search task: the board game Scrabble. They didn’t actually play Scrabble, but they got letters as if they were going to play, and had to search their memory for as many words as they could make with those letters. As with the board game, they could also choose to trade in their letters for new ones, but in the experiment they could do it whenever they wanted to. The wholesale trading of letters is what the psychologists were actually observing: They want to compare the volunteers’ Scrabble strategies with their foraging strategies, to see if they stuck with the letters they were given—or rapidly abandoned one set of letters for another (more promising) set. In other words, would those who were mentally primed for a clumpy world see their Scrabble letters as rich clumps, worth sticking with, while those primed for a diffuse world quickly abandoned one set of letters for another?

The results were striking. As reported in the August issue of the journal Psychological Science, those whose neurons were primed for exploration in the wild, were also more restless and exploratory in Scrabble, while those primed for exploitation were more focused and persevering when they switched to the abstract mental challenge. Put another way, the human brain appears capable of toggling back and forth between exploration and exploitation, depending on the demands of the task.

But the psychologists also found that individuals were consistent in their cognitive style. That is, the most persevering foragers were also the most persevering Scrabble players, just as gadabouts in the food search tended to be gadabouts in intellectual matters as well. And presumably in life: They would probably be too antsy to settle for a “good enough” neighborhood café.

But dining out is trivial, and these findings have more serious implications related to other recent work on brain chemistry and cognitive disorders. Exploratory and inattentive foraging—actual or abstract—appears linked to decreases in the brain chemical dopamine. Similarly, many problems related to attention—including ADHD, drug addiction, some forms of autism and schizophrenia—have been link to such a dopamine deficit. It’s possible, the psychologists say, that computer foraging might reveal underlying cognitive style—either persistence or the lack of it. It’s even possible that such simulated foraging could have long-term effects on thinking style, and possibly even lead to therapies for such cognitive disorders. That’s something worth exploring.

For more insights into the quirks of human behavior, visit “We’re Only Human . . .” at www.psychologicalscience.org/onlyhuman. Excerpts from the weblog also appear in the magazine Scientific American Mind and at http://www.sciam.com/.


posted by Wray Herbert @ 9:45 AM 2 Comments

The Corners of My (Stone-Age) Mind

Friday, August 08, 2008

By Wray Herbert

My first phone number was Prospect 67210. The quaint sound of that number is enough to tell you how long it has been bouncing around in my neurons. I also recall the street address that went with that phone number: 211 Elm Drive. I can vividly picture the grey Cape Cod house in my mind’s eye, and I believe I could even find my way around the neighborhood all these decades later.

I wonder why that would be. That chapter in my life is long gone, and it will never recur. I have never once actually used this information since I moved from my childhood home. What possible value could it have that I would still have it in my synapses? And why those particular details, when I have forgotten so much else?

Psychologists have spent a lot of time over the years describing what and how we remember, and there are volumes on how to improve memory. But little is known about the most basic question of all: Why remember? Why do we have memory at all? Purdue University psychologists James Nairne and Josefa Pandeirada decided to tackle this root question from a Darwinian perspective. They figured that memory, much like our kidneys and eyes and limbs, must have been shaped by eons of evolution. That is, it must have had some sort of survival value deep in the past. But why would nature have designed our “mnemonic organ” to work precisely the way it does? What’s the purpose of storing away the past?

The psychologists decided to explore these questions in the laboratory, starting with this premise: The only value of the past is in illuminating the present or predicting the future. It therefore makes sense to remember only those things that once helped solve “problems” related to the survival of the species: the location of food and water, signs of predators and potential mates, and so forth. It would also make sense to forget a lot of the rest, since the clutter of indiscriminate remembering would paralyze us.

To test this idea, the scientists had volunteers imagine spending a couple months alone in an unknown and uncivilized place, a grassland, without any useful tools for survival. Then they were given a list of words, which they were asked to rate for survival value. Because the words were randomly selected (stone, chair, meadow, and so forth), volunteers had to think a bit about whether each thing could conceivably have any usefulness: A chair might be a nice luxury, for example, but how might it be useful to a survivalist? How about a rock, or flowers?

The psychologists had other volunteers rate the same words for either pleasantness or for their relevance to moving abroad. Then they gave all the volunteers a surprise memory quiz, asking them simply to recall as many words from the list as they could. The results were memorable: Those who had imagined getting by in the wild remembered far more survival words, compared to the others’ recall of pleasant or moving-related words. These findings suggest that memory is indeed adaptive; that it has been “tuned” to information about evolutionary fitness.

Nairne and Pandeirada decided to look at these findings a different way. As described in the August issue of Current Directions in Psychological Science, they pitted survival memory against the best known tricks that memory researchers have for enhancing memory, in a head-to-head competition. This “who’s who” of memory devices includes forming a visual image of a word, creating an autobiographical memory related to a word, and simple effortful memorization. Again, they used a random list of words, and had volunteers use these various methods to process them. And again, simply thinking about words in terms of their survival relevance led to far greater recall than any of the other tried-and-true memory enhancement techniques.

So what does this say about how memory is organized in the brain? The scientists say it’s unlikely that the brain has a “survival module”; the concept of species survival is just too broad and amorphous. It’s more likely that evolution has engineered more finely tuned modules for recognition and storage of information—about predators, say, or poisons or nutrition.

Or safety and shelter, perhaps. Think about it: For a kid, what could be more important to survival than being able to find your way back home? It would make sense to burn those coordinates indelibly into the neurons. Which would explain why I could still reach home in an instant: Just dial Prospect 67210.

For more insights into the quirks of human nature, visit “We’re Only Human . . .” at www.psychologicalscience.org/onlyhuman. Selections from the weblog also appear in the magazine Scientific American Mind and at http://www.sciam.com/.


posted by Wray Herbert @ 1:19 PM 1 Comments