The Science of Kids

Friday, August 28, 2009

By Wray Herbert

Nick is a 6-year-old boy who doesn’t lie. At least according to his father, Steve. So imagine Steve’s chagrin when he witnessed what a hidden camera had documented in the McGill University laboratory of psychologist Victoria Talwar. In order to win a prize, Nick readily cheated in a game, then lied to cover up his cheating. When pressed, he elaborated on his lie, and he showed not a glimmer of remorse. Indeed, he was gleeful.

Is Nick a “young sociopath in the making?” Probably not. In fact, he’s fairly typical of 6-year-olds, who lie about once an hour, usually to cover up a transgression of some kind. That’s about twice as much lying as 4-year-olds do, which suggests that kids are learning to lie. Looking at kids of all ages, fully 96 percent are liars. Indeed, Talwar views lying as an important developmental milestone, linked to intelligence.

That doesn’t mean lying is okay, and both father and son know this. It’s uncomfortable to watch Nick squirm through his lies as he digs himself in deeper. And Steve is a fairly typical parent too, in the sense that all parents are very bad at lie detection. What’s more, Nick likely learned to lie from watching his parents tell white lies. Parents typically view precocious lying as innocent, something that will correct itself; but in fact a lot of kids get “hooked” on lying very early.

Nick’s story comes from science writers Po Bronson and Ashley Merryman, who include it in NurtureShock, their delightful new collection of essays on the “science of kids.” Though not exactly a parenting manual, the book does offer a lot of useful information on why kids do what they do. For example, Talwar and her colleagues have tried using stories to teach kids like Nick to curb their lying. In one study, they had kids listen to either "The Boy Who Cried Wolf" or "George Washington and the Cherry Tree"; they heard the story after they had cheated, but before the psychologist asked them about cheating.

For those who don’t recall: In "The Boy Who Cried Wolf," the shepherd boy lies repeatedly about a wolf, and in the end is eaten by a wolf when nobody believes his calls for help. So it’s about severe punishment for lying. George Washington, by contrast, tells his father the truth about chopping down the tree, and is forgiven and praised for his truthfulness. When Bronson and Merryman conducted a survey, three of four respondents said the wolf story would be the more effective teaching tool, but in fact it was the opposite. The honest George tale cut lying by 75 percent in boys, and 50 percent in girls.

Why? Probably because kids already know that lying is a punishable offense; they’re not learning anything new there. What’s new—and welcome information—is that honesty might bring them both immunity from punishment and parental praise.

Bronson and Merryman’s essay on lying is representative of this engaging volume, in its mix of pitch-perfect science writing and soft-pedaled guidance for parents. Many of their essays—on sleep, racial attitudes, self-control, sibling relations, and more—are animated by actual flesh-and-blood kids, who we meet on an excursion through many of the nation’s top child psychology laboratories. It’s a rewarding and entertaining excursion. NurtureShock is published by Twelve Books, and is in bookstores now.

For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Selections from “We’re Only Human” appear regularly at Newsweek.com and in the magazine Scientific American Mind.


posted by Wray Herbert @ 11:37 AM 2 Comments

A cognitive metamorphosis

Tuesday, August 25, 2009

By Wray Herbert

Franz Kafka’s 1919 short story “A Country Doctor” is a tale about . . . well, who knows what it’s about really? The bare-bones plot involves a physician who must make his way through a blizzard to tend to a young boy who is ailing. Or might be ailing, or might not; it’s not clear. Beyond that it is hard to describe, much less interpret the string of absurdities and nonsense that make up this short piece. Time and traditional narrative break down entirely. It’s a disorienting assault on meaning.


That won’t surprise any reader familiar with the works of Kafka and other existentialist writers, who deliberately toyed with reality in order to disorient the reader. Indeed, the word Kafkaesque has come to be a synonym for bizarre, confusing, surreal.

But why is this great literature rather than just gibberish? What is its effect on the reader’s mind? How does a surreal tale like “A Country Doctor” work on a psychological level?

We may never know Kafka’s intentions, but psychologists are beginning to get some insight into the mental dynamics of reading such absurdist writing. One recent study suggests that Kafkaesque threats on life’s meaning might actually prime our need for (and perception of) order and pattern in the world. So paradoxically, experiencing meaninglessness may inspire a keener search for meaning. Here’s the evidence.

Psychologists Travis Proulx of UC-Santa Barbara and Steven Heine of the University of British Columbia ran an experiment in which volunteers actually read a modified version of “A Country Doctor,” this one illustrated with a series of drawings as nonsensical as the text. Other volunteers read a short story roughly like the Kafka tale, but more conventional in form. When they were done reading, all the volunteers took a difficult test that required them to identify patterns in long and seemingly random strings of letters. The psychologists expected that those who were disoriented by the Kafkaesque prose would be more earnest in searching—and more successful in spotting order in the chaos.

And that’s exactly what they found. As reported on-line in the journal Psychological Science, those who were unmoored by Kafka found more of the hidden patterns that actually existed, but they also identified more patterns overall, correctly and incorrectly—suggesting that they were highly motivated to seek and find order. But here’s the most intriguing aspect of these findings: A disorienting literary experience appears to have sharpened the volunteers’ yearning for meaning on a fundamental cognitive level; it’s unlikely that the volunteers even thought of themselves as searching for meaning, yet their neurons seemed primed to make order anywhere and everywhere they could.

Proulx and Heine ran another similar experiment and got the same results. Taken together, the studies suggest that we humans are irrepressible meaning makers. Indeed the need for order and predictability may be fundamental to the human condition, and challenging the world’s predictability may be one key to art’s psychological power. Kafka apparently had this uncanny insight into the human mind nearly a century ago, at age 36.

For more insights into the quirks of the human mind, visit the "Full Frontal Psychology" blog at True/Slant. Selections from "We’re Only Human" also appear regularly at Newsweek.com and in the magazine Scientific American Mind.


posted by Wray Herbert @ 11:30 AM 4 Comments

Carpe diem! Did our ancient ancestors have personalities?

Friday, August 14, 2009

By Wray Herbert

I have high school friends who are dead already, as a direct result of their chosen lifestyle. They drank too much, drove too fast, ate whatever they craved at any given moment. They were impulsive, live-for-today types, and they paid a price for these traits. Nobody’s shocked that they died early.

We all know people like this. We also know people who are conscientious workers, homebodies and parents, committed partners and committed bachelors, workaholics, health nuts, easy-going and neurotic. There’s no denying the stark individual differences in personality. “Who we are” seems to emerge early in life, and to endure through the lifespan. It shapes our life choices, from health to family to work and finances.

But why do we have personality at all? It wouldn’t seem to make sense from an evolutionary point of view. The traits that have been wired into our genes and neurons over the millennia tend not to be differences, but things we all share in common--habits of mind that have helped the entire human species survive and adapt. That’s why evolutionary psychologists have tended to dismiss personality traits as irrelevant “noise.”

Until recently. Now a small cadre of psychologists has been revisiting personality, to see how it might fit into an evolutionary understanding of humanity. One of the leaders in this effort is University of Texas psychologist David Buss, who lays out several emerging ideas in the April issue of the journal Perspectives on Psychological Science. Here’s just one:

Each of us has a finite supply of time and energy. Think of a hypothetical young man making his way in the modern world. He might choose to put his energy into prospering—being healthy and well-fed—or he might instead choose the life of a romantic gadabout. Or perhaps he’ll opt for being a devoted parent and provider. But he probably can’t do all these things well. He has to make choices.

So it was with our ancient ancestors. They were similarly called upon to make tradeoffs, spending their time and energy on one life “problem” or another. They probably weren’t as aware of making choices as we are today, but they were nevertheless prioritizing things like romance, parenting, and social climbing.

So the constant challenge that all early humans faced was making the optimal energy tradeoff. The individual choices they made—and continue making today—were shaped by their supply of energy and time, their personal qualities, and their circumstances. Very attractive men, for example, might put a lot of their energy into mating rather than parenting, while people with bleak mating prospects might opt for career or nurturing others’ children.

And those who lack energy, or who perceive the future as short, might discount mating and parenting and career, and squander their limited energy now. Those are the live-for-today types, according to Buss: In that sense, what is often disparaged as a maladjusted personality marked by poor self-control might more generously be viewed as a realistic adaptation to what life throws at you. Carpe diem.

For more insights into the quirks of human nature, visit the new “Full Frontal Psychology” blog at the True/Slant website. Selections from “We’re Only Human” also appear regularly in the magazine Scientific American Mind and at Newsweek.com.


posted by Wray Herbert @ 3:17 PM 4 Comments

I learned it at the movies

Tuesday, August 04, 2009

By Wray Herbert

In the 2003 movie The Last Samurai, Tom Cruise plays a former US Army captain named Nathan Algren, an alcoholic and mercenary who in the 1870s goes to Japan to work for the Emperor Meiji. The young Emperor is facing a Samurai rebellion, and Algren trains a ragtag bunch of farmers and peasants in modern warfare, including the use of rifles. When Algren is captured by the Samurai, however, he is gradually converted to their ways, and ends up fighting with the warriors in a losing battle against the Imperial Army he helped create.

The movie was both a critical and popular success, and why not? Lots of exciting swordplay, exotic costumes, and a fascinating piece of history that was probably unfamiliar to most Americans before the film was released. Indeed, it’s fair to say that many Americans have learned much of what they know about the westernization of Japan from watching films like The Last Samurai.

That’s probably not a good thing, because the film is full of historical errors. Most notably, it was the French and Dutch, not Americans, who played the key role in Japan’s modernization in the late 19th century, and the Algren character is loosely based on a French officer named Jules Brunet. What’s more, the movie conflates two decades of military history for the sake of simplicity, and presents a highly romanticized view of the Samurai warriors.

I know, I know. The Last Samurai is not a documentary, and people go to the movies to be entertained, not to be instructed in history. No argument there. But films like The Last Samurai are increasingly used in the classroom as well, as adjuncts to textbooks and lectures. Educators believe that the vividness of film can be a valuable teaching tool, enlivening and reinforcing students’ memories for otherwise dry historical text. But is that a good thing, if the facts are wrong? Are they doing more harm than good?

A team of psychologists has begun exploring these questions experimentally. Andrew Butler of Washington University in St. Louis and his colleagues decided to simulate a classroom where popular films are used as a teaching tool, to see if the practice improved or distorted students’ understanding of history. The Last Samurai was in fact one of the films they used in the experiment, along with Amadeus, Glory, Amistad, and a few others. All of the films contained both accurate and inaccurate information about the historical incidents they depicted.

The students watched the film clips either before or after they read an accurate version of the historical events. So with The Last Samurai, for example, they read a version that accurately identified the hero as French, not American, and was faithful to the actual timeline of Japanese history. In addition, some of the students received a general warning about the inaccuracy of popular historical films, while others got very specific warnings, about changing the hero's nationality, for instance. The idea was to see which teaching method led to the most accurate comprehension of the events: reading or watching a movie or both, with or without the teacher's commentary.

When the psychologists tested all the students a week later, the verdict for classroom movies was one thumb up, one thumb down. Watching the films did clearly help the students learn more—but only when the information was the same in both text and film. Apparently the vividness of the film—and simply having a second version of the same facts—did help the students create stronger memories of the material. But when the information in the film and the reading were contradictory—that is, when the film was inaccurate—the students were more likely to recall the film’s distorted version. What’s more, they were very confident in their memories, even though they were wrong. This happened even when the students were warned that filmmakers often play fast and loose with the facts.

So should films be banned from the classroom? Not necessarily, and here’s why. As the psychologists report on-line in the journal Psychological Science, a good teacher can trump a movie's shortcomings. They found that when teachers gave the very detailed warnings about inaccuracies in the film version, the students got it. But those warnings had to be very precise, something like: Pay attention when you watch the film and you’ll see that the filmmaker has changed the nationality of the hero from French to American, which is not the way it was. With such warnings, the students apparently “tagged” the information as false in the minds—and remembered the accurate version when quizzed later on.

In this sense, the movie’s distorted version of history can be used as a teachable moment.* Students learn the truth by identifying the mistakes and labeling them, so their take-away learning is: the film says this, but in fact it’s that. Not a bad way to learn, assuming the classroom teacher knows enough to point out what’s this and that.

*For an entertaining guide to some historical inaccuracies in popular films, check out this slideshow at the Washington University in St. Louis website: http://news-info.wustl.edu/tips/page/normal/14418.html

For more insights into the quirks of human nature, visit the "Full Frontal Psychology" blog at the True/Slant website. Selections from “We’re Only Human” also appear regularly in the magazine Scientific American Mind and at Newsweek.com.


posted by Wray Herbert @ 1:38 PM 3 Comments