Not long ago, Harvard University professor Marc Hauser dropped in on his daughter Sofia's kindergarten class and presented the children with a moral dilemma.
You must all keep your eyes closed for 30 seconds, he told them. If none of you raises your hand during that time, you will each get a sheet of stickers when it's over. But if one of you raises your hand, only that child will get all the stickers.
The task brought immediate cries of protest, Hauser recalled. "But that's not fair!" some children exclaimed, shocked at the idea that one child could hog all the stickers.
Some might say that the kindergartners, in their short lives, had already learned much about the nature of justice. But Hauser goes a step further: Morality, he argues, is influenced by cultural teachings but is also so deep and universal an aspect of human existence that it is effectively "hard-wired" into the brain, much like the instinct for language.
At work, he says, are principles as unconscious and yet powerful as the grammar rules we use when we speak - and the challenge to scientists is to figure out what those built-in moral rules are and how they work.
To that end, Hauser and other researchers are experimenting with children, monkeys, online survey takers, brain-damaged patients and even psychopaths and remote hunter-gatherers.
His theory that morality is based in biology has plunged Hauser into an intellectual fray that spans from the pages of The New York Times to the rows of students who take his evolution classes at Harvard.
A psychologist, evolutionary biologist and anthropologist, Hauser has felt students grow restless as he talks about the underpinnings of morality. In one class, he said, a student said, "I know where you're going: Because it's universal, it's biological and therefore there's no role for religion."
Hauser recalls responding: "I'm not saying you shouldn't derive meaning from religion. I'm just telling you that at some level, the nature of the moral judgments that you make and I make are the same, even though I don't go to church and you do."
Others challenge the very premise that the underpinnings of morality are innate rather than learned, pointing out the striking differences between various cultures. Why do some cultures condone "honor killings," in which male relatives kill women who have violated sexual mores, for example?
Jesse Prinz, a professor of philosophy at the University of North Carolina, said it seems more likely to him that "morality is a human invention that developed to deal with the complexities of living in large-scale societies" and more of an "add-on" to biology.
Still, Prinz said, Hauser and other morality researchers are doing "exceedingly useful" work to illuminate "important facts about our moral intuition." The next step, he said, "is to determine where those intuitions come from. There's no doubt that biology is making some contribution, but I think it's going to be an equally important part of the story to look at the specific ways in which moral systems are informed" by culture.
Some critics also charge that Hauser's emphasis on biology negates the concept of free will and implies that all our moral choices are predetermined.
But he is not saying that at all, Hauser responds. A greater understanding of how moral minds work by no means translates into automatic prescriptions and decisions, he says.
Rather, Hauser and other morality researchers are working to tease apart "the system that allows us to intuitively, unconsciously make moral judgments about what's right or wrong," he said. "And that capacity is distinct from how we go about justifying what we do, or what we actually do." Such a system would be so fundamental that it would be present in all cultures.
For example: One ingrained moral principle that seems to span across ages and cultures is that doing something bad is worse than letting something bad happen, even though the ultimate effect is the same. A boy purposely breaking his mother's favorite vase is seen as worse than a boy who stands back and lets a strong wind sweep his mother's vase onto the floor, though he has seen what is going to happen and could have prevented it.
Such an underlying principle can help explain a common attitude toward euthanasia: that it is not acceptable to kill a terminally ill patient, but it is acceptable to stand back and withhold medical treatment that could have prolonged the patient's life.
Hauser and other morality researchers perform many of their experiments by presenting subjects with moral dilemmas from a repertoire of hundreds of them. (For a taste, go to moral.wjh. harvard.edu/index2.html.)
For example, psychologists Douglas Medin and Dan Bartels of Northwestern University recently published a study in the journal Psychological Science that involved a series of questions meant to probe how moral decisions can shift.
They found that when people were initially asked whether they would kill two fish species by opening a dam to save 20 other species upstream, they tended to say no.
But when asked repeatedly whether they would be willing to open the dam if it killed two species, or four species, or six species, their focus shifted to the fact that more species would be saved than killed - and they became much more willing to open the dam.
The study concluded, in part, that "how people's moral values influence their decisions is subject to where their attention is directed," Bartels said. When the different questions shifted attention away from the act and toward its net results, moral repulsion to the act lessened.
In general, Hauser's morality work is part of a growing movement called experimental philosophy that has philosophers rising from their armchairs and seeking to gather hard evidence on the deep moral workings of the mind: "evidence from evolutionary theory, from comparing humans to other animals, and other methods to derive constraints on the nature of these principles, constraints we couldn't just derive by reasoning alone," said Joshua Knobe, an experimental philosopher at UNC.
In animals, Hauser and his graduate student, Justin Wood, have been experimenting with chimpanzees and tamarin and rhesus monkeys to try to discern the building blocks, or precursors, of the human moral sense. Do they make a distinction between actions that are intentional and those that are accidental?
How about between harm caused by an action versus harm caused by a failure to act? Previous work suggests that tamarins do pay attention to the intent behind an action. If one monkey shares food as an unintended side effect of its own selfish desire to get food for itself, while another monkey shares food intentionally, the generous monkey will win much more cooperation from a fellow tamarin than the accidental giver.
Such work may also help determine which aspects of moral reasoning and behavior are uniquely human.
As for the kindergarten class and the sticker dilemma, not a single child raised a greedy hand - and they shared the stickers equally.