PASADENA, Calif.--In the biblical story in which two women bring a baby to King Solomon, both claiming to be the mother, he suggests dividing the child so that each woman can have half. Solomon's proposed solution, meant to reveal the real mother, also illustrates an issue central to economics and moral philosophy: how to distribute goods fairly.
Now, researchers at the California Institute of Technology have discovered that reason struggles with emotion to find equitable solutions, and have pinpointed the region of the brain where this takes place. The concept of fairness, they found, is processed in the insular cortex, or insula, which is also the seat of emotional reactions.
"The fact that the brain has such a robust response to unfairness suggests that sensing unfairness is a basic evolved capacity," notes Steven Quartz, an associate professor of philosophy at Caltech and author of the study, voicing a sentiment that anyone who has seen children fight over a treat can relate to.
"The movement to look into the neural basis for ethical decision making is only about seven years old," Quartz adds. "This is the first study where people made real decisions with real consequences."
The subjects in the study, 26 men and women between 28 and 55 years old, faced a real-world moral dilemma. They started their participation in the experiment by reading a short biography of each of the 60 orphans at the Canaan Children's Home in Uganda. The orphanage would receive a sum of money that would depend on decisions the subjects made. In the end, $2,279 was donated.
While a functional magnetic resonance imaging (fMRI) machine scanned their brains for peak activity regions, the participants each had about eight seconds to decide how to distribute meals among groups of children in different scenarios. In one, their choice would grant either four extra meals to each of two children or six extra meals to one child. The children they didn't choose would get nothing. In another scenario, the kids had been given extra meals and the subjects had to decide whether it was better to take six meals away from each of two kids, or ten meals away from one.
Ultimately the subjects' brains made a choice, and Quartz and his collaborators got to peek into where that calculation was made. "You wonder what is happening at different levels--is your brain's decision right or not?"
When they got to give food to the children, the study participants' orbital frontal cortex, the reward region of the brain, lit up. When instead they had to take food away, the insula region--the emotional processor--was activated.
Quartz suggests that the insula was triggered by the inequity of the choices. The activity varied considerably across subjects, indicating that individual differences in moral sensitivity may be rooted in the strength of the biological responses, he adds.
"The emotional response to unfairness pushes people from extreme inequity and drives them to be fair," Quartz says. This observation, he adds, suggests that "our basic impulse to be fair isn't a complicated thing that we learn."
This study, which appears in the May 8 early online edition of the journal Science, is the first to examine "neuroethics"--the neural underpinnings of moral decision making--with real-world consequences. It may also help guide how to make policy decisions about distributing resources. And, adds Jonathan Katz, chair of Caltech's Division of the Humanities and Social Sciences, "It's one of the first studies to bridge humanities research with social science and biology," a central effort at Caltech.
Other authors are former Caltech graduate students Ming Hsu. and Cédric Anen. Hsu is now a postdoc at the University of Illinois at Urbana-Champaign.