"Long before it's in the papers"
January 28, 2015

RETURN TO THE WORLD SCIENCE HOME PAGE


Brain mishaps produce “cold” morality

March 21, 2007
Courtesy University of Southern California
and World Science staff

Im­ag­ine that some­one you know has AIDS and plans to in­fect oth­ers, some of whom will die. Your on­ly op­tions are to let it hap­pen or to kill the per­son. Do you pull the trig­ger? 

Most peo­ple wav­er or say they could­n’t, even if they agree that in the­o­ry they should. But a new study re­ports that peo­ple with dam­age to one part of the brain make a less per­sonal cal­cu­la­tion. The log­i­cal choice, they say, is to sac­ri­fice one life to save many.

The re­search shows that emo­tion plays a key role in mor­al de­ci­sions, sci­ent­ists claim: if cer­tain emo­tions are blocked, we make de­ci­sions that—right or wrong—seem un­nat­u­ral­ly cold.

Past stud­ies have linked dam­age to some brain ar­eas with a lack of any dis­cern­i­ble con­science, part of a syn­drome com­monly called psy­chop­a­thy. The new stu­dy, by con­trast, ident­if­ied a re­gion of brain dam­age tied to what the re­search­ers por­t­rayed as a nar­rower def­i­cit: one that strips mor­ality of an emo­tional com­po­nent while leav­ing its log­i­cal part in­tact.

The scientists pre­sented 30 males and fe­males with sce­nar­i­os pit­ting im­me­di­ate harm to one per­son against fu­ture harm to many. Six par­ti­ci­pants had dam­age to the ven­tro­me­dial pre­fron­tal cor­tex, a small re­gion be­hind the fore­head; 12 had brain dam­age else­where; an­oth­er 12 had no dam­age.

The sce­nar­i­os in the study were ex­treme, but the co­re di­lem­ma is­n’t. Should one con­front a co-work­er, chal­lenge a neigh­bor, or scold a loved one to up­hold the great­er good? The sub­jects with ven­tro­me­dial pre­fron­tal dam­age stood out in their stat­ed will­ing­ness to harm an in­di­vid­u­al—a pros­pect that usu­al­ly gen­er­ates strong aver­sion, re­search­ers said.

“They have ab­nor­mal so­cial emo­tions in real life. They lack em­pa­thy and com­pas­sion,” said Ralph Adolphs of the Cal­i­for­nia In­sti­tute of Tech­nol­o­gy in Pas­a­de­na, Calif., one of the re­search­ers.

“In those cir­cum­stances most peo­ple… will be torn. But these par­tic­u­lar sub­jects seem to lack that con­flic­t,” said An­to­nio Dama­sio of the Uni­ver­si­ty of South­ern Cal­i­for­nia, in Los An­ge­les, an­oth­er of the sci­en­tists.

“Our work pro­vides the first caus­al ac­count of the role of emo­tions in mor­al judg­ments,” added a third mem­ber of the re­search team, Marc Hauser of Har­vard Uni­ver­si­ty in Cam­bridge, Mass. The study ap­pears March 21 in the ad­vance on­line ed­i­tion of the re­search jour­nal Na­ture.

What’s “as­ton­ish­ing,” Hauser added, is “how se­lec­tive the def­i­cit is... [it] leaves in­tact a suite of mor­al prob­lem solv­ing abil­i­ties, but dam­ages judg­ments in which an aver­sive ac­tion is put in­to di­rect con­flict with a strong util­i­tar­ian out­come.” Util­i­tar­ian­ism is the be­lief that the top pri­or­i­ty in eth­ics should be what’s best for the great­est num­ber of peo­ple.

Hu­mans of­ten de­vi­ate from this prin­ci­ple be­cause they re­coil from di­rectly harm­ing one an­oth­er. This aver­sion is “a com­bi­na­tion of re­jec­tion of the act [and] com­pas­sion for that par­tic­u­lar per­son,” Dama­sio said. The ques­tion, Adolphs asked, is whether “so­cial emo­tions” such as com­pas­sion are “nec­es­sary to make these mor­al judg­ments.”

The stu­dy’s an­swer will in­form a clas­sic phil­o­soph­i­cal de­bate on wheth­er hu­mans make mor­al judg­ments based on norms and so­ci­e­tal rules, or based on their emo­tions, the sci­en­tists pre­dicted. It al­so holds an­oth­er im­pli­ca­tion for phi­los­o­phy, they said: it shows that hu­mans are neurolog­i­cally un­fit for strict util­i­tar­ian think­ing, and thus sug­gests neu­ro­sci­ence could test dif­fer­ent philoso­phies for com­pat­i­bil­ity with hu­man na­ture.


* * *

Send us a comment on this story, or send it to a friend

 

Sign up for
e-newsletter
   
 
subscribe
 
cancel

On Home Page         

LATEST

  • St­ar found to have lit­tle plan­ets over twice as old as our own

  • “Kind­ness curricu­lum” may bo­ost suc­cess in pre­schoolers

EXCLUSIVES

  • Smart­er mice with a “hum­anized” gene?

  • Was black­mail essen­tial for marr­iage to evolve?

  • Plu­to has even cold­er “twin” of sim­ilar size, studies find

  • Could simple an­ger have taught people to coop­erate?

MORE NEWS

  • F­rog said to de­scribe its home through song

  • Even r­ats will lend a help­ing paw: study

  • D­rug may undo aging-assoc­iated brain changes in ani­mals

Imagine that someone you know has AIDS and plans to infect others, some of whom will die. Your only options are to let it happen or to kill the person. Do you pull the trigger? Most people waver or say they couldn’t, even if they agree that in theory they should. But a new study reports that people with damage to one part of the brain make a less personal calculation. The logical choice, they say, is to sacrifice one life to save many. The research shows that emotion plays an important role in moral decisions, said the researchers. If certain emotions are blocked, we make decisions that—right or wrong—seem unnaturally cold. Past studies have linked damage to some brain areas with a total lack of discernible morality, called psychopathy. The new study, by contrast, found an area of brain damage linked to a narrower deficit, which strips morality of an emotional component while leaving its logical part intact. Researchers presented thirty males and females with scenarios pitting immediate harm to one person against future harm to many. Six partici pants had damage to the ventromedial prefrontal cortex, a small region behind the forehead; 12 had brain damage elsewhere; another 12 had no damage. The scenarios in the study were extreme, but the core dilemma isn’t. Should one confront a co-worker, challenge a neighbor, or scold a loved one to uphold the greater good? The subjects with ventromedial prefrontal damage stood out in their stated willingness to harm an indi vidual—a prospect that usually generates strong aversion, researchers said. “Because of their brain damage, they have abnormal social emotions in real life. They lack empathy and compassion,” said Ralph Adolphs of the California Institute of Technology in Pasadena, Calif., one of the researchers. “In those, circumstances most people… will be torn. But these particular subjects seem to lack that conflict,” said Antonio Damasio of the University of Southern California, in Los Angeles, another of the scientists. “Our work provides the first causal account of the role of emotions in moral judgments,” added a third member of the research team, Marc Hauser of Harvard University in Cambridge, Mass. The study appears in the March 22 issue of the research journal Nature. Not all moral reasoning depends so strongly on emotion, Hauser added: “What is absolutely astonishing about our results is how selective the deficit is.” “Damage to the frontal lobe leaves intact a suite of moral problem solving abilities, but damages judgments in which an aversive action is put into direct conflict with a strong utilitarian outcome.” Utilitarianism is the belief that the top priority in ethics should be what’s best for the greatest number of people. Humans often deviate from this principle because they recoil from directly harming one another. This aversion is “a combination of rejection of the act, but combined with the social emotion of compassion for that particular person,” Damasio said. “The question is, are the social emotions necessary to make these moral judgments?” Adolphs asked. The study’s answer will inform a classic philosophical debate on whether humans make moral judgments based on norms and societal rules, or based on their emotions, the scientists predicted. It also holds another implication for philosophy, they said: it shows that humans are neurologically unfit for strict utilitarian thinking, and thus suggests neuro science could test different philosophies for compatibility with human nature.