"Long before it's in the papers"
June 03, 2013

RETURN TO THE WORLD SCIENCE HOME PAGE


Study suggests we can flip our opinions in moments without knowing

Sept. 20, 2012
Courtesy of Lund University
and World Science staff

Using nothing more than a trick adapted from stage mag­ic, scient­ists say they have in­duced volun­teers to re­verse their stated opin­ions on mor­al top­ics without no­tic­ing the in­con­sist­ency.

The find­ings, pub­lished Sep. 19 in the re­search jour­nal PLoS One, ex­pose sur­prising flex­i­bil­ity in hu­man mor­al atti­tudes and could have im­pli­ca­tions for sur­vey-taking, the in­vest­i­ga­tors say.

In the research, led by Lars Hall of Lund Uni­vers­ity in Swe­den, ex­pe­ri­menters ap­proached peo­ple as they strolled through a park, and pre­sented them with a sur­vey on some mor­al is­sues. Some of the sur­veys cov­ered bas­ic mor­al prin­ci­ples while oth­ers cov­ered hot news top­ics.

The sur­vey pro­ce­dure step by step. (1) The ques­tion­naire is at­tached to a clip­board. A pa­per slip with mor­al state­ments is at­tached to the first page of the ques­tion­naire to con­ceal the same, but negated set of printed state­ments. (2) The par­tic­i­pants rate their agree­ment with the state­ments on the first page of the ques­tion­naire and (3) turn to the sec­ond page, and (4) rate their agree­ment with a sec­ond set of prin­ci­ples. (5) When the par­tic­i­pants are asked to flip back the sur­vey to the first page to dis­cuss their opin­ions, the add-on pa­per slip from (1) now sticks to a patch of stronger glue on the back­side of the clip­board, and re­mains at­tached there. When the par­tic­i­pants now read the ma­nip­u­lat­e state­ments the mean­ing has been re­versed (the equiv­a­lent of mov­ing the ac­tu­al rat­ing score to the mir­ror side of the scale). (6) Dur­ing the de­brief­ing, the ex­per­i­ment­ demon­strates the work­ings of the pa­per slip to the par­tic­i­pants, and ex­plains how the ma­nip­u­la­tion led to the re­ver­sal of their po­si­tion. (Cred­it: L. Hall et al, PLoS ONE)


In each case, par­ti­ci­pants in each case were asked to state wheth­er they agreed or disa­greed with var­i­ous state­ments, such as “e­ven if an ac­tion might harm the in­no­cent, it can still be mor­ally per­mis­si­ble to per­form it.”

Each sur­vey had two pages. But the pa­per used was sub­jected to a “magic trick” that caused the first page to change while the par­ti­ci­pant was dis­tract­ed fill­ing out the sec­ond. 

When the par­ti­ci­pant re­turned to look at the first, two of the orig­i­nal state­ments ap­peared in their op­po­site for­m—though the an­swers re­mained un­changed. Thus, a read­ing of that page be­fore and af­ter the al­tera­t­ion would make it seem as though con­flict­ing opin­ions had been ex­pressed.

The sleight-of-hand was ac­com­plished with no high-tech tool­s—just a clip­board, a bit of glue and and a ques­tion­naire page that was made up of two lay­ers of pa­per. (For a vi­deo il­lustra­t­ion of the ex­pe­ri­ment, see here; the full re­port on the study is here.)

Af­ter the par­ti­ci­pants did in fact re­turn to the orig­i­nal page, they were asked to dis­cuss their re­sponses. The re­search­ers found that many par­ti­ci­pants sup­ported their an­swers as seen on the re­vised page, even though they were in ef­fect op­po­site to what they had orig­i­nally in­tend­ed to ex­press. 

While close to half of par­ti­ci­pants did catch on that at least some­thing was wrong, “a full 53 per­cent of the par­ti­ci­pants ar­gued un­equiv­o­cally for the op­po­site of their orig­i­nal at­ti­tude in at least one of the ma­ni­pu­lated tri­als,” the re­search­ers wrote.

These par­ti­ci­pants went be­yond just cas­u­ally con­firm­ing what was on the page. They “often con­structed co­her­ent and un­equiv­o­cal ar­gu­ments sup­port­ing the op­po­site of their orig­i­nal po­si­tion,” the in­ves­ti­ga­tors wrote, sug­gest­ing “a dra­mat­ic po­ten­tial for flex­i­bil­ity in our mor­al at­ti­tudes.”

Part of the ex­plana­t­ion may be people’s sur­pris­ingly high ca­pa­city for af­ter-the-fact “ra­t­ional­iz­a­tion” of views that they have ex­pressed or even just thought they ex­pressed, the re­search­ers wrote. Participants were de­brief­ed and in­ter­viewed after the surveys to find out what, if any­thing, they had no­ticed amiss.

The find­ings “could have sig­nif­i­cant im­pact on re­search that uses self-reported ques­tion­naires,” Hall said. “Ei­ther we would have to con­clude that many par­ti­ci­pants hold no real at­ti­tudes about the top­ics we in­ves­t­i­gate, or that stand­ard sur­vey scales fail to cap­ture the com­plex­ity of the at­ti­tudes peo­ple ac­tu­ally hold.”


* * *

Send us a comment on this story, or send it to a friend









 

Sign up for
e-newsletter
   
 
subscribe
 
cancel

On Home Page         

LATEST

  • Pov­erty re­duction, environ­mental safe­guards go hand in hand: UN re­port

  • Astro­nomers hope to find al­ien civiliza­tions through heat

EXCLUSIVES

  • Was black­mail essen­tial for marr­iage to evolve?

  • Plu­to has even cold­er “twin” of sim­ilar size, studies find

  • Could simple an­ger have taught people to coop­erate?

  • Diff­erent cul­tures’ mu­sic matches their spe­ech styles, study finds

MORE NEWS

  • F­rog said to de­scribe its home through song

  • Even r­ats will lend a help­ing paw: study

  • D­rug may undo aging-assoc­iated brain changes in ani­mals

Moments after stating a moral view on a difficult topic, people may easily endorse the opposite opinion and remain blind to the inconsistency, according to a new study. The findings were published Sep. 19 in the research journal PLoS One. In the study, led by Lars Hall of Lund University in Sweden, experimenters approached people as they strolled through a park, and presented them with a survey on some moral issues. Some of the surveys covered basic moral principles while others covered hot news topics. In each case, participants in each case were asked to state whether they agreed or disagreed with various statements, such as “even if an action might harm the innocent, it can still be morally permissible to perform it.” Each survey had two pages, but the paper used was subjected to a “magic trick” that caused the first page to change while the participant was distracted filling out the second. When the participant returned to look at the first, two of the original statements appeared in their opposite form—though the answers remained unchanged. Thus, a reading of that page before and after the alteration would make it seem as though conflicting opinions had been expressed. The sleight-of-hand was accomplished with no high-tech tools—just a clipboard, a bit of glue and and a questionnaire page that was made up of two layers of paper. (For a video illustration of the experiment, see http://www.lucs.lu.se/cbq/). After the participants did in fact return to the original page, they were asked to discuss their responses. The researchers found that many participants supported their answers as seen on the revised page, even though they were in effect opposite to what the they had originally intended to express. While close to half of participants did catch on that at least something was wrong, “a full 53% of the participants argued unequivocally for the opposite of their original attitude in at least one of the manipulated trials,” the researchers wrote. These participants went beyond just casually confirming what was on the page. They “often constructed coherent and unequivocal arguments supporting the opposite of their original position,” the investigators wrote, suggesting “a dramatic potential for flexibility in our moral attitudes.” Part of the explanation may be a surprisingly high human capacity for after-the-fact “rationalization” of views that they have expressed or even just thought they expressed, the researchers wrote. The findings “could have significant impact on research that uses self-reported questionnaires,” Hall said. “Either we would have to conclude that many participants hold no real attitudes about the topics we investigate, or that standard survey scales fail to capture the complexity of the attitudes people actually hold.”