"Long before it's in the papers"
October 28, 2015

RETURN TO THE WORLD SCIENCE HOME PAGE


Music processed by matching brain rhythms, study finds

Oct. 28, 2015
Courtesy of New York University
and World Science staff

Sci­en­tists say they have found how brain rhythms are used to pro­cess mu­sic, a find­ing that shows how our pe­r­cep­tion of notes and melodies can be used to bet­ter un­der­stand the hear­ing sys­tem.

The stu­dy, which ap­pears in the jour­nal Pro­ceed­ings of the Na­t­ional Acad­e­my of Sci­ences this week, points to a new role for “cor­ti­cal os­cilla­t­ions,” or rhyth­mic repe­ti­tions of nerve cell ac­ti­vity in the brain.

These os­cilla­t­ions are in­volved in de­tect­ing of mu­sical se­quences, and mu­sical train­ing may en­hance their func­tion, ac­cord­ing to the stu­dy.

“We’ve iso­lat­ed the rhythms in the brain that match rhythms in mu­sic,” ex­plains Keith Doelling, a doc­tor­al stu­dent at New York Uni­vers­ity and the stu­dy’s lead au­thor. “Our find­ings show that the pres­ence of these rhythms en­hances our pe­r­cep­tion of mu­sic and of pitch changes.”

The study found that mu­sicians have stronger os­cillatory mech­a­nisms than non-mu­sicians, a find­ing whose im­por­tance goes be­yond the val­ue of mu­sical in­struc­tion, the au­thors said.

“What this shows is we can be trained, in ef­fect, to make more ef­fi­cient use of our auditory-de­tection sys­tems,” said study co-au­thor Da­vid Poep­pel, a pro­fes­sor at the uni­vers­ity. “Mu­si­cians, through their ex­pe­ri­ence, are simply bet­ter at this type of pro­cessing.”

Pre­vi­ous re­search has shown that brain rhythms very pre­cisely syn­chro­nize with speech, en­a­bling us to parse con­tin­u­ous streams of speech—in oth­er words, how we can iso­late syl­la­bles, words, and phrases from speech, which is not, when we hear it, marked by spaces or punctua­t­ion.

But it has­n’t been clear what role such cor­ti­cal brain rhythms, or os­cilla­t­ions, play in pro­cessing oth­er types of nat­u­ral and com­plex sounds, such as mu­sic.

The re­search­ers con­ducted three ex­pe­ri­ments us­ing a tech­nique called mag­ne­toen­ceph­al­o­graphy, which al­lows mea­sure­ments of the ti­ny mag­net­ic fields gen­er­at­ed by brain ac­ti­vity. 

Par­ti­ci­pants in the study were asked to de­tect short pitch dis­tor­tions in 13-sec­ond clips of clas­si­cal pia­no mu­sic by Bach, Bee­tho­ven, Brahms that var­ied in tem­po—from half a note to eight notes per sec­ond.

For mu­sic that is faster than one note per sec­ond, both mu­sicians and non-mu­sicians showed cor­ti­cal os­cilla­t­ions that syn­chro­nized with the note rate of the clips, the au­thors said. In oth­er words, ever­yone ef­fectively used these os­cilla­t­ions to pro­cess the sounds, al­though mu­sicians’ brains syn­chro­nized more to the mu­sical rhythms. 

But only mu­sicians showed os­cilla­t­ions that syn­chro­nized with un­usu­ally slow clips.

This dif­fer­ence, the re­search­ers say, may sug­gest that non-mu­sicians can’t pro­cess the mu­sic as a con­tin­u­ous mel­o­dy rath­er than as in­di­vid­ual notes. More­o­ver, mu­sicians much more ac­cu­rately de­tected pitch dis­tor­tions—as ev­i­denced by cor­re­spond­ing cor­ti­cal os­cilla­t­ions. Brain rhythms, they add, there­fore seem to play a role in pars­ing and group­ing sound streams in­to “chunks” that are then an­a­lyzed as speech or mu­sic.


* * *

Send us a comment on this story, or send it to a friend











y Sign up for
e-newsletter

   
 
subscribe
 
cancel

On Home Page         

LATEST

  • Com­et de­tected dump­ing alco­hol into space

  • M­ost Ear­th-like wo­rlds have yet to form, stu­dy says

EXCLUSIVES

  • Study links global warming, war for first time—in Syria

  • Smart­er mice with a “hum­anized” gene?

  • Was black­mail essen­tial for marr­iage to evolve?

  • Plu­to has even cold­er “twin” of sim­ilar size, studies find

MORE NEWS

  • F­rog said to de­scribe its home through song

  • Even r­ats will lend a help­ing paw: study

  • D­rug may undo aging-assoc­iated brain changes in ani­mals

Scientists say they have found how brain rhythms are used to process music, a finding that shows how our perception of notes and melodies can be used to better understand the hearing system. The study, which appears in the journal Proceedings of the National Academy of Sciences, points to a newfound role for “cortical oscillations,” or rhythmic repetitions of nerve cell activity. These oscillations are involved in detecting of musical sequences, and musical training may enhance their function, according to the study. “We’ve isolated the rhythms in the brain that match rhythms in music,” explains Keith Doelling, a doctoral student at New York University and the study’s lead author. “Our findings show that the presence of these rhythms enhances our perception of music and of pitch changes.” The study found that musicians have more potent oscillatory mechanisms than do non-musicians, a finding whose importance goes beyond the value of musical instruction, the authors said. “What this shows is we can be trained, in effect, to make more efficient use of our auditory-detection systems,” said study co-author David Poeppel, a professor at the University. “Musicians, through their experience, are simply better at this type of processing.” Previous research has shown that brain rhythms very precisely synchronize with speech, enabling us to parse continuous streams of speech—in other words, how we can isolate syllables, words, and phrases from speech, which is not, when we hear it, marked by spaces or punctuation. But it hasn’t been clear what role such cortical brain rhythms, or oscillations, play in processing other types of natural and complex sounds, such as music. The researchers conducted three experiments using a technique called magnetoencephalography, which allows measurements of the tiny magnetic fields generated by brain activity. Participants in the study were asked to detect short pitch distortions in 13-second clips of classical piano music by Bach, Beethoven, Brahms that varied in tempo—from half a note to eight notes per second. For music that is faster than one note per second, both musicians and non-musicians showed cortical oscillations that synchronized with the note rate of the clips, the authors said. In other words, everyone effectively used these oscillations to process the sounds, although musicians’ brains synchronized more to the musical rhythms. But only musicians showed oscillations that synchronized with unusually slow clips. This difference, the researchers say, may suggest that non-musicians can’t process the music as a continuous melody rather than as individual notes. Moreover, musicians much more accurately detected pitch distortions—as evidenced by corresponding cortical oscillations. Brain rhythms, they add, therefore seem to play a role in parsing and grouping sound streams into “chunks” that are then analyzed as speech or music.