Comparing the processing of music and language meaning using EEG and FMRI provides evidence for similar and distinct neural representations

PLoS One. 2008 May 21;3(5):e2226. doi: 10.1371/journal.pone.0002226.

Abstract

Recent demonstrations that music is capable of conveying semantically meaningful information has raised several questions as to what the underlying mechanisms of establishing meaning in music are, and if the meaning of music is represented in comparable fashion to language meaning. This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning. In two experiments using EEG and fMRI, it was shown that single chords varying in harmonic roughness (consonance/dissonance) and thus perceived affect could prime the processing of subsequently presented affective target words, as indicated by an increased N400 and activation of the right middle temporal gyrus (MTG). Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion). This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe.

Publication types

  • Comparative Study

MeSH terms

  • Adult
  • Brain / physiology*
  • Electroencephalography
  • Female
  • Humans
  • Language*
  • Magnetic Resonance Imaging
  • Male
  • Music*