Perceptual Organizational

Musical perception can also be studied at the level of the perceptual organization of auditory events along these basic perceptual-acoustic axes, both when the events occur simultaneously or overlap in time, and when they are extended across time sequentially. Although the previous section touched on this level, the following discussion focuses on the perceptual organization of multiple auditory events.

1. Pitches

As discussed in the previous section, there is considerable evidence supporting a critical role for the right auditory cortex in the processing of pitch, as distinguished from frequency, particularly when judgments must be made. In this section, we consider what is known about the perceptual processing of multiple pitched auditory events, occurring either simultaneously (as in musical chords of two or more notes) or sequentially (as in tone sequences or melodies).

a. Harmonic (Simultaneous) Few studies have systematically examined the localization of component processes involved in the perception of harmony or the subjective effect of particular combinations of musical pitches or pitch sequences. Mark Tramo, Jamshed Bharucha, and Frank Musiek asked a patient with bilateral temporal lobe strokes to make judgments of mistuning within a three-note chord that was preceded by a different three-note chord. Normal subjects showed facilitation of mistuning judgments when the preceding chord was harmonically related. Even though the patient was impaired in tuning judgments, with a bias toward dissonance, his judgments were still normally "primed" by the harmonic relatedness of the preceding chord. This result suggests that simple judgments of consonance, which may be dependent on primary auditory cortex, can be dissociated from higher level effects of harmonic relatedness and expected sequence, which may be mediated by secondary auditory cortical areas.

b. Tonal (Sequential) The processing of sequences of auditory events possessing pitch is obviously central to music perception. In its simplest form, each of these events consists of a harmonic complex tone with a relatively constant pitch. The neural basis for processing sequences of such tones has been explored using several different methods. Sequences of tones differing in pitch can be encoded relatively crudely in terms of the pattern of changes in frequency direction, regardless of the magnitude of change. This type of pattern is called contour. Alternatively, the specific pitches (or the size of the intervals they form) can be encoded, and the resulting melodic contour or tonal melody may be transposed in frequency.

Only a few neurophysiological studies have examined single cell recordings in response to the individual tones contained in such stimuli. Norman Weinberger and T. McKenna studied the responses of neurons to five-tone sequences in the cat's auditory cortex. Strong responses were obtained only to the first tone of each sequence under anesthesia, so measurements were carried out while awake, resulting in strong responses to each tone. Discharges to the same tone varied depending on the contour of the particular sequence in which they were embedded, within both primary and secondary auditory cortical areas.

In a related study using three-tone sequences, I. Espinosa and George Gerstein recorded from primary auditory cortex in the cat, testing all permutations of the three tones. In a groundbreaking experiment, they recorded up to 24 neurons simultaneously, permitting analyses of the functional connectivity between pairs of neurons. These patterns of connectivity, elicited in response to the same tone as it occurred in different sequences, were varied, suggesting that contour and/or tonal melody are encoded in the responses of populations of neurons.

Techniques for recording from many neurons or neuronal clusters simultaneously have undergone rapid advances recently, thus permitting further investigation of these phenomena. Only then will we be able to proceed beyond these initial suggestive inquiries toward a more complete understanding of how sequences of pitches are encoded within the tonotopic auditory nervous system. Also, only then will we be able to judge the prescience of William James, who speculated in 1890 that memories might be retained as '"paths' ... in the finest recesses of the brain's tissue,'' recalled by ''the functional excitement of the tracts and paths in question.''

Using PET, Zatorre and colleagues measured increases in CBF in response to standard Western diatonic melodies, in contrast to amplitude envelope-matched noise bursts. In a recent (2000) reanalysis of the original 1994 report, the localization of the focus within the right superior temporal gyrus was further refined to its ventrolateral aspect, anterior to Heschl's gyrus, and the smaller increase in the left superior temporal gyrus reached significance (Fig. 1). Although

Heschl Gyrus Mri

Figure 1 Melody perception vs matched noise bursts. Horizontal, coronal, and saggital views through the peak of CBF increases in the superior temporal gyrus, as measured with PET, superimposed on the group-averaged MRI volume. The peak is located on the lateral surface of the right superior temporal gyrus, posterior to Heschl's gyrus (original data from Zatorre et al., 1994; reanalysis and figure reproduced with permission from Zatorre and Binder, 2000).

Figure 1 Melody perception vs matched noise bursts. Horizontal, coronal, and saggital views through the peak of CBF increases in the superior temporal gyrus, as measured with PET, superimposed on the group-averaged MRI volume. The peak is located on the lateral surface of the right superior temporal gyrus, posterior to Heschl's gyrus (original data from Zatorre et al., 1994; reanalysis and figure reproduced with permission from Zatorre and Binder, 2000).

the contrast between melodies and noise sequences is complex, these results suggest the involvement of secondary auditory cortical areas in the processing of melodies, with a right hemisphere predominance that is nevertheless relative.

A recent study by Anniruddh Patel and Evan Balaban measured the magnetoencephalogram while subjects listened to tone sequences varying parame-trically in structure from random to music-like. Patterns of temporal synchronization between brain regions were greatest for the most structured, melodylike sequences and suggested a high degree of inter-hemispheric connectivity.

Although a number of studies of the effects of brain damage on the processing of tonal melody indicate a relative right hemisphere predominance, many also indicate a critical involvement of processing in the left hemisphere as well, particularly for more complex sequences and tasks. Ensuring that the melodies do not have associated lyrics, that they are novel, and that the task does not involve naming is not sufficient to eliminate potentially critical left hemisphere contributions to tone sequence perception.

For example, Peretz found that groups of patients with right or left hemisphere lesions were both impaired on a task of making same-different judgments to pairs of novel melodies, relative to normal control subjects. The right hemisphere group was more impaired for trials in which the contour was violated. However, both groups were equivalently impaired when contour was preserved, and comparison was thus based solely on pitch interval information. Peretz hypothesized that the right hemisphere is predominant for the encoding of contour and the left hemisphere for the encoding of specific pitch intervals. She further proposed that the encoding of specific pitch intervals is dependent on the formation of a representation of contour, resulting in the observed pattern of deficits. However, studies by Zatorre and colleagues found normal effects of these cues on melody recognition regardless of which hemisphere was lesioned and overall found greater and more consistent effects of right hemisphere excisions, particularly those that encroach on the primary auditory region.

Differences in tasks and in the types and extents of lesions are no doubt critical, but clearly cortical areas in the left hemisphere also contribute to melody perception.

2. Duration

The musical organization of durations is called rhythm, and it often reflects an underlying meter or repeated pattern of stress against which other temporal variations are ordered. These aspects of music perception have received less study than tonal pitch organization, and there are fewer hypotheses regarding their neural substrates.

a. Metric Peretz compared rhythm and meter discrimination by groups of right and left hemisphere-damaged patients. Although rhythm discrimination was impaired for both groups, neither group was impaired in meter discrimination. This evidence that meter discrimination is more resistant to cortical damage may implicate subcortical and/or spared cortical processing.

b. Rhythmic A recent study using PET carried out by Virginia Penhune, Robert Zatorre, and colleagues found right-sided activity in the superior temporal gyrus when the perception and manual reproduction of isochronous sequences were contrasted to perception alone. These results suggest the involvement of secondary auditory cortical areas in the right hemisphere for the simplest form of rhythmic perception and/or reproduction. When reproduction of more complex rhythmic sequences was contrasted to reproduction of a similar repeated, well-learned sequence, increases were seen in the cerebellar vermis and hemispheres for both auditory and visual stimulus presentation. This cerebellar activation may result from its contribution to the flexible production of precisely timed sequential movements and/or to the perception of complex, changing rhythms, but in either case it is not unique to auditory rhythms. The perception of visual rhythm may itself be considered an important musical function, at least for members of conductor-led ensembles.

3. Intensity

Although the intensity patterns underlying musical phrase structure have been studied extensively by cognitive scientists such as Henry Shaffer, Eric Clarke, and Caroline Palmer, the neural basis of their perceptual processing remains to be explored.

4. Integration of Tonal, Rhythmic, and Intensity Organizations

In actual musical performance, and thus in its product, musical sound, the tonal structure, rhythmic structure, and intensity pattern are usually highly interrelated. Their integrated product, the musical phrase at one level of local detail, is perhaps one of the most essential and defining elements of music. However, a neural understanding of the mechanisms supporting this integration must follow further study of the component processes outlined previously.

Finally, more complex levels of perceptual organization can be considered, such as the perception of an auditory ''scene'' and the integration of auditory and visual perception across time.

5. Auditory Scene Analysis

Spatial location is only one potential attribute of musical sounds that may be used dynamically, along with temporal and spatial cues, in order to extract auditory objects or events from complex sound mixtures. The physical separation of players in an orchestra can facilitate their extraction from the sound of the orchestra as a whole, finding an extreme in the off-stage ensembles sometimes employed in complex orchestrations. However, temporal and spectral cues also play prominent roles in the segregation of auditory objects. Sufficient differences on at least two of these three axes are needed in order to segregate an auditory object from its surround. In an experiment examining the combination of spectral and spatial differences, David Perry and Pierre Divenyi found lower thresholds for spatial separation within the left auditory hemifield in normal controls and disturbances following cortical damage in the contralateral temporoparietal junction. The normal asymmetry is suggestive of a right hemisphere preference for some aspect of auditory scene analysis based on temporal/ spectral integration and could be explained simply by the demand for spectral processing.

A recent study of professional conductors found evidence from EPs of enhanced auditory localization when compared to that of nonmusicians or pianists. Although the ability to accurately localize the sources of sounds is critical for a symphony conductor, auditory spatial perception can also enhance the perception of music for casual listeners, whether real, as in live performance, or virtual, as in studio-produced recordings for two or more speakers. Specifically musical investigations must in large part follow further basic neuroscientific investigations of auditory scene analysis.

6. Auditory-Visual Integration

The integration of auditory and visual information is an active focus of basic neuroscience research that has relevance for understanding music perception. Although music can be enjoyed as a purely auditory phenomenon, without correlated visual input, live or videotaped performances of music present the listener with the correlated movements of players, singers, and conductors. Choreography set to music, live or videotaped dance, movie musicals, music videos, cartoons, and other music-video compositions present the listener/viewer with artificially assembled auditory-visual coincidences. Concomitant visual inputs can influence music perception, just as they can influence speech perception. Advances in basic research on low-level auditory-visual integration (e.g., a tone paired with a spatial grating) and on visual influences on speech perception will facilitate investigations of the neural mechanisms supporting auditory-visual integration in specifically musical contexts.

7. Absolute ("Perfect") Pitch

Some individuals, who are said to possess absolute or perfect pitch (AP), can name the pitch of a musical tone or sing a named pitch on demand without referring to any other sounds. Gottfried Schlaug measured the planum temporale or secondary auditory cortex on the superior temporal plane posterior to Heschl's gyrus and found that AP possessors exhibited an exaggerated leftward asymmetry.

Robert Zatorre, David Perry, and Christine Beckett used PET to measure CBF increases while musicians with and without AP performed a series of simple tasks involving pairs of complex tones. The musicians either simply listened to the tone pairs or decided whether they formed major or minor intervals. For listening in contrast to noise bursts, both groups showed bilateral activation of auditory cortex, but only the AP group showed an additional focus in the left posterior dorsolateral frontal cortex. However, when making interval judgments both groups showed activation in this region (Fig. 2). Studies in nonhuman primates by Michael Petrides have implicated this region as critical for conditional associative learning. AP can be conceptualized as a form of conditional associative learning, in which a stimulus attribute (tonal pitch) is arbitrarily associated with a verbal label (note name). Therefore, the activation of left posterior dorsolateral frontal cortex when simply listening to the associated stimulus (and while note names are obligatorily retrieved) may be explained by the retrieval of these associations. Its activation when musicians labeled intervals as "major" or "minor" may similarly be explained by the association between an arbitrary label

Was this article helpful?

0 0
Do Not Panic

Do Not Panic

This guide Don't Panic has tips and additional information on what you should do when you are experiencing an anxiety or panic attack. With so much going on in the world today with taking care of your family, working full time, dealing with office politics and other things, you could experience a serious meltdown. All of these things could at one point cause you to stress out and snap.

Get My Free Ebook


Post a comment