V

Figure 7 Auditory event-related potentials (ERPs) recorded from the scalp of a human subject in response to the presentation of brief sounds. The ERP is divided into three epochs: a short latency auditory brain stem response (ABR) (A), a middle latency response (MLR) (B), and a late, long-latency response (C) (adapted with permission from Picton, T.W., et al., EEG Clin Neurophysiol. 36, 179-190, 1974).

Figure 7 Auditory event-related potentials (ERPs) recorded from the scalp of a human subject in response to the presentation of brief sounds. The ERP is divided into three epochs: a short latency auditory brain stem response (ABR) (A), a middle latency response (MLR) (B), and a late, long-latency response (C) (adapted with permission from Picton, T.W., et al., EEG Clin Neurophysiol. 36, 179-190, 1974).

firing pattern is reflected in the sequence of peaks and valleys that make up the resulting evoked potential waveform, referred to as the event-related potential (ERP). The ERP typically is of very low amplitude and usually indistinguishable from the ongoing EEG. Thus, in practice, averaging the responses to hundreds or thousands of stimulus presentations is needed to reveal the ERP temporal waveform. The ERP is traditionally divided into three time epochs (Fig. 7). The seven waves arising during the first 10-15 msec

(Fig. 7A) after stimulus onset constitute the averaged brain stem response (ABR). Intraoperative recording from the human auditory nerve, cochlear nuclei, and inferior colliculi has provided the most compelling evidence that waves I and II are generated by the auditory nerve, wave III by the cochlear nuclei, and wave IV by the SOC. Wave V probably originates in or below the inferior colliculus, whereas waves VI and VII originate in the ICC or thalamocortical projection system. Thus, the temporal waveform of the ERP, its spatial distribution on the scalp, and its sensitivity to various acoustic parameters can provide information about the transmission and encoding properties of large neuronal assemblies at all levels of the central auditory pathway. The ABR is highly stable under a wide range of conditions, including various stages of consciousness, attention, sleep, wakefulness, and sedation. It is present at birth and mature by approximately 18 months. Today, it is used routinely to screen for hearing impairment in newborn babies. Overall, the ABR has proven to be a very useful noninvasive and objective way to test for hearing impairment and other disorders of the peripheral and central auditory pathways, especially in those subjects who are unwilling or unable to participate.

A cluster of three major wave peaks with onset latency beyond the ABR components (out to about 50-60 msec) are widely recorded over the frontal and temporal cortex and referred to as the middle latency response (MLR; Fig. 7B). The generation of the MLR may involve the interaction of many brain structures within and outside of the classic lemniscal auditory pathways. Lesions of auditory cortex in humans and animals disrupt MLR waveforms. MLR is influenced by the state of arousal of a subject, suggesting involvement of the reticular formation. The MLR is used clinically to assess hearing thresholds in the low-frequency range and to evaluate auditory pathway function in hearing individuals and in subjects who have cochlear implants.

When trains of brief stimuli are presented the MLR components become time locked to the individual stimuli at frequencies of approximately 40 Hz. This 40Hz response to the appropriate stimulus is exhibited by other sensory systems as well, indicating that it represents a general mechanism for recognizing a sensory event. Spontaneous electrical oscillation in the human brain at frequencies of approximately 40 Hz and its resetting by a sensory stimulus have been postulated to reflect cortical mechanisms involved in temporal binding of sensory stimuli and perceptual scene segregation.

Finally, a series of peaks and valleys is recorded with latencies exceeding 50 msec (Fig. 7C). The amplitudes of the waves are higher but more variable than earlier ones, depending on the conscious state of the subject and the stimulus paradigm employed to evoke the waveform. This has led to the suggestion that the late components represent the convergence of input from a number of forebrain systems whose interactions are related to the attentive or cognitive state of the subject.

The currents that give rise to the electrical voltage recorded on the scalp also give rise to weak magnetic fields. These weak fields can be measured by an array of sensitive magnetometers (superconducting quantum interference devices) surrounding the head. The method is known as MEG, and the response to a sound is referred to as the auditory-evoked magnetic field (AEF) (Fig. 8). AEF data reveal many of the same cortical processes as the ERP. Both methods yield similar waveforms with excellent response times capable of tracking with high fidelity neural events in time. In addition, the MEG offers relatively high spatial resolution, on the order of millimeters. Because of its differential sensitivity to currents flowing tangentially to the scalp, the MEG is particularly suited for noninvasive study of the cortex buried within fissures, including auditory cortex within the Sylvian fissure on the superior temporal plane. Because any electrical potential or magnetic signal may have more than one source in the brain (the so-called inverse problem), a source model is applied in which the orientation, strength, and location of the equivalent current dipoles are best accounted for on statistical grounds.

The ERP and AEF methods have been used effectively to study the neural activity associated with

Figure 8 Electromagnetic waves generated by the cortex and recorded on or near the head in response to an acoustic stimulus. Event-related potential (ERP) represents electrical potentials recorded with an electrode in contact with the scalp. The auditory-evoked magnetic field (AEF) is the magnetic field recorded from a detector very near the head. The major component is a middle latency response because it occurs approximately 100 msec after stimulus onset (adapted with permission from Hari, R., Adv. Audiol. 6, 222-282, 1990).

Figure 8 Electromagnetic waves generated by the cortex and recorded on or near the head in response to an acoustic stimulus. Event-related potential (ERP) represents electrical potentials recorded with an electrode in contact with the scalp. The auditory-evoked magnetic field (AEF) is the magnetic field recorded from a detector very near the head. The major component is a middle latency response because it occurs approximately 100 msec after stimulus onset (adapted with permission from Hari, R., Adv. Audiol. 6, 222-282, 1990).

attentional and cognitive processes. Figure 9 (top) shows an example of the ERP obtained when a subject heard the same sound repeatedly and then when a rare, or odd-ball, stimulus was introduced at random times. The major changes associated with the waveform occurred relatively late, after about 200 msec. Similarly, the AES waveforms (Fig. 9, bottom) were obtained under conditions in which the subject was awake but reading and when the subject was attending to the stimulus. Like the ERP, the late components are the ones most affected by this difference in state of attention. These late components are thought to reflect the processing of auditory stimuli.

b. Functional Imaging: PET and fMRI In recent years, the use of noninvasive functional imaging techniques—fMRI and PET—has provided a wealth of new information concerning the location and extent of regions of the human brain that are metabolically activated by acoustic stimulation. Both methods provide indirect evidence of neuronal activation based on the facts that when neurons become more active they require more oxygen and glucose and that this increased demand is met by a compensatory increase in regional cerebral blood flow. Using these indirect imaging methods, investigators have noted that broad regions of the supratemporal plane and lateral superior temporal gyrus are bilaterally activated by a wide

Rare

Understanding And Treating Autism

Understanding And Treating Autism

Whenever a doctor informs the parents that their child is suffering with Autism, the first & foremost question that is thrown over him is - How did it happen? How did my child get this disease? Well, there is no definite answer to what are the exact causes of Autism.

Get My Free Ebook


Post a comment