Article Text

Download PDFPDF

Language lateralisation and early right ear deafness: was Wernicke right?


The effects of early right ear deafness on lateralisation of auditory language functions are not fully known. A 36 year old right handed man, with a history of perinatal right ear deafness and undergoing evaluation for surgical treatment of seizures that began at age 10 years was studied. Language lateralisation testing by intracarotid sodium amobarbital injection showed receptive and expressive language functions to be strongly lateralised to the left hemisphere. Results with intracarotid sodium amobarbital injection further suggested that transmission of auditory input to the patient's left hemisphere was partially dependent on ipsilateral left ear pathways. Cortical language mapping through implanted subdural electrodes localised auditory language functions to traditional left posterior perisylvian language areas. These results suggest that early right ear deafness does not impede left hemisphere lateralisation and localisation of auditory language functions. Moreover, transmission of auditory information to the patient's left hemisphere seems to be accomplished, in part, by recruitment of ipsilateral left ear pathways.

  • language lateralisation
  • right ear deafness

Statistics from

Almost a century ago, Carl Wernicke hypothesised that early right ear deafness would result in language dominance in the right hemisphere.1 This hypothesis rested on the assumptions that auditory input is required for language lateralisation and that it is transmitted exclusively from the contralateral ear. Both assumptions were subsequently challenged by the finding that each ear projects to both hemispheres, and by reports of expressive aphasia in congenitally deaf people with left hemispheric lesions.2 This second finding suggests that language will lateralise to the left hemisphere in the absence of auditory input. Recent functional neuroimaging studies of congenitally deaf patients report left hemispheric activation during sign language tasks, supporting the notion that auditory input is not required for left hemispheric language lateralisation.3 4 What cannot be determined from such cases, however, is whether lateralisation of receptive (auditory) language functions is affected by absence of early unilateral, and in particular right ear, auditory input. Although each ear projects to both hemispheres, projections to the contralateral hemisphere take precedence over ipsilateral projections.5 Consequently, the primary source of auditory input to the left hemisphere is the right ear, as evidenced by the well established right ear advantage for speech on tests of dichotic listening.6 Atypical patterns of language lateralisation might be expected, therefore, in the absence of early right ear input, as Wernicke predicted. This hypothesis is relevant in the light of recent reports that the right hemisphere is capable of supporting auditory language functions even in the adult brain.7 8

Children with early right ear deafness are at risk for developmental language disorders.9 Their language learning difficulties have been attributed to the degradation of acoustic information transmitted by interhemispheric pathways from the right hemisphere and/or ipsilateral left ear pathways.10 Alternatively, such impairments could, as Wernicke suggested, reflect partial or complete lateralisation of auditory language functions to the less linguistically competent right hemisphere.1 11 We had the opportunity to test directly this hypothesis in an adult patient with early unilateral right ear deafness undergoing evaluation for surgical treatment of seizures.

Case report

A 36 year old right handed man with a profound, unilateral, right ear sensorineural hearing loss, concurrent with mumps at the age of 2 months, was referred for resection surgery for treatment of intractable complex partial seizures. He began having seizures at the age of 10 years, at a rate of 3–4 seizures a month. Scalp EEG identified a left anterior temporal seizure focus. Neurological examination was normal, as was MRI of the brain. There was no family history of seizures or left handedness. The patient had 14 years of education and was self employed. He described his academic performance as average, noting that he was a poor reader and speller. The patient gave informed consent for all testing.

As part of his presurgery evaluation, the patient underwent neuropsychological testing, language lateralisation testing by sodium amobarbital injection, and cortical language mapping through implanted subdural electrodes. All testing was completed within a 1 month period. The patient underwent initial surgery for implantation of the electrodes followed by a second operation 10 days later for electrode removal and resection. At the time of testing, the patient's daily medications included tegretol (800 mg) and phenobarbital (90 mg). He performed within normal limits on all neuropsychological tests, including tests of attention, auditory verbal memory (digit span), picture naming (Boston naming test), and word reading. His performance on the Boston diagnostic aphasia test was within normal limits, although numerous spelling errors were noted. His speech was described as fluent and grammatical. He obtained a full scale IQ of 103 (verbal IQ=104, performance IQ=94) on the WAIS-R.12 Audiological evaluation confirmed a stable severe to profound right ear sensorineural hearing loss at 250–8000 Hz, with no measurable word recognition. Left ear thresholds were normal (For example,⩽25 dB) at 250–8000 Hz, with excellent (96%) word recognition in quiet conditions. However, left ear word recognition in background noise (multitalker babble,+10 dB signal-noise ratio) was impaired (72% correct) compared with normal hearing subjects (⩾92% correct).

Language lateralisation was tested by sequential intracarotid injections of sodium amobarbital. Contralateral limb weakness and unilateral EEG slowing confirmed cerebral deactivation. Auditory comprehension was tested by spoken word-picture matching and semantic judgments (for example, “is ice hot?”). Expressive language was tested by naming and word reading. Deactivation of the left hemisphere resulted in speech arrest and severely impaired receptive language abilities. Conversely, all language functions were spared with right hemispheric deactivation.

Cortical language mapping was performed through a 6×8 electrode array that was implanted subdurally over the lateral left cortical surface and remained indwelling for 10 days. Electrodes were 3 mm in diameter and spaced 10 mm apart in medical grade silastic. The three top rows of the array, containing eight electrodes each, were positioned over the inferior frontal lobe and parietal lobe. The three bottom rows of electrodes were positioned over the temporal lobe, covering the superior and middle temporal gyri and extending posteriorly to the end of the sylvian fissure. Electrode locations were determined by intraoperative photographs and CT. Electrical interference occurred by generating an electrical current between adjacent electrode pairs for 5 second periods, at a maximal current level of 15 mA.13Receptive and expressive language functions were tested at all electrode pairs, except those where sensory, or motor effects, or both were elicited. Auditory comprehension was assessed by the Token test14 and by spoken word-picture matching. We also tested speech perception by asking the patient to judge 25 auditory word pairs as the same or different (for example, pat-bat). Expressive language testing included picture naming, word and paragraph reading, and word repetition. At baseline, the patient performed⩾98% on all language tests. Deficits during electrical interference testing were identified as more than one deviation from the patient's baseline responses. Trials with after discharges were excluded. Results showed that all language functions were mapped to traditional left perisylvian language areas. Specifically, auditory comprehension and naming were associated with multiple electrode sites located in the superior and middle temporal gyri, the inferior parietal lobe, and the inferior frontal lobe. Speech arrest was induced at two electrode pairs in the left inferior frontal lobe. Speech perception, auditory comprehension, and naming were impaired at a single electrode pair located in the left posterior superior temporal gyrus, in the classic Wernicke's area.


Intracarotid sodium amobarbital injection testing disclosed that despite the absence of early right ear input, all expressive and receptive language functions lateralised exclusively to the patient's left hemisphere. Moreover, auditory language functions were localised by cortical stimulation to Wernicke's area, as typically seen in people with normal hearing.3 13 These results indicate that despite our patient's history of early right ear deafness, auditory language functions are strongly lateralised to left posterior perisylvian language areas.

Auditory input from the isolated left ear can be transmitted to the left hemisphere by two pathways: ipsilateral left ear projections or, more indirectly, by interhemispheric transfer of auditory information that is projected from the left ear to the right hemisphere. Although the few sodium amobarbital injection stimuli preclude definitive claims about the particular pathway selected, the sparing of auditory comprehension after right hemispheric deactivation suggests that our patient was using ipsilateral pathways from the left ear to access language systems in the left hemisphere. This may account for our patient's impaired word recognition in noise, as ipsilateral auditory pathways are considered less efficient and, therefore, more vulnerable to degradation of the acoustic signal, as occurs in background noise.10 Furthermore, the prevalence of adverse listening conditions may contribute to the poor academic performance of those, including our patient, with early unilateral right ear hearing losses.9

Our results challenge Wernicke's hypothesis that early right ear deafness results in atypical patterns of language lateralisation. Previous studies of congenitally deaf persons have shown that expressive and non-auditory receptive language functions lateralise to the left hemisphere.2-4 The bilateral nature of their hearing loss, however, precluded investigation of auditory language functions. The clinical circumstances of our patient provided a rare opportunity to investigate the effects of early right ear deafness on lateralisation of auditory language functions. Our results suggest that auditory language functions follow traditional patterns of left hemispheric lateralisation and localisation, despite early right ear deafness, and that access to the left hemisphere is accomplished, in part, by recruitment of ipsilateral left ear pathways.


This study was supported by NIDCD grant R29–03081. We thank Dr John Hart and Moona Alidoost for technical assistance. Special thanks to Drs. Stephen Reich and Pamela Talalay for their thoughtful comments.


Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.