Sign language in the brain

Sign language refers to any natural language which uses visual gestures produced by the hands and body language to express meaning. The brain's left side is the dominant side utilized for producing and understanding sign language, just as it is for speech.[1] In 1861, Paul Broca studied patients with the ability to understand spoken languages but the inability to produce them. The damaged area was named Broca's area, and located in the left hemisphere’s inferior frontal gyrus (Brodmann areas 44, 45). Soon after, in 1874, Carl Wernicke studied patients with the reverse deficits: patients could produce spoken language, but could not comprehend it. The damaged area was named Wernicke's area, and is located in the left hemisphere’s posterior superior temporal gyrus (Brodmann area 22).

Signers with damage in Broca's area have problems producing signs. Those with damage in the Wernicke's area (left hemisphere) in the temporal lobe of the brain have problems comprehending signed languages. Early on, it was noted that Broca’s area was near the part of the motor cortex controlling the face and mouth. Likewise, Wernicke's area was near the auditory cortex. These motor and auditory areas are important in spoken language processing and production, but the connection to signed languages had yet to be uncovered. For this reason, the left hemisphere was described as the verbal hemisphere, with the right hemisphere deemed to be responsible for spatial tasks. This criterion and classification was used to denounce signed languages as not equal to spoken language until it was widely agreed upon that due to the similarities in cortical connectivity they are linguistically and cognitively equivalent.

In the 1980s research on deaf patients with left hemisphere stroke were examined to explore the brains connection with signed languages. The left perisylvian region was discovered to be functionally critical for language, spoken and signed.[1][2] Its location near several key auditory processing regions led to the belief that language processing required auditory input and was used to discredit signed languages as "real languages."[2] This research opened the doorway for linguistic analysis and further research on signed languages. Signed languages, like spoken languages, are highly structured linguistic systems; they have their own sets of phonological, morphological and syntactic characteristics. Despite some differences between spoken and signed languages, the associated brain areas share a lot in common.[3]

Figure 1. Schematic of the ascending auditory pathway
  1. ^ a b Campbell, Ruth (June 29, 2007). "Sign Language and the Brain". Journal of Deaf Studies and Deaf Education. 13 (1): 3–20. doi:10.1093/deafed/enm035. PMID 17602162.
  2. ^ a b Campbell, Ruth, et al. “Sign Language and the Brain: A Review.” Journal of Deaf Studies and Deaf Education, vol. 13, no. 1, 2008, pp. 3–20., https://www.jstor.org/stable/42658909.
  3. ^ Poizner H, Klima ES, Bellugi U., What the hands reveal about the brain, 1987, Cambridge, MA The MIT Press