Sign language refers to any natural language which uses visual gestures produced by the hands and body language to express meaning. The brain's left side is the dominant side utilized for producing and understanding sign language, just as it is for speech.[1] In 1861, Paul Broca studied patients with the ability to understand spoken languages but the inability to produce them. The damaged area was named Broca's area, and located in the left hemisphere’s inferior frontal gyrus (Brodmann areas 44, 45). Soon after, in 1874, Carl Wernicke studied patients with the reverse deficits: patients could produce spoken language, but could not comprehend it. The damaged area was named Wernicke's area, and is located in the left hemisphere’s posterior superior temporal gyrus (Brodmann area 22).
Signers with damage in Broca's area have problems producing signs. Those with damage in the Wernicke's area (left hemisphere) in the temporal lobe of the brain have problems comprehending signed languages. Early on, it was noted that Broca’s area was near the part of the motor cortex controlling the face and mouth. Likewise, Wernicke's area was near the auditory cortex. These motor and auditory areas are important in spoken language processing and production, but the connection to signed languages had yet to be uncovered. For this reason, the left hemisphere was described as the verbal hemisphere, with the right hemisphere deemed to be responsible for spatial tasks. This criterion and classification was used to denounce signed languages as not equal to spoken language until it was widely agreed upon that due to the similarities in cortical connectivity they are linguistically and cognitively equivalent.
In the 1980s research on deaf patients with left hemisphere stroke were examined to explore the brains connection with signed languages. The left perisylvian region was discovered to be functionally critical for language, spoken and signed.[1][2] Its location near several key auditory processing regions led to the belief that language processing required auditory input and was used to discredit signed languages as "real languages."[2] This research opened the doorway for linguistic analysis and further research on signed languages. Signed languages, like spoken languages, are highly structured linguistic systems; they have their own sets of phonological, morphological and syntactic characteristics. Despite some differences between spoken and signed languages, the associated brain areas share a lot in common.[3]