This is your brain on…. American Sign Language

Recently, I tried to teach myself to learn American Sign Language. It didn’t go very well. I couldn’t grasp the new grammar structure or the vocabulary. It was all so foreign to me. I don’t know why I was so surprised, I guess I figured this physical way of communication might be a little easier to pick up than a foreign language like Spanish or French.

Spoken languages and sign languages share more similarities than we may think.

What I did not know is that many of the brain regions used to understand spoken language are also used to understand sign language just like any other spoken language.

The temporal lobe is home to auditory language and speech production with the auditory cortex falling near the Sylvian Fissure and the Wernicke’s area for speech comprehension nearby (Baars & Gage, 2010). Closely located is the Broca’s area, which helps attend to spoken language and is also anterior to the motor cortex associated with the face, tongue, and larynx (Johns, 2014). These areas are used in all aspect of communication including non-vocal communication like American Sign Language.


Neuroscientist and linguists alike have learned the properties of the temporal lobe through damages of this area. This type of damage is called aphasia. Aphasia is the result of stroke or serious head injury and is categorized by the inability to form or comprehend speech.

Aphasia occurs after damage to one or both of these two areas and surrounding sections of the brain. Theses areas can be damaged through stroke or severe head impact.

Numerous case studies of damage to the Wernicke’s and Broca’s areas have allowed neuroscientists to understand the implication of these two areas in spoken language comprehension and production. For example, former US Representative of Arizona, Gabrielle Giffords suffered serious injuries after being shot in the head during a speech in Tucson In 2011. Giffords miraculously survived but suffered severe setbacks in her speech and had to take time away from her career to attend speech therapy. Giffords could perfectly comprehend information but could hardly make more than a few single word utterances up to a year after her injury. This was categorized as Broca’s aphasia as she showed deficits in her speech production but not her comprehension (Profiles of Aphasia, 2017).

What is more interesting is damage in this area for signing communicators also show deficits in signed language.

Does sign language work the same way?

A 2013 case study, published by Falchook and colleagues, looked at the behavioral changes of a congenitally deaf 55 year old woman suffering from Alzheimer’s Disease (AD). The patient presented with the classic symptoms of AD such as impaired episodic memory but she also presented with impaired finger spelling abilities, anomia in which patients cannot remember names for everyday objects, and ideomotor apraxia in which patients show deficits in pantomiming (e.g., pretending to use an imaginary tool) and communicative gestures (e.g., waving goodbye).

American Sign Language (ASL) consists of gestures, finger spelling, facial expression, and body language.

The authors concluded through various neurological tests that the patient’s deficits in finger spelling could be a product of both learning her native language (American Sign Language) later in life and AD’s effects on language production. The patient in this study learned ASL between 8-9 years old, which is much later than most children learn to speak, and research suggests that later ASL learners tend have a shallower understanding of linguistic properties and must use posterior regions like the left lingual gyrus and left middle occipital gyrus rather than the usual perisylvian gyrus (near the Sylvian Fissure).

Authors hypothesize that late ASL learners with AD may show deficits in sign production due to the need to use posterior cortical regions necessary for syntactic processing. Overall, Falchook and colleagues determined that this patients deficits in signing were similar to that of speaking patients who developed language later than others.

This case study, and others, of sign aphasics allows us better understand the neurological similarities between spoken language and sign language.

What does this tell us?

The case studies of Gabrielle Giffords and the deaf AD patient shed light on the fact that speaking and signing share a lot of neurological real estate. This is astounding as it not only shows us how the areas in the brain required for speech are not limited to spoken word but also extend to signing as well. This amazing feat of the brain is yet another example of the brain’s plasticity and its ability to adjust and adapt.


Baars, B. J., Gage, N. M. (2010). The Brain in Cognition, Brain, and Consciousness: Introduction to Cognitive Neuroscience 2nd Edition (126-154). Burlington, MA: Elsevier Ltd.

Johns, P. (2014). Functional Neuroanatomy in Clinical Neuroscience (27-47). Churchill Livingstone.

Profiles of Aphasia: Gabby Giffords. (2017, July 30). Retrieved from

Falchook, A. D., Mayberry, R. I., Poizner, H., Burtis, D. B., Doty, L., & Heilman, K. M. (2012). Sign language aphasia from a neurodegenerative disease. Neurocase, 19(5), 434–444. doi:10.1080/13554794.2012.690427


Busch, J. (n.d.). National Foreign Language Honor Society. Retrieved from

The Internet Stroke Center. (n.d.). Retrieved from

American Sign Language Guide. (2018, February 01). Retrieved from
Giphy. (n.d.). American Sign Language GIFs – Get the best GIF on GIPHY. Retrieved from

One thought on “This is your brain on…. American Sign Language

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s