Signing in Time: Closing the Learning Gap for Deaf Children

Psychology Professor Jennie Pyers
Author  Katie Noah Gibson
Published on 

Psychology Professor Jennie Pyers’ mother and father were born deaf, but didn’t learn sign language. “When my parents were growing up, the expectation was that you were going to learn how to read lips, you were going to learn to rely on your hearing aids, as much as possible,” Pyers said. That approach may work for one-on-one conversations, but participating in group discussions would prove to be difficult and exhausting.

Her parents attended Gallaudet University, a school for the deaf in Washington, D.C., where they learned sign language and, for the first time, could communicate more freely. “Not having those accessible conversations and then moving into a place where you could have these accessible conversations with your peers was a revolutionary moment for my parents,” Pyers said.

While an undergraduate at Smith, she studied with a psychology professor who worked with deaf children that were not learning sign language. The students attended a school that focused on teaching them English and discouraged sign language for fear it would interfere with spoken language acquisition. This experience combined with her parents’ experience spurred Pyers to change directions after Smith, where she majored in art history, and pursue research in language and cognition.

Because language and cognition are closely connected, the language deprivation typically experienced by the majority of deaf children can have long-term effects on various aspects of their cognitive and social development. Deaf children born to parents who are not fluent in a sign language sometimes experience delays in “theory of mind,” which relates to an understanding of what other people think and know. They can also experience delays in number reasoning and other cognitive processes.

Pyers, along with Naomi Caselli and Amy Lieberman of Boston University and other researchers, is studying how deaf children acquire American Sign Language (ASL) vocabulary and developing reliable measures of vocabulary development in children who learn to sign. Pyers, herself a native signer of ASL, said their project, funded by the National Institute on Deafness and Other Communication Disorders, addresses these central questions:

  1. Can deaf children whose parents are not fluent signers develop a vocabulary in ASL at the same level as deaf children who are learning ASL from their deaf, signing parents? (A question recently answered by a newly published study by this team of researchers.)

  2. Is there a critical period during which deaf infants and toddlers must be exposed to a sign language in order to develop a vocabulary similar to deaf children who are natively acquiring a sign language?

  3. How does ASL vocabulary size relate to English vocabulary size and to later-developing ASL skills (such as use of grammar)?

“Several states have passed legislation to ensure that deaf children have the language skills to be ‘kindergarten ready,’” Pyers said, noting that measuring those skills can be a challenge for parents and educators, especially those who are not fluent signers themselves. As part of their research on early vocabulary development in deaf children born to hearing parents who are exposing them to ASL, she and her colleagues have developed a vocabulary assessment that she said is the first one that parents of a deaf child can complete online without the guidance of a trained clinician. “Parent reports of children’s vocabulary development is one of the most common ways that researchers and clinicians can get a sense of a child’s vocabulary size,” Pyers explained.

In developing the ASL vocabulary assessment, Pyers and her colleagues updated an existing pen and paper checklist, developed in the 1990s, that provided rough English translations of ASL signs. The previous assessment did not give researchers the ability to “build a data set of norms to show what vocabulary development looked like for deaf children natively acquiring ASL,” Pyers says. With the new online assessment, “we now have a data set, and with every participant, our norms are expanding and becoming more robust.” The new online assessment is administered bilingually in ASL and English, using videos of ASL signs instead of English translations to help parents provide more accurate responses.

Pyers and her colleagues collected data by administering the assessment to 120 deaf children with deaf parents, to provide a basis for comparing vocabulary development in different household settings. “Ongoing data collection is expanding the sample,” Pyers says, “so we can have more accurate norms against which to compare children’s language development.” Parents can access the assessment through an online portal and complete it multiple times as their children develop. They also can share their child’s assessment scores with clinicians and researchers as needed.

The project dovetails nicely with Wellesley’s cognitive and linguistic sciences and data science programs, plus the Quantitative Analysis Institute. In Pyers’ lab, the Laboratory for Language and Cognitive Development (LLCD), students analyze the data from projects like this one. “The data sets that we get from administering the vocabulary assessment online are massive and frankly unwieldy,” Pyers said. “I’ve relied on [students] to help wrangle the data and explore new questions with it.”