Remember the famous 30 million word gap in language exposure between the children of professional families and those on welfare, and all its attendant problems in reading and attention? How could educators make up for the gap for a child with no exposure to language at all in the first year or two of life?
That鈥檚 the potential gap facing more than 1 in 500 children in the United States born each year deaf or hard of hearing if their parents do not sign to them, according to Gallaudet University鈥檚 Visual Language and Visual Learning Center, or VL2. The center, one of the National Science Foundation鈥檚 , launched its first 鈥渒nowledge festival鈥 on Monday night to bring together researchers working on ways to understand and build language development across multiple languages, including English, Chinese, and American Sign Language.
New neurological and longitudinal behavioral studies presented at the forum suggest visual and spoken languages are biologically equal in the brain, and early exposure can help support each other in students with deafness and normal hearing alike.
For example, Clifton Langdon, assistant professor and assistant director of the Brain and Language Laboratory for Neuroimaging, said they have found infants鈥 brains are most sensitive to sign language syllables that happen at about 1.5 hertz per second, roughly on par with the 1.5 syllables per second frequency that infants can parse out spoken language. 鈥淭hat tells us the brain is not looking for particular signs or symbols of speech, but processing underlying patterns of language,鈥 Langdon said. Early exposure to this 鈥渞hythmic patterning鈥 allows infants to begin to process language, he said. [CORRECTION: Langdon clarified the process of infants processing sound and visual syllables.]
Need for Visual, Spoken Language Exposure
Hearing loss is the most common congenital sensory problem in the country, and the National Academy of Pediatrics estimates that 98 percent of newborns receive a hearing test. But only about 1 in 4 with hearing loss are properly diagnosed and given services before they are 6 months old.
Children with normal hearing in even those in the most disadvantaged homes still were exposed to about 15 million words by age 3. But the overwhelming majority of children with hearing loss are born to hearing parents who do not sign and most do not begin to receive interventions until well over 1 year old, said Laura-Ann Petitto, scientific director and co-principal investigator of VL2. Even those who receive a cochlear implant generally are 18 months old or more before they begin to regularly hear and develop their first spoken words, because the device can take several months to fine-tune.
鈥淏rain systems are on different maturation timetables, and one of the most unforgiving is the system for language development,鈥 Petitto said. 鈥淚t peaks at age 3 ... and if the child doesn鈥檛 experience exposure to the fundamental patterns of language in early life鈥6 to 10 months old鈥攜ou are putting the child at severe risk鈥 of major language, reading, and even math delays later on.
That may explain why deaf children of parents who sign from birth perform significantly better in reading and attention outcomes than deaf children whose parents did not sign.
In a three-year longitudinal study of 3- to 5-year-old children who were deaf or hard of hearing across 20 states, VL2 co-principal investigator Thomas Allen found 鈥"Those who master one typically master the other,鈥 Allen said.
Further, young in the same time periods as hearing children did on spoken languages: 鈥渇ingerbabbling鈥 at age 3 to 6 months, for example; using their simple sentences between age 1 and 2; and becoming proficient in signing by around age 3. Children with hearing loss who grew up in homes with significant signing and fingerspelling on average met those developmental benchmarks faster than those who grew up with no early signing. Their letter-word recognition abilities grew faster, too.
A few states have started to take visual languages into account when measuring early literacy. will begin to require districts to include visual language ability when assessing their deaf students鈥 literacy, and a similar bill in Kansas is awaiting the governor鈥檚 signature.
Designing Literacy Supports for Vision and Speech
The Gallaudet researchers are developing new ways to use visual and spoken languages to support each other. For example, VL2鈥檚 Motion Light Lab is developing a series of storybook apps (It has released five so far.) that allow children to read along with written text and a visual signer, and click on individual words to learn sign, written, and spoken vocabulary.
Melissa Malzkuhn, director of the Motion Light Lab, said teachers and parents can also to write and sign their own stories for children.
The group is also working to create nursery rhymes in sign language via computer-animated avatars. 鈥淭hose amazing rhythmic patterns are sometimes lost in the translation to ASL from English,鈥 Malzkuhn said. 鈥淲e鈥檙e trying to recreate those patterns in ASL.鈥
Photos: Top: Petitto, left, and Langdon cheer in American Sign Language during a presentation Monday on research breakthroughs on language development in children who are deaf and hard of hearing. Source: Sarah D. Sparks
Bottom: Young students who are deaf or hard of hearing can read along with 鈥淭he Little Airplane That Could,鈥 a story told in both written type, spoken English, and American Sign Language, via an app developed by Gallaudet University researchers.
Video: An excerpt from one of the lab鈥檚 motion-capture projects allows students to follow along with a nursery rhyme performed in ASL. Source: Gallaudet University
Related: