This is a writing sample by “nycghostwriter,” AKA Barbara Finkelstein. It is “Can Technology Celebrate Deaf Culture,” written for Teachers College/Columbia University. You can get professional ghostwriting services from a published non-fiction writer. Email me or fill out the short form on my contact page.

Published in TC Today – Volume 37, No. 1 | 12/7/2012

In “Practice,” deaf poet Raymond Luczak writes about the audio distortions he suffers when using a hearing aid to talk to his father on the phone. “I don’t understand, Dad,” his narrator says and smashes the hearing aid with the telephone receiver. “Wearing a hearing aid does not clarify sound for Luczak,” as Professor Russell Rosen, a lecturer in the Department of Health and Behavior Studies at Teachers College, has written. For Luczak, audio technology renders sound as “noisy and meaningless.”

By contrast, Michael Sagum, a graduate student in the Deaf and Hard of Hearing Program at TC, elected to undergo a cochlear implant at fifteen and a half. Born profoundly deaf in both ears, Sagum relied on hearing aids and FM listening systems as a mainstreamed student in Seattle, and found both tools wanting. He still gets emotional when he remembers the moment his implant was activated and he heard birds chirping. Hearing his own speech became a “source of pride” for Sagum, and yet he also chose to learn American Sign Language at the University of Washington, where he received a B.A. in 2010 in the comparative history of ideas.

These two examples of deaf interaction with a hearing world underscore the range of feelings the Deaf and Hard of Hearing (D/HH) have about the use of assistive technologies to augment hearing or even experience it for the first time. They also point to a deaf community that is coming to embrace a diverse approach to self-expression — while highlighting the need to research the effect that both technology and signing have on the cognitive life of deaf people.

Since the nineteenth century, the deaf and hearing worlds have been at loggerheads with each other over the role that sign language and technology can play — and ought to play — in compensating for deafness or hearing loss. On the one hand, deaf individuals and organizations that speak for them, view deafness as a unique culture, with its own language and modes of social interaction.

Hence, the skepticism, even disdain, by advocates of deaf culture for cochlear implants, or any assistive technology, that purports to “correct” deafness. The once anti-technology National Association of the Deaf, for example, only endorsed the “bionic” cochlear implant as one acceptable communication choice among many in 2000.

On the other hand, most of the hearing world, as well as some schools for the deaf, resisted American Sign Language as a legitimate language until the mid-1990s. In fact, in the early 1970s, when Ruth Rabinovitch Herzel was a graduate student in the Department of Health and Behavior Studies, TC’s emphasis in deaf education was the acquisition of oral language. “The TC curriculum today shows an amazing acceptance of ASL as the predominant language of the deaf in the U.S. and English-speaking Canada,” Herzel says. “TC had to evolve its thinking just like the rest of society.”

Clearly, this clash of language philosophies is not unique to D/HH people. Think Flemish versus French, English versus Quebecois, Polish versus Lithuanian and Russian versus Ukrainian, to name a few comparable conflicts. Indeed, the struggle between technology advocates, who mostly see deafness as a correctable disability, and deaf practitioners of ASL, who view signing as one aspect of a rich deaf culture, is almost as passionate as these infamous spoken language wars. Ironically, the partisan nature of the “deaf wars” is increasingly mitigated by the interplay of technology, political activism and legislation — the very factors that provoked so much sound and fury about deafness. Together, these conditions have created a deaf community that, arguably, could be a model of tolerance and diversity for the adversaries involved in the spoken language conflicts around the world.

The Technology

Assistive technology for the deaf and hard of hearing got its start in 1876 with the application by Alexander Graham Bell and others of various acoustic technologies. Bell had been interested since childhood in developing a device to “cure” his mother’s deafness. His work with the acoustic telegraph, an apparatus capable of transmitting voices and other sounds telegraphically, contributed to the invention of the hearing aid. Nearly seventy-five years passed before hearing aids became truly marketable when Zenith began manufacturing the Miniature 75 vacuum tube model. This system consisted of a microphone case about as big as a Smartphone and a single receiver earmold. It sold for seventy-five dollars.

As poet Raymond Luczak made clear, hearing aids do not necessarily enhance the quality of human speech. Despite their ability to amplify sound, they do not separate out speech from ambient noise. Nor can they adequately amplify high pitches, particularly high female voices. D/HH people who wear hearing aids in theaters, for example, have trouble hearing stage dialogue without the help of an audio induction loop system. The IL system — a loop of wire cable installed around the room — works by converting an input signal to electromagnetic waves that radiate out from the cable. A “T-coil” in the hearing aid detects these waves and converts them into an alternating electrical current. The hearing aid amplifier increases this current, converts it back into sound and delivers it to the hearing aid receiver.

But for the IL to work effectively, the D/HH person has to be wearing a hearing aid. The hearing aid has to contain a “T-coil.” And the T-coil has to be strong enough to pick up the electromagnetic wave. In short, a lot of contingencies may be undone by a single gap in the system.

Other sound field technologies, especially the personal FM system, have proved somewhat more effective, especially in the classroom. This assistive system improves the speech-to-noise ratio, a number value that describes how much louder the teacher’s voice is over background noise of scraping chairs, climate control units and students. It operates like a private radio station on radio waves. The teacher speaks into a microphone, usually a wearable lavalier. The student receives the signal via an FM receiver worn as a wire around the neck, inside headphones or button-type earmolds or as direct audio input. FM systems may or may not rely on a hearing aid interface. Teachers receive training from an audiologist to maximize the use of various settings and options.

Personal FM systems have helped deaf children in classrooms focus in on the teacher’s voice, especially when combined with the use of a cochlear implant. Like most technologies, however, they have some drawbacks. The FM signal quality may be hampered by unknown interference. And the system may be too complex for young children to use without considerable adult intervention.

The assistive technology that has stirred the greatest hopes — and controversy — among severe-to-profoundly deaf people is the cochlear implant. Unlike hearing aids, which rely on inner ear hair cells to convert vibrations into nerve signals then sent to the brain, cochlear implants bypass the damaged parts of the ear and send electrical signals directly to the brain where they are interpreted as sound.

Some implant recipients, such as TC’s Michael Sagum, praise the device for its chief advantage: It facilitates the ability to hear speech, learn spoken English and interact with hearing people. Other D/HH individuals, notably TC’s Professor Rosen, find the cochlear implant problematic for reasons of auditory quality: “What’s the point of using a CI if it does not do anything for me except [make me] aware of environmental noises?” he says via email. More to the point, he says, are the cultural and cognitive issues that distinguish D/HH people as a community. He observes that “hearing is not the only means of obtaining information and communicating with people. What’s wrong with deaf culture and [using] its language of ASL?

The Cognitive Questions

As with most conversations about deafness, Rosen’s rhetorical question seeks to address an implied debate over how best to educate D/HH children. If this debate was restricted solely to a dialogue about the merits and shortcomings of technology, a D/HH individual might simply study a table of pros and cons. But educators, linguists, hearing parents of deaf children and D/HH people themselves continue to ponder the role of hearing in cognitive issues, such as thinking, reasoning, judging and learning. So relevant is the relationship between “audition,” or the sense of hearing, and cognition that Robert E. Kretschmer, associate professor of education and psychology at TC, raises it in “Development of language for individuals who are d/Deaf or hard of hearing,” a first-year course in the Deaf and Hard of Hearing Program.

“Does language map out what you already know or does language dictate thought?” Kretschmer asks his students, alluding to linguistic thinking by Jean Piaget and Benjamin Lee Whorf, respectively. He also cites the work of Lev Vygotsky, the early twentieth-century psychologist who discussed the role that social and cultural patterns of interaction play in the development of language.

“As educators and researchers, we are obligated to ask how children process the world if they do so without the sense of hearing,” Kretschmer says. He asks his students to consider a host of issues related to the cognition and education of D/ HH children: Is signing an absolute equivalent of spoken language? Will a child whose deafness goes undiagnosed past the age of three experience lifelong learning and thinking deficits? Even with the use of assistive technologies, will a deaf child identify more with the hearing or the signing world? How long does it take to become a fluent signer? What is the impact on the child of the parents’ approach to language?”

The range of courses that Kretschmer alone has taught, from “Language development and rehabilitation: The foundations,” to “Audiological principles and the teaching of speech and listening skills to individuals who are d/Deaf or hard of hearing,” attest to multidisciplinary complexity that underscores an in-depth approach to educating deaf and hard of hearing children.

The Impact of Deaf Culture on “IDEA” Legislation

So much of the discussion about assistive technology and its impact — or lack thereof — has been shaped by the deaf and hard of hearing themselves. Capitalizing the letter “d,” for example, is an expression of the commitment toward the political, cultural and social values that have grown out of the deaf experience. In short, D/HH is a community of individuals that views deafness as a unique mode of existence, not a disability. Its adherents are a “linguistic minority” who use ASL are their primary language. And D/HH is a culture that must be protected and facilitated by law.

So influential has D/HH culture been on mainstream hearing culture that it helped revise the premise in U.S. Public Law 94-142, or the “Education of All Handicapped Children Act” (1975), that deafness is a disability akin to mental retardation or cerebral palsy. By 1990, Law 94-142 had evolved into the Individuals with Disabilities Education Act, ensuring that students who were deaf or hard of hearing could attend schools in their own neighborhoods rather than state residence schools for the deaf. With “IDEA,” D/HH students now had a right to receive a “free and appropriate public education.” By 1997, IDEA was amended to include language that acknowledged the need for “special considerations” for D/HH students. Public schools had to “consider the communications needs” of the child, as well as “opportunities for direct instruction in the child’s language and communication mode.”

So, what role does technology play in educating deaf children and adults?

Except for very young children, whose hearing or deaf parents will make decisions about assistive technologies for them, the role of technology is increasingly determined by the D/HH community.

“The D/HH community has been impacted by the Internet and its related technologies as much as the hearing world,” says Dale Atkins, Ph.D., a television personality who received an M.A. in special education and deafness from TC in 1971. “But people are starting to understand that the conversation about deafness isn’t really about technology. It’s about enabling people to be comfortable with where they are. It’s about celebrating their children for the precious people they are. This is a human advance, not a technological one.”

Read this article in pdf format

email me