Post by laurasnowbird on Oct 22, 2004 9:10:57 GMT -5
Just read this and found it verrrry interesting:
Left and right ears not created equal as newborns process sound
Challenging decades of scientific belief that the decoding of sound originates from a preferred side of the brain, UCLA and University of Arizona scientists have demonstrated that right-left differences for the
auditory processing of sound start at the ear.
Reported in the Sept. 10 edition of Science, the new research could hold profound implications for rehabilitation of persons with hearing loss in one or both ears, and help doctors enhance speech and language development in hearing-impaired newborns. "From birth, the ear is structured to distinguish between various types of sounds and to send them to the optimal side in the brain for processing," explained Yvonne Sininger, Ph.D., visiting professor of head and neck surgery at the David Geffen School of Medicine at UCLA.
"Yet no one has looked closely at the role played by the ear in processing auditory signals."
Scientists have long understood that the auditory regions of the two halves of the brain sort out sound differently. The left side dominates in deciphering speech and other rapidly changing signals, while the right side leads in processing tones and music. Because of how the brain's neural network is organized, the left half of the brain controls the right side of the body, and the left ear is more directly connected to the right side of the brain.
Prior research had assumed that a mechanism arising from cellular properties unique to each brain hemisphere explained why the two sides of the brain
process sound differently. But Sininger's findings suggest that the difference is inherent in the ear itself. "We always assumed that our left and right ears worked exactly the same way," she said. "As a result, we tended to think it didn't matter which ear was impaired in a person. Now we see that it may have profound implications for the individual's speech and
language development."
Working with co-author Barbara Cone-Wesson, Ph.D., associate professor of speech and hearing sciences at the University of Arizona, Sininger studied tiny amplifiers in the outer hair cells of the inner ear. "When we hear a sound, tiny cells in our ear expand and contract to amplify the vibrations," explained Sininger. "The inner hair cells convert the vibrations to neural cells and send them to the brain, which decodes the input."
"These amplified vibrations also leak back out to the ear in a phenomena call otoacoustic emission (OAE)," added Sininger. "We measured the OAE by inserting a microphone in the ear canal."
In a six-year study, the UCLA/UA team evaluated more than 3,000 newborns for hearing ability before they left the hospital. Sininger and Cone-Wesson placed a tiny probe device in the baby's ear to test its hearing. The probe emitted a sound and measured the ear's OAE.
The researchers measured the babies OAE with two types of sound. First, they used rapid clicks and then sustained tones. They were surprised to find that
the left ear provides extra amplification for tones like music, while the right ear provides extra amplification for rapid sounds timed like speech.
"We were intrigued to discover that the clicks triggered more amplification in the baby's right ear, while the tones induced more amplification in the baby's left ear," said Sininger. "This parallels how the brain processes
speech and music, except the sides are reversed due to the brain's cross connections."
"Our findings demonstrate that auditory processing starts in the ear before it is ever seen in the brain," said Cone-Wesson. "Even at birth, the ear is structured to distinguish between different types of sound and to send it to the right place in the brain."
Previous research supports the team's new findings. For example, earlier research shows that children with impairment in the right ear encounter more trouble learning in school than children with hearing loss in the left ear.
"If a person is completely deaf, our findings may offer guidelines to surgeons for placing a cochlear implant in the individual's left or right ear and influence how cochlear implants or hearing aids are programmed to
process sound," explained Cone-Wesson. "Sound-processing programs for hearing devices could be individualized for each ear to provide the best
conditions for hearing speech or music."
"Our next step is to explore parallel processing in brain and ear simultaneously," said Sininger. "Do the ear and brain work together or independently in dealing with stimuli? How does one-sided hearing loss affect this process? And finally, how does hearing loss compare to one-sided loss in the right or left ear?"
Me again. Food for thought, huh? I would say that this is consistent with what we found with Ethan. His hearing was testing totally normal on one ear, but not the other. His ENT said that he only needed good hearing in one ear to develop speech. I felt the need to make certain he could hear as well as possible out of both ears, and wanted to protect the "good" ear as much as possible, so I requested that he put tubes in Ethan's ears. He had never had an ear infection and was 21 months old when the tubes were inserted.
His language started to take right off after that. He was sounding more, and I counted the other day, and he has more than 50 words and two-word phrases that he uses consistently, and is adding more each day. And seriously, you can ask Debu about this one, we were VERY concerned about his speech, because at 21 months, he didn't say OR sign ANYTHING.
Anyway, just wanted to share, because this study seems consistent with what we experienced. This is a link to a similar article
yalenewhavenhealth.org/healthnews/healthday/040909HD521137.htm
Left and right ears not created equal as newborns process sound
Challenging decades of scientific belief that the decoding of sound originates from a preferred side of the brain, UCLA and University of Arizona scientists have demonstrated that right-left differences for the
auditory processing of sound start at the ear.
Reported in the Sept. 10 edition of Science, the new research could hold profound implications for rehabilitation of persons with hearing loss in one or both ears, and help doctors enhance speech and language development in hearing-impaired newborns. "From birth, the ear is structured to distinguish between various types of sounds and to send them to the optimal side in the brain for processing," explained Yvonne Sininger, Ph.D., visiting professor of head and neck surgery at the David Geffen School of Medicine at UCLA.
"Yet no one has looked closely at the role played by the ear in processing auditory signals."
Scientists have long understood that the auditory regions of the two halves of the brain sort out sound differently. The left side dominates in deciphering speech and other rapidly changing signals, while the right side leads in processing tones and music. Because of how the brain's neural network is organized, the left half of the brain controls the right side of the body, and the left ear is more directly connected to the right side of the brain.
Prior research had assumed that a mechanism arising from cellular properties unique to each brain hemisphere explained why the two sides of the brain
process sound differently. But Sininger's findings suggest that the difference is inherent in the ear itself. "We always assumed that our left and right ears worked exactly the same way," she said. "As a result, we tended to think it didn't matter which ear was impaired in a person. Now we see that it may have profound implications for the individual's speech and
language development."
Working with co-author Barbara Cone-Wesson, Ph.D., associate professor of speech and hearing sciences at the University of Arizona, Sininger studied tiny amplifiers in the outer hair cells of the inner ear. "When we hear a sound, tiny cells in our ear expand and contract to amplify the vibrations," explained Sininger. "The inner hair cells convert the vibrations to neural cells and send them to the brain, which decodes the input."
"These amplified vibrations also leak back out to the ear in a phenomena call otoacoustic emission (OAE)," added Sininger. "We measured the OAE by inserting a microphone in the ear canal."
In a six-year study, the UCLA/UA team evaluated more than 3,000 newborns for hearing ability before they left the hospital. Sininger and Cone-Wesson placed a tiny probe device in the baby's ear to test its hearing. The probe emitted a sound and measured the ear's OAE.
The researchers measured the babies OAE with two types of sound. First, they used rapid clicks and then sustained tones. They were surprised to find that
the left ear provides extra amplification for tones like music, while the right ear provides extra amplification for rapid sounds timed like speech.
"We were intrigued to discover that the clicks triggered more amplification in the baby's right ear, while the tones induced more amplification in the baby's left ear," said Sininger. "This parallels how the brain processes
speech and music, except the sides are reversed due to the brain's cross connections."
"Our findings demonstrate that auditory processing starts in the ear before it is ever seen in the brain," said Cone-Wesson. "Even at birth, the ear is structured to distinguish between different types of sound and to send it to the right place in the brain."
Previous research supports the team's new findings. For example, earlier research shows that children with impairment in the right ear encounter more trouble learning in school than children with hearing loss in the left ear.
"If a person is completely deaf, our findings may offer guidelines to surgeons for placing a cochlear implant in the individual's left or right ear and influence how cochlear implants or hearing aids are programmed to
process sound," explained Cone-Wesson. "Sound-processing programs for hearing devices could be individualized for each ear to provide the best
conditions for hearing speech or music."
"Our next step is to explore parallel processing in brain and ear simultaneously," said Sininger. "Do the ear and brain work together or independently in dealing with stimuli? How does one-sided hearing loss affect this process? And finally, how does hearing loss compare to one-sided loss in the right or left ear?"
Me again. Food for thought, huh? I would say that this is consistent with what we found with Ethan. His hearing was testing totally normal on one ear, but not the other. His ENT said that he only needed good hearing in one ear to develop speech. I felt the need to make certain he could hear as well as possible out of both ears, and wanted to protect the "good" ear as much as possible, so I requested that he put tubes in Ethan's ears. He had never had an ear infection and was 21 months old when the tubes were inserted.
His language started to take right off after that. He was sounding more, and I counted the other day, and he has more than 50 words and two-word phrases that he uses consistently, and is adding more each day. And seriously, you can ask Debu about this one, we were VERY concerned about his speech, because at 21 months, he didn't say OR sign ANYTHING.
Anyway, just wanted to share, because this study seems consistent with what we experienced. This is a link to a similar article
yalenewhavenhealth.org/healthnews/healthday/040909HD521137.htm