T had both a speech therapy session and an audiology appointment today (with a nap in between!)
T has gotten good at producing an “ahhhh” sound – our speech therapist explained that this tends to be one of the first sounds babies produce since it’s “easier” compared to other vowel sounds – the mouth is open and relaxed (rather than requiring a coordinated mouth shape or tongue position). Now, we’re starting to work on getting T to produce other vowel sounds, like “oooo” and “eeee.”
Our speech therapist explained that these are “harder” than “ahhh,” because they require tension in the mouth. Making those sounds out loud now, I notice that my lips are more closed compared to “ahhh,” and my tongue is in a particular position. One of the things we’ve been doing at speech therapy for a few weeks now is pairing different sounds with visual motions. Our speech therapist explained that this is to make the sounds more salient by combining them with an attention-grabbing visual motion cue. For example, we’ve been playing with ribbons a lot – when we make “ahhh” sounds, we wave the ribbons around loosely, up and down. In contrast, when we make “ooo” and “eee” sounds, we often grab the ends of the ribbon (or let T hold one end) and pull on the other, to give a visual cue of tension while making the sound. Today, we also focused on sounds and words that have an “ooo” or “eee” sound, like saying “choo choo!” when playing with a train, or playing peekaboo.
T is going through some big gross motor leaps lately. In the past 2 weeks, he’s figured out crawling, and just learned to pull himself up to standing yesterday. Now, all he wants to do is move around and practice his mobility skills. Our speech therapist told us that often, when babies are working on their gross motor skills, speech production motor skills take a backseat, since their brains are focusing so hard on gross motor skills. That was really interesting to learn!
T sees his audiologist every 4 to 6 weeks or so – she will usually check and tweak his hearing aid settings, check for fluid in his ear, and, for the past few sessions, try to get some behavioral audiometric data.
Measuring an audiogram from a cooperating adult is pretty straightforward – they can tell you when they hear a sound. It’s much trickier with babies though! Until T was 6 months old or so, the audiologist estimated his audiogram with an Auditory Brain Response (ABR) measurement. Now that he’s older, we have started trying to get behavioral audiometric results. At each audiology appointment, we have continued to try and refine his audiograms; we’re only able to test him for small chunks of time (maybe 10-15 minutes), but we are slowly but surely getting a more accurate picture of his hearing loss.
The test that T’s audiologist has been doing with him is called Visually Reinforced Audiometry (VRA). She plays sounds at different frequencies and different loudnesses, and when he responds in a way that indicates that he heard the sound, she “rewards” him by showing him light-up dancing puppets (I find them pretty creepy, but T LOVES them). In this way, T is conditioned to perform the task of indicating what sounds he hears without requiring him to verbally respond – once he realized then when he looks in the direction he heard the sound from, he gets to see the puppets, he became more motivated to respond when he heard the sound, since he gets excited about the puppets.
In order to separately measure his right and left ears (since he won’t tolerate headphones), the audiologist stuck the probe wire with the speaker into his ear mold (with the processor disconnected) – I thought this was such a clever trick! T is already used to wearing his ear molds since he wears his hearing aids, so he barely even noticed the probe wire (although he did try to chew on the wire once he noticed it).
T was in a great mood, and we were able to get data at both mid (1000 Hz) and high (4000 Hz) frequencies for both ears, and the results continue to confirm a mild bilateral hearing loss.
After measuring his responses with air conduction (through the speaker played into the ear canal), the audiologist wanted to measure his bone conduction responses. With sounds played through air conduction, the sounds go through the full chain of processing – through the outer ear, the middle ear, and then through the inner ear. With bone conduction, a little oscillator vibrates on the mastoid bone (just below the ear lobe), and this bypasses the outer and middle ears. The audiologist explained that if the air conduction and bone conduction results differ (for example, if air conduction results indicate a hearing loss but bone conduction results are normal), this can suggest a middle ear problem or conductive hearing loss. For T, the bone conduction results agreed with the air conduction results, indicating a sensorineural hearing loss (that is, problem with the hair cells in the cochlea). One really interesting thing the audiologist mentioned to me was that the bone conduction results indicate hearing levels for the better ear – she had placed the oscillator behind his left ear, but she said that there is an effect where the oscillation is picked up by the “better” cochlea, regardless of where the oscillator is placed!