The Mechanics of Hearing
When sound waves enter the ear, they strike the eardrum, which causes vibrations in several tiny bones known — the malleus, incus, and stapes, collectively known as ossicles. Their vibrations carry through to the cochlea, a fluid-filled organ lined with microscopic hair cells.
Each group of hair responds to a specific pitch and frequency, sending signals along the auditory nerve. When these nerve impulses reach the brain, the real work begins. Like a biological supercomputer that's always active, the brain analyzes and interprets the sounds around us in real-time — so fast that we hardly even realize it's happening.
The Auditory Center: From Noise to Meaning
When a sound reaches the brain through the auditory nerve, the first thing it needs to do is strip that sound down to its component parts, pitch, and volume. Once it's done so, it then quickly checks the components against patterns within its stored memory. If no such patterns exist, it will instead categorize the sound based on the closest match available.
Finally, once it's identified and categorized the sound, it quickly determines if it demands our focus.
Let's say, for example, that you're talking on the phone in a crowded mall. The left hemisphere of your brain will filter out the background noise so that you can hear the person on the other line. In some cases, such as with auditory processing order and ADHD, this automatic filtering doesn't happen properly — every sound is given equal focus, making it impossible to distinguish one from another.
Note also that although auditory processing happens subconsciously, it's possible to influence it with our conscious mind as well. If there's a particular sound we want to focus on, we can temporarily override the automation. This could be anything from hearing the sound of a child crying over an air conditioner to focusing on a single conversation on the subway.
Always-On: The Auditory Center and Sleep
The auditory center never 'turns off.' Instead, even when we sleep, it constantly scans our environment for sound, filtering, and processing as it usually does. Unlike when we're awake, most sounds are generally categorized as unimportant and blocked out.
The exception here is when it picks up on something unusual — something which to it may signify danger to either ourselves or our loved ones. This could be anything from the sound of a car crash, a fire alarm, or a child's crying.
Safeguarding Against Sensory Overload
As we've already mentioned, the auditory cortex acts as a natural firewall of sorts, blocking out irrelevant stimuli so that we can focus on what's important. Without this natural filter, we would quickly become overwhelmed by noise. It's why sensory overload is so common in individuals with disorders such as ADHD — because they cannot filter sounds as effectively.
Although the brain is critical to analyzing and understanding sound, it can only do this when connected to well-functioning ears. If the auditory information that reaches its processing center is in some way incomplete or distorted, it's unable to work as effectively. It's sort of like being on a Zoom call with a damaged headset.
Described by auditory professionals as the nerve cells in the brain 'talking to themselves,' tinnitus is the result of the brain making a futile attempt to amplify missing frequencies. Unfortunately, it tends to overcompensate, resulting in a ringing or buzzing sound, even when no such sound is present.
The Brain and Language: A Biological Supercomputer in Action
Comprehension of speech is arguably one of the most complex processes tackled by the auditory center, and one that may take years to learn and understand.
Not only does the brain have to process and understand each individual word, but it also needs to account for pitch, vocal frequency, body language, accents, dialects, and speed. An identical sentence uttered by two different people might thus have a completely different meaning behind it. Yet somehow, the brain manages to tackle all this processing at lightning speed, analyzing up to 14 speech signals a second while simultaneously absorbing information from all of our other senses.
This is a level of power that even modern supercomputers have difficulty matching.
And we don't even understand everything about how the brain processes speech. We do not, for instance, fully understand how the brain is able to differentiate our voice from the voices of those around us. Even in noisy environments, our voice somehow always seems clear to us.
Big Things Come in Small Packages
Believe it or not, the brain's auditory center is only slightly larger than a thumbnail. There are two halves to it, each embedded in one side of the cerebral cortex. Each “side” of the auditory center is responsible for comprehending eleven different frequencies and fields.
For instance, the left is mainly responsible for interpretation and is in constant communication with the right.
And again, the brain parses all this information together with the vast array of data gathered by our other senses — a fact which is equal parts fascinating and staggering.