Have you ever wondered what it subjectively feels like to read braille? What about echolocating?
Scientists are learning that both these senses have a lot more in common with sight – both physiologically and in terms of subjective perception – than you might expect.
Let’s start with braille. PET studies since at least the late ’90s have revealed some intriguing facts about the brains of people who have ocular blindness due to some damage or defect in the visual pathway, but who still have a functioning visual cortex. As these patients read braille with their hands, their striate cortex (also known as the V1 layer of the visual cortex) shows activation patterns similar to those observed in V1 of non-blind patients when they read print.
If this means what it appears to mean, people reading braille are – in a way – actually seeing the letters their fingertips touch:
One subject reported that when doing crosswords she ‘sees’ embossed Braille dots on top of printed characters in her mind’s eye. She also reported that she uses the information of the printed letters to solve the crossword and not the Braille characters in which the crossword is presented to her.
What’s even more exciting is that patients who lose their sight late in life can learn this visualizing ability with practice. This means even a fully-developed adult brain can rewire some of its major sensory pathways!
Now, the idea of visualizing what we touch isn’t too hard to imagine. But what about using sound to paint a vivid mental picture? As it turns out, that’s exactly what human echolocators do.
Like bats and dolphins, humans can use the echos of clicks and squeaks to build up an image of their surroundings. As if it wasn’t incredible enough that we humans (with our – let’s face it – pretty scrawny ears) can learn to do this at all, it also turns out that human echolocators can determine the size and shape of objects with a stunningly high degree of accuracy, even when listening to sound recordings of echoes in a room:
[One subject], for example, could distinguish a 3° difference in the position of a pole in the sealed room, as well as from the pre-recorded sounds. [Another] was slightly less accurate, distinguishing 9° differences in position of the pole while in the room, and 22° differences from the recordings.
As you’ve probably guessed by now, fMRI scans of these subjects’ brains show that echolocation signals are processed in the visual cortex, not the auditory cortex. And experts confirm that anyone at all can learn the skill.
But simple navigation is just the beginning – one blind teenager has beaten video games using echolocation, and “blind adventurer” Eric Weihenmayer has (get ready for this) climbed Mt. Everest using an echolocation prosthesis that sends signals through his tongue.
It’s mind-blowing stories like these that demonstrate just how adaptable our sense pathways are. Many of us tend to think of our sensory experience as being chopped up into categories – sight, sound, touch and so on – but those boundaries are often much more fluid that we realize.