Looking at an object doesn’t just activate visual regions of our brain – it also triggers touch-sensitive areas, a new study shows.
The connection between these two senses is so strong, in fact, that scientists can look at fMRI scans of a person’s somatosensory cortex – a crucial region for touch sensations – and predict which object that person is looking at.
When it comes to our senses, a lot of what we learned in school just isn’t true – rather than five distinct senses, our brains seem to have a variety of processing pathways for incoming sense-data, and those pathways often cross or blur together. For example, echolocation relies on visual parts of the brain, allowing blind people to literally see with sound. In short, where a sensory signal comes from doesn’t seem to be nearly as important as what area of the brain ends up processing it.
This study was led by USC’s Antonio Damasio, Hanna Damasio, and Kaspar Meyer; who are no strangers to the idea of sensory linkage – back in 2010, they were able to predict which objects volunteers were looking at by studying activity in the auditory cortex:
As subjects viewed sound-implying, but silent, visual stimuli, activity in auditory cortex differentiated among sounds related to various animals, musical instruments and objects. These results support the idea that early sensory cortex activity reflects perceptual experience, rather than sensory stimulation alone.
But this time, the journal Cerebral Cortex reports, the Damasios and their team set out to study the connections between visual stimuli and somatosensory activation.
They started by placing volunteers in an fMRI scanner, and showing them video clips of hands touching various objects. The subjects were asked to imagine the sensation of touching each object (soft fur, smooth metal, and so on) while the scanner mapped their brains’ somatosensory activity.
When the team played the fMRI scans back in a specially designed computer program, they found that the computer could easily predict which of the video clips corresponded to each activation pattern:
Using multivariate pattern analysis of functional magnetic resonance imaging data, we were able to predict, based exclusively on the activity pattern in SI, which of several objects a subject saw being explored.
This strongly suggests that our memories aren’t sorted into discrete categories of sight, sound, touch and so on – but that they’re stored multimodally; in other words, all our sense-impressions of an object or idea are linked together, to some degree, in our long-term memories.
Whenever I read studies like this, I find myself amazed that my brain can ever assemble a coherent sense of the world around me. Well…somewhat coherent, at any rate.