Posts Tagged ‘ hearing ’

Musical Matchups

Our brains process music via different sensory pathways depending on what we think its source is, a new study finds.

Let me stop ya right there - "Stairway to Heaven" is off-limits.

As our brains organize information from our senses into a coherent representation of the world around us, they’re constantly hard at work associating data from one sense – say, sight – with data from another – say, hearing.

A lot of the time, this process is pretty straightforward – for instance, if we see a man talking and hear a nearby male voice, it’s typically safe for our brains to assume the voice “goes with” the man’s lip movements. But it’s also not too hard for others to trick this association process – as anyone who’s watched a good ventriloquism act knows.

Now, as the journal Proceedings of the National Academy of Sciences (PNAS) reports, a team led by HweeLing Lee and Uta Noppeney at the Max Planck Institute for Biological Cybernetics has discovered a way in which musicians‘ brains are specially tuned to correlate information from different senses when their favorite instruments are involved.

Neuroscientists have known for years that the motor cortex in the brains of well-trained guitar and piano players devotes much more processing power to fingertip touch and finger movement than the same area of a non-musician’s brain does. But what this new study tells us is that the brains of pianists are also much more finely-tuned to detect whether a finger stroke is precisely synchronous with a sound produced by the touch of a piano key.

To figure this out, the team assembled 18 pianists – amateurs who practice on a regular basis – and compared their ability to tell synchronous piano tones and keystrokes from slightly asynchronous ones while they lay in an fMRI scanner (presumably by showing them this video). The researchers also tested the pianists’ ability to tell when lip movements were precisely synchronized with spoken sentences.

The team then compared the musicians’ test results against the results of equivalent tests taken by 19 non-musicians. What they found was pretty striking:

Behaviorally, musicians exhibited a narrower temporal integration window than non-musicians for music but not for speech. At the neural level, musicians showed increased audiovisual asynchrony responses and effective connectivity selectively for music in a superior temporal sulcus-premotor-cerebellar circuitry.

In short, pianists are much more sensitive to a slight asynchrony between a keystroke and a piano tone than non-musicians are – but this sensitivity doesn’t also apply to speech and lip movements. In other words, pianists’ brains are unusually sensitive to asynchrony only when it involves piano keystrokes.

Another important finding is that the researchers could predict how sensitive the musicians would be to asynchrony based on asynchronies the fMRI scanner detected in their motor cortex:

Our results suggest that piano practicing fine tunes an internal forward model mapping from action plans of piano playing onto visible finger movements and sounds.

This means there’s a direct link between inter-neural coordination and ear-eye coordination. I don’t know about you, but I think that’s pretty incredible.

The researchers hope that as they study similar data from musicians who work with other instruments, they’ll come to better understand how our brains learn to associate stimuli from one sense from information from another – and maybe even how they learn when and when not to “sync up” these stimuli in our subjective experience of reality.

It’s too bad we can’t hook up the brain of, say, Mozart or Hendrix to an fMRI scanner – who knows what amazing discoveries we might make. But even so, I’m sure you can think of some living musical geniuses whose brains you’d like to see in action.

Sensory Fluidity

What do you think it’d be like to see a smell? Or to feel an electromagnetic field?

Brueghel liked to get all allegorical with the senses.

In the last few posts, I’ve talked a lot about the differences between the synthesized concept of a self, and the raw subjective experience of being oneself in the present moment. I’ve also explored some of the boundaries and dimensions of selfhood. But this all leaves a question that’s nagged at my mind for years: what exactly is subjective experience?

It’s simple enough to say that my perceptions of reality (both external and internal) are represented by coordinated patterns of activity in my nervous system – but that doesn’t really answer the question: what do I really mean when I say I “experience” those patterns? What are the precise neurophysiological correlates of “having an experience?”

In a recent blog post, Bradley Voytek expresses this in a way I can really relate to:

When we “hear” things, the sound pressure waveform hits the tympanic membrane (eardrum) and ultimately causes the basilar membrane in your cochlea to vibrate. The basilar membrane is stiffer at one end (the basal end) and less stiff at the other end (the apical end). Okay, great, so we know a ton of the basic biology and cellular mechanisms of the signal transduction mechanisms of our sensory apparatus. But damn if I’m still not amazed by the actual experience of sensation.

I’ve got to admit, the nature of that actual experience has me a bit stumped at the moment. But thinking about it has raised some intriguing considerations that I think are worth mulling over. Here’s the thing: the senses we use to experience the world aren’t as hard-wired or discrete as we often assume.

One common example is synesthesia – a condition in which stimulation of one sensory or cognitive pathway leads to activation of another. In one patient’s case, sounds triggered a somatosensory response:

About a year and a half after her stroke, a 36-year-old professor started to feel sounds. A radio announcer’s voice made her tingle. Background noise in a plane felt physically uncomfortable. Sophisticated imaging of the woman’s brain revealed that new links had grown between its auditory part, which processes sound, and the somatosensory region, which handles touch.

This has led some scientists to research the idea that our sense of hearing developed as an enhancement as our sense of touch – a hypothesis supported by lab tests in which patients are better able to detect a quiet sound when their skin’s touch receptors are stimulated, and vice versa.

Then there are cases of blind patients who use echolocation to navigate. Scientists have found that most sounds these patients hear are processed in their auditory cortex – but sounds used for echolocation are processed in the visual cortex, allowing these patients to literally see an image of their environment constructed from sound. One guy has even learned to beat video games from sound cues alone.

Quinn Norton's finger, displaying Magneto-style powers.

And some people have gone so far as to forcibly hack their own senses. Quinn Norton, for example, inserted a magnet into her hand, and discovered it enabled her to feel electromagnetic fields. According to Norton and other body hackers, projects like these are just the beginning – we may be the first generation to choose what senses we have, and how we want to experience them.

So, while it might be tricky to pin down just what subjective experience is, it’s clear that the senses from which it’s constructed are incredibly flexible and fluid. And as Voytek is quick to point out, the amount of our sensory experience that we consciously pay attention to is just a fraction of what we actually receive:

It turns out that humans can, in fact, detect as few as 2 photons entering the retina. Similarly, it appears that the limits to our threshold of hearing may actually be Brownian motion. That means that we can almost hear the random movements of atoms. We can also smell as few as 30 molecules of certain substances.

But our brains – at least, our conscious attention – can’t parse all this data at every moment of the day. The visual pathway is a good example: our pupils can contract or expand in response to overall light levels, our retinas have a built-in response threshold that prevents them from passing along every bit of input they receive, and the visual cortex devotes most of its resources to analyzing motion and edges, while filling in the more monotonous areas (hence, why we usually don’t notice our blind spots). In other words, the entire nervous system is adept at filtering out all but the most unusual data.

One funny example of this is the classic “selective attention test” video. Take a minute to watch it, and see if you can catch the unusual element. As Voytek says, “How can we see two photons, but miss that?!” Because attention is a limited resource.

Still, as the examples above demonstrate, the human connectome is stunningly pliable, and can learn to rewire its sensory perceptions – or even to perceive entirely new senses – with a bit of practice.

I think it could be fun to write some tutorials on attention hacking, and explore some ways in which we could all push our own senses to new heights of perceptive sensitivity and fluidity. I’d say this holds some intriguing implications for erotic neurophysiology – but I think that’s only the beginning. As Voytek says, we may all be “inattentive superheroes” just waiting to be born.

Follow

Get every new post delivered to your Inbox.

Join 75 other followers