This means that if we’re verbally primed with a scent-related cue – say, someone asking us whether a glass of milk has gone rotten – we’re more likely to detect a tinge of rottenness in the milk’s smell, whether it’s actually gone sour or not.
The relationship between priming and perception is a hot topic for debate in today’s neuroscience and psychology labs (I’ve written here about the influence of verbal priming on visual working memory, and here about its influence on mathematical intuition). Though psychologists have understood for decades that priming often affects what we think and experience, the neural underpinnings of this phenomenon have remained somewhat poorly understood.
But now, as the journal Neuron reports, a team led by Christina Zelano at Northwestern University’s Feinberg School of Medicine have discovered some exciting new specifics about the links between what we expect and what we smell.
The researchers monitored the brain activity of volunteers in an fMRI scanner, as the subjects tried to identify a specific smell – watermelon or Play-Doh – in a series. They found that, even before the volunteers actually smelled an odor, their brains showed activity patterns that closely matched those associated with the scent itself:
Ensemble activity patterns in anterior piriform cortex (APC) and orbitofrontal cortex (OFC) reflected the attended odor target both before and after stimulus onset. In contrast, prestimulus ensemble representations of the odor target in posterior piriform cortex (PPC) gave way to poststimulus representations of the odor itself.
In other words, the anterior piriform cortex (APC) – an area crucial for processing smells – and the orbitofrontal cortex (OFC) – an area involved in anticipating upcoming events – both lit up with activity before, during, and after the actual smell was presented. But activity patterns in the posterior piriform cortex (PPC) shifted in response to new scents – which strongly implies that this region helps compare the scents we expect against those we actually experience.
The researchers take these discoveries to mean that activity in the PPC helps shape what we smell, but doesn’t completely control it:
The robustness of target-related patterns in PPC predicted subsequent behavioral performance. Our findings directly show that the brain generates predictive templates or “search images” in PPC, with physical correspondence to odor-specific pattern representations, to augment olfactory perception.
In short, while the PPC might help us distinguish one odor from another, it doesn’t directly tell us what we smell. This fits in neatly with recent discoveries about priming in the visual pathway, which seems to function in much the same way.
It’s just one more confirmation that reality isn’t something we passively take in – it’s something our brains actively construct throughout each moment.
So, the next time something smells funny to you, take a moment to consider what you’re expecting to catch a whiff of. Your anticipation might be stinking up the joint more than you’d think.