Posts Tagged ‘ learning ’

Neuroscience Friends!

I’ve just returned from a thrilling weekend at the BIL Conference in Long Beach, California (yes, the pun on “TED” is very intentional) where I met all kinds of smart, fun people – including lots of folks who share my love for braaaiiins!

The conference was held in... The Future!

So I thought I’d introduce you guys to some of the friends I made. I think you’ll be as surprised – and as excited – as I am.

Backyard Brains
Their motto is “neuroscience for everyone” – how cool is that? They sell affordable kits that let you experiment at home with the nervous systems of insects and other creatures. They gave a super-fun presentation where I got to help dissect a cockroach and send electrical signals through its nerves.

Interaxon
They build all kinds of cutting-edge tools that let home users study their brain activity, and even control machines and art projects with it. Their founder, Ariel Garten, has a great TED talk here – I’ve rarely met anyone else who was so excited to have weird new neuroscience adventures.

Deltaself and Dangerously Hardcore
Two blogs by the very smart Naomi Most – the first is about how scientific data is changing the way we all understand our minds and bodies; the second is about hacking your own behavior to stay healthier and live better.

Halcyon Molecular
Their aim is to put the power to sequence and modify genomes in everyone’s hands within the next few decades. They’re getting some huge funding lately, and lots of attention in major science journals.

Bonus – XCOR Aerospace
They’re building a privately-funded suborbital spacecraft for independent science missions. If there’s anybody who can help us all join the search for alien life in the near future, I bet it’s these guys.

So check those links out and let me know what you think. I’d love to get these folks involved in future videos, especially if you’re interested in any of them.

Consider This an Invitation

This photo got me thinking. Only 24 percent? Really?

We’re finding weird new exoplanets every day – hell, NASA hasn’t even ruled out the possibility that there could be life on Europa and Titan, two moons in our own solar system – yet so many people have lost faith in space’s limitless potential to surprise us.

But we’re entering an age when that potential is no longer the exclusive domain of first-world governments and media conglomerates. The fact that we even have a contest like Google’s X Prize proves that independent space exploration is becoming a very real possibility for each one of us.

The question isn’t whether a private company is going to mount an alien-hunting expedition – it’s who’s gonna be the first to try?

Crazy? Of course it’s crazy! Every awesome expedition is!

So what do you guys say? I say it’s possible if we put our resources and our heads together. Even if we don’t find E.T., we’ll have one hell of a story to tell our grandkids.

Forget Me Not

Having trouble remembering where you left your keys? You can improve with a little practice, says a new study.

"I've forgotten more than you'll ever...wait, what was I saying?"

It’s an idea that had never occurred to me before, but one that seems weirdly obvious once you think about it: people who train their brains to recall the locations of objects for a few minutes each day show greatly improved ability to remember where they’ve left things.

No matter what age you are, you’ve probably had your share of “Alzheimer’s moments,” when you’ve walked into a room only to forget why you’re there, or set something down and immediately forgotten where you put it. Attention is a limited resource, and when you’re multitasking, there’s not always enough of it to go around.

For people with real Alzheimer’s disease, though, these little moments of forgetfulness can add up to a frustrating inability to complete even simple tasks from start to finish. This is known as mild cognitive impairment (MCI), and its symptoms can range from amnesia to problems with counting and logical reasoning.

That’s because all these tasks depend on memory – even if it’s just the working memory that holds our sense of the present moment together – and most of our memories are dependent on a brain structure called the hippocampus, which is one of the major areas attacked by Alzheimer’s.

What exactly the hippocampus does is still a hotly debated question, but it seems to help sync up neural activity when new memories are “written down” in the brain, as well as when they’re recalled (a process that rewrites the memory anew each time). So it makes sense that the more we associate a particular memory with other memories – and with strong emotions - the more easily even a damaged hippocampus will be able to help retrieve it.

But now, a team led by Benjamin Hampstead at the Emory University School of Medicine has made a significant breakthrough in rehabilitating people with impaired memories, the journal Hippocampus reports: the researchers have demonstrated that Alzheimer’s patients suffering from MCI can learn to remember better with practice.

The team took a group of volunteers with MCI and taught them a three-step memory-training strategy: 1) the subjects focused their attention on a visual feature of the room that was near the object they wanted to remember, 2) they memorized a short explanation for why the object was there, and 3) they imagined a mental picture that contained all that information.

Not only did the patients’ memory measurably improve after a few training sessions – fMRI scans showed that the training physically changed their brains:

Before training, MCI patients showed reduced hippocampal activity during both encoding and retrieval, relative to HEC. Following training, the MCI MS group demonstrated increased activity during both encoding and retrieval. There were significant differences between the MCI MS and MCI XP groups during retrieval, especially within the right hippocampus.

In other words, the hippocampus in these patients became much more active during memory storage and retrieval than it had been before the training.

Now, it’s important to point out that that finding doesn’t necessarily imply improvement – studies have shown that decreased neural activity is often more strongly correlated with mastery of a task than increased activity is – but it does show that these people’s brains were learning to work differently as their memories improved.

So next time you experience a memory slipup, think of it as an opportunity to learn something new. You’d be surprised what you can train your brain to do with a bit of practice.

That is, as long as you remember to practice.

Connection Clusters

As our brains learn something, our neurons form new connections in clustered groups, says a new study.

Some clusters are juicier than others.

In other words, synapses – connections between neurons – are much more likely to form near other brand-new synapses than they are to emerge near older ones.

As our neuroscience friends like to say: “Cells that fire together wire together” – and that process of rewiring never stops. From before you were born right up until this moment, the synaptic pathways in your brain have been transforming, hooking up new electrochemical connections and trimming away the ones that aren’t needed. Even when you’re sound asleep, your brain’s still burning the midnight oil, looking for ever-sleeker ways to do its many jobs.

I like to imagine that this happens to the sound of a really pumped-up drumbeat, as my brain says things like, “We can rebuild this pathway – we have the technology! We can make it better! Faster! Stronger!”

What’s even more amazing is how delicate these adjustments can be. We’re not just talking about growing dendrites here – we’re talking about dendritic spines, the tiny knobs that branch off from dendrites and bloom into postsynaptic densities – molecular interfaces that allow one neuron to receive information from its neighbors.

Back in 2005, a team led by Yi Zuo at the University of California Santa Cruz found that as a mouse learns a new task, thousands of fresh dendritic spines blossom from the dendrites of neurons in the motor cortex (an area of the brain that helps control movement). In short, they actually observed neurons learning to communicate better.

And now Zuo’s back with another hit, the journal Nature reports. This time, Zuo and her team have shown that those new dendritic spines aren’t just popping up at random – they grow in bunches:

A third of new dendritic spines (postsynaptic structures of most excitatory synapses) formed during the acquisition phase of learning emerge in clusters, and that most such clusters are neighbouring spine pairs.

The team discovered this by studying fluorescent mouse neurons under a microscope (Oh, did you know there are mice with glowing neurons? Because there are mice with glowing neurons.). As in Zuo’s earlier study, they focused on neurons in the motor cortex:

We followed apical dendrites of layer 5 pyramidal neurons in the motor cortex while mice practised novel forelimb skills.

But as it turned out, their discovery about clustered spines was just the tip of the iceberg – the researchers also found that when a second dendritic spine formed close to one that was already there, the first spine grew larger, strengthening the connection even more. And they learned that clustered spines were much more likely to persist than non-clustered ones were, which just goes to show the importance of a solid support network. And finally, they found that the new spines don’t form when just any signal passes through – new connections only blossom when a brain is learning through repetition.

Can you imagine how many new dendritic spines were bursting to life in the researchers‘ brains as they learned all this? And what about in your brain, right now?

It’s kinda strange to think about this stuff, I know – even stranger is the realization that your brain isn’t so much an object as it is a process – a constantly evolving system of interconnections. You could say that instead of human beings, we’re really human becomings – and thanks to your adaptable neurons, each moment is a new opportunity to decide who – or what – you’d like to become.

Why I Love and Hate “Game”

Yes, it’s that special time of year again – time for flamboyant bouquets and chalky candy to appear at office desks – time for Facebook pages to drown in cloying iconography – time for self-labeled “forever aloners” to dredge the back alleys of OKCupid in last-ditch desperation – and time for me to load up my trusty gatling crossbow with oxytocin-tipped darts and hit the streets.

Valentine's Day also means it's time to enjoy the traditional dish of Earlobe.

Oh, and it’s time for everyone to complain about how misogynistic all this “Game” stuff is.

So, while I guess I could write about, say, a new study that says cutting your romantic partner some slack can make him or her more capable of actual change, or this one that says love and chocolate are good for cardiovascular health, I think it’ll be much more interesting to talk about what’s really on most of our minds today:

What does science have to say about “getting the girl” (or guy) of your dreams? And what do actual girls (and guys) think about it?

Let’s start with some full disclosure: about this time last year, I decided to see what all the fuss was about, and I read The Game for myself – and then I read some of the other works it cites, too. And I started talking to my friends (both male and female) about what they thought of the ideas in those books – and I tested a lot of the ideas I read, the same way I’d test any hypothesis: I wrote down the predictions various authors made, and checked how well those predictions lined up with my own real-world experiences.

In short, I went Full Geek on the topic.

What I learned is that, on the spectrum of scientific rigorousness – a scale from, say, astrology (0) to molecular chemistry (10) – most of this stuff falls somewhere in the 4-to-6 range: It tends to be more evidence-based than, say, ghost-hunting; but it still falls firmly into the realm of the “softer” sciences, like psychotherapy and so on.

The reason for this is that – as many pick-up artists freely admit – their craft is at least as much an artistic pursuit as a scientific one. Much like, say, Aristotle and Hobbes and Descartes, PUAs do their best to ground their conclusions logically in real-world data that anyone is free to test and refute – but at the same time, like those great philosophers of old, PUAs tend to be more intent on constructing elaborate thought systems than on presenting their “ugly” raw data for independent labs to crunch through.

This means pick-up manuals tend to read more like philosophical treatises than scientific papers.

And I think it’s this very feature of pick-up art that explains why it’s such a polarizing topic – why many women (and plenty of men) find the very concept insulting and distasteful, while other men swear that it’s transformed them from self-loathing losers into sexually fulfilled alpha males.

See, many women will tell you in no uncertain terms that pickup “tricks” don’t work on someone as intelligent and experienced as them; and that even if such tricks did work, they don’t want to be “picked up” –  instead, they want to fall in love (or at least in lust) with a man who’s honest about his real self and his real feelings. Many men, too, would agree that crafty seduction techniques somehow cheapen the process – that it’s better to be “forever alone” than to be surrounded by adoring women who were manipulated into their romantic feelings.

Meanwhile, men who’ve had “success” (however they choose to define it) as a result of a pick-up system’s techniques will often defend that system to the death – much like how a person who’s found inner peace thanks to, say, Buddhism will often defend it passionately against anti-Buddhist viewpoints.

What I’m arguing here, though, is that none of these reactions pertain directly to the underlying process of seduction at all – rather, they’re reactions to the (often sleazy-sounding) thought-systems that various writers have constructed around their experiences with that process.

Because – let’s get right down to it – in all our interactions with other humans, we’re hoping to manipulate the outcome somehow. Double entendres, pop-cultural references, stylish clothes and makeup, kind gestures, subtle dishonesty – even honesty itself – all these are tools and techniques that we hope will garner us a certain response.

For example, if you choose to callously manipulate the people around you, you may get a lot more sex than you would otherwise - but you’ll also end up with a lot of shallow relationships, which you’ll probably come to regret eventually. If you choose to be completely honest all the time, you may repel some people – but you’ll probably also find that those who stick around end up respecting you for who you really are.

It’s Game Theory 101: Players who “win” are those who understand the rules, risks and rewards of the game - and play accordingly. All the sleazy lingo and tricks – all the elaborate systems – are just various people’s attempts to explain these dynamics as they play out in gender relations, and to sell their vision of the process to a demographic of sex-starved men, whose desires they understand quite well.

But still – the underlying process itself is no more and no less sleazy than the mind of the person using it.

In other words, when you read between the lines of these PUA systems, most of them turn out to be geared toward the same premises: That to grow as a person, you need to 1) be fully honest with yourself about what you want from the people around you, 2) acknowledge the personal changes that need to be made in order to achieve those results, and 3) steadily work to make those changes in yourself.

From an evolutionary psychology perspective, it’s hard for me to see how that’s inherently more “cheap” than, say, a woman learning how to dress and speak seductively in order to get what she wants.

Yes, there are a lot of sleazy men out there who objectify women and sweet-talk them into one-night stands. There are also plenty of sweet-talking women out there who milk men for the contents of their wallets, then move on. And so we label each other “douchebags” and “bitches,” and keep engaging in the same defensive behaviors, and no one’s really happy.

And I hate that Game. I despise it.

At the same time, though, it’s clear that we humans, like many other animals, have evolved to play competitive social games – there’s no getting around that fact. But unlike many animals, we don’t have to play the game exactly as our instincts tell us to – we’re metacognitive, so we can learn to play using strategies that don’t result in zero-sum outcomes: We can develop tactics that help both sides get more of what they want. We can harness our evolutionary drives to mutually-beneficial behavior patterns.

Doesn’t that make you want to learn to play more creatively, instead of trying not to play at all?

I mean, at the end of the day, it kinda fills me with love for the Game.

What do you think?

Beyond Perfection

If you continue to practice a skill even after you’ve achieved mastery of it, your brain keeps learning to perform it more and more efficiently, says a new study.

Believing you've reached perfection can lead you to engage in some...interesting...behavior.

As we perform a task – say, dunking a basketball or playing a sweet guitar solo – over and over again, we eventually reach a point that some psychologists call “unconscious competence,” where we execute each movement perfectly without devoting any conscious attention to it at all. But even after this point, our bodies keep finding ways to perform the task more and more efficiently, burning less energy with each repetition.

This story’s got it all – brain-hacks, mysterious discoveries, robots – but to put it all in perspective, we’ve gotta start by talking about this idea we call perfection.

“Practice makes perfect,” the old saying goes – but what’s this “perfect” we’re trying to reach? Isn’t it often a matter of opinion? What I mean is, how do we judge, say, a “perfect” backflip or a “perfect” dive? We compare it to others we’ve seen, and decide that it meets certain criteria better than those examples did; that it was performed with less error.

But where do these criteria for perfection come from? Well, some have said there’s a Platonic realm of “perfect forms” that our minds are somehow tapping into – a realm that contains not only “The Perfect Chair” but “the perfect version of that chair” and “the perfect version of that other chair” and “the perfect version of that molecule” and so on, ad infinitum. Kinda weird, I know – but a lot of smart people believed in ideas like this for thousands of years, and some still do.

Science, though, works in a different way: Instead of trying to tap into a world of perfect forms, scientists (and engineers and mathematicians and programmers and so on) work to find errors and fix them.

And it turns out that the human body is quite talented at doing exactly that. A team led by Alaa Ahmed at the University of Colorado at Boulder found this out firsthand, with the help of robots, the Journal of Neuroscience reports:

Seated subjects made horizontal planar reaching movements toward a target using a robotic arm.

These researchers weren’t interested in brain activity – instead, as the volunteers practiced moving the arm, the researchers measured their oxygen consumption, their carbon dioxide output, and their muscle activity.

As you might expect, the scientists found that as people got better at moving the arm, their consumption of oxygen and production of carbon dioxide, and their overall muscle activity, steadily decreased:

Subjects decreased movement error and learned the novel dynamics. By the end of learning, net metabolic power decreased by ∼20% from initial learning. Muscle activity and coactivation also decreased with motor learning.

But the volunteers’ bodies didn’t stop there. As people kept practicing, their gas consumption and output continued to decrease – and so did their muscle activation. In short, their bodies kept learning to move the arm with measurably less and less physical effort.

Though this study didn’t record any data from the subjects’ brains, it’s easy to see how this continual improvement is just one reflection of a very versatile ability. For instance, we know that when two neurons get really friendly, they become more sensitive to each others’ signals – and we also know that underused neural pathways gradually fade away, making room for new ones. Self-improvement impulses are woven deeply into our bodies – into our cells.

When I say that our brains and bodies are cities, I’m not just speaking metaphorically – you are, quite literally, a vast community – an ecosystem composed of trillions of interdependent microorganisms, each one constantly struggling for its own nourishment and safety.

And though your conscious mind is one part – a very significant part – of this great microscopic nation, it’s not the only part that can learn. At this moment, all throughout the lightless highways and chambers of your body, far below your conscious access, networks of cells are changing, adapting, learning, adjusting - finding errors and fixing them.

So, you can think about “perfection” all you want – but even at that magical moment when you achieve it, the multitudes within you are still hard at work, figuring out how to reach beyond that ideal.

What do you think they’re up to right now?

Learning Expectations

Researchers have isolated a specific pathway our brains use when learning new beliefs about others’ motivations, a new study says.

"M'lord! 'Tis improper to influence the lady's anterior cingulate!"

Though this type of learning, like many others, depends heavily on the neurotransmitter chemical dopamine‘s influence in a set of ancient brain structures called the basal ganglia, it’s also influenced by the rostral anterior cingulate cortex (ACC) – a structure that helps us weigh certain emotional reactions against others – indicating that emotions like empathy also play crucial roles.

As we play competitively against other people, our brains get to work constructing mental models that aim to predict our opponents’ future actions. This means we’re not only learning from the consequences of our own actions, but figuring out the reasons behind others‘ actions as well. This ability is known as theory of mind, and it’s thought to be one of the major mental skills that separates the minds of humans – and of our closest primate cousins – from those of other animals.

Though plenty of studies have examined the neural correlates of straightforward cause-and-effect learning, the process by which we learn from the actions of other people still remains somewhat unclear – largely because complex emotions like empathy and regret seem to involve many areas of the brain, including parts of the temporal, parietal and prefrontal cortices, as well as more ancient structures like the basal ganglia and cingulate cortex.

That’s why a team led by the University of Illinois’ Kyle Mathewson set out to track exactly what happens in our brains as we learn new ideas about other’s motivations, the journal Proceedings of the National Academy of Sciences reports.

The team used functional magnetic resonance imaging (fMRI) to study activity deep within volunteers’ brains as they played a competitive betting game against one another – focusing especially on moments when players learned whether they’d won or lost a round, and how much their opponents had wagered.

The researchers then used a computational model to match up patterns of brain activity with patterns of play – and found that the volunteers’ brains learned others’ behaviors and motivations through a complex interplay of several regions:

We found that the reinforcement learning (RL) prediction error was correlated with activity in the ventral striatum.

In other words, the ventral striatum – an area of the basal ganglia – was crucial for learning by reinforcement, much as the researchers expected…

In contrast, activity in the ventral striatum, as well as the rostral anterior cingulate (rACC), was correlated with a previously uncharacterized belief-based prediction error. Furthermore, activity in rACC reflected individual differences in degree of engagement in belief learning.

…while the anterior cingulate, on the other hand, seemed to dictate how attentively players watched their opponents’ patterns of play, and how much thought they put into predicting those patterns.

Thus, it appears that theory of mind is built atop an ancient “substructure” of simple reinforcement learning, which supports layers of more emotionally complex attitudes and beliefs about others’ thoughts, feelings and motivations – many of which are influenced by our perceptions of our own internal feelings.

And that points back to an important aspect of subjective experience in general: Many of our perceptions of the external world are extrapolated from our perceptions of our internal states. When we say, “It’s hot,” we really mean, “I feel hot;” when we say, “It’s loud in here,” we really mean, “It sounds loud to me.” In fact, the great philosopher Bertrand Russell has gone so far as to suggest that instead of saying, “I think,” it’d be more accurate to say “It thinks in me,” the same way we say “It’s raining.”

Anyway, no matter how you choose to phrase it, the point is that thinking isn’t a single process, but a relationship of many processes to one another. Which means that no matter how much we think we know, there’s always plenty left to learn.

Musical Learning

A new study throws some light on how musical aptitude can offset one very specific aspect of the aging process.

The question of Why Those Young Men Always Sound So Angry remains ripe for investigation.

In research comparing older patients with musical training to those without, older people who’d spent time regularly practicing or teaching music consistently displayed much faster neural reaction times to certain kinds of sounds.

The idea that the human brain has a deep relationship with music is obviously nothing new – but lately, research has been demonstrating more and more ways in which music is a major ingredient in mental health. For example, a 2007 study found that the brain reacts to music by automatically heightening attention, and one in 2010 found that an ear for harmony was correlated with a better ability to distinguish speech from noise.

The therapeutic implications of all this haven’t gone unnoticed. The neuroscientist Michael Merzenich has cured patients of chronic tinnitus (ear-ringing) by prescribing them musical training – and he’s had remarkable success using it to improve the responsiveness of autistic children.

Inspired by Merzenich’s work, a team led by Northwestern University’s Nina Kraus made up an experiment: They decided to record the reaction times of musicians‘ brains when they heard certain sounds, and compare those against the reaction times of people with no musical training.

As the journal Neurobiology of Aging reports, the team inserted electrodes directly into the patients’ brains during surgery, like this (WARNING – the following image is a very cool but very bloody photo of brain surgery): here, and recorded exactly how quickly their auditory cortex reacted to a variety of speech sounds.

They found that older musicians’ brains seemed to keep their youthful reaction speeds; at least when it came to a certain kind of sound: The syllable “da” – one of the “hard” vowel sounds known as formant transitions in science slang:

Although younger and older musicians exhibited equivalent response timing for the formant transition, older nonmusicians demonstrated significantly later re-sponse timing relative to younger nonmusicians … The main effect of musicianship observed for the neural response to the onset and the transition was driven solely by group differences in the older participants.

In other words, a musicians’ brain responds to the “da” sound just as quickly as it did in youth – but a nonmusician’s response time slows down significantly as it ages.

The slowdown isn’t much – only a few milliseconds – but in brain time, that can be enough to cause problems. See, we’re not talking about conscious reaction time here – this is electrophysiological reaction time – the speed at which information travels in the brain.

Why does this matter? Because mental issues like autism, senile dementia and schizophrenia are all related to very slight timing errors in the brain’s elaborate communication patterns. An aging brain isn’t so much an old clock as an old city. Ever notice how the most ancient cities tend to be the ones with the weirdest cultures? Well, there ya go.

Just like old cities, though, autism and dementia and schizophrenia – and aging – can be scary sometimes, but they’re also the sources of great breakthroughs, and remarkable insights, and all sorts of conversations that couldn’t have happened otherwise.

What I’m saying is, the only measurable difference between a disorder and a gift is that one is helpful and the other isn’t. And in most cases, that difference really comes down to timing.

The Memory Master

A gene that may underlie the molecular mechanisms of memory has been identified, says a new study.

Some of us feel that "yellow" and "red" are open to interpretation...

The gene’s called neuronal PAS domain protein 4 (Npas4 to its friends). When a brain has a new experience, Npas4 leaps into action, activating a whole series of other genes that modify the strength of synapses – the connections that allow neurons to pass electrochemical signals around.

You can think of synapses as being a bit like traffic lights: a very strong synapse is like a green light, allowing lots of traffic (i.e., signals) to pass down a particular neural path when the neuron fires. A weaker synapse is like a yellow light – some signals might slip through now and then, but most won’t make it. Some synapses can inhibit others, acting like red lights – stopping any signals from getting through. And if a particular synapse goes untraveled for long enough, the road starts to crumble away – until finally, there’s no synapse left.

There’s a saying in neuroscience: “Cells that fire together wire together.” (And vice versa.) In other words, synaptic plasticity – the ability of neurons to modify their connectivity patterns – is what allows neural networks to physically change as they take in new information.  It’s what gives our brains the ability to learn.

In fact, millions of neurons are delicately tinkering with their connectivity patterns right now, inside your head, as you learn this stuff. Pretty cool, huh?

Anyway, synaptic plasticity’s not exactly breaking news – scientists have been studying it in animals like squid and sea slugs since the 1970s. Neurons in those animals are pretty easy to study with electrodes and a microscope, because a) the animals are anatomically simple compared to humans, and b) some of their neurons are so huge they can be seen with the naked eye.

Studying synapses in humans isn’t quite so simple, though. For one thing, most people wouldn’t like it if you cut open their brain and started poking around while they were alive and conscious – and besides, a lot of the really interesting stuff happens down at the molecular level.

That brings up an important point: though you normally hear about genes in connection with traits – say, a “gene for baldness” and so on – these complex molecular strands actually play all sorts of roles in the body, from building cells to adjusting chemical levels to telling other genes what to do.

That’s why MIT’s Yingxi Lin and her team set out to study the functions of certain genes found in the hippocampus – a brain structure central to memory formation – the journal Science reports. The researchers taught a group of mice to avoid a little room in which they received a mild electric shock, then used a precise chemical tracking technique to isolate which genes in the mouse hippocampus were activated right when the mice learned which room to avoid.

In particular, they focused on a hippocampal region with the sci-fi-sounding name of Cornu Ammonis 3or CA3 for short:

We found that the activity-dependent transcription factor Npas4 regulates a transcriptional program in CA3 that is required for contextual memory formation. Npas4 was specifically expressed in CA3 after contextual learning.

By “transcriptional program,” the paper’s authors mean a series of genetic “switches” – genes that Npas4 activates – which in turn make chemical adjustments that strengthen or weaken synaptic connections. In short, Npas4 appears to be part of the master “traffic conductor program” for many of the brain’s synapses.

Though they were pretty excited by this discovery (who wouldn’t be?) the researchers took a deep breath, calmed down, and double-checked their results, by testing memory formation in mice whose brains were unable to produce Npas4:

Global knockout or selective deletion of Npas4 in CA3 both resulted in impaired contextual memory, and restoration of Npas4 in CA3 was sufficient to reverse the deficit in global knockout mice.

In short, they make a pretty convincing argument that Npas4 is a necessary ingredient in a mouse’s ability – and probably our ability – to form certain types of new memories.

Exactly how that program relates to our experience of memory remains unclear, but it’s a promising starting point for fine-tuning future memory research. I don’t know about you, but I’d be thrilled to green-light such a project.

Catchin’ Some Waves

Our capacity for short-term memory depends on the synchronization of two types of brainwaves – rapid cycles of electrical activation – says a new study.

Theta and gamma waves try get their dance steps synced up.

When the patterns of theta waves (4-7 Hz) and gamma waves (25-50 Hz) are closely synchronized, pieces of verbal information seem to be “written” into our short-term memory. But it also turns out that longer theta cycles help us remember more bits of information, while longer gamma cycles are correlated with lower recall.

These patterns are measured using electroencephalography (EEG), a lab technique with a long and successful history. Back in the 1950s, it helped scientists unravel the distinct brainwave patterns associated with REM (rapid-eye movement) and deep sleep. More recently, it’s been used to help people with disabilities control computers, and it’s even helped home users get an up-close look at their own brain activity.

Though more modern techniques like fMRI and DTI are much better at mapping tiny activity patterns deep within the brain, EEG remains a useful tool for measuring the overall patterns of synchronized electrical activity that sweep across the entire brain in various wave-like patterns – hence the term “brainwaves.”

Several types of brainwaves have been well studied since the 1950s: alpha waves, which are correlated with active attention; beta and delta waves, which are associated with logical processing; theta waves, which are associated with meditation and acceptance; and gamma waves, which burst rapidly across the brain when we come to a realization or an understanding.

And now, as the International Journal of Psychophysiology reports, a team led by Jan Kamiński at the Polish Academy of Sciences has discovered a new way of mapping relationships between these patterns of wave activity, to arrive at a new understanding of how theta and gamma waves work together: they studied the lengths of these two cycles relative to one another - and what they found was pretty amazing:

We have observed that the longer the theta cycles, the more information ‘bites’ the subject was able to remember; the longer the gamma cycle, the less the subject remembered.

The researchers discovered this in a very straightforward way – they simply kept tabs on volunteers’ EEG activity as they sat with eyes closed and let their minds wander; then they compared these recordings against ones taken as the volunteers memorized longer and longer strings of numbers - from three digits up to nine.

The correlation between long theta cycles and greater memory for digits turned out to be quite strong – and for gamma waves, the reverse turned out to be true. This means that gamma waves are probably much more crucial for forming ideas than they are for rote memorization.

Though this finding might not seem all that revolutionary, it provides an elegant demonstration of how even older technologies like EEG can still be used to help us make brand-new discoveries. Which means that in the brains of those of us who keep pluggin’ away at home EEG experiments, there’s probably still a place of honor for those wonderful little gamma waves.

Follow

Get every new post delivered to your Inbox.

Join 74 other followers