Posts Tagged ‘ genetics ’

Doubling Up

Our big brains may be the result of a doubled gene that lets brain cells migrate to new areas, says a new study.

SRGAP2 seems to have a lot to do with the human "inflate-o-brain."

The gene, known as SRGAP2, has been duplicated in our genomes at least twice in the four million years since our ancestors diverged from those of the other great apes. It codes for a certain protein that interferes with filopodia - tiny molecular structures that shape the growth of neurons in a developing brain. Researchers think that as SRGAP’s protein disrupted the “normal” growth of our ancestors’ filopodia, millions of their neurons migrated outward to thicken the cerebral cortex – the outer “rind” of the cerebrum where many of our most advanced cognitive functions are processed.

This SRGAP2 duplication isn’t just common – it’s universal: every human being alive today shares it. In fact, it’s one of 23 duplicated genes that are shared by every person’s genome, but aren’t shared with chimps, gorillas, and orangutans. Thus, geneticists think these 23 genes may be crucial parts of the instructions for building a full-fledged Homo sapiens.

A team led by Megan Dennis at the University of Washington examined SRGAP2 in more than 150 million people, and discovered that this duplication story has an interesting twist: it seems that about 3.4 million years ago, SRGAP2 was partially duplicated – and this partial duplication is missing in some people. But then, about 2.4 million years ago, a copy of that partial copy was created and added to chromosome 1. That copied copy is common to the genetic code of every human being alive today.

Dennis and her team studied the effects of this duplicated duplicate, and found that the version of SRGAP2 we all carry interferes with neurons’ ability to make filopodia. Since our great ape cousins don’t share this duplication, it seems reasonable to think that this filopodia “defect” played a major part in shaping the modern human brain.

This research is still in the pretty early stages (I’ve found plenty of popular-press reportage on it; but unless I’m missing something, there’s no peer-reviewed journal paper yet) but it still provides some exciting clues as to why our brains might be so different from those of our closest genetic relatives. I’m pretty interested to see what future studies on SRGAP2 will reveal about the structure of the human cerebral cortex – especially the prefrontal cortex (PFC). As I hear more news, I’ll keep you posted.

But for now, it’s off to kill some brain cells in the hope of making myself smarter. It’ll totally work – Science says so!

Autistic Genetics

Some forms of autism seem to be linked with variations in certain genes, a new study says.

One of these chromosomes is not like the others (maybe).

The deletion of a certain cluster of 27 genes on the mammalian chromosome 16 – specifically a region known as 16p11.2 - causes autism-like features to develop in mouse brains. These mice exhibited hyperactivity, repetitive behaviors, and difficulty adjusting to new environments, much like human children with autism. (As I mention a lot on this blog, mouse brains provide pretty reliable models of certain human brain functions, which is why neuroscientists experiment on them.)

The idea that chromosome 16 might be linked to autism dates back to 2007, when Michael Wigler at Cold Spring Harbor Laboratory (CSHL) discovered that many children with autism had a deletion of a certain set of 27 genes in region 16p11.2.

Tired of acronyms yet? I sure hope not, because here come some more.

As the journal Proceedings of the National Academy of Sciences (PNAS) reports, a team led by Guy Horev at CSHL genetically engineered mice to manifest this same chromosomal copy number variation (CNV):

We used chromosome engineering to generate mice harboring deletion of the chromosomal region corresponding to 16p11.2, as well as mice harboring the reciprocal duplication. These 16p11.2 CNV models have dosage-dependent changes in gene expression, viability, brain architecture, and behavior. For each phenotype, the consequence of the deletion is more severe than that of the duplication.

In short, a deletion of those 27 genes produced autism-like symptoms, while mice with an extra copy of region 16p11.2 didn’t seem to be autistic at all.

Interestingly, half the mice with the deletion died soon after birth. The ones that survived to adulthood were physically healthy and fertile, but when the researchers studied their brains under MRI scans, they found a set of neurological symptoms that were all too recognizable:

[The mice] have alterations in the hypothalamus and exhibit a “behavior trap” phenotype—a specific behavior characteristic of rodents with lateral hypothalamic and nigrostriatal lesions.

The hypothalamus is a brain region involved in regulating our hormones, balancing our body temperature, and motivating us to perform semi-automatic tasks like eating, drinking, and sleeping. The nigrostriatal pathway connects the midbrain’s substantia nigra to the forebrain’s striatum; it’s a major dopamine pathway that’s involved in motivating movement. Abnormalities in all these regions have been linked with autism in previous studies.

The next step in this research will be to pinpoint which of the 27 genes in region 16p11.2 impact autistic development in what ways. The researchers hope future discoveries along those lines will help doctors diagnose autism early in a child’s life, or perhaps even predict its likelihood based on the genetics of the parents.

Like other predictive genetic tests, this could lead to some tough ethical dilemmas for would-be mothers and fathers – but it’s also likely to lead to more proactive treatments. And besides, even if you know your child will be born with autism, you never know if he or she might be the next Daniel Tammett or Temple Grandin.

Enzyme Alarm Clock

Researchers have isolated a protein that sounds our biological clock’s alarm each morning, a new study reports.

Energy-saving tip: try replacing your alarm clock with a live rooster!

A gene known as KDM5A codes for an enzyme (i.e., a protein that increases the rate of a chemical reaction) called JARID1a. This enzyme acts as a switch that starts the biochemical process of waking us from sleep – like some kind of weird molecular rooster.

Here’s the background: scientists have known for years that levels of a protein called PER rise in the morning and fall toward nighttime. The level of PER in our bodies helps our cells know what time of day it is – higher levels tell us it’s time to be awake, while lower levels help make us sleepy. This wake/sleep process is known as the circadian rhythm or circadian cycle.

Two genes – known as CLOCK and BMAL1 – help raise levels of PER. When those levels reach a certain critical point in the evening, CLOCK and BMAL1 stop triggering a rise in PER levels – and as those levels drop, our heart rate, blood pressure, and mental activity slow down in preparation for bedtime.

But this research marks a new discovery: a specific enzyme that restarts the circadian cycle in the morning, telling CLOCK and BMAL1 to start raising PER levels again.

As the journal Science reports, a team led by Satchindananda Panda at Salk’s Regulatory Biology Laboratory collaborated with teams at McGill University and Albert Einstein College of Medicine to discover JARID1a’s role in the circadian cycle. The teams genetically engineered fruit flies to under-produce JARID1a, and these flies seemed to have no idea what time of day it was – they woke and slept at random hours, and took naps throughout the day… much like college students.

Human and mouse cells engineered to produce less JARID1a also produced odd levels of PER:

JARID1a increased histone acetylation by inhibiting histone deacetylase 1 function and enhanced transcription by CLOCK-BMAL1 in a demethylase-independent manner. Depletion of JARID1a in mammalian cells reduced Per promoter histone acetylation, dampened expression of canonical circadian genes, and shortened the period of circadian rhythms.

In other words, less JARID1a led to shorter circadian cycles of PER production in the cells.

But wait – there’s more! Panda and his team also discovered that JARID1a counteracts the effects of a protein called HDAC1, which acts as a molecular brake on CLOCK and BMAL1 each night. From this, the scientists reason that rising levels of PER tell HDAC1 to put the brakes on its own production as the night goes on, which would eventually allow JARID1a to restart the cycle by kicking CLOCK and BMAL1 back into gear, which would start raising PER levels again. Now how’s that for an intricate clock?!

The researchers confirmed this idea by inserting the JARID1a gene into the fruit flies that lacked it – and sure enough, JARID1a released the HDAC1 brake and put the flies on a normal circadian cycle.

Scientists hope this discovery will aid the development of certain types of drugs – for instance, as people age, their circadian cycles seem to shorten, and the researchers suspect this may have something to do with JARID1a. People with certain types of diabetes also tend to have out-of-whack circadian rhythms, so similar drugs might benefit them as well.

As for nocturnal college students and rockstars, though, I suspect the solution may not be quite so simple.

Optimistic Genetics

For the first time, scientists have pinpointed a particular gene variation linked with optimism and self-esteem, a new study reports.

A Genuine Scientific Image of the OXTR gene's G/G allele.

Two different versions – alleles - of the oxytocin receptor gene (OXTR) exist: an allele with the nucleotide “A” (adenine) at a certain location, and an allele with “G” (guanine) at that same location. Previous studies had found that people with at least one “A” molecule at that location tended to have heightened sensitivity to stress, and worse social skills.

But as the journal Proceedings of the National Academy of Sciences (PNAS) reports, a team led by UCLA’s Shelley E. Taylor were able to correlate certain alleles of OXTR with specific psychological symptoms:

We report a link between the oxytocin receptor (OXTR) SNP rs53576 and psychological resources, such that carriers of the “A” allele have lower levels of optimism, mastery, and self-esteem, relative to G/G homozygotes. OXTR was also associated with depressive symptomatology.

In other words, people who have either two “A” nucleotides, or one “A” and one “G,” at that specific location have lower-than-normal levels of optimism and self-esteem, and much higher levels of depressive symptoms, than people with two “G” nucleotides at that location on the gene.

Meanwhile, people with two “G” nucleotides at a certain location on their OXTR gene are more likely to be able to buffer themselves against stress. This is the most precise correlation between nucleotide differences and psychological traits that’s ever been discovered. And while this correlation isn’t a determiner of behavior, it does look like it’ll turn out to be an accurate predictor.

To figure this out, Taylor’s team studied the DNA from the saliva of 326 volunteers, and examined this data along with questionnaires each subject had completed. The questionnaires measured subjective feelings like self-worth, confidence, and positivity. The subjects also completed a set of questionnaires used to diagnose depression.

As is usual when stories like this – that is, about genes linked with certain traits – hit the press, there’ll probably be a flurry of articles with titles like “The Happy Gene,” making vague claims that the “gene for optimism” has been isolated. And that’s not what this study is about, at all – it’s about a connection between certain versions of a gene and the availability of certain psychological resources:

Some people think genes are destiny, that if you have a specific gene, then you will have a particular outcome. That is definitely not the case. This gene is one factor that influences psychological resources and depression, but there is plenty of room for environmental factors as well. Even people with the “A” variant can overcome depression and manage stress.

In short, what these allele differences mainly predict is a person’s susceptibility to certain psychological disorders if they encounter certain types of stress – not the likelihood that they’ll actually develop a given disorder.

It also remains to be seen what role, exactly, the neurotransmitter oxytocin, and its receptors, play in managing psychological troubles. As I’ve mentioned before, it’s been shown to lower stress and increase generosity – and it’s also involved in timing birth, encouraging hunger, and… heightening racist feelings.

Still, studies like this continue to being us more accurate and precise methods of diagnosing mental disorders – and even discovering if a person might be at risk for them. It’s also more evidence that our minds, like our bodies, are not all created equal – each of them is a unique neurochemical environment with its own thresholds of responsiveness.

So, next time somebody’s getting on your nerves, just tell them, “You better hope my oxytocin receptor genes are G/G alleles.” Take it from me: they’ll know exactly what you mean, and will probably back off and offer an apology.

Skin Into Brain

Scientists have discovered a way to convert human skin cells into working brain cells. Cue the Weird Science theme song!

"The neurons! They live! They liiiiiive!"

Using strands of microRNA molecules and a few carefully chosen genes, a team led by Dr. Sheng Ding at the Scripps Research Institute reprogrammed the genetic code of skin cells taken from a 55-year-old patient, transforming them into full-fledged neurons that actually synapse with each other.

As the lab’s report in the journal Cell Stem Cell1 explains, this new reprogramming method allows scientists to directly transform one cell type from an adult human into a normal, functional cell of a completely different type:

These human induced neurons (hiNs) exhibit typical neuronal morphology and marker gene expression, fire action potentials, and produce functional synapses between each other.

But this isn’t only a major breakthrough because the induced neurons work so well – it also represents a major leap forward in the field of cell transformation.

See, over the past few years, scientists have had some success at turning various kinds of human somatic (body) cells into artificial stem cells – known as induced pluripotent stem cells (iPSCs)which can then be grown into various other types of cells.

This process is far from perfect – the successful conversion rate is has often been less than one percent, and it’s not yet clear whether iPSCs behave and grow exactly like their authentic stem cell equivalents. Still, iPSCs have proven to be useful tools for modeling diseases and testing drugs - and they may soon offer a less controversial alternative to research on human embryonic stem cells.

More recently, though, an even cooler technique has emerged: reprogramming one type of somatic cell directly into another type, without any intermediate iPSC stage. This is the technique Ding’s team used to make their induced neurons.

And that’s only the beginning, Ding says. For one thing, he thinks this method can be used to grow neurons for patients with Alzheimer’s and other neurodegenerative diseases – and they’ll be safe to transplant, because they’ll be grown from the patient’s own body:

Rather than using models made in yeast, flies or mice for disease research, all cell-reprogramming technology allows human brain, heart and other cells to be created from the skin cells of patients with a specific disease.

It’s also likely that as the technology improves, we’ll use it to grow transplantable tissue for hearts, lungs, and other organs – not to mention using it to study the development of various diseases in a specific patient’s body, in a controlled, safe environment:

“This will help us avoid any genome modifications,” said Dr. Ding. “These cells are not ready yet for transplantation, but this work removes some of the major technical hurdles to using reprogrammed cells to create transplant-ready cells for a host of diseases.”

As this research shows, we may soon be able to grow brand-new, guaranteed-compatible body parts from just a small sample of cells from the patient’s body. It looks like solutions to some age-old problems could be right on the horizon.


1. Clearly, Cell Press hired a whole team of marketing geniuses to come up with that name.

Inherited Memory

Scientists have discovered a molecular mechanism by which a parent’s experiences can alter the genes of its children.

Do you dare to learn the secrets of...The Chromosome?!

Several recent studies have demonstrated connections between environmental factors and inherited genetic traits – for instance, people whose parents lived through famine tend to have higher rates of diabetes and heart disease – but this latest research, published in the journal Nature, marks the first hard evidence of such a modification process at work.

Before we dive into the details, though, let’s back up a bit, and take a look at why this is such a Big Freakin’ Deal.

See, the whole concept of inheritance via experience has been – and will probably continue to be – a very hard sell to modern biologists. As you might remember from science class, the 18th-century theory of Lamarckism, which proposed that traits could be modified through an individual’s “use and disuse” of them, was blown out of the water by Darwin’s theory of evolution by natural selection, which demonstrated – through mountains of hard data – that traits are actually shaped by millions of tiny variations, and the degree to which those variations help an individual create successful offspring that carry them on.

For more than a century, the whole Lamarck debate seemed to be settled – but lately, it looks like some elements of Lamarckism may be making a surprise comeback, in a new field called epigenetics - the study of gene changes caused by chemical factors other than modifications to the DNA sequence itself:1

Epigenetic memory comes in various guises, but one important form involves histones — the proteins around which DNA is wrapped. Particular chemical modifications can be attached to histones and these modifications can then affect the expression of nearby genes, turning them on or off.

For example, in 2007, the journal Nature published data showing that exposure to extended periods of cold could cause plants to alter the molecular process by which their DNA was copied – thus “silencing” certain genes in their offspring. Then in 2009, MIT’s Technology Review discussed some animal studies with even more shocking results: apparently, a mouse’s environment can affect the memory capacity it passes on to its descendants.

But whereas those studies were mainly concerned with the effects of epigenetic alteration, this new research has gone a step further, and mapped a specific epigenetic process that turns genes on and off. A team led by Professor Martin Howard and Professor Caroline Dean at the John Innes Centre performed a mathematical analysis of experimental data, and discovered the mechanism by which plants exposed to extended cold periods produce descendants with delayed flowering:

Professor Howard produced a mathematical model of the FLC system. The model predicted that inside each individual cell, [a gene known as] FLC should be either completely activated or completely silenced, with the fraction of cells switching to the silenced state increasing with longer periods of cold.

Not only did the plants’ gene expression pattern line up with the mathematical predictions – the team’s research also showed that histone proteins were modified in a way that would alter the FLC gene, and that this alteration happened during the period of cold. This is the first true demonstration of histones’ role of epigenetic modification.

At first glance, this might not seem to have a whole lot to do with neuroscience – but in a wider context, the implications of epigenetics look pretty incredible.

For plenty of obvious reasons, modern biological science has never been big on the idea “genetic” or “ancestral” memories – recollections of specific events passed down to us from our ancestors (e.g., the Jungian collective unconscious). And yet, while epigenetic modifications don’t offer much support for that idea, they do seem to suggest that our ancestors’ experiences can affect the biological development - and thus, the minds – of future generations.

We’ve still got a long way to go before we understand all the ways in which gene expression affects brain development. But if histones can help mice pass their enriched learning capacity on to their children, the question of what humans can pass on seems, at the very least, worth investigating.


1. Just to be safe, I want to point out that, on the whole, the theory of evolution through natural selection still explains a great deal about how biological systems work, and its principles have been verified in a multitude of fields. At the risk of being over-obvious: these new discoveries don’t threaten the main trunk of evolutionary theory; they’re just pruning a few of its branches.


Get every new post delivered to your Inbox.

Join 74 other followers