Aging and its 11 hippocampal genes

Aging is being quite extensively studied these days and here is another advance in the field. Pardo et al. (2017) looked at what happens in the hippocampus of 2-months old (young) and 28-months old (old) female rats. Hippocampus is a seahorse shaped structure no more than 7 cm in length and 4 g in weight situated at the level of your temples, deep in the brain, and absolutely necessary for memory.

First the researchers tested the rats in a classical maze test (Barnes maze) designed to assess their spatial memory performance. Not surprisingly, the old performed worse than the young.

Then, they dissected the hippocampi and looked at neurogenesis and they saw that the young rats had more newborn neurons than the old. Also, the old rats had more reactive microglia, a sign of inflammation. Microglia are small cells in the brain that are not neurons but serve very important functions.

After that, the researchers looked at the hippocampal transcriptome, meaning they looked at what proteins are being expressed there (I know, transcription is not translation, but the general assumption of transcriptome studies is that the amount of protein X corresponds to the amount of the RNA X). They found 210 genes that were differentially expressed in the old, 81 were upregulated and 129 were downregulated. Most of these genes are to be found in human too, 170 to be exact.

But after looking at male versus female data, at human and mouse aging data, the authors came up with 11 genes that are de-regulated (7 up- and 4 down-) in the aging hippocampus, regardless of species or gender. These genes are involved in the immune response to inflammation. More detailed, immune system activates microglia, which stays activated and this “prolonged microglial activation leads to the release of pro-inflammatory cytokines that exacerbate neuroinflammation, contributing to neuronal loss and impairment of cognitive function” (p. 17). Moreover, these 11 genes have been associated with neurodegenerative diseases and brain cancers.


These are the 11 genes: C3 (up), Cd74  (up), Cd4 (up), Gpr183 (up), Clec7a (up), Gpr34 (down), Gapt (down), Itgam (down), Itgb2 (up), Tyrobp (up), Pld4 (down).”Up” and “down” indicate the direction of deregulation: upregulation or downregulation.

I wish the above sentence was as explicitly stated in the paper as I wrote it so I don’t have to comb through their supplemental Excel files to figure it out. Other than that, good paper, good work. Gets us closer to unraveling and maybe undoing some of the burdens of aging, because, as the actress Bette Davis said, “growing old isn’t for the sissies”.

Reference: Pardo J, Abba MC, Lacunza E, Francelle L, Morel GR, Outeiro TF, Goya RG. (13 Jan 2017, Epub ahead of print). Identification of a conserved gene signature associated with an exacerbated inflammatory environment in the hippocampus of aging rats. Hippocampus, doi: 10.1002/hipo.22703. ARTICLE

By Neuronicus, 25 January 2017



Don’t eat snow

Whoever didn’t roll out a tongue to catch a few snowflakes? Probably only those who never encountered snow.

The bad news is that snow, particularly urban snow is bad, really bad for you. The good news is that this was not always the case. So there is hope that in the far future it will be pristine again.

Nazarenko et al. (2016) constructed a very clever contraption that reminds me of NASA space exploration instruments. The authors refer to this by the humble name of ‘environmental chamber’, but is in fact a complex construction with different modules designed to measure out how car exhaust and snow interact (see Fig. 1).

Fig. 1 from Nazarenko et al. (2016, DOI: 10.1039/c5em00616c). Released under CC BY-NC 3.0.

After many experiments, researchers concluded that snow absorbs pollutants very effectively. Among the many kinds of organic compounds soaked by snow in just one hour after exposure to fume exhaust, there were the infamous BTEX (benzene, toluene, ethylbenzene, and xylenes). The amounts of these chemicals in the snow were not at all negligible; to give you an example, the BTEX concentration increased from virtually 0 to 50 and up to 380 ug kg-1. The authors provide detailed measurements for all the 40+ compounds they have identified.

Needles to say, many these compounds are known carcinogenics. Snow absorbs them, alters their size distributions, and then it melts… Some of them may be released back in the air as they are volatile, some will go in the ground and rivers as polluted water. After this gloomy reality check, I’ll leave you with the words of the researchers:

“The accumulation and transfer of pollutants from exhaust – to snow – to meltwater need to be considered by regulators and policy makers as an important area of focus for mitigation with the aim to protect public health and the environment” (p. 197).


Reference: Nazarenko Y, Kurien U, Nepotchatykh O, Rangel-Alvarado RB, & Ariya PA. (Feb 2016). Role of snow and cold environment in the fate and effects of nanoparticles and select organic pollutants from gasoline engine exhaust. Environmental Science: Processes & Impacts, 18(2):190-199. doi: 10.1039/c5em00616c. ARTICLE | FREE FULTEXT PDF 

By Neuronicus, 26 December 2016



Soccer and brain jiggling

There is no news or surprise that strong hits to the head produce transient or permanent brain damage. But how about mild hits produced by light objects like, say, a volley ball or soccer ball?

During a game of soccer, a player is allowed to touch the ball with any part of his/her body minus the hands. Therefore, hitting the ball with the head, a.k.a. soccer heading, is a legal move and goals marked through such a move are thought to be most spectacular by the refined connoisseur.

A year back, in 2015, the United States Soccer Federation forbade the heading of the ball by children 10 years old and younger after a class-action lawsuit against them. There has been some data that soccer players display loss of brain matter that is associated with cognitive impairment, but such studies were correlational in nature.

Now, Di Virgilio et al. (2016) conducted a study designed to explore the consequences of soccer heading in more detail. They recruited 19 young amateur soccer players, mostly male, who were instructed to perform 20 rotational headings as if responding to corner kicks in a game. The ball was delivered by a machine at a speed of approximately 38 kph. The mean force of impact for the group was 13.1 ± 1.9 g. Immediately after the heading session and at 24 h, 48 h and 2 weeks post-heading, the authors performed a series of tests, among which are a transcranial magnetic stimulation (TMS) recording, a cognitive function assessment (by using the Cambridge Neuropsychological Test Automated Battery), and a postural control test.

Not being a TMS expert myself, I was wondering how do you record with a stimulator? TMS stimulates, it doesn’t measure anything. Or so I thought. The authors delivered brief  (1 ms) stimulating impulses to the brain area that controls the leg (primary motor cortex). Then they placed an electrode over the said muscle (rectus femoris or quadriceps femoris) and recorded how the muscle responded. Pretty neat. Moreover, the authors believe that they can make inferences about levels of inhibitory chemicals in the brain from the way the muscle responds. Namely, if the muscle is sluggish in responding to stimulation, then the brain released an inhibitory chemical, like GABA (gamma-amino butyric acid), hence calling this process corticomotor inhibition. Personally, I find this GABA inference a bit of a leap of faith, but, like I said, I am not fully versed in TMS studies so it may be well documented. Whether or not GABA is responsible for the muscle sluggishness, one thing is well documented though: this sluggishness is the most consistent finding in concussions.

The subjects had impaired short term and long term memory functions immediately after the ball heading, but not 24 h or more later. Also transient was the corticomotor inhibition. In other words, soccer ball heading results in measurable changes in brain function. Changes for the worst.

Even if these changes are transient, there is no knowing (as of yet) what prolonged ball heading might do. There is ample evidence that successive concussions have devastating effects on the brain. Granted, soccer heading does not produce concussions, at least in this paper’s setting, but I cannot think that even sub-concussion intensity brain disruption can be good for you.

On a lighter note, although the title of the paper features the word “soccer”, the rest o the paper refers to the game as “football”. I’ll let you guess the authors’ nationality or at least the continent of provenance ;).


Reference: Di Virgilio TG, Hunter A, Wilson L, Stewart W, Goodall S, Howatson G, Donaldson DI, & Ietswaart M. (Nov 2016, Epub 23 Oct 2016). Evidence for Acute Electrophysiological and Cognitive Changes Following Routine Soccer Heading. EBioMedicine, 13:66-71. PMID: 27789273, DOI: 10.1016/j.ebiom.2016.10.029. ARTICLE | FREE FULLTEXT PDF

By Neuronicus, 20 December 2016

Amusia and stroke

Although a complete musical anti-talent myself, that doesn’t prohibit me from fully enjoying the works of the masters in the art. When my family is out of earshot, I even bellow – because it cannot be called music – from the top of my lungs alongside the most famous tenors ever recorded. A couple of days ago I loaded one of my most eclectic playlists. While remembering my younger days as an Iron Maiden concert goer (I never said I listen only to classical music :D) and screaming the “Fear of the Dark” chorus, I wondered what’s new on the front of music processing in the brain.

And I found an interesting recent paper about amusia. Amusia is, as those of you with ancient Greek proclivities might have surmised, a deficit in the perception of music, mainly the pitch but sometimes rhythm and other aspects of music. A small percentage of the population is born with it, but a whooping 35 to 69% of stroke survivors exhibit the disorder.

So Sihvonen et al. (2016) decided to take a closer look at this phenomenon with the help of 77 stroke patients. These patients had an MRI scan within the first 3 weeks following stroke and another one 6 months poststroke. They also completed a behavioral test for amusia within the first 3 weeks following stroke and again 3 months later. For reasons undisclosed, and thus raising my eyebrows, the behavioral assessment was not performed at 6 months poststroke, nor an MRI at the 3 months follow-up. It would be nice to have had behavioral assessment with brain images at the same time because a lot can happen in weeks, let alone months after a stroke.

Nevertheless, the authors used a novel way to look at the brain pictures, called voxel-based lesion-symptom mapping (VLSM). Well, is not really novel, it’s been around for 15 years or so. Basically, to ascertain the function of a brain region, researchers either get people with a specific brain lesion and then look for a behavioral deficit or get a symptom and then they look for a brain lesion. Both approaches have distinct advantages but also disadvantages (see Bates et al., 2003). To overcome the disadvantages of these methods, enter the scene VLSM, which is a mathematical/statistical gimmick that allows you to explore the relationship between brain and function without forming preconceived ideas, i.e. without forcing dichotomous categories. They also looked at voxel-based morphometry (VBM), which a fancy way of saying they looked to see if the grey and white matter differ over time in the brains of their subjects.

After much analyses, Sihvonen et al. (2016) conclude that the damage to the right hemisphere is more likely conducive to amusia, as opposed to aphasia which is due mainly to damage to the left hemisphere. More specifically,

“damage to the right temporal areas, insula, and putamen forms the crucial neural substrate for acquired amusia after stroke. Persistent amusia is associated with further [grey matter] atrophy in the right superior temporal gyrus (STG) and middle temporal gyrus (MTG), locating more anteriorly for rhythm amusia and more posteriorly for pitch amusia.”

The more we know, the better chances we have to improve treatments for people.


unless you’re left-handed, then things are reversed.


1. Sihvonen AJ, Ripollés P, Leo V, Rodríguez-Fornells A, Soinila S, & Särkämö T. (24 Aug 2016). Neural Basis of Acquired Amusia and Its Recovery after Stroke. Journal of Neuroscience, 36(34):8872-8881. PMID: 27559169, DOI: 10.1523/JNEUROSCI.0709-16.2016. ARTICLE  | FULLTEXT PDF

2.Bates E, Wilson SM, Saygin AP, Dick F, Sereno MI, Knight RT, & Dronkers NF (May 2003). Voxel-based lesion-symptom mapping. Nature Neuroscience, 6(5):448-50. PMID: 12704393, DOI: 10.1038/nn1050. ARTICLE

By Neuronicus, 9 November 2016


Another puzzle piece in the autism mystery

Just like in the case of schizophrenia, hundreds of genes have been associated with autistic spectrum disorders (ASDs). Here is another candidate.

97autism - Copy

Féron et al. (2016) reasoned that most of the info we have about the genes that are behaving badly in ASDs comes from studies that used adult cells. Because ASDs are present before or very shortly after birth, they figured that looking for genetic abnormalities in cells that are at the very early stage of ontogenesis might prove to be enlightening. Those cells are stem cells. Of the pluripotent kind. FYI, based on what they can become (a.k.a how potent they are), the stem cells are divided into omipotent, pluripotent, multipotent, oligopotent, and unipotent. So the pluripotents are very ‘potent’ indeed, having the potential of producing a perfect person.

Tongue-twisters aside, the authors’ approach is sensible, albeit non-hypothesis driven. Which means they hadn’t had anything specific in mind when they had started looking for differences in gene expression between the olfactory nasal cells obtained from 11 adult ASDs sufferers and 11 age-matched normal controls. Luckily for them, as transcriptome studies have a tendency to be difficult to replicate, they found the anomalies in the expression of genes that have been already associated with ASD. But, they also found a new one, the MOCOS (MOlybdenum COfactor Sulfurase) gene, which was poorly expressed in ASDs (downregulated, in genetic speak). The enzyme is MOCOS (am I the only one who thinks that MOCOS isolated from nasal cells is too similar to mucus? is the acronym actually a backronym?).

The enzyme is not known to play any role in the nervous system. Therefore, the researchers looked to see where the gene is expressed. Its enzyme could be found all over the brain of both mouse and human. Also, in the intestine, kidneys, and liver. So not much help there.

Next, the authors deleted this gene in a worm, Caenorhabditis elegans, and they found out that the worm’s cells have issues in dealing with oxidative stress (e.g. the toxic effects of free radicals). In addition, their neurons had abnormal synaptic transmission due to problems with vesicular packaging.

Then they managed – with great difficulty – to produce human induced pluripotent cells (iPSCs) in a Petri dish in which the gene MOCOS was partially knocked down. ‘Partially’, because the ‘totally’ did not survive. Which tells us that MOCOS is necessary for survival of iPSCs. The mutant cells had less synaptic buttons than the normal cells, meaning they formed less synapses.

The study, besides identifying a new candidate for diagnosis and treatment, offers some potential explanations for some beguiling data that other studies have brought forth, like the fact that all sorts of neurotransmitter systems seem to be impaired in ADSs, all sorts of brain regions, making very hard to grab the tiger by the tail if the tiger is sprouting a new tail when you look at it, just like the Hydra’s heads. But, discovering a molecule that is involved in an ubiquitous process like synapse formation may provide a way to leave the tiger’s tail(s) alone and focus on the teeth. In the authors’ words:

“As a molecule involved in the formation of dense core vesicles and, further down, neurotransmitter secretion, MOCOS seems to act on the container rather than the content, on the vehicle rather than one of the transported components” (p. 1123).

The knowledge uncovered by this paper makes a very good piece of the ASDs puzzle. Maybe not a corner, but a good edge. Alright, even if it’s not an edge, at least it’s a crucial piece full of details, not one of those sky pieces.

Reference: Féron F, Gepner B, Lacassagne E, Stephan D, Mesnage B, Blanchard MP, Boulanger N, Tardif C, Devèze A, Rousseau S, Suzuki K, Izpisua Belmonte JC, Khrestchatisky M, Nivet E, & Erard-Garcia M (Sep 2016, Epub 4 Aug 2016). Olfactory stem cells reveal MOCOS as a new player in autism spectrum disorders. Molecular Psychiatry, 21(9):1215-1224. PMID: 26239292, DOI: 10.1038/mp.2015.106. ARTICLE | FREE FULLTEXT PDF

By Neuronicus, 31 August 2016

One parent’s gene better than the other’s

Not all people with the same bad genetic makeup that predisposes them to a particular disease go and develop that disease or, at any rate, not with the same severity and prognosis. The question is why? After all, they have the same genes…

Here comes a study that answers that very important question. Eloy et al. (2016) looked at the most common pediatric eye cancer (1 in 15,000) called retinoblastoma (Rb). In the hereditary form of this cancer, the disease occurs if the child carries mutant (i.e. bad) copies of the RB1 tumour suppressor gene located on chromosome 13 (13q14). These copies, called alleles, are inherited by the child from the mother or from the father. But some children with this genetic disadvantage do not develop Rb. They should, so why not?

The authors studied 57 families with Rb history. They took blood and tumour samples from the participants and then did a bunch of genetic tests: DNA, RNA, and methylation analyses.

They found out that when the RB1 gene is inherited from the mother, the child has only 9.7% chances of developing Rb, but when the gene is inherited from the father the child has only 67.5% chances of developing Rb.

The mechanism for this different outcomes may reside in the differential methylation of the gene. Methylation is a chemical process that suppresses the expression of a gene, meaning that less protein is produced from that gene. The maternal gene had less methylation, meaning that more protein was produced, which was able to offer some protection against the cancer. Seems counter-intuitive, you’d think less bad protein is a good thing, but there is a long and complicated explanation for that, which, in a very simplified form, posits that other events influence the function of the resultant protein.

Again, epigenetics seem to offer explanations for pesky genetic inheritance questions. Epigenetic processes, like DNA methylation, are modalities through which traits can be inherited that are not coded in the DNA itself.

RB - Copy

Reference: Eloy P, Dehainault C, Sefta M, Aerts I, Doz F, Cassoux N, Lumbroso le Rouic L, Stoppa-Lyonnet D, Radvanyi F, Millot GA, Gauthier-Villars M, & Houdayer C (29 Feb 2016). A Parent-of-Origin Effect Impacts the Phenotype in Low Penetrance Retinoblastoma Families Segregating the c.1981C>T/p.Arg661Trp Mutation of RB1. PLoS Genetics, 12(2):e1005888. eCollection 2016. PMID: 26925970, PMCID: PMC4771840, DOI: 10.1371/journal.pgen.1005888. ARTICLE | FREE FULLTEXT PDF

By Neuronicus, 24 July 2016

Stress can kill you and that’s no metaphor

85heart - CopyThe term ‘heartbreak’ is used as a metaphor to describe the intense feeling of loss, sometimes also called emotional pain. But what if the metaphor has roots into something more tangible than a feeling, that of the actual muscular organ giving signs of failure?

Although there have been previous reports that found stress causes cardiovascular problems, including myocardial infarction, Graff et al. (2016) conducted the largest study to date that investigated this link: they had almost 1 million subjects. That’s right, 1 million people (well, actually 974 732). Out of these, almost 20% of them had a partner who died between 1995 and 2014. The chosen stressor was the loss of a loved one because “the loss of a partner is considered one of the most severely stressful life events and is likely to affect most people, independently of coping mechanisms” (p. 1-2). The authors looked at Danish hospital records for people who were diagnosed with atrial fibrillation (AF) for the first time and correlated that data with bereavement information. AF increases the risk of death due to stroke or heart failure.

The people who underwent loss had an increased risk to develop AF for 1 year after the loss. The risk was more pronounced in the first 8-14 days after the loss, the bereaved people having a 90% higher risk of developing AF than non-bereaved people. By the end of the first month the risk had declined, but still was a whooping 41% higher than the average. Only 1 year after the loss the risk of developing AF was similar to that of non-bereaved people.

The risk was even higher in young people or if the death of the partner was unexpected. The authors also looked to see if other variables play a role in the risk, like gender, civil status, education, diabetes, or cardiovascular medication and none influenced the results.

I suspect the number of people that have heart problems after major stress is actually a lot higher because of the under-reporting bias. In other words, not everybody who feels their heart aching would go to the hospital, particularly in the first couple of weeks after losing a loved one.

As for the mechanism, there is some data pointing to some stress hormones (like adrenaline or cortisol) which can damage the heart. Other substances released in abundance during stress and likely to act in concert with the stress hormones are proinflamatory cytokines which also can lead to arrhythmias.

Reference: Graff S, Fenger-Grøn M, Christensen B, Søndergaard Pedersen H, Christensen J, Li J, & Vestergaard M (2016). Long-term risk of atrial fibrillation after the death of a partner. Open Heart, 3: e000367. doi:10.1136/openhrt-2015-000367. Article  | FREE FULTEXT PDF

By Neuronicus, 16 April 2016

Eating high-fat dairy may lower your risk of being overweight

84 - CopyMany people buy low-fat dairy, like 2% milk, in the hopes that ingesting less fat means that they will be less fattier.

Contrary to this popular belief, a new study found that consumption of high-fat dairy lowers the risk of weight gain by 8% in middle-aged and elderly women.

Rautiainen et al. (2016) studied 18 438 women over 45 years old who did not have cancer, diabetes or cardiovascular diseases. They collected data on the women’s weight, eating habits, smoking, alcohol use, physical activity, medical history, hormone use, and vitamin intake for  8 to 17 years. “Total dairy product intake was calculated by summing intake of low-fat dairy products (skim and low-fat milk, sherbet, yogurt, and cottage and ricotta cheeses) and high-fat dairy products (whole milk, cream, sour cream, ice cream, cream cheese, other cheese, and butter)” (p. 980).

At the beginning of the study, all women included in the analyses were normal weight.

Over the course of the study, all women gained some weight, probably as a result of normal aging.

Women who ate more dairy gained less weight than women who didn’t. This finding is due to the high-fat dairy intake; in other words, women who ate high-fat dairy gained less weight compared to the women who consumed low-fat dairy. Skimmed milk seemed to be the worst for weight gain compared to low-fat yogurt.

I did not notice any speculation as to why this may be the case, so I’ll offer one: maybe the people who eat high-fat dairy get more calories from the same amount of food so maybe they eat less overall.

Reference: Rautiainen S, Wang L, Lee IM, Manson JE, Buring JE, & Sesso HD (Apr 2016, Epub 24 Feb 2016). Dairy consumption in association with weight change and risk of becoming overweight or obese in middle-aged and older women: a prospective cohort study. The American Journal of Clinical Nutrition, 103(4): 979-988. doi: 10.3945/ajcn.115.118406. Article | FREE FULLTEXT PDF | SuppData

By Neuronicus, 7 April 2016

Cats and uncontrollable bursts of rage in humans

angry cat - Copy (2)

That many domestic cats carry the parasite Toxoplasma gondii is no news. Nor is the fact that 30-50% of the global population is infected with it, mainly as a result of contact with cat feces.

The news is that individuals with toxoplasmosis are a lot more likely to have episodes of uncontrollable rage. It was previously known that toxoplasmosis is associated with some psychological disturbances, like personality changes or cognitive impairments. In this new longitudinal study (that means a study that spanned more than a decade) published three days ago, Coccaro et al. (2016) tested 358 adults with or without psychiatric disorders for toxoplasmosis. They also submitted the subjects to a battery of psychological tests for anxiety, impulsivity, aggression, depression, and suicidal behavior.

The results showed that the all the subjects who were infected with T. gondii had higher scores on aggression, regardless of their mental status. Among the people with toxoplasmosis, the aggression scores were highest in the patients previously diagnosed with intermittent explosive disorder, a little lower in patients with non-aggressive psychiatric disorders, and finally lower (but still significantly higher than non-infected people) in healthy people.

The authors are adamant in pointing out that this is a correlational study, therefore no causality direction can be inferred. So don’t kick out you felines just yet. However, as CDC points out, a little more care when changing the cat litter or a little more vigorous washing of the kitchen counters would not hurt anybody and may protect against T. gondii infection.

Reference: Coccaro EF, Lee R, Groer MW, Can A, Coussons-Read M, & Postolache TT (23 march 2016). Toxoplasma gondii Infection: Relationship With Aggression in Psychiatric Subjects. The Journal of Clinical Psychiatry, 77(3): 334-341. doi: 10.4088/JCP.14m09621. Article Abstract | FREE Full Text | The Guardian cover

By Neuronicus, 26 March 2016

Younger children in a grade are more likely to be diagnosed with ADHD

AHDH immaturity - Copy.jpgA few weeks ago I was drawing attention to the fact that some children diagnosed with ADHD do not have attention deficits. Instead, a natural propensity for seeking more stimulation may have led to overdiagnosing and overmedicating these kids.

Another reason for the dramatic increase in ADHD diagnosis over the past couple of decades may stem in the increasingly age-inappropriate demands that we place on children. Namely, children in the same grade can be as much as 1 year apart in chronological age, but at these young ages 1 year means quite a lot in terms of cognitive and behavioral development. So if we put a standard of expectations based on how the older children behave, then the younger children in the same grade would fall short of these standards simply because they are too immature to live up to them.

So what does the data say? Two studies, Morrow et al. (2012) and Chen et al. (2016) checked to see if the younger children in a given grade are more likely to be diagnosed with ADHD and/or medicated. The first study was conducted in almost 1 million Canadian children, aged 6-12 years and the second investigated almost 400,000 Taiwanese children, aged 4-17 years.

In Canada, the cut-off for starting school in Dec. 31. Which means that in the first grade, a child born in January is almost a year older that a child born in December. Morrow et al. (2012) concluded that the children born in December were significantly more likely to receive a diagnosis of ADHD than those born in January (30% more likely for boys and 70% for girls). Moreover, the children born in December were more likely to be given an ADHD medication prescription (41% more likely for boys and 77% for girls).

In Taiwan, the cut-off date for starting school in August 31. Similar to the Canadian study, Chen et al. (2016) found that the children born in August were more likely to be diagnosed with ADHD and receive ADHD medication than the children born in September.

Now let’s be clear on one thing: ADHD is no trivial matter. It is a real disorder. It’s an incredibly debilitating disease for both children and their parents. Impulsivity, inattention and hyperactivity are the hallmarks of almost every activity the child engages in, leading to very poor school performance (the majority cannot get a college degree) and hard family life, plus a lifetime of stigma that brings its own “gifts” such as marginalization, loneliness, depression, anxiety, poor eating habits, etc.

The data presented above favors the “immaturity hypothesis” which posits that the behaviors expected out of some children cannot be performed not because something is wrong with them, but because they are simply too immature to be able to perform those behaviors. That does not mean that every child diagnosed with ADHD will just grow out of it; the researchers just point to the fact that ignoring the chronological age of the child coupled with prematurely entering a highly stressful and demanding system as school might lead to ADHD overdiagnosis.

Bottom line: ignoring the chronological age of the child might explain some of increase in prevalence of ADHD by overdiagnostication (in US alone, the rise is from 6% of children diagnosed with ADHD in 2000 to 11-15% in 2015).


  1. Morrow RL, Garland EJ, Wright JM, Maclure M, Taylor S, & Dormuth CR. (17 Apr 2012, Epub 5 Mar 2012). Influence of relative age on diagnosis and treatment of attention-deficit/hyperactivity disorder in children. Canadian Medical Association Journal, 184 (7), 755-762, doi: 10.1503/cmaj.111619. Article | FREE PDF 
  1. Chen M-H, Lan W-H, Bai Y-M, Huang K-L, Su T-P, Tsai S-J, Li C-T, Lin W-C, Chang W-H, & Pan T-L, Chen T-J, & Hsu J-W. (10 Mar 2016). Influence of Relative Age on Diagnosis and Treatment of Attention-Deficit Hyperactivity Disorder in Taiwanese Children. The Journal of Pediatrics [Epub ahead print]. DOI: Article | FREE PDF

By Neuronicus, 14 March 2016