The Kallmann syndrome should be Maestre syndrome (or MSJJ)

Few of the Generation X people are unfamiliar with the super-hit/movie cult Twin Peaks. And from those, even fewer find the dwarf dancing in the Red Room not scary as hell. And yet even fewer know who the man singing “Sycamore Trees” in that room is. For memory refreshment, here is the clip.

The man with the unusual voice, to match all the rest of the… um… “unusualness”, is none other than Jimmy Scott, a phenomenal jazz singer. If you listen to him, particularly in the hit “I’m Afraid The Masquerade Is Over“, you’ll notice that, if you close your eyes, you might be confused on whether the voice belongs to an adult man or a very gifted prepubertal boy.

And that is because Jimmy Scott suffered his entire life from a rare and obscure disease, called the “Kallmann syndrome”. This is a genetic disorder that prevents a person to start or to fully complete puberty. And that is because they have low circulating sex hormones: testosterone in males and estrogen and progesterone in females. And that is because their hypothalamus does not produce enough or at the proper times gonadotropin-releasing hormone (GnRH). And that is because sometime during the first trimester of pregnancy, there was an abnormality in the development of the olfactory fibers. You might ask what does smell development has with sex hormones? Well, I’m glad you asked. The cells that will end up releasing GnRH during puberty need to migrate from where they are born (nasal epithelium) to hypothalamus. If something impedes said migration, like, say, a mass, then the cells cannot reach their destination and you get a person who cannot start or finish puberty, plus a few other symptoms (Teixeira et al., 2010).

As some of you might have already surmised, something else besides lack of puberty must happen to these people regarding their sense of smell. After all, their olfactory fibers got tangled during embryonic development, forming, probably, a benign tumor, a neuroma. Not surprisingly, the person ends up with severely diminished or absent sense of smell, information that can be clinically used for diagnosis (Yu et al., 2022).

The first one to notice that people with failure to start or fully complete puberty are also anosmic was Aureliano Maestre de San Juan, a Spanish scientist. Unfortunately, the syndrome he documented in 1856 was named not named after him, but after the German scientist Franz Joseph Kallmann who described it almost a century later, in 1944 (Martin et al., 2011). Kallmann not only he did not discover the syndrome himself, but he was a staunch supporter of “racial hygiene”, advocating for finding and sterilizing relatives of people with schizophrenia so to eradicate the disease from future generations. Ironically, he fled Germany in 1939 because he was of Jewish heritage into a country which enthusiastically embraced eugenics and performed their own sterilizations programs, the USA (Benbassat, 2016).

So, I hereby propose, in an obscure unvisited corner of the Internet, to rename the disease the Maestre syndrome or MSJJ, after the guy who actually noticed it the first time and published about it. Besides, it’s his birthday today, having been born on October 17, 1828.

REFERENCES (in order of appearance):

Yu B, Chen K, Mao J, Hou B, You H, Wang X, Nie M,Huang Q, Zhang R, Zhu Y, Sun B, Feng F, Zhou W, & Wu X (2022, Sep 22).The diagnostic value of the olfactory evaluation for congenital hypogonadotropic hypogonadism. Frontiers in Endocrinology (Lausanne). 2022; 13: 909623. doi: 10.3389/fendo.2022.909623, PMCID: PMC9523726, PMID: 36187095. ARTICLE | FREE FULLTEXT PDF

Teixeira L, Guimiot F, Dodé C, Fallet-Bianco C, Millar RP, Delezoide A-L, & Hardelin J-P (2010, Oct 1, Published online 2010 Sep 13). Defective migration of neuroendocrine GnRH cells in human arrhinencephalic conditions. Journal of Clinical Investigation, 120(10): 3668–3672. doi: 10.1172/JCI43699, PMCID: PMC2947242, PMID: 20940512. ARTICLE | FREE FULLTEXT PDF

Benbassat, CA (2016, Published online 2016 Apr 19). Kallmann Syndrome: Eugenics and the Man behind the Eponym. Rambam Maimonides Medical Journal, 7(2): e0015. doi: 10.5041/RMMJ.10242, PMCID: PMC4839542, PMID: 27101217. ARTICLE | FREE FULLTEXT ARTICLE

Martin C, Balasubramanian R, Dwyer AA, Au MG, Sidis Y, Kaiser UB, Seminara SB, Pitteloud N,  Zhou Q-Y, & Crowley, Jr WF (2011, Published online 2010 Oct 29). The Role of the Prokineticin 2 Pathway in Human Reproduction: Evidence from the Study of Human and Murine Gene Mutations. Endocrine Reviews, 32(2): 225–246. doi: 10.1210/er.2010-0007, PMCID: PMC3365793, PMID: 21037178. ARTICLE | FREE FULLTEXT PDF

Original references which I couldn’t find, as they appear in Martin et al. (2011):

Maestre de San Juan A. (1856). Teratologia: Falta total de los nervios olfactorios con anosmia en un individuo en quien existia una atrofia congénita de los testículos y miembro viril. Siglo Medico, 3:211–221 [Google Scholar]

Kallmann F, Schoenfeld W & Barrera S (1944). The genetic aspects of primary eunuchoidism. Am J Ment Defic, 48:203–236 [Google Scholar]

By Neuronicus, 17 October 2022

Another puzzle piece in the autism mystery

Just like in the case of schizophrenia, hundreds of genes have been associated with autistic spectrum disorders (ASDs). Here is another candidate.

97autism - Copy

Féron et al. (2016) reasoned that most of the info we have about the genes that are behaving badly in ASDs comes from studies that used adult cells. Because ASDs are present before or very shortly after birth, they figured that looking for genetic abnormalities in cells that are at the very early stage of ontogenesis might prove to be enlightening. Those cells are stem cells. Of the pluripotent kind. FYI, based on what they can become (a.k.a how potent they are), the stem cells are divided into omipotent, pluripotent, multipotent, oligopotent, and unipotent. So the pluripotents are very ‘potent’ indeed, having the potential of producing a perfect person.

Tongue-twisters aside, the authors’ approach is sensible, albeit non-hypothesis driven. Which means they hadn’t had anything specific in mind when they had started looking for differences in gene expression between the olfactory nasal cells obtained from 11 adult ASDs sufferers and 11 age-matched normal controls. Luckily for them, as transcriptome studies have a tendency to be difficult to replicate, they found the anomalies in the expression of genes that have been already associated with ASD. But, they also found a new one, the MOCOS (MOlybdenum COfactor Sulfurase) gene, which was poorly expressed in ASDs (downregulated, in genetic speak). The enzyme is MOCOS (am I the only one who thinks that MOCOS isolated from nasal cells is too similar to mucus? is the acronym actually a backronym?).

The enzyme is not known to play any role in the nervous system. Therefore, the researchers looked to see where the gene is expressed. Its enzyme could be found all over the brain of both mouse and human. Also, in the intestine, kidneys, and liver. So not much help there.

Next, the authors deleted this gene in a worm, Caenorhabditis elegans, and they found out that the worm’s cells have issues in dealing with oxidative stress (e.g. the toxic effects of free radicals). In addition, their neurons had abnormal synaptic transmission due to problems with vesicular packaging.

Then they managed – with great difficulty – to produce human induced pluripotent cells (iPSCs) in a Petri dish in which the gene MOCOS was partially knocked down. ‘Partially’, because the ‘totally’ did not survive. Which tells us that MOCOS is necessary for survival of iPSCs. The mutant cells had less synaptic buttons than the normal cells, meaning they formed less synapses.

The study, besides identifying a new candidate for diagnosis and treatment, offers some potential explanations for some beguiling data that other studies have brought forth, like the fact that all sorts of neurotransmitter systems seem to be impaired in ADSs, all sorts of brain regions, making very hard to grab the tiger by the tail if the tiger is sprouting a new tail when you look at it, just like the Hydra’s heads. But, discovering a molecule that is involved in an ubiquitous process like synapse formation may provide a way to leave the tiger’s tail(s) alone and focus on the teeth. In the authors’ words:

“As a molecule involved in the formation of dense core vesicles and, further down, neurotransmitter secretion, MOCOS seems to act on the container rather than the content, on the vehicle rather than one of the transported components” (p. 1123).

The knowledge uncovered by this paper makes a very good piece of the ASDs puzzle. Maybe not a corner, but a good edge. Alright, even if it’s not an edge, at least it’s a crucial piece full of details, not one of those sky pieces.

Reference: Féron F, Gepner B, Lacassagne E, Stephan D, Mesnage B, Blanchard MP, Boulanger N, Tardif C, Devèze A, Rousseau S, Suzuki K, Izpisua Belmonte JC, Khrestchatisky M, Nivet E, & Erard-Garcia M (Sep 2016, Epub 4 Aug 2016). Olfactory stem cells reveal MOCOS as a new player in autism spectrum disorders. Molecular Psychiatry, 21(9):1215-1224. PMID: 26239292, DOI: 10.1038/mp.2015.106. ARTICLE | FREE FULLTEXT PDF

By Neuronicus, 31 August 2016

Can you tickle yourself?

As I said before, with so many science outlets out there, it’s hard to find something new and interesting to cover that hasn’t been covered already. Admittedly, sometimes some new paper comes out that is so funny or interesting that I too fall in line with the rest of them and cover it. But, most of the time, I try to bring you something that you won’t find it reported by other science journalists. So, I’m sacrificing the novelty for originality by choosing something from my absolutely huge article folder (about 20 000 papers).

And here is the gem for today, titled enticingly “Why can’t you tickle yourself?”. Blakemore, Wolpert & Frith (2000) review several papers on the subject, including some of their own, and arrive to the conclusion that the reason you can’t tickle yourself is because you expect it. Let me explain: when you do a movement that results in a sensation, you have a pretty accurate expectation of how that’s going to feel. This expectation then dampens the sensation, a process probably evolved to let you focus on more relevant things in the environment that on what you’re doing to yourself (don’t let your mind go all dirty now, ok?).

Mechanistically speaking, it goes like this: when you move your arm to tickle your foot, a copy of the motor command you gave to the arm (the authors call this “efference copy”) goes to a ‘predictor’ region of the brain (the authors believe this is the cerebellum) that generates an expectation (See Fig. 1). Once the movement has been completed, the actual sensation is compared to the expected one. If there is a discrepancy, you get tickled, if not, not so much. But, you might say, even when someone else is going to tickle me I have a pretty good idea what to expect, so where’s the discrepancy? Why do I still get tickled when I expect it? Because you can’t fool your brain that easily. The brain then says; “Alright, alright, we expect tickling. But do tell me this, where is that motor command? Hm? I didn’t get any!” So here is your discrepancy: when someone tickles you, there is the sensation, but no motor command, signals 1 and 2 from the diagram are missing.

93 - Copy
Fig. 1. My take on the tickling mechanism after Blakemore, Wolpert & Frith (2000). Credits. Picture: Sobotta 1909, Diagram: Neuronicus 2016. Data: Blakemore, Wolpert & Frith (2002). Overall: Public Domain

Likewise, when someone tickles you with your own hand, there is an attenuation of sensation, but is not completely disappeared, because there is some registration in the brain regarding the movement of your own arm, even if it was not a motor command initiated by you. So you get tickled just a little bit. The brain is no fool: is aware of who had done what and with whose hands (your dirty mind thought that, I didn’t say it!) .

This mechanism of comparing sensation with movement of self and others appears to be impaired in schizophrenia. So when these patients say that “I hear some voices and I can’t shut them up” or ” My hand moved of its own accord, I had no control over it”, it may be that they are not aware of initiating those movements, the self-monitoring mechanism is all wacky. Supporting this hypothesis, the authors conducted an fMRI experiment (Reference 2) where they showed that that the somatosensory and the anterior cingulate cortices show reduced activation when attempting to self-tickle as opposed to being tickled by the experimenter (please, stop that line of thinking…). Correspondingly, the behavioral portion of the experiment showed that the schizophrenics can tickle themselves. Go figure!

94 - Copy

Reference 1: Blakemore SJ, Wolpert D, & Frith C (3 Aug 2000). Why can’t you tickle yourself? Neuroreport, 11(11):R11-6. PMID: 10943682. ARTICLE FULLTEXT

Reference 2: Blakemore SJ, Smith J, Steel R, Johnstone CE, & Frith CD (Sep 2000, Epub 17 October 2000). The perception of self-produced sensory stimuli in patients with auditory hallucinations and passivity experiences: evidence for a breakdown in self-monitoring. Psychological Medicine, 30(5):1131-1139. PMID: 12027049. ARTICLE

By Neuronicus, 7 August 2016

Transcranial direct current stimulation & cognitive enhancement

There’s so much research out there… So much that some time ago I learned that in science, as probably in other fields too, one has only to choose a side of an argument and then, provided that s/he has some good academic search engines skills and institutional access to journals, get the articles that support that side. Granted, that works for relatively small questions restricted to narrow domains, like “is that brain structure involved in x” or something like that; I doubt you would be able to find any paper that invalidates theories like gravity or central dogma of molecular biology (DNA to RNA to protein).

If you’re a scientist trying to answer a question, you’ll probably comb through some dozens papers and form an opinion of your own after weeding out the papers with small sample sizes, the ones with shoddy methodology or simply the bad ones (yes, they do exists, even scientists are people and hence prone to mistakes). And if you’re not a scientist or the question you’re trying to find an answer for is not from your field, then you’ll probably go for reviews or meta-analyses.

Meta-analyses are studies that look at several papers (dozens or hundreds), pool their data together and then apply some complicated statistics to see the overall results. One such meta-analysis concerns the benefits, if any, of transcranial direct current stimulation (tDCS) on working memory (WM) in healthy people.

tDCS is a method of applying electrical current through some electrodes to your neurons to change how they work and thus changing some brain functions. It is similar with repetitive transcranial magnetic stimulation (rTMs), only in the latter case the change in neuronal activity is due to the application of a magnetic field.

Some people look at these methods not only as possible treatment for a variety of disorders, but also as cognitive enhancement tools. And not only by researchers, but also by various companies who sell the relatively inexpensive equipment to gamers and others. But does tDCS work in the first place?

92 conf - Copy (2)

Mancuso et al. (2016) say that there have been 3 recent meta-analyses done on this issue and they found that “the effects [of tDCS on working memory in healthy volunteers] are reliable though small (Hill et al., 2016), partial (Brunoni & Vanderhasselt, 2014), or nonexistent (Horvath et al., 2015)” (p. 2). But they say these studies are somewhat flawed and that’s why they conducted their own meta-analysis, which concludes that “the true enhancement potential of tDCS for WM remains somewhat uncertain” (p.19). Maybe it works a little bit if used during the training phase of a working memory task, like n-back, and even then that’s a maybe…

Boring, you may say. I’ll grant you that. So… all that work and it revealed virtually nothing new! I’ll grant you that too. But what this meta-analysis brings new, besides adding some interesting statistics, like controlling for publication bias, is a nice discussion as to why they didn’t find nothing much, exploring possible causes, like the small sample and effects sizes, which seem to plague many behavioral studies. Another explanation which, to tell you the truth, the authors do not seem to be too enamored with is that, maybe, just maybe, simply, tDCS doesn’t have any effect on working memory, period.

Besides, papers with seemingly boring findings do not catch the media eye, so I had to give it a little attention, didn’t I 😉 ?

Reference: Mancuso LE, Ilieva IP, Hamilton RH, & Farah MJ. (Epub 7 Apr 2016, Aug 2016) Does Transcranial Direct Current Stimulation Improve Healthy Working Memory?: A Meta-analytic Review. Journal of Cognitive Neuroscience, 28(8):1063-89. PMID: 27054400, DOI: 10.1162/jocn_a_00956. ARTICLE

 By Neuronicus, 2 August 2016

One parent’s gene better than the other’s

Not all people with the same bad genetic makeup that predisposes them to a particular disease go and develop that disease or, at any rate, not with the same severity and prognosis. The question is why? After all, they have the same genes…

Here comes a study that answers that very important question. Eloy et al. (2016) looked at the most common pediatric eye cancer (1 in 15,000) called retinoblastoma (Rb). In the hereditary form of this cancer, the disease occurs if the child carries mutant (i.e. bad) copies of the RB1 tumour suppressor gene located on chromosome 13 (13q14). These copies, called alleles, are inherited by the child from the mother or from the father. But some children with this genetic disadvantage do not develop Rb. They should, so why not?

The authors studied 57 families with Rb history. They took blood and tumour samples from the participants and then did a bunch of genetic tests: DNA, RNA, and methylation analyses.

They found out that when the RB1 gene is inherited from the mother, the child has only 9.7% chances of developing Rb, but when the gene is inherited from the father the child has only 67.5% chances of developing Rb.

The mechanism for this different outcomes may reside in the differential methylation of the gene. Methylation is a chemical process that suppresses the expression of a gene, meaning that less protein is produced from that gene. The maternal gene had less methylation, meaning that more protein was produced, which was able to offer some protection against the cancer. Seems counter-intuitive, you’d think less bad protein is a good thing, but there is a long and complicated explanation for that, which, in a very simplified form, posits that other events influence the function of the resultant protein.

Again, epigenetics seem to offer explanations for pesky genetic inheritance questions. Epigenetic processes, like DNA methylation, are modalities through which traits can be inherited that are not coded in the DNA itself.

RB - Copy

Reference: Eloy P, Dehainault C, Sefta M, Aerts I, Doz F, Cassoux N, Lumbroso le Rouic L, Stoppa-Lyonnet D, Radvanyi F, Millot GA, Gauthier-Villars M, & Houdayer C (29 Feb 2016). A Parent-of-Origin Effect Impacts the Phenotype in Low Penetrance Retinoblastoma Families Segregating the c.1981C>T/p.Arg661Trp Mutation of RB1. PLoS Genetics, 12(2):e1005888. eCollection 2016. PMID: 26925970, PMCID: PMC4771840, DOI: 10.1371/journal.pgen.1005888. ARTICLE | FREE FULLTEXT PDF

By Neuronicus, 24 July 2016

Intracranial recordings in human orbitofrontal cortex

How is reward processed in the brain has been of great interest to neuroscience because of the relevance of pleasure (or lack of it) to a plethora of disorders, from addiction to depression. Among the cortical areas (that is the surface of the brain), the most involved structure in reward processing is the orbitofrontal cortex (OFC). Most of the knowledge about the human OFC comes from patients with lesions or from imaging studies. Now, for the first time, we have insights about how and when the OFC processes reward from a group of scientists that studied it up close and personal, by recording directly from those neurons in the living, awake, behaving human.

Li et al. (2016) gained access to six patients who had implanted electrodes to monitor their brain activity before they went into surgery for epilepsy. All patients’ epilepsy foci were elsewhere in the brain, so the authors figured the overall function of OFC is relatively intact.

While recording directly form the OFC the patients performed a probabilistic monetary reward task: on a screen, 5 visually different slot machine appeared and each machine had a different probability of winning 20 Euros (0% chances, 25%, 50%, 75% and 100%), fact that has not been told to the patients. The patients were asked to press a button if a particular slot machine is more likely to give money. Then they would use the slot machine and the outcome (win 20 or 0 Euros) would appear on the screen. The patients figured out quickly which slot machine is which, meaning they ‘guessed’ correctly the probability of being rewarded or not after only 1 to 4 trails (generally, learning is defined in behavioral studies as > 80% correct responses). The researchers also timed the patients during every part of the task.

Not surprisingly, the subjects spent more time deciding whether or not the 50% chance of winning slot machine was a winner or not than in all other 4 possibilities. In other words, the more riskier the choice, the slower the time reaction to make that choice.

The design of the task allowed the researchers to observe three 3 phases which were linked with 3 different signals in the OFC:

1) the expected value phase where the subjects saw the slot machine and made their judgement. The corresponding signal showed an increase in the neurons’ firing about 400 ms after the slot machine appeared on the screen in moth medial and lateral OFC.

2) the risk or uncertainty phase, when subjects where waiting for the slot machine to stop its spinners and show whether they won or not (1000-1500 ms). They called the risk phase because both medial and lateral OFC had the higher responses when there was presented the riskiest probability, i.e. 50% chance. Unexpectedly, the OFC did not distinguish between the winning and the non-wining outcomes at this phase.

3) the experienced value or outcome phase when the subjects found out whether they won or not. Only the lateral OFC responded during this phase, that is immediately upon finding if the action was rewarded or not.

For the professional interested in precise anatomy, the article provides a nicely detailed diagram with the locations of the electrodes in Fig. 6.

The paper is also covered for the neuroscientists’ interest (that is, is full of scientific jargon) by Kringelbach in the same Journal, a prominent neuroscientist mostly known for his work in affective neuroscience and OFC. One of the reasons I also covered this paper is that both its full text and Kringelbach’s commentary are behind a paywall, so I am giving you a preview of the paper in case you don’t have access to it.

81 ofc - Copy

Reference: Li Y, Vanni-Mercier G, Isnard J, Mauguière F & Dreher J-C (1 Apr 2016, Epub 25 Jan 2016). The neural dynamics of reward value and risk coding in the human orbitofrontal cortex. Brain, 139(4):1295-1309. DOI: http://dx.doi.org/10.1093/brain/awv409. Article

By Neuronicus, 25 March 2016

Younger children in a grade are more likely to be diagnosed with ADHD

AHDH immaturity - Copy.jpgA few weeks ago I was drawing attention to the fact that some children diagnosed with ADHD do not have attention deficits. Instead, a natural propensity for seeking more stimulation may have led to overdiagnosing and overmedicating these kids.

Another reason for the dramatic increase in ADHD diagnosis over the past couple of decades may stem in the increasingly age-inappropriate demands that we place on children. Namely, children in the same grade can be as much as 1 year apart in chronological age, but at these young ages 1 year means quite a lot in terms of cognitive and behavioral development. So if we put a standard of expectations based on how the older children behave, then the younger children in the same grade would fall short of these standards simply because they are too immature to live up to them.

So what does the data say? Two studies, Morrow et al. (2012) and Chen et al. (2016) checked to see if the younger children in a given grade are more likely to be diagnosed with ADHD and/or medicated. The first study was conducted in almost 1 million Canadian children, aged 6-12 years and the second investigated almost 400,000 Taiwanese children, aged 4-17 years.

In Canada, the cut-off for starting school in Dec. 31. Which means that in the first grade, a child born in January is almost a year older that a child born in December. Morrow et al. (2012) concluded that the children born in December were significantly more likely to receive a diagnosis of ADHD than those born in January (30% more likely for boys and 70% for girls). Moreover, the children born in December were more likely to be given an ADHD medication prescription (41% more likely for boys and 77% for girls).

In Taiwan, the cut-off date for starting school in August 31. Similar to the Canadian study, Chen et al. (2016) found that the children born in August were more likely to be diagnosed with ADHD and receive ADHD medication than the children born in September.

Now let’s be clear on one thing: ADHD is no trivial matter. It is a real disorder. It’s an incredibly debilitating disease for both children and their parents. Impulsivity, inattention and hyperactivity are the hallmarks of almost every activity the child engages in, leading to very poor school performance (the majority cannot get a college degree) and hard family life, plus a lifetime of stigma that brings its own “gifts” such as marginalization, loneliness, depression, anxiety, poor eating habits, etc.

The data presented above favors the “immaturity hypothesis” which posits that the behaviors expected out of some children cannot be performed not because something is wrong with them, but because they are simply too immature to be able to perform those behaviors. That does not mean that every child diagnosed with ADHD will just grow out of it; the researchers just point to the fact that ignoring the chronological age of the child coupled with prematurely entering a highly stressful and demanding system as school might lead to ADHD overdiagnosis.

Bottom line: ignoring the chronological age of the child might explain some of increase in prevalence of ADHD by overdiagnostication (in US alone, the rise is from 6% of children diagnosed with ADHD in 2000 to 11-15% in 2015).

References:

  1. Morrow RL, Garland EJ, Wright JM, Maclure M, Taylor S, & Dormuth CR. (17 Apr 2012, Epub 5 Mar 2012). Influence of relative age on diagnosis and treatment of attention-deficit/hyperactivity disorder in children. Canadian Medical Association Journal, 184 (7), 755-762, doi: 10.1503/cmaj.111619. Article | FREE PDF 
  1. Chen M-H, Lan W-H, Bai Y-M, Huang K-L, Su T-P, Tsai S-J, Li C-T, Lin W-C, Chang W-H, & Pan T-L, Chen T-J, & Hsu J-W. (10 Mar 2016). Influence of Relative Age on Diagnosis and Treatment of Attention-Deficit Hyperactivity Disorder in Taiwanese Children. The Journal of Pediatrics [Epub ahead print]. DOI: http://dx.doi.org/10.1016/j.jpeds.2016.02.012 Article | FREE PDF

By Neuronicus, 14 March 2016

Not all children diagnosed with ADHD have attention deficits

Given the alarming increase in the diagnosis of attention deficit/hyperactivity disorder (ADHD) over the last 20 years, I thought pertinent to feature today an older paper, from the year 2000.

Dopamine, one of the chemicals that the neurons use to communicate, has been heavily implicated in ADHD. So heavily in fact that Ritalin, the main drug used for the treatment of ADHD, has its main effects by boosting the amount of dopamine in the brain.

Swanson et al. (2000) reasoned that people with a particular genetic abnormality that makes their dopamine receptors work less optimally may have more chances to have ADHD. The specialist reader may want to know that the genetic abnormality in question refers to a 7-repeat allele of a 48-bp variable number of tandem repeats in exon 3 of the dopamine receptor number 4 located on chromosome 11, whose expression results in a weaker dopamine receptor. We’ll call it DRD4,7-present as opposed to DRD4,7-absent (i.e. people without this genetic abnormality).

They had access to 96 children diagnosed with ADHD after the diagnostic criteria of DSM-IV and 48 matched controls (children of the same gender, age, school affiliation, socio-economic status etc. but without ADHD). About half of the children diagnosed with ADHD had the DRD4,7-present.

The authors tested the children on 3 tasks:

(i) a color-word task to probe the executive function network linked to anterior cingulate brain regions and to conflict resolution;
(ii) a cued-detection task to probe the orienting and alerting networks linked to posterior parietal and frontal brain regions and to shifting and maintenance of attention; and
(iii) a go-change task to probe the alerting network (and the ability to initiate a series of rapid response in a choice reaction time task), as well as the executive network (and the ability to inhibit a response and re-engage to make another response) (p. 4756).

Invalidating the authors’ hypothesis, the results showed that the controls and the DRD4,7-present had similar performance at these tasks, in contrast to the DRD4,7-absent who showed “clear abnormalities in performance on these neuropsychological tests of attention” (p. 4757).

This means two things:
1) Half of the children diagnosed with ADHD did not have an attention deficit.
2) These same children had the DRD4,7-present genetic abnormality, which has been previously linked with novelty seeking and risky behaviors. So it may be just possible that these children do not suffer from ADHD, but “may be easily bored in the absence of highly stimulating conditions, may show delay aversion and choose to avoid waiting, may have a style difference that is adaptive in some situations, and may benefit from high activity levels during childhood” (p. 4758).

Great paper and highly influential. The last author of the article (meaning the chief of the laboratory) is none other that Michael I. Posner, whose attentional networks, models, and tests feature every psychology and neuroscience textbook. If he doesn’t know about attention, then I don’t know who is.

One of the reasons I chose this paper is because it seems to me that a lot of teachers, nurses, social workers, or even pediatricians feel qualified to scare the living life out of parents by suggesting that their unruly child may have ADHD. In deference to most form the above-mentioned professions, the majority of people recognize their limits and tell the concerned parents to have the child tested by a qualified psychologist. And, unfortunately, even that may result in dosing your child with Ritalin needlessly when the child’s propensity toward a sensation-seeking temperament and extravert personality, may instead require a different approach to learning with a higher level of stimulation (after all, the children form the above study had been diagnosed by qualified people using their latest diagnosis manual).

Bottom line: beware of any psychologist or psychiatrist who does not employ a battery of attention tests when diagnosing your child with ADHD.

93 adhd - Copy

Reference: Swanson J, Oosterlaan J, Murias M, Schuck S, Flodman P, Spence MA, Wasdell M, Ding Y, Chi HC, Smith M, Mann M, Carlson C, Kennedy JL, Sergeant JA, Leung P, Zhang YP, Sadeh A, Chen C, Whalen CK, Babb KA, Moyzis R, & Posner MI. (25 April 2000). Attention deficit/hyperactivity disorder children with a 7-repeat allele of the dopamine receptor D4 gene have extreme behavior but normal performance on critical neuropsychological tests of attention. Proceedings of the National Academy of Sciences of the United States of America, 97(9):4754-4759. doi: 10.1073/pnas.080070897. Article | FREE FULLTEXT PDF

P.S. If you think that “weeell, this research happened 16 years ago, surely something came out of it” then think again. The newer DSM-V’s criteria for diagnosis are likely to cause an increase in the prevalence of diagnosis of ADHD.

By Neuronicus, 26 February 2016

I am blind, but my other personality can see

58depression-388872_960_720

This is a truly bizarre report.

A woman named BT suffered an accident when she was 20 years old and she became blind. Thirteen year later she was referred to Bruno Waldvogel (one of the two authors of the paper) for psychotherapy by a psychiatry clinic who diagnosed her with dissociative identity disorder, formerly known as multiple personality disorder.

The cortical blindness diagnosis has been established after extensive ophtalmologic tests in which she appeared blind but not because of damage to the eyes. So, by inference, it had to be damage to the brain. Remarkably (we shall see later why), she had no oculomotor reflexes in response to glare. Moreover, visual evoked potentials (VEP is an EEG in the occipital region) showed no activity in the primary visual area of the brain (V1).

During the four years of psychotherapy, BT showed more than 10 distinct personalities. One of them, a teenage male, started to see words on a magazine and pretty soon could see everything. With the help of hypnotherapeutic techniques, more and more personalities started to see.

“Sighted and blind states could alternate within seconds” (Strasburger & Waldvogel, 2015).

The VEP showed no or very little activity when the blind personality was “on” and showed normal activity when the sighted personality was “on”. Which is extremely curious, because similar studies in people with psychogenic blindness or anesthetized showed intact VEPs.

There are a couple of conclusions from this: 1) BT was misdiagnosed, as is unlikely to be any brain damage because some personalities could see, and 2) Multiple personalities – or dissociate identities, as they are now called – are real in the sense that they can be separated at the biological level.

BEAR_10_04
The visual pathway that mediates conscious visual perception. a) A side view of the human brain with the retinogeniculocortical pathway shown inside (blue). b) A horizontal section through the brain exposing the same pathway.

Fascinating! The next question is, obviously, what’s the mechanism behind this? The authors say that it’s very likely the LGN (the lateral geniculate nucleus of the thalamus) which is the only relay between retina and V1 (see pic). It can be. Surely is possible. Unfortunately, so are other putative mechanisms, as 10% of the neurons in the retina also go to the superior colliculus, and some others go directly to the hypothalamus, completely bypassing the thalamus. Also, because it is impossible to have a precise timing on the switching between personalities, even if you MRI the woman it would be difficult to establish if the switching to blindness mode is the result of a bottom-up or a top-down modulation (i.e. the visual information never reaches V1, it reaches V1 and is suppressed there, or some signal form other brain areas inhibits V1 completely, so is unresponsive when the visual information arrives).

Despite the limitations, I would certainly try to get the woman into an fMRI. C’mon, people, this is an extraordinary subject and if she gave permission for the case study report, surely she would not object to the scanning.

Reference: Strasburger H & Waldvogel B (Epub 15 Oct 2015). Sight and blindness in the same person: Gating in the visual system. PsyCh Journal. doi: 10.1002/pchj.109.  Article | FULLTEXT PDF | Washington Post cover

By Neuronicus, 29 November 2015

The werewolf and his low fibroblast growth factor 13 levels

Petrus Gonsalvus, by anonymous
Petrus Gonsalvus, anonymous painting of the first recorded case of hypertrichosis in 1642. License: PD

Although they are very rare, werewolves do exist. And now the qualifier: werewolves as in people with excessive hair growth all over the body and not the more familiar kind that changes into a wolf every time there is a new moon. The condition is called hypertrichosis and its various forms have been associated with distinct genetic abnormalities.

In a previous report, DeStefano et al. (2013) identified the genetic locus of the X-linked congenital generalized hypertrichosis (CGH), which is a 19-Mb region on Xq24-27 that spans about 82 genes, resulting mainly from insertions from chromosomes 4 and 5. Now, they wanted to see what is the responsible mechanism for the disease. First, they looked at the hair follicles of a man afflicted with CGH that has hair almost all over his body and noticed some structural abnormalities. Then, they analyzed the expression of several genes from the affected region of the chromosome in this man and others with CGH and they observed that only the levels of the Fibroblast Growth Factor 13 (FGF13), a protein found in hair follicles, are much lower in CGH. Then they did some more experiments to establish the crucial role of FGF13 in regulating the follicle growth.

An interesting find of the study is that, at least in the case of hypertrichosis, is not the content of the genomic sequences that were added to chromosome X that matter, but their presence, affecting a gene that is located 1.2 Mb away from the insertion.

Reference: DeStefano GM, Fantauzzo KA, Petukhova L, Kurban M, Tadin-Strapps M, Levy B, Warburton D, Cirulli ET, Han Y, Sun X, Shen Y, Shirazi M, Jobanputra V, Cepeda-Valdes R, Cesar Salas-Alanis J, & Christiano AM ( 7 May 2013, Epub 19 Apr 2013). Position effect on FGF13 associated with X-linked congenital generalized hypertrichosis. Proceedings of the National Academy of Sciences of the U.S.A., 110(19):7790-5. doi: 10.1073/pnas.1216412110. Article | FREE FULLTEXT PDF

By Neuronicus, 17 November 2015

TMS decreases religiosity and ethnocentrism

Medieval knight dressed in an outfit with the Cross of St James of Compostela. From Galicianflag.
Medieval knight dressed in an outfit with the Cross of St. James of Compostela. Image from Galicianflag.

Rituals are anxiolytic; we developed them because they decrease anxiety. So it makes sense that when we feel the most stressed we turn to soothing ritualistic behaviors. Likewise, in times of threat, be it anywhere from war to financial depression, people show a sharp increase in adherence to political or religious ideologies.

Holbrook et al. (2015) used TMS (transcranial magnetic stimulation) to locally downregulate the activity of the posterior medial frontal cortex (which includes the dorsal anterior cingulate cortex and the dorsomedial prefrontal cortex), a portion of the brain the authors have reasons to believe is involved in augmenting the adherence to ideological convictions in times of threat.

They selected 38 U.S. undergraduates who scored similarly on political views (moderate or extremely conservative, the extremely liberals were excluded). Curiously, they did not measure religiosity prior to testing. Then, they submitted the subjects to a group prejudice test designed to increase ethnocentrism (read critique of USA written by an immigrant) and a high-level conflict designated to increase religiosity (reminder of death) while half of them received TMS and the other half received shams.

Under these conditions, the TMS decreased the belief in God and also the negative evaluations of the critical immigrant, compared to the people that received sham TMS.

The paper is, without doubt, interesting, despite the many possible methodological confounds. The authors themselves acknowledged some of the drawbacks in the discussion section, so regard the article as a pilot investigation. It doesn’t even have a picture with the TMS coordinates. Nevertheless, reducing someone’s religiosity and extremism by inactivating a portion of the brain… Sometimes I get afraid of my discipline.

Reference: Holbrook C, Izuma K, Deblieck C, Fessler DM, & Iacoboni M (Epub 4 Sep 2015). Neuromodulation of group prejudice and religious belief. Social Cognitive and Affective Neuroscience. DOI: 10.1093/scan/nsv107. Article | Research Gate full text PDF

By Neuronicus, 3 November 2015

Are you in love with an animal?

Sugar Candy Hearts by Petr Kratochvil. License: PD
Sugar Candy Hearts by Petr Kratochvil taken from publicdomainpictures. License: PD

Ren et al. (2015) gave sweet drink (Fanta), sweet food (Oreos), salty–vinegar food (Lays chips) or water to 422 people and then asked them about their romantic relationship; or, if they didn’t have one, about a hypothetical relationship. For hitched people, the foods or drinks had no effect on the evaluation of their relationship. In contrast, the singles who received sweets were more eager to initiate a relationship with a potential partner and evaluated more favorably a hypothetical relationship (how do you do that? I mean, if it’s hypothetical… why wouldn’t you evaluate it favorably from your singleton perspective?) Anyway, the singles who got sweets tend see things a little more on the rosy side, as opposed to the taken ones.

The rationale for doing this experiment is that metaphors alter our perceptions (fair enough). Given that many terms of endearment include reference to the taste of sweet, like “Honey”, “Sugar” or “Sweetie”, maybe this is not accidental or just a metaphor and, if we manipulate the taste, we manipulate the perception. Wait, what? Now re-read the finding above.

The authors take their results as supporting the view that “metaphorical thinking is one fundamental way of perceiving the world; metaphors facilitate social cognition by applying concrete concepts (e.g., sweet taste) to understand abstract concepts (e.g., love)” (p. 916).

So… I am left with many questions, the first being: if the sweet appelatives in a romantic relationship stem from an extrapolation of the concrete taste of sweet to an abstract concept like love, then, I wonder, what kind of concrete concept is being underlined in the prevalence of “baby” as a term of endearment? Do I dare speculate what the metaphor stands for? Should people who are referred to as “baby” by their partners alert the authorities for a possible pedophile ideation? And what do we do about the non-English cultures (apparently non-Germanic or non-Mandarin too) in which the lovey-dovey terms tend to cluster around various small objects (e.g. tassels), vegetables (e.g. pumpkin), cute onomatopoeics (I am at a loss for transcription here), or baby animals (e.g. chick, kitten, puppy). Believe me, such cultures do exist and are numerous. “Excuse me, officer, I suspect my partner is in love with an animal. Oh, wait, that didn’t come out right…”

Ok, maybe I missed something with this paper, as half-way through I failed to maintain proper focus due to an intruding – and disturbing! – image of a man, a chicken, and a tassel. So take the authors’ words when they say that their study “not only contributes to the literature on metaphorical thinking but also sheds light on an understudied factor that influences relationship initiation, that of taste” (p. 918). Oh, metaphors, how sweetly misleading you are…

Please use the “Comments” section below to share the strangest metaphor used as term of endearment you have ever heard in a romantic relationship.

Reference: Ren D, Tan K, Arriaga XB, & Chan KQ (Nov 2015). Sweet love: The effects of sweet taste experience on romantic perceptions. Journal of Social and Personal Relationships, 32(7): 905 – 921. DOI: 10.1177/0265407514554512. Article | FREE FULLTEXT PDF

By Neuronicus, 21 October 2015

Have we missed the miracle painkiller?

How a classic pain scale would look to a person with congenital insensitivity to pain.
How a classic pain scale would look to a person with congenital insensitivity to pain.

Pain insensitivity has been introduced to the larger public via TV shows from the medical drama genre (House, ER, Gray’s Anatomy, and the like). It seems fascinating to explore the consequences of a life without pain. But these shows do not feature, quite understandably, the gruesome aspects of this rare and incredibly life threatening disorder. For example, did you know that sometimes the baby teeth of these people are extracted before they reach 1 year old so they stop biting their fingers and tongues off? Or that a good portion of the people born with pain insensitivity die before reaching adulthood?

Nahorski et al. (2015) discovered a new disorder that includes pain insensitivity, along with touch insensitivity, cognitive delay, and severe other disabilities. They investigated a family where the husband and wife are double first cousins and produced offsprings. The authors had access to all the family’s DNA, including the children. Extensive analyses revealed a mutation on the gene CLTCL1 that encodes for the protein CHC22. This protein is required for the normal development of the cells that fell pain and touch, among other things.

Other genetic studies into various syndromes of painlessness have produced data that lead to discovery of new analgesics. Therefore, the hope with this study is that CHC22 may become a target for a future painkiller discovery.

But, on the side note, what made me feature this paper is more than just the potential for new analgesics; is in the last paragraph of the paper: “rodents have lost CLTCL1 and thus must have alternative pathway(s) to compensate for this. Thus, some pain research results generated in these animals may not be applicable to man” (p. 2159).

The overwhelming majority of pain research and painkiller search is done in rodents. So…. how much from what we know from rodents and translate to humans doesn’t really apply? Worse yet, how many false negatives did we discard already? What if the panaceum universalis has been tried already in mice and nobody knows what it is because it didn’t work? It’s not like there is a database of negative results published somewhere where we can all ferret and, in the light of these new discoveries, give those loser chemicals another try…. Food for thought and yet ANOTHER reason why all research should be published, not just the positive results.

Reference: Nahorski MS, Al-Gazali L, Hertecant J, Owen DJ, Borner GH, Chen YC, Benn CL, Carvalho OP, Shaikh SS, Phelan A, Robinson MS, Royle SJ, & Woods CG. (August 2015, Epub 11 Jun 2015). A novel disorder reveals clathrin heavy chain-22 is essential for human pain and touch development. Brain, 138(Pt 8):2147-2160. doi: 10.1093/brain/awv149. Article | FREE FULLTEXT PDF

By Neuronicus, 20 October 2015

Really? That’s your argument?!

Photo by FreeStockPhotos.biz Collection. Released under FSP Standard License License
Photo by FreeStockPhotos.biz Collection. Released under FSP Standard License

I don’t believe there is a single human being that during an argument has not thought or exclaimed “Really? That’s your argument?” or something along those lines. The saying/attitude is meant to convey the emotional response (often contemptuous) to the identification of the opponent’s argument as weak and unworthy of debate. We seem to be very critical about other people’s reasoning when it does not match our own. On the other hand, we also seem to be a little more indulgent with the strength of our own arguments. This phenomenon has been dubbed “selective laziness”, as one is not so diligent in applying the stringent rules of rational thinking to his/her own line of argumentation.

But what happens when the argument that one so easily dismisses as invalid is one’s own? Trouche et al. (2015) managed to fool 47% (115 individuals) into believing that the arguments for a reasoning choice were their own, when, in point of fact, they were not (see Fig. 1). When asked to evaluate the “other” argument (which was their own), 56% (65 people, 27% of the whole sample) “rejected their own argument, choosing instead to stick to the answer that had been attributed to them. Moreover, these participants (Non-Detectors) were more likely to accept their own argument for the valid than for an invalid answer. These results shows that people are more critical of their own arguments when they think they are someone else’s, since they rejected over half of their own arguments when they thought that they were someone else’s”. (p. 8). I had to do this math on a PostIt, as authors were a little bit… lazy in reporting anything but percentages and no graphs.

Fig. 1 from Trouche et al. (2015). © 2015 Cognitive Science Society, Inc.
Fig. 1 from Trouche et al. (2015). © 2015 Cognitive Science Society, Inc.

The authors replicated their findings to address some limitations of the previous experiment, with similar results. And they provide some speculation about the adaptability of ‘selective laziness’, which, frankly, I think is baloney. Nevertheless, the paper quantifies and provides a way to study this reasoning bias we are all familiar with.

Reference: Trouche E, Johansson P, Hall L, & Mercier H. (9 October 2015). The Selective Laziness of Reasoning. Cognitive Science, 1-15. doi: 10.1111/cogs.12303. [Epub ahead of print]. Article | PDF

By Neuronicus, 15 October 2015

Dopamine role still not settled

vta pfc
No idea why the prefrontal cortex neuron is Australian, but here you go. Cartoon made by me with free (to the best of my knowledge) clipart elements. Feel free to use to your heart’s content.

There have been literally thousands of pages published about the dopamine function(s). Dopamine, which made its stage debut as the “pleasure molecule”, is a chemical produced by some neurons in your brain that is vital to its functioning. It has been involved in virtually all types of behavior and most diseases, from pain to pleasure, from mating to addiction, from working-memory to decision-making, from autism to Parkinson’s, from depression to schizophrenia.

Here is another account about what dopamine really does in the brain. Schwartenbeck et al. (2015) trained 26 young adults to play a game in which they had to decide whether to accept an initial offer of small change or to wait for a more substantial offer. If they waited too long, they would lose everything. After that, the subjects played the game in the fMRI. The authors argue that their clever game allows segregation between previously known roles of dopamine, like salience or reward prediction.

As expected with most fMRI studies, a brain salad lit up (that is, your task activated many other structures in addition to your region of interest), which the authors address only very briefly. Instead, they focus on the timing of activation of their near and dear midbrain dopamine neurons, which they cannot detect directly in the scanner because their cluster is too small, so they infer their location by proxy. Anyway, after some glorious mental (and mathematical) gymnastics Schwartenbeck et al. (2015) conclude that

1) “humans perform hierarchical probabilistic Bayesian inference” (p. 3434) (i.e. “I don’t have a clue what’s going on here, so I’ll go with my gut instinct on this one”) and

2) dopamine discharges reflect the confidence in those inferences (i.e. “how sure am I that doing this is going to bring me goodies?”)

With the obvious caveat that the MRI doesn’t have the resolution to isolate the midbrain dopamine clusters and that these clusters refer to two very distinct population of dopamine neurons (ventral tegmental area and substantia nigra) with different physiological, topographical, and anatomical properties, and distinct connections, the study adds to the body of knowledge of “for the love of Berridge and Schultz, what the hell are you DOIN’, dopamine neuron?”.

Reference: Schwartenbeck, P., FitzGerald, T. H., Mathys, C., Dolan, R., & Friston K. (October 2015, Epub 23 July 2014). The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes. Cerebral Cortex, 25:3434–3445, doi:10.1093/cercor/bhu159. Article + FREE PDF

By Neuronicus, 8 October 2015