I am blind, but my other personality can see


This is a truly bizarre report.

A woman named BT suffered an accident when she was 20 years old and she became blind. Thirteen year later she was referred to Bruno Waldvogel (one of the two authors of the paper) for psychotherapy by a psychiatry clinic who diagnosed her with dissociative identity disorder, formerly known as multiple personality disorder.

The cortical blindness diagnosis has been established after extensive ophtalmologic tests in which she appeared blind but not because of damage to the eyes. So, by inference, it had to be damage to the brain. Remarkably (we shall see later why), she had no oculomotor reflexes in response to glare. Moreover, visual evoked potentials (VEP is an EEG in the occipital region) showed no activity in the primary visual area of the brain (V1).

During the four years of psychotherapy, BT showed more than 10 distinct personalities. One of them, a teenage male, started to see words on a magazine and pretty soon could see everything. With the help of hypnotherapeutic techniques, more and more personalities started to see.

“Sighted and blind states could alternate within seconds” (Strasburger & Waldvogel, 2015).

The VEP showed no or very little activity when the blind personality was “on” and showed normal activity when the sighted personality was “on”. Which is extremely curious, because similar studies in people with psychogenic blindness or anesthetized showed intact VEPs.

There are a couple of conclusions from this: 1) BT was misdiagnosed, as is unlikely to be any brain damage because some personalities could see, and 2) Multiple personalities – or dissociate identities, as they are now called – are real in the sense that they can be separated at the biological level.

The visual pathway that mediates conscious visual perception. a) A side view of the human brain with the retinogeniculocortical pathway shown inside (blue). b) A horizontal section through the brain exposing the same pathway.

Fascinating! The next question is, obviously, what’s the mechanism behind this? The authors say that it’s very likely the LGN (the lateral geniculate nucleus of the thalamus) which is the only relay between retina and V1 (see pic). It can be. Surely is possible. Unfortunately, so are other putative mechanisms, as 10% of the neurons in the retina also go to the superior colliculus, and some others go directly to the hypothalamus, completely bypassing the thalamus. Also, because it is impossible to have a precise timing on the switching between personalities, even if you MRI the woman it would be difficult to establish if the switching to blindness mode is the result of a bottom-up or a top-down modulation (i.e. the visual information never reaches V1, it reaches V1 and is suppressed there, or some signal form other brain areas inhibits V1 completely, so is unresponsive when the visual information arrives).

Despite the limitations, I would certainly try to get the woman into an fMRI. C’mon, people, this is an extraordinary subject and if she gave permission for the case study report, surely she would not object to the scanning.

Reference: Strasburger H & Waldvogel B (Epub 15 Oct 2015). Sight and blindness in the same person: Gating in the visual system. PsyCh Journal. doi: 10.1002/pchj.109.  Article | FULLTEXT PDF | Washington Post cover

By Neuronicus, 29 November 2015

The runner’s euphoria and opioids

The runner’s high is most likely due to release of the endorphins binding to the opioid receptors according to Boecker et al. (2008, doi: 10.1093/cercor/bhn013). Image courtesy of Pixabay.

We all know that exercise is good for you: it keeps you fit, it reduces stress and improves your mood. And also, sometimes, particularly after endurance running, it gets you high. The mechanism of euphoria reported by some runners after resistance training is unknown. Here is a nice paper trying to figure it out.

Boecker et al. (2008) scanned 10 trained male athletes at rest and after 2 hour worth of endurance running. By “trained athletes” they mean people that ran for 4-10 hours weekly for the past 2 years. The scanning was done using a positron emission tomograph (PET). The PET looks for a particular chemical that has been injected into the bloodstream of the subjects, in this case non-selective opioidergic ligand (it binds to all opioid receptors in the brain; morphine, for example, binds only to a subclass of the opioid receptors).

The rationale is as follows: if we see an increase in ligand binding, then the receptors were free, unoccupied, showing a reduction in the endogenous neurotransmitter, that is the substance that the brain produces for those receptors; if we see a decrease in the ligand binding it was because the receptors were occupied, meaning that there was an increase in the production of the endogenous neurotransmitter. The endogenous neurotransmitters for the opioid receptors are the endorphins (don’t confuse them with epinephrine a.k.a. adrenaline; different systems entirely).

After running, the subjects reported that they are euphoric and happy, but no change in other feelings (confusion, anger, sadness, fear etc.; there was a reduction in fear, but it was not significant). The scanning showed that it was less binding of the opioidergic ligand in many places in the brain (for the specialist, here you go: prefrontal/orbitofrontal cortices, dorsolateral prefrontal cortex, anterior and posterior cingulate cortex, insula and parahippocampal gyrus, sensorimotor/parietal regions, cerebellum and basal ganglia).

Regression analysis showed that there was a link between the euphoria feeling and the receptor occupancy: the more euphoric the people said they were, the more endorphines (i.e. endogenous opioids) they had bound in the brain. This study is the first to show this kind of link.

Reference: Boecker H, Sprenger T, Spilker ME, Henriksen G, Koppenhoefer M, Wagner KJ, Valet M, Berthele A, & Tolle TR. (Nov 2008, Epub 21 Feb 2008). The Runner’s High: Opioidergic Mechanisms in the Human Brain. Cerebral Cortex, 18:2523–2531. doi:10.1093/cercor/bhn013. Article | FREE FULLTEXT PDF

By Neuronicus, 28 November 2015


The FIRSTS: the isolation of tryptophan (1901)

The post-Thanksgiving dinner drowsiness is due to the very carbohydrates-rich meal and not to the amounts of tryptophan in the turkey meat, which are not higher that those in chicken.

There is a myth that says the post-Thanksgiving dinner drowsiness is due to high amounts of tryptophan found in the turkey meat. Nothing farther from the truth; in fact, it is due to the high amounts of carbohydrates in the Thanksgiving dinner which trigger massive insulin production. Anyway, the myth still goes on, despite evidence that the turkey has about the same amount of tryptophan as the chicken. That being said, what’s this tryptophan business?

Tryptophan is an amino acid necessary for many things in the body, including the production of serotonin, a brain neurotransmitter. You cannot live without it and your body cannot make it. Thus, you need to eat it. There are many sources of tryptophan, like eggs, soybeans, cheeses, various meats and so on.

Tryptophan was first isolated by Hopkins & Cole (1901) through hydrolysis of casein, a protein found in milk. And there were no two ways about it: “there is indeed not the smallest doubt that our substance is the much-sought tryptophane” (p. 427). No “we’re confident that…”, “we’re suggesting this…”, no maybe, possibly, probably, and most likely’s that one finds in an overwhelming abundance in the cautious tone adopted by today’s studies. Many more scientists today, fewer job openings, one has a career to think about…

Digression aside, Hopkins went on later to prove that tryptophan is an essential amino acid by feeding mice a tryptophan-free diet (and the mice died). By 1929 he was knighted and he got the Nobel prize for his contributions in the vitamin field. Also, a little known fact for you, butter lovers, Hopkins proved that margarine is worse that butter because it lacks certain vitamins and you have him to thank for the vitamin-enriched margarine that you find today.

Reference: Hopkins FG & Cole SW (Dec 1901). A contribution to the chemistry of proteids: Part I. A preliminary study of a hitherto undescribed product of tryptic digestion. The Journal of Physiology, 27 (4-5): 418–28. doi:10.1113/jphysiol.1901.sp000880. PMC 1540554. PMID 16992614. Article | FREE FULLTEXT PDF

By Neuronicus, 27 November 2015

Lucy’s 9 vertebrae are actually 8

Lucy. Left: picture of the real skeleton. Middle and Right: reconstructions. Courtesy of Wikipedia

As Google reminded us, today is the 41st anniversary of the finding of Lucy, the first discovered member of the species Australopithecus afarensis. Lucy lived in Ethiopia about 3.2 million years ago and the most extraordinary fact about her is that her fossil represents the first evidence of bipedalism in a hominin (we are also hominins).

Lucy is one “missing link” (not ‘missing’ anymore, obviously) between the common ancestor of humans and chimpanzees and humans because she has ape-like features (jaw, forehead, long arms, small cranium) and human-like features (knee, ankle, lumbar curve, pelvic bones) and walked upright.

Meyer et al. (2015) wanted to do a comprehensive reconstruction of Lucy for display at the American Museum of Natural History in New York. During this work they noticed that one vertebrae of the total of nine found is kindda small compared to the other ones. So they set to measure vertebrae form all sorts of other species, alive and extinct, and after some factor analysis they concluded that out of Lucy’s nine found vertebrae, the little one is not actually hers, but belongs to a different species from the genus Theropithecus (a baboon ancestor).

This finding is functionally uninformative and their “work does not refute previous work on Lucy or its importance for human evolution, but rather highlights the importance of studying original fossils, as well as the efficacy of the scientific method.” In other words, give the poor anthropologists not reconstructions but the original fossils to work with (most people worked with Lucy’s reconstructions which missed some details, thus allowing this pesky vertebra to remain miss-cataloged for 40 years).

The new alignment from doi: 10.1016/j.jhevol.2015.05.007

This is the first paper of pure anthropology that I have read in full and let me tell you that I found a lot of curious things, unrelated to Lucy. Like, for example, from an anthropologist’s point of view, an adult is someone with the third molar completely erupted. We should then look into the people’s mouths before giving them the keys to the wine cellar, because some 21-year olds are definitely not adults. Also, instead of a medical doctor, get an anthropologist to teach anatomy, because oh boy do these people know their skeletons! Here is an excerpt from the Methods section: “The overall size of the A.L. 288-1am partial vertebra was calculated as the geometric mean of six linear dimensions: lamina superoinferior height and dorsoventral thickness, pars interarticularis width, interarticular facet height, and superior and inferior articular interfacet maximum transverse widths. The pars interarticularis geometric mean includes three variables from the pars interarticularis: lamina superoinferior height and dorsoventral thickness, and pars interarticularis width” (p. 175).

All in all, nice!

Reference: Meyer MR, Williams SA, Smith MP, Sawyer GJ (August 2015, Epub 6 Jun 2015). Lucy’s back: Reassessment of fossils associated with the A.L. 288-1 vertebral column. Journal of Human Evolution, 85:174-80. doi: 10.1016/j.jhevol.2015.05.007. Article | FREE FULLTEXT PDF

By Neuronicus, 24 November 2015


Dead salmon engaged in human perspective-taking, uncorrected fMRI study reports


Subject. One mature Atlantic Salmon (Salmo salar) participated in the fMRI study. The salmon was approximately 18 inches long, weighed 3.8 lbs, and was not alive at the time of scanning.

Task. The task administered to the salmon involved completing an open-ended mentalizing task. The salmon was shown a series of photographs depicting human individuals in social situations with a specified emotional valence. The salmon was asked to determine what emotion the individual in the photo must have been experiencing.”

Before explaining why you read what you just read and if it’s true (it is!), let me tell you that for many people, me included, the imaging studies seem very straightforward compared to, say, immunohistochemistry protocols. I mean, what do you have to do? You stick a human in a big scanner (fMRI, PET, or what-have-you), you start the image acquisition software and then some magic happens and you get pretty pictures of the human brain on your computer associated with some arbitrary numbers. Then you tell the humans to do something and you acquire more images which come with a different set of numbers. Finally, you compare the two sets of numbers and voila!: the neural correlates of whatever. Easy-peasy.

Well, it turns out it’s not so easy-peasy. Those numbers correspond to voxels, something like a pixel only 3D; a voxel is a small cube of brain (with the side of, say, 2 or 3 mm) comprising of hundreds of thousands to millions of brain cells. After this division, depending on your voxel size, you end up with a whooping 40,000 to 130,000 voxels or thereabouts for one brain. So a lot of numbers to compare.

When you do so many comparisons, by chance alone, you will find some that are significant. This is nature’s perverse way to show relationships when there are none and to screw-up a PhD. Those relationships are called false positives and the more comparisons you do, the more likely it is to find something statistically significant. So, in the ’90s, when the problem became very pervasive with the staggering amount of data generated by an fMRI scan, researchers came up with mathematical ways to dodge the problem, called multiple comparisons corrections (like application of the Gaussian Random Field Theory). Unfortunately, even 20 years later one could still find imaging studies with uncorrected results.

To show how important it is to perform that statistical correction, Bennet et al. (2010) did an fMRI study on perspective taking on one subject: a salmon. The subject was dead at the time of scanning. Now you can re-read the above excerpt from the Methods section.

Scroll down a bit to the Results section: “Out of a search volume of 8064 voxels a total of 16 voxels were significant”, p(uncorrected) < 0.001, showing that the salmon was engaging in active perspective-taking.

After the multiple comparisons correction, no voxel lit up, meaning that the salmon was not really imagining what the humans are feeling. Bummer…

The study has been extensively covered by media and I jumped on that wagon too – even if a bit late – because I never tire of this study as it’s absolutely funny and timeless. The authors even received the 2012 IgNobel prize for Neuroscience, as justly deserved. I refrained from fish puns because there are aplenty in the links I provided for you after the Reference. Feel free to come up with your own. Enjoy!

Reference: Bennett, CM, Baird AA, Miller MB & Wolford GL (2010). Neural correlates of interspecies perspective taking in the post-mortem Atlantic Salmon: An argument for multiple comparisons correction. Journal of Serendipitous Unexpected Results, 1, 1–5, presented as poster at the 2009 Human Brain Mapping conference. PDF | Nature cover | Neuroskeptic cover | Scientific American full story

By Neuronicus, 23 November 2015

Pesticides reduce pollination

Close-up of a bee with pollen flying by a flower. Credit: Jon Sullivan. License: PD

Bees have difficult times these days, what with that mysterious colony collapse disorder on top of various viral, bacterial and parasitical diseases. Of course, the widespread use of pesticides did not help the thriving of the hive, as many pesticides have various deleterious effects on the bees, from poor foraging or less reproduction to even death.

The relatively new (’90s) class of insecticide – the neonicotinoids – has been met with great hope because has low toxic effects on birds and mammals, as opposed to the organophosphates, for example. Why that should be the case, is a mystery for me, because the neonicotinoids bind to the nicotinic receptors present in both peripheral and central nervous system in an irreversible manner, which does not put the neonicotinoids in a favorable light.

Now Stanley et al. (2015) have found that exposure to the neonicotinoid thiamethoxam reduces the pollination provided by the bumblebees to apples. They checked it using 24 bumblebee colonies and the exposure was at low levels over 13 days, trying to mimic realistic in-field exposure. The apples visited by the bumblebees exposed to insecticide had 36% reduction in apple seeds.

Almost 90% of the flowering plants need pollination to reproduce, so any threat to pollination can cause serious problems. Over the paste few years, virtually all USA corn had been treated with neonicotinoids; EU banned the thiamethoxam use in 2013. And, to make matters worse, neonicotinoids are but only one class of the many toxins affecting the bees.

Related post: Golf & Grapes OR Grandkids (but not both!)

Reference: Stanley DA, Garratt MP, Wickens JB, Wickens VJ, Potts SG, & Raine NE. (Epub 18 Nov 2015). Neonicotinoid pesticide exposure impairs crop pollination services provided by bumblebees. Nature, doi: 10.1038/nature16167. Article

By Neuronicus, 21 November 2015

Will you trust a pigeon pathologist? That’s right, he’s a bird. Stop being such an avesophobe!


From Levenson et al. (2015), doi: 10.1371/journal.pone.0141357. License: CC BY 4.0

Pigeons have amazing visual skills. They can remember facial expressions, recall almost 2000 images, recognizes all the letters of the alphabet (well, even I can do that), and even tell apart a Monet form a Picasso! (ok, birdie, you got me on that one).

Given their visual prowess, Levenson et al. (2015) figured that pigeons might be able to distinguish medically-relevant images (a bit of a big step in reasoning there, but let’s go with it). They got a few dozen pigeons, starved them a bit so the birds show motivation to work for food, and started training them on recognizing malignant versus non-malignant breast tumors histology pictures. These are the same exact pictures your radiologist looks at after a mammogram and your pathologist after a breast biopsy; they were not retouched in any way for the pigeon’s benefit (except to make it more difficult, see below). Every time the pigeon pecked on the correct image, it got a morsel of food (see picture). Training continued for a few weeks on over 100 images.

For biopsies, the birds had an overwhelming performance, reaching 99% accuracy, regardless of the magnification of the picture, and for mammograms, up to 80% accuracy, just like their human counterparts. Modifying the pictures’ attributes, like rotation, compression or color lowered somewhat their accuracy, but they were still able to score only marginally less than humans and considerably better than any computer software. More importantly, the pigeons were able to generalize, after training, to correctly classify previously unseen pictures.

Let’s be clear: I’m not talking about some fancy breed here, but your common beady-eyed, suspicious-sidling, feral-looking rock pigeon. Yes, the one and only pest that receives stones and bread in equal measures, the former usually accompanied by vicious swearings uttered by those that encountered their slushy “gifts” under the shoes, on the windshield or in the coffee and the latter offered by more kindly disposed and yet utterly naive individuals in the misguided hopes of befriending nature. Columba livia by its scientific name, at the same time an exasperating pest and an excellent pathologist! Who knew?!

The authors even suggest using pigeons instead of training and paying clinicians. Hmmm… But who do I sue if my mother’s breast cancer gets missed by the bird, in one of those 1% chances? Because somehow making a pigeon face the guillotine does not seem like justice to me. Or is this yet another plot to get the clinicians off the hook for misdiagnoses? Leave the medical profession alone, birdies – is morally sensitive as it is -, and search employment in the police or Google; they always need better performance in the ever-challenging task of face-recognition in surveillance videos.

P.S. The reason why you didn’t recognized the word “avesophobe” in the title is because I just invented it, to distinguish the hate for birds from a more serious affliction, ornithophobia, the fear of birds.

Reference: Levenson RM, Krupinski EA, Navarro VM, & Wasserman EA (18 Nov 2015). Pigeons (Columba livia) as Trainable Observers of Pathology and Radiology Breast Cancer Images. PLoS One, 10(11):e0141357. doi: 10.1371/journal.pone.0141357.  Article | FREE FULLTEXT PDF

By Neuronicus, 19 November 2015

Prions in urine

toilet urine

This is one of the scariest papers I have read.

All prion diseases – like the mad cow disease, scrapie, Kuru or Creutzfeldt-Jacob (CJD) – are incurable and fatal. Up to recently, we thought the only way you can get it is by ingesting the meat of the affected animal. Or, as I reported a couple of months ago, by ingesting drugs derived from the pituitary glands of infected dead humans.

A paper published 4 years ago describes another unexpected way to contract this horrible deadly disease. Using electrophoresis, mass spectrometry and liquid chromatography selected reaction monitoring, Van Dorsselaer et al. (2011) found prion proteins in a class of infertility drugs, the injectable urine-derived gonadotropins. These drugs are given to hundreds of thousands of women in North America for infertility treatment. They are developed from the urine of donor women, who are screened for all sorts of diseases, but the CJD has a long incubation period (decades) and thus it may be un-detectable using non-invasive methods.

Now, this in itself is not so worrisome as additional screening of the final medicine can be done and eliminate the batches with prions. What scared the living you-know-what out of me is the thought that the infected humans pee in the toilet and then that goes to the water treatment plants and then comes to your faucet. My question is: can the purification done at the water treatment plant eliminate the prions? I really, really do not wish to be alarming and panicky, especially in a world where every other news you read/hear seems to be something scary, so I invite anybody with knowledge about water treatment to comment and let us all know that is impossible, or at least highly unlikely, to get prions from the drinkable water. I don’t know how, maybe some step in the water treatment kills proteins as a matter of course or something.

Reference:  Van Dorsselaer A, Carapito C, Delalande F, Schaeffer-Reiss C, Thierse D, Diemer H, McNair DS, Krewski D, & Cashman NR. (23 Mar 2011). Detection of prion protein in urine-derived injectable fertility products by a targeted proteomic approach. PLoS One, 6(3):e17815. doi: 10.1371/journal.pone.0017815. Article | FREE FULLTEXT PDF

By Neuronicus, 18 November 2015


The werewolf and his low fibroblast growth factor 13 levels

Petrus Gonsalvus, by anonymous
Petrus Gonsalvus, anonymous painting of the first recorded case of hypertrichosis in 1642. License: PD

Although they are very rare, werewolves do exist. And now the qualifier: werewolves as in people with excessive hair growth all over the body and not the more familiar kind that changes into a wolf every time there is a new moon. The condition is called hypertrichosis and its various forms have been associated with distinct genetic abnormalities.

In a previous report, DeStefano et al. (2013) identified the genetic locus of the X-linked congenital generalized hypertrichosis (CGH), which is a 19-Mb region on Xq24-27 that spans about 82 genes, resulting mainly from insertions from chromosomes 4 and 5. Now, they wanted to see what is the responsible mechanism for the disease. First, they looked at the hair follicles of a man afflicted with CGH that has hair almost all over his body and noticed some structural abnormalities. Then, they analyzed the expression of several genes from the affected region of the chromosome in this man and others with CGH and they observed that only the levels of the Fibroblast Growth Factor 13 (FGF13), a protein found in hair follicles, are much lower in CGH. Then they did some more experiments to establish the crucial role of FGF13 in regulating the follicle growth.

An interesting find of the study is that, at least in the case of hypertrichosis, is not the content of the genomic sequences that were added to chromosome X that matter, but their presence, affecting a gene that is located 1.2 Mb away from the insertion.

Reference: DeStefano GM, Fantauzzo KA, Petukhova L, Kurban M, Tadin-Strapps M, Levy B, Warburton D, Cirulli ET, Han Y, Sun X, Shen Y, Shirazi M, Jobanputra V, Cepeda-Valdes R, Cesar Salas-Alanis J, & Christiano AM ( 7 May 2013, Epub 19 Apr 2013). Position effect on FGF13 associated with X-linked congenital generalized hypertrichosis. Proceedings of the National Academy of Sciences of the U.S.A., 110(19):7790-5. doi: 10.1073/pnas.1216412110. Article | FREE FULLTEXT PDF

By Neuronicus, 17 November 2015

Pic of the day: Evolution

06fossil - Copy.png

Dobzhansky was, in his words, an “evolutionist and creationist”, a geneticist with deep faith in God.

Reference: Dobzhansky, T. (Mar 1973). Nothing in Biology Makes Sense Except in the Light of Evolution,  American Biology Teacher, 35 (3): 125–129. DOI: 10.2307/4444260. Article | FULLTEXT PDF via Research Gate

Terrorist attacks increase the male fetal loss

effelThe odds of having a baby boy decreases after terrorist attacks, natural or man-made disasters, or economical depression. There are several studies worldwide that support this finding. This is somewhat counter-intuitive, because there are anecdotal accounts that report an increase in male births after a war, presumably to make up for the lost men.

Bruckner et al. (2010) wanted to see if this decrease in the odds of a male births, also called the secondary sex ratio, is due to a failure to conceive male babies or the male fetuses die in the womb before birth. They looked at the public databases from 1996-2002 fetal deaths and births from the U.S. National Center for Health Statistics.

The results showed that in the months following the September 11, 2001 terrorist attacks the deaths of male fetuses older that 20 weeks increased significantly. The authors make reference to the communal bereavement hypothesis, which stipulates that stress increases in persons not directly affected by a tragedy. Although the effects of stress on pregnant females is well documented, why the male fetuses seem to be more susceptible to mother’s stress is unknown.

I chose to feature this paper because of the recent Paris atrocities.

Reference: Bruckner TA, Catalano R, & Ahern J. (25 May 2010). Male fetal loss in the U.S. following the terrorist attacks of September 11, 2001. BMC Public Health.;10:273. doi: 10.1186/1471-2458-10-273. Article | FREE FULLTEXT PDF

By Neuronicus, 15 November 2015

Kinesin in axon regeneration

Fig. 8 from Lu, Lakonishok, & Gelfand (2015). License: Creative Commons 2.
Fig. 8 from Lu, Lakonishok, & Gelfand (2015). License: Creative Commons 2.

The longest neuron that a human has is from the spinal cord to the tip of the toes. As a cell, it needs various proteins in various places. How is this transport done? Surely not by diffusion, the proteins would degrade or would arrive at inopportune membrane-moments (I just coined that). Molecular motors, on the other hand, are toiling proteins which haul huge cargoes for the benefit of the cell in an incredibly ingenious manner (they have feet and sticky soles and gears and so on). Notable motors are kinesin and dynein, the former brings stuff to the terminal buttons of the axon, the latter goes in the opposite direction, to the soma. They walk on a railway-like scaffold in a very funny manner, if you are to believe the simulations. Go on, I dare you, search kinesin or dynein animation on Google or YouTube and tell me then that biology is not funny.

And because no self-respectable scientist can work with the molecular motors without adding his/her contribution to the above-mentioned wealth of animations, the paper below comes with no less than 9 movies (as online supplemental material)! Lu et al. (2015) focused their attention on the role of kinesin in injured neurons. The authors dyed several types of proteins in fly neurons and then cultured the cells in a Petri dish. And then cut their axons with a glass needle. After that, they used a really fancy microscope (and a good microscopist, you should look at their pictures) to look at what happens. Which is this: the cut activates a c-Jun N-terminal kinase cascade (the cell’s response to stress), which leads to sliding of microtubules (part of cell’s cytoskeleton), which is com­pletely dependent on kinesin-1 heavy chain. This sliding initiates axonal regeneration (see picture).

I believe the kinesins and dyneins are the most charming, funny, and endearing proteins out there. Yes, I’m anthropomorphizing clumps of amino acids. I know, I’m a geek.

Reference: Lu W, Lakonishok M, & Gelfand VI (1 Apr 2015, Epub 5 Feb 2015). Kinesin-1–powered microtubule sliding initiates axonal regeneration in Drosophila cultured neurons. Molecular Biology of the Cell, 26(7):1296-307. doi: 10.1091/mbc.E14-10-1423. Article | FREE FULLTEXT PDF | Supplemental movies

Some youtube videos I mentioned before, quite accurate, too: best in show

by Neuronicus, 12 November 2015

Putative mechanism for decreased spermatogenesis following SSRI

fishThe SSRIs (selective serotonin reuptake inhibitors) are the most commonly prescribed antidepressants around the world. Whether is Prozac, Zoloft or Celexa, chances are that 1 in 4 Americans (or 1 in 10, depending on the study) will be making a decision during their lifetime to start an antidepressant course or not. And yet adherence to treatment is significantly low, as many people get off the SSRI due to their side effects, one of the main complains being sexual dysfunction in the form of low libido and pleasure.

Now a new study finds a mechanism for an even more worrisome effect of citalopram, (Celexa), an SSRI: the reduction of spermatogenesis. Prasad et al. (2015) used male zebrafish as a model and exposed them to citalopram in 3 different doses for 2- or 4-weeks period. They found out that the expression in the brain of the serotonin-related genes (trp2 and sert) and gonadotropin genes (lhb, sdhb, gnrh2, and gnrh3) were differently affected depending on the dose and durations of treatment. In the testes, the “long-term medium- and high-dose citalopram treatments displayed a drastic decrease in the developmental stages of spermatogenesis as well as in the matured sperm cell count” (p. 5). The authors also looked at how the neurons are organized and they found out that the serotonin fibers are associated with the fibers of the neurons that release gonadotropin-releasing hormone 3 (GnRH3) in preoptic area, a brain region in the hypothalamus heavily involved in sexual and parental behavior in both humans and fish.

Shortly put, in the brain, the citalopram affects gene expression profiles and fiber density of the serotonin neurons, which in turn decreases the production of GnRH3, which may account for the sexual dysfunctions that follow citalopram. In the testes, citalopram may act directly by binding to the local serotonin receptors and decrease spermatogenesis.

Reference: Prasad P, Ogawa S, & Parhar IS. (Oct 2015, Epub 8 Jul 2015). Serotonin Reuptake Inhibitor Citalopram Inhibits GnRH Synthesis and Spermatogenesis in the Male Zebrafish. Biololy of Reproduction. 93(4):102, 1-10. doi: 10.1095/biolreprod.115.129965. Article | FREE FULLTEXT PDF

By Neuronicus, 11 November 2015

The FIRSTS: the pons (1572)

The pons varolii, as described in plate 677 of Henry Gray's Anatomy of the Human Body (1918). License: PD.
The pons varolii, as described in plate 677 of Henry Gray’s Anatomy of the Human Body (1918). License: PD.

The name of the pons, that part of the brainstem that is so important for survival functions (like breathing) and holds the nuclei of several cranial nerves, is actually pons varolii. I was wondering why is that? When I learned neuroanatomy I was extremely lucky, because my knowledge of Latin, such as it is, contributed immensely to the memorization of brain structures; so the name of pons means “bridge” in Latin, which makes sense because it looks like one (see picture). But I was at a loss with varolii. Was it some sort of a joke that I missed? Was it the “rude bridge” or, more colloquially, the “a**hole bridge”?! Varo (or the closest thing) in Latin means rude or uncivilized.

Title page of Varolio's published letter in 1573.
Title page of Varolio’s published letter in 1573.

Well, turns out that the guy who described the pons for the first time is Costanzo Varolio (1543–1575) and the structure is named after him. Duh! As if it’s uncommon to name things after their discoverer… Anyway, I didn’t read the original account, which is free in its digitized-by-Google form of dubious quality (you can see the actual thumb of the dude who scanned it on the last page and many pages are illegible due to poor scanning technique). I got the information about the pons from the Pioneers in Neurology section in the Journal of Neurology. Varolio wrote a huge letter (seventy-some pages worth!) on 1 April 1572 to another physician describing the optical nerves and the pons. The letter has been published a year later in Padua, Italy. The pons may have been described and/or named earlier, but, alas, the works were not published or published much later. Goes to show that publication is more important that discovery…

Original reference: Varolio, C. (1573). De Nervis Opticis nonnullisque aliis praeter communem opinionem in Humano capite observatis (On the optic nerves observed in the human brain and a few other particulars adverse to the common opinion). Padua. Google ebook

Reference: Zago S & Meraviglia MV (July 2009, Epub 6 June 2009). Costanzo Varolio (1543–1575). Journal of Neurology, 256(7):1195-6. doi: 10.1007/s00415-009-5192-5. Article | FREE FULLTEXT PDF

By Neuronicus, 10 November 2015

The culprit in methamphetamine-induced psychosis is very likely BDNF

Psychoses. Credit: NIH (Publication Number 15-4209) & Neuronicus.
Psychoses. Credit: NIH (Publication Number 15-4209) & Neuronicus. License: PD.

Methamphetamine prolonged use may lead to psychotic episodes in the absence of the drug. These episodes are persistent and closely resemble schizophrenia. One of the (many) molecules involved in both schizophrenia and meth abuse is BDNF (brain derived neurotrophic factor), a protein mainly known for its role in neurogenesis and long-term memory.

Lower BDNF levels have been observed in schizophrenia, therefore Manning et al. (2015) wondered if it’s also involved in meth-induced psychosis. So they got normal mice and mice that were genetically engineered to express lower levels of BDNF. They gave them meth for 3 weeks, with escalating doses form one week to the next. Interestingly, no meth on weekends, which made me rapidly scroll to the beginning of the paper and confirm my suspicion that the experiments were not done in USA; if they were, the grad students would not have had the weekends off and mice would have received meth every day, including weekends. Look how social customs can influence research! Anyway, social commentary aside, after the meth injections, the researchers let the mice untroubled for 2 more weeks. And then they tested them on a psychosis test.

How do you measure psychosis in rodents? By inference, since the mouse will not grab your coat and tell you about the newly appeared hypnotizing wall pattern and the like. Basically, it was observed that psychotic people have a tendency to walk in a disorganized manner when given the opportunity to explore, a behavior that was also observed in rodents on amphetamines. This disorganized walk can be quantifies into an entropic index, which is thought to reflect occurrence of psychosis (I know, a lot of inferring. But you come up with a better model of psychosis in rodent!).

Manning et al. (2015) gave their mice amphetamine to mimic psychosis and then observed their behavior. And the results were that the genetically engineered mice to express less BDNF showed reduced psychosis (i.e. had a lower entropic index). In conclusion, the alteration of the BDNF pathway may be responsible for the development of psychosis in methamphetamine users.

Reference: Manning EE, Halberstadt AL, & van den Buuse M. (Epub 9 Oct 2015). BDNF-Deficient Mice Show Reduced Psychosis-Related Behaviors Following Chronic Methamphetamine. International Journal of Neuropsychopharmacology, 1–5. doi: 10.1093/ijnp/pyv116. Article | FREE FULLTEXT PDF

By Neuronicus, 9 November 2015

Hope for a new migraine medication

Headache clipart. Courtesy of ClipArtHut.
Headache. Courtesy of ClipArtHut.

The best current anti-migraine medication are triptans (5-HT1B/1D receptor agonists). Because these medications are contra-indicated in patients with a variety of other diseases (cardiovascular, renal, hepatic, etc.), the search for alternative drugs continues.

The heat- and pain-sensitive TRPV1 receptors (Transient Receptor Potential Vanilloid 1) localized on the trigeminal terminals (the fifth cranial nerve) have been implicated in the production of headaches. That is, if you activate them by, say, capsaicin, the same substance that gives the chili peppers their hotness, you get headaches (you’d have to eat an awful lot of peppers to get the migraine, though). On the other hand, if you block these receptors by triptans, you alleviate the migraines. All good and well, so let’s hunt for some TRPV1 antagonists, i.e. blockers. But, as theory often doesn’t meet practice, the first two antagonists that were tried were dropped in the clinical trials for lack of efficiency.

Meents et al. (2015) are giving another try to two different TRPV1 antagonists, by their fetching names of JNJ-38893777 and JNJ-17203212, respectively. Because you cannot ask a rat if it has a headache, it is very difficult to have a rodent model for migraine. Instead, researchers focused on giving rats some inflammatory soup directly into the subarachnoid space or capsaicin directly into the carotid artery, actions which they have reasons to believe produce severe headaches and some biological changes, like increase in a certain gene expression (c-fos, if you must know) in the trigeminal brain stem complex and release of the neurotransmitter calcitonin gene-related peptide (CGRP).

JNJ-17203212 got rid of all those physiological changes in a dose-dependent manner, presumably of the migraine, too. The other drug, JNJ-38893777, was effective only on the highest dose. Give these drugs a few more tests to pass, and off to the clinical trials with them. I’m joking, it takes a lot more research than just a paper between discovery and human drug trials.

Reference: Meents JE, Hoffmann J, Chaplan SR, Neeb L, Schuh-Hofer S, Wickenden A, & Reuter U (December 2015, Epub 24 June 2015). Two TRPV1 receptor antagonists are effective in two different experimental models of migraine. The Journal of Headache and Pain. 16:57. doi: 10.1186/s10194-015-0539-z. Article | FREE FULLTEXT PDF

By Neuronicus, 8 November 2015

The FIRSTS: discovery of the polymerase (1956)

Arthur Kornberg. License: PD, courtesy of the National Library of Medicine
Arthur Kornberg. License: PD, courtesy of the National Library of Medicine

Polymerases are enzymes that synthesize nucleic acids. The main types of polymerases are DNA polymerases and RNA polymerases. Everything alive has them. Saying that you cannot have cellular life on Earth without them is like saying you cannot have a skeleton without bones.

The first polymerase was discovered by Arthur Kornberg in 1956. Of note, his (and his two postdocs and lab technician) discovery was rejected for publication by The Journal of Biological Chemistry basically on the grounds that they don’t know what they’re talking about or they’re not qualified to talk about it. It took a new Editor-in-Chief to push the publication which finally appeared in the July 1958 issue. Talk about politicking in academia…

Diagram of DNA polymerase extending a DNA strand and proof-reading. License: PD. Credit: Madeleine Price Ball
Diagram of DNA polymerase extending a DNA strand and proof-reading. License: PD. Credit: Madeleine Price Ball

Anyway, less than a year since publication, in 1959, Kornberg (but not his co-authors) received the Nobel Prize for the discovery of the polymerase. Which he isolated from a bug called E. Coli, the same bacterium that can be found in your intestines and poop or can give you food poisoning (same species, but not necessarily the same strain).

Reference: Lehman IR, Bessman MJ, Simms ES, & Kornberg A (July 1958). Enzymatic Synthesis of Deoxyribonucleic Acid. I. Preparation of Substrates and Partial Purification of an enzyme from Escherichia Coli. The Journal of Biological Chemistry, 233:163-170. FREE FULLTEXT PDF | 2005 JBC Centennial Cover

By Neuronicus, 7 November 2015

Is religion turning perfectly normal children into selfish, punitive misanthropes? Seems like it.

Screenshot from
Screenshot from “Children of the Corn” (Director: Fritz Kiersch, 1984)

The main argument that religious people have against atheism or agnosticism is that without a guiding deity and a set of behaving rules, how can one trust a non-religious person to behave morally? In other words, there is no incentive for the non-religious to behave in a societally accepted manner. Or so it seemed. Past tense. There has been some evidence showing that, contrary to expectations, non-religious people are less prone to violence and deliver more lenient punishments as compared to religious people. Also, the non-religious show equal charitable behaviors as the religious folks, despite self-reporting of the latter to participate in more charitable acts. But these studies were done with adults, usually with non-ecological tests. Now, a truly first-of-its-kind study finds something even more interesting, that calls into question the fundamental basis of Christianity’s and Islam’s moral justifications.

Decety et al. (2015) administered a test of altruism and a test of moral sensitivity to 1170 children, aged 5-12, from the USA, Canada, Jordan, Turkey, and South Africa. Based on parents’ reports about their household practices, the children had been divided into 280 Christian, 510 Muslim, and 323 Not Religious (the remaining 57 children belonged to other religions, but were not included in the analyses due to lack of statistical power). The altruism test consisted in letting children choose their favorite 10 out of 30 stickers to be theirs to keep, but because there aren’t enough stickers for everybody, the child could give some of her/his stickers to another child, not so fortunate as to play the sticker game (the researcher would give the child privacy while choosing). Altruism was calculated as the number of stickers given to the fictive child. In the moral sensitivity task, children watched 10 videos of a child pushing, shoving etc. another child, either intentionally or accidentally and then the children were asked to rate the meanness of the action and to judge the amount of punishment deserved for each action.

And.. the highlighted results are:

  1. “Family religious identification decreases children’s altruistic behaviors.
  2. Religiousness predicts parent-reported child sensitivity to injustices and empathy.
  3. Children from religious households are harsher in their punitive tendencies.”
Current Biology DOI: (10.1016/j.cub.2015.09.056). Copyright © 2015 Elsevier Ltd
From Current Biology (DOI: 10.1016/j.cub.2015.09.056). Copyright © 2015 Elsevier Ltd. NOTE: ns. means non-significant difference.

Parents’ educational level did not predict children’s behavior, but the level of religiosity did: the more religious the household, the less altruistic, more judgmental, and delivering harsher punishments the children were. Also, in stark contrast with the actual results, the religious parents viewed their children as more emphatic and sensitive to injustices as compared to the non-religious parents. This was a linear relationship: the more religious the parents, the higher the self-reports of socially desirable behavior, but the lower the child’s empathy and altruism objective scores.

Childhood is an extraordinarily sensitive period for learning desirable social behavior. So… is religion really turning perfectly normal children into selfish, vengeful misanthropes? What anybody does at home is their business, but maybe we could make a secular schooling paradigm mandatory to level the field (i.e. forbid religion teachings in school)? I’d love to read your comments on this.

Reference: Decety J, Cowell JM, Lee K, Mahasneh R, Malcolm-Smith S, Selcuk B, & Zhou X. (16 Nov 2015, Epub 5 Nov 2015). The Negative Association between Religiousness and Children’s Altruism across the World. Current Biology. DOI: 10.1016/j.cub.2015.09.056. Article | FREE PDF | Science Cover

By Neuronicus, 5 November 2015

Only the climate change scientists are interested in evidence. The rest is politics

Satellite image of clouds created by the exhaust of ship smokestacks (2005). Credit: NASA. License: PD.
Satellite image of clouds created by the exhaust of ship smokestacks (2005). Credit: NASA. License: PD.

Medimorec & Pennycook (2015) analyzed the language used in two prominent reports regarding climate change. Climate change is not a subject of scientific debate anymore, but of political discourse. Nevertheless, it appears that there are a few scientists that are skeptical about the climate change. As part of a conservative think tank, they formed the “Nongovernmental International Panel on Climate Change (NIPCC) as an alternative to the Intergovernmental Panel on Climate Change (IPCC). In 2013, the NIPCC authored Climate Change Reconsidered II: Physical Science (hereafter referred to as ‘NIPCC’; Idso et al. 2013), a scientific report that is a direct response to IPCC’s Working Group 1: The Physical Science Basis (hereafter referred to as ‘IPCC’; Stocker et al. 2013), also published in 2013″ (Medimorec & Pennycook, 2015) .

The authors are not climate scientists, but psychologists armed with nothing but 3 text analysis tools: Coh-Metrix text analyzer, Linguistic Inquiry and Word Count, and AntConc 3.3.5 concordancer analysis toolkit). They do not even fully understand the two very lengthy and highly technical papers; as they put it,

it is very unlikely that non-experts (present authors included) would have the requisite knowledge to be able to distinguish the NIPCC and IPCC reports based on the validity of their scientific arguments“.

So, they proceed on counting nouns, verbs, adverbs, and the like. The results: IPCC used more formal language, more nouns, more abstract words, more infrequent words, more complex syntax, and a lot more tentative language (‘possible’, ‘probable’, ‘might’) than the NIPCC. Which is ironic, since the climate scientists proponents are the ones accused of alarmism and trumpeting catastrophes. On the contrary, their language was much more refrained, perhaps out of fear of controversy, or just as likely, because they are scientists and very afraid to put their reputations at stake by risking type 1 errors.

In the authors’ words (I know, I am citing them 3 times in 4 paragraphs, but I really enjoyed their eloquence),

“the IPCC authors used more conservative (i.e., more cautious, less explicit) language to present their claims compared to the authors of the NIPCC report […]. The language style used by climate change skeptics suggests that the arguments put forth by these groups warrant skepticism in that they are relatively less focused upon the propagation of evidence and more intent on discrediting the opposing perspective”.

And this comes just from text analysis…

Reference: Medimorec, S. & Pennycook, G. (Epub 30 August 2015). The language of denial: text analysis reveals differences in language use between climate change proponents and skeptics. Climatic Change, doi:10.1007/s10584-015-1475-2. Article | Research Gate full text PDF

By Neuronicus, 4 November 2015

TMS decreases religiosity and ethnocentrism

Medieval knight dressed in an outfit with the Cross of St James of Compostela. From Galicianflag.
Medieval knight dressed in an outfit with the Cross of St. James of Compostela. Image from Galicianflag.

Rituals are anxiolytic; we developed them because they decrease anxiety. So it makes sense that when we feel the most stressed we turn to soothing ritualistic behaviors. Likewise, in times of threat, be it anywhere from war to financial depression, people show a sharp increase in adherence to political or religious ideologies.

Holbrook et al. (2015) used TMS (transcranial magnetic stimulation) to locally downregulate the activity of the posterior medial frontal cortex (which includes the dorsal anterior cingulate cortex and the dorsomedial prefrontal cortex), a portion of the brain the authors have reasons to believe is involved in augmenting the adherence to ideological convictions in times of threat.

They selected 38 U.S. undergraduates who scored similarly on political views (moderate or extremely conservative, the extremely liberals were excluded). Curiously, they did not measure religiosity prior to testing. Then, they submitted the subjects to a group prejudice test designed to increase ethnocentrism (read critique of USA written by an immigrant) and a high-level conflict designated to increase religiosity (reminder of death) while half of them received TMS and the other half received shams.

Under these conditions, the TMS decreased the belief in God and also the negative evaluations of the critical immigrant, compared to the people that received sham TMS.

The paper is, without doubt, interesting, despite the many possible methodological confounds. The authors themselves acknowledged some of the drawbacks in the discussion section, so regard the article as a pilot investigation. It doesn’t even have a picture with the TMS coordinates. Nevertheless, reducing someone’s religiosity and extremism by inactivating a portion of the brain… Sometimes I get afraid of my discipline.

Reference: Holbrook C, Izuma K, Deblieck C, Fessler DM, & Iacoboni M (Epub 4 Sep 2015). Neuromodulation of group prejudice and religious belief. Social Cognitive and Affective Neuroscience. DOI: 10.1093/scan/nsv107. Article | Research Gate full text PDF

By Neuronicus, 3 November 2015