Cognitive-stimulation experiment suggests new tools for healthy brain aging

The brain’s default mode network (DMN) was the focus of the cognitive-stimulation experiment. Top: fMRI scans showing DMN regions; bottom: a diagram of typical connectivity between these regions. (credit: John Graner/Walter Reed National Military Medical Center, Andreas Horn et al./NeuroImage, Abigail G. Garrity/Am. J. Psych.)

Neuroscientists in Italy and the U.K. have developed cognitive-stimulation exercises and tested them in a month-long experiment with healthy aging adults. The exercises were based on studies of the brain’s resting state, known as the “default mode network”* (DMN).

In a paper published in Brain Research Bulletin, the researchers explain that in aging (and at a pathological level in AD patients), the posterior (back) region of the DMN in the brain is underactive while the anterior (front) region is overactive. In addition, the two regions are not well connected in aging. So as a proof of concept, working with screened and tested, mentally healthy adults over age 50, the researchers designed an experiment to improve cognitive connectivity in the posterior region and also between the anterior and posterior regions.

Intensive mental exercises

The intensive cognitive exercises were conducted over a period of up to 42 days using E-Prime 2.0 software from Psychology Software Tools. The computer-based exercises were chosen to be in the domains of “semantic processing, memory retrieval, logical reasoning, and executive processing,” the neuroscientists said, with “simultaneous activity in widespread neocortical, and mediotemporal and limbic areas” (the posterior component of the DMN). “Further exercises were then added to foster functional connectivity between anterior and posterior regions.”

Sample trials from the “sequence completion” (left) and “sentence completion” (right) tasks in the exercises. More difficult sequence-completion trials were characterized by abstract or indirect inter-image relations (e.g., “bear” and “bee” on the left, “cat” in the upper central position, “1: milk” and “2: cow” as the alternative answers). (credit: Matteo De Marco et al./Brain Research Bulletin and Psychology Software Tools)

An MRI protocol and a battery of neuropsychological tests were administered at baseline and at the end of the study.

The exercises were followed by fMRI exams. “Significant associations were found between task performance and gray-matter volume of multiple DMN core regions,” the authors note. “Functional regulation of resting-state connectivity within the posterior component of the DMN was found, “but no change in connectivity between the posterior and the anterior components. … These findings suggest that the program devised may have a preventive and therapeutic role in association with early AD-type neurodegeneration.”

The researchers are affiliated with IRCCS Fondazione Ospedale and University of Modena in Italy and University of Sheffield in the U.K.

* “The default mode network is most commonly shown to be active when a person is not focused on the outside world and the brain is at wakeful rest, such as during daydreaming and mind-wandering, but it is also active when the individual is thinking about others, thinking about themselves, remembering the past, and planning for the future. The network activates “by default” when a person is not involved in a task. … The DMN can also be defined by the areas deactivated during external directed tasks compared to rest.” — Wikipedia 1/9/2016 


Abstract of Cognitive stimulation of the default-mode network modulates functional connectivity in healthy aging

A cognitive-stimulation tool was created to regulate functional connectivity within the brain Default-Mode Network (DMN). Computerized exercises were designed based on the hypothesis that repeated task-dependent coactivation of multiple DMN regions would translate into regulation of resting-state network connectivity.

Forty seniors (mean age: 65.90 years; SD: 8.53) were recruited and assigned either to an experimental group (n = 21) who received one month of intensive cognitive stimulation, or to a control group (n = 19) who maintained a regime of daily-life activities explicitly focused on social interactions. An MRI protocol and a battery of neuropsychological tests were administered at baseline and at the end of the study. Changes in the DMN (measured via functional connectivity of posterior-cingulate seeds), in brain volumes, and in cognitive performance were measured with mixed models assessing group-by-timepoint interactions. Moreover, regression models were run to test gray-matter correlates of the various stimulation tasks.

Significant associations were found between task performance and gray-matter volume of multiple DMN core regions. Training-dependent up-regulation of functional connectivity was found in the posterior DMN component. This interaction was driven by a pattern of increased connectivity in the training group, while little or no up-regulation was seen in the control group. Minimal changes in brain volumes were found, but there was no change in cognitive performance.

The training-dependent regulation of functional connectivity within the posterior DMN component suggests that this stimulation program might exert a beneficial impact in the prevention and treatment of early AD neurodegeneration, in which this neurofunctional pathway is progressively affected by the disease.

Real-time 3-D video of nematode brains links neurons with motion and behavior

Princeton University researchers have captured some of the first near-whole-brain recordings of 3-D neural activity of a free-moving animal, and at single-neuron resolution. They studied the nematode Caenorhabditis elegans, a worm species 1 millimeter long with a nervous system containing just 302 neurons.

The three-dimensional recordings could provide scientists with a better understanding of how neurons coordinate action and perception in animals.

As the researchers report in the journal Proceedings of the National Academy of Sciences, their technique allowed them to record the activity of 77 neurons from the animal’s nervous system, focusing on specific behaviors such as backward or forward motion and turning.


Andrew Leifer/Lewis-Sigler Institute for Integrative Genomics | This video — displayed in quarter-time — shows the four simultaneous video feeds the Princeton researchers used to capture the nematodes’ neural activity. Upper left: the position of the nuclei in all the neurons in an animal’s brain. Upper right: recorded neural activity, indicated by a fluorescent calcium indicator. Lower left: the animal’s posture on the microscope plate, which automatically adjusted to keep the animal within the cameras’ view. Bottom right: a low-magnification fluorescent image of a nematode brain, which contains 302 neurons.

Most previous research on brain activity has focused on small subregions of the brain or is based on observations of organisms that are unconscious or somehow limited in mobility, explained corresponding author Andrew Leifer, an associate research scholar in Princeton’s Lewis-Sigler Institute for Integrative Genomics.

“This system is exciting because it provides the most detailed picture yet of brain-wide neural activity with single-neuron resolution in the brain of an animal that is free to move around,” Leifer said. “Neuroscience is at the beginning of a transition towards larger-scale recordings of neural activity and towards studying animals under more natural conditions,” he said. “This work helps push the field forward on both fronts.”

A current focus in neuroscience is understanding how networks of neurons coordinate to produce behavior. “The technology to record from numerous neurons as an animal goes about its normal activities, however, has been slow to develop,” Leifer said.


Andrew Leifer, Lewis-Sigler Institute for Integrative Genomics | Nematode neural nuclei in 3-D, showing the location of brain-cell nuclei in a nematode’s head.

The simpler nervous system of C. elegans provided the researchers with a more manageable testing ground, but could also reveal information about how neurons work together, which applies to more complex organisms, Leifer said. For instance, the researchers were surprised by the number of neurons involved in the seemingly simple act of turning around.

“One reason we were successful was that we chose to work with a very simple organism,” Leifer said. “It would be immensely more difficult to perform whole-brain recordings in humans. The technology needed to perform similar recordings in humans is many years away. By studying how the brain works in a simple animal like the worm, however, we hope to gain insights into how collections of neurons work that are universal for all brains, even humans.”

The researchers designed an instrument that captures calcium levels in brain cells as they communicate with one another. The level of calcium in each brain cell tells the researchers how active that cell is in its communication with other cells in the nervous system. They induced the nemotodes’ brain cells to generate a protein known as a “calcium indicator” that becomes fluorescent when it comes in contact with calcium.

The researchers used a special type of microscope to record the nematodes’ free movements and also neuron-level calcium activity for more than four minutes and in 3-D. Special software the researchers designed monitored the position of an animal’s head in real time as a motorized platform automatically adjusted to keep the animal within the field of view of a series of cameras.


Andrew Leifer, Lewis-Sigler Institute for Integrative Genomics | A visualization of neural activity in the nematode brain. Upper-left: Each colored sphere represents a neuron, and its location in the drawing shows the position of that neuron in the worm’s head. Upper-right: The size and color of a sphere indicates the level of neural activity (purple spheres: the least amount of activity; large yellow spheres: most significant). By watching neurons that grow and shrink, the viewer can get an impression of the range of neural activity in the worm. Lower left and right panels: The worm’s movement in real time and the worm’s location plotted on a graph.

Leifer said these recordings are very large and the researchers have only begun the process of carefully mining all of the data.

“An exciting next step is to use correlations in our recordings to build mathematical and computer models of how the brain functions,” he said. “We can use these models to generate hypotheses about how neural activity generates behavior. We plan to then test these hypotheses, for example, by stimulating specific neurons in an organism and observing the resulting behavior.”


Abstract of Whole-brain calcium imaging with cellular resolution in freely behaving Caenorhabditis elegans

The ability to acquire large-scale recordings of neuronal activity in awake and unrestrained animals is needed to provide new insights into how populations of neurons generate animal behavior. We present an instrument capable of recording intracellular calcium transients from the majority of neurons in the head of a freely behaving Caenorhabditis elegans with cellular resolution while simultaneously recording the animal’s position, posture, and locomotion. This instrument provides whole-brain imaging with cellular resolution in an unrestrained and behaving animal. We use spinning-disk confocal microscopy to capture 3D volumetric fluorescent images of neurons expressing the calcium indicator GCaMP6s at 6 head-volumes/s. A suite of three cameras monitor neuronal fluorescence and the animal’s position and orientation. Custom software tracks the 3D position of the animal’s head in real time and two feedback loops adjust a motorized stage and objective to keep the animal’s head within the field of view as the animal roams freely. We observe calcium transients from up to 77 neurons for over 4 min and correlate this activity with the animal’s behavior. We characterize noise in the system due to animal motion and show that, across worms, multiple neurons show significant correlations with modes of behavior corresponding to forward, backward, and turning locomotion.

How brain architecture relates to consciousness and abstract thought

humanconnectome

Human brain connectome (credit: NIH Human Connectome Project)

Ever wonder how your brain creates your thoughts, based on everything that’s happening around you (and within you), and where these thoughts are actually located in the brain?

UMass Amherst computational neuroscientist Hava Siegelmann has, and she created a geometry-based method for doing just that. Her team did a massive data analysis of 20 years of functional magnetic resonance imaging (fMRI) data from tens of thousands of brain imaging experiments. The goal was to understand how abstract thought arises from brain structure, which could lead to better ways to identify and treat brain disease and even to new deep-learning artificial intelligence (AI) systems.

Details appear in an open-access article in the current issue of Nature Scientific Reports.

How abstract thoughts are formed

KurzweilAI has covered more than 200 research projects involving fMRI. Basically, fMRI detects changes in neural blood flow, which relates to specific brain activities (such as imagining what an object looks like, or talking). More blood flow means higher levels of neural activity in that specific brain region. While fMRI-based research has done an impressive job of relating specific brain areas with activities, surprisingly, “no one had ever tied together the tens of thousands of experiments performed over decades to show how the physical brain could give rise to abstract thought,” Siegelmann notes.

For this study, the researchers took a data-science approach. First, they defined a physiological directed network (a form of a graph with nodes and links) of the whole brain, starting at input areas and labeling each brain area with the distance (or “depth”) from sensory inputs. For example, in the drawing below, the visual cortex (in green) is located far away from the eyes (on the left) while the auditory cortex (in yellow) is relatively close to the ears (although routing via the thalamus makes this more complex).

OK, so what does that mean in terms of thinking? To find out, they processed a massive repository of fMRI data from about 17,000 experiments, representing about one fourth of the fMRI literature).

Regions of motor and sensory cortex (credit: Blausen.com staff/Blausen gallery 2014/Wikiversity)

“The idea was to project the active regions for a cognitive behavior onto the network depth and describe that cognitive behavior in terms of its depth distribution,” says Siegelmann. “We momentarily thought our research failed when we saw that each cognitive behavior showed activity through many network depths. Then we realized that cognition is far richer; it wasn’t the simple hierarchy that everyone was looking for. So, we developed our geometrical ‘slope’ algorithm.”

Ranking cognitive behaviors

The researchers summed all neural activity for a given behavior over all related fMRI experiments, then analyzed it using the slope algorithm. “With a slope identifier, behaviors could now be ordered by their relative depth activity, with no human intervention or bias,” she adds. They ranked slopes for all cognitive behaviors from the fMRI databases from negative to positive and found that they ordered from more tangible to highly abstract. An independent test of an additional 500 study participants supported the result.

She and colleagues found that cognitive function and abstract thought exist as a combination of many cortical sources ranging from those close to sensory cortices to far deeper from them along the brain connectome, or connection wiring diagram.

Generated by human-blind automated procedures, this diagram depicts an oversimplified graphical model of the information representation flow from sensory inputs (bottom) to abstract representations (top) in human cortex. Bottom layer of the pyramid included a sample representative description of the 20th percentile of behavioral elements closest to sensory inputs, the next layer up includes a sample description of behavioral elements from the 20–40th percentile…with the top layer containing a sample description of the behavioral elements distributed deepest in the cortical network, at the structural pinnacle of cognition. (credit: P. Taylor et al./Nature Scientific Reports)

The authors say their work demonstrates that all cognitive behaviors exist on a hierarchy, starting with the most tangible behaviors (such as finger tapping or pain), then to consciousness, and extending to the most abstract thoughts and activities such as naming. This hierarchy of abstraction is related to the connectome structure of the whole human brain — the connections between different regions of the brain — they add.

Creating a massively recurrent deep learning network

Siegelmann says this work will have great impact in computer science, especially in deep learning. “Deep learning is a computational system employing a multi-layered neural net, and is at the forefront of artificial intelligence (AI) learning algorithms,” she explains. “It bears similarity to the human brain in that higher layers are agglomerations of previous layers, and so provides more information in a single neuron.

“But the brain’s processing dynamic is far richer and less constrained because it has recurrent interconnection, sometimes called feedback loops. In current human-made deep learning networks that lack recurrent interconnections, a particular input cannot be related to other recent inputs, so they can’t be used for time-series prediction, control operations, or memory.”

Her lab is now creating a “massively recurrent deep learning network,” she says, for a more brain-like and superior learning AI, along with a new geometric data-science tool, which may find widespread use in other fields where massive data is difficult to view coherently due to data overlap.

New hope for biomarkers of brain disorders

Siegelmann believes this work will also have far-reaching effects for brain disorders. “Many brain disorders are implicated by non-standard processing or abnormal combination of sensory information,” she says. “Currently, many brain disorders lack a clear biological identifier, and are diagnosed by symptoms, such as confusion, memory loss and depression.

Our research suggests an entirely new method for analyzing brain abnormalities and is a source of new hope for developing biomarkers for more accurate and earlier diagnoses of psychiatric and neurological diseases.”

Siegelmann is director of the Biologically Inspired Neural and Dynamical Systems Laboratory at UMass Amherst and one of 16 recipients in 2015 of the National Science Foundation’s (NSF) Brain Research through Advancing Innovative Neurotechnologies (BRAIN) program initiated by President Obama to advance understanding of the brain. The work is supported by the U.S. Office of Naval Research.


Abstract of The global landscape of cognition: hierarchical aggregation as an organizational principle of human cortical networks and functions

Though widely hypothesized, limited evidence exists that human brain functions organize in global gradients of abstraction starting from sensory cortical inputs. Hierarchical representation is accepted in computational networks, and tentatively in visual neuroscience, yet no direct holistic demonstrations exist in vivo. Our methods developed network models enriched with tiered directionality, by including input locations, a critical feature for localizing representation in networks generally. Grouped primary sensory cortices defined network inputs, displaying global connectivity to fused inputs. Depth-oriented networks guided analyses of fMRI databases (~17,000 experiments;~1/4 of fMRI literature). Formally, we tested whether network depth predicted localization of abstract versus concrete behaviors over the whole set of studied brain regions. For our results, new cortical graph metrics, termednetwork-depth, ranked all databased cognitive function activations by network-depth. Thus, we objectively sorted stratified landscapes of cognition, starting from grouped sensory inputs in parallel, progressing deeper into cortex. This exposed escalating amalgamation of function or abstraction with increasing network-depth, globally. Nearly 500 new participants confirmed our results. In conclusion, data-driven analyses defined a hierarchically ordered connectome, revealing a related continuum of cognitive function. Progressive functional abstraction over network depth may be a fundamental feature of brains, and is observed in artificial networks.

A ‘garbage disposal’ drug may slow Alzheimer’s and other neurodegenerative diseases

Rolipram drug activates the brain’s garbage disposal system, eliminating excess tau proteins (glowing red dots) associated with neurodegenerative diseases such as Alzheimer’s. (credit: Laboratory of Karen Duff/Columbia University Medical Center)

Rolipram, a drug that boosts activity in the brain’s “garbage disposal” system, can decrease levels of toxic proteins associated with Alzheimer’s disease and other neurodegenerative disorders and improve cognition in mice, a new study by neuroscientists has found.

Rolipram causes nausea, but similar drugs do not, and could be tested in clinical trials quickly, the researchers say.

“This has the potential to open up new avenues of treatment for Alzheimer’s and many other neurodegenerative diseases,” said study leader Karen E. Duff, PhD, professor of pathology and cell biology at Columbia University Medical Center (CUMC) and New York State Psychiatric Institute (NYSPI).

A “garbage-disposal” switch

To remain healthy, brain cells must continually clear out old, worn, or damaged proteins. This task is performed by a small molecule called the proteasome, which works like a kitchen garbage-disposal system, grinding up the old proteins so they can be recycled into new ones. However, in neurodegenerative diseases such as Alzheimer’s, Parkinson’s, and Huntington’s, proteins tagged for destruction accumulate in the brain’s neurons. This suggests that the cell’s proteasomes are impaired.

The cause for this: tau — a protein that accumulates in Alzheimer’s and other brain diseases — sticks to the proteasome and jams up the protein garbage-disposal process, the researchers first discovered (using a genetically engineered mouse).

In the new research, administering rolipram activated the proteasome and restored protein disposal. The drug also improved memory in diseased mice to levels seen in healthy mice.

Rolipram has been tested before in mice, and was shown to improve memory. But the new research shows a previously unknown function of the drug: it produces a physical change in the proteasome and increases its activity.

Should ‘clear out everything at once’

Duff says we still don’t know exactly which form of a particular protein is toxic to the brain, which has made it difficult to develop drugs to treat neurodegenerative diseases. “In Alzheimer’s disease, the problem is compounded because several types of abnormal protein can accumulate in a person’s brain, including amyloid, tau, alpha-synuclein, and TDP43.

However, the researchers think that “a well-functioning proteasome will be able to clear out everything at once,” she says — including Alzheimer’s, frontotemporal degeneration, Huntington’s, and Parkinson’s.

The study was published Tuesday (Dec. 22) in the online edition of Nature Medicine. The National Institute of Health’s National Institute of Neurological Disorders and Stroke provided funding for the study.


Abstract of Tau-driven 26S proteasome impairment and cognitive dysfunction can be prevented early in disease by activating cAMP-PKA signaling

The ubiquitin proteasome system (UPS) degrades misfolded proteins including those implicated in neurodegenerative diseases. We investigated the effects of tau accumulation on proteasome function in a mouse model of tauopathy and in a cross to a UPS reporter mouse (line Ub-G76V-GFP). Accumulation of insoluble tau was associated with a decrease in the peptidase activity of brain 26S proteasomes, higher levels of ubiquitinated proteins and undegraded Ub-G76V-GFP. 26S proteasomes from mice with tauopathy were physically associated with tau and were less active in hydrolyzing ubiquitinated proteins, small peptides and ATP. 26S proteasomes from normal mice incubated with recombinant oligomers or fibrils also showed lower hydrolyzing capacity in the same assays, implicating tau as a proteotoxin. Administration of an agent that activates cAMP–protein kinase A (PKA) signaling led to attenuation of proteasome dysfunction, probably through proteasome subunit phosphorylation. In vivo, this led to lower levels of aggregated tau and improvements in cognitive performance.

Genetic ‘intelligence networks’ discovered in the brain

Color-coded heatmap of gradient of expression of the M1 gene network, spanning fetal development to late adulthood and expressed in distinct cortical regions (listed on right, such as primary somatosensory cortex, S1C). Most of the genes in this network express in cortical regions (indicated by red), except for the V1C (primary visual cortex), STR (striatum), CBC (cerebellar cortex), and MD (mediodorsal nucleus of thalamus) brain areas. (credit: Michael R. Johnson et al./Nature Neuroscience)

Scientists from Imperial College London have identified two clusters (“gene networks”) of genes that are linked to human intelligence. Called M1 and M3, these gene networks appear to influence cognitive function, which includes memory, attention, processing speed and reasoning.

Importantly, the scientists have discovered that these two networks are likely to be under the control of master regulator switches. The researcher want to identify those switches and see if they can manipulate them, and ultimately find out if this knowledge of gene networks could allow for boosting cognitive function.

“We know that genetics plays a major role in intelligence but until now, haven’t known which genes are relevant,” said Michael Johnson, lead author of the study from the Imperial College London Department of Medicine. Johnson says the genes they have found so far are likely to share a common regulation, which means it may be possible to manipulate a whole set of genes linked to human intelligence.

Combining data from brain samples, genomic information, and IQ tests

In the study, published in the journal Nature Neuroscience, the international team of researchers looked at samples of human brain from patients who had undergone neurosurgery for epilepsy. The investigators analyzed thousands of genes expressed in the human brain, and then combined these results with genetic information from healthy people who had undergone IQ tests and from people with neurological disorders such as autism spectrum disorder and intellectual disability.

Then they conducted various computational analyses and comparisons to identify the gene networks influencing healthy human cognitive abilities. Remarkably, they found that some of the same genes that influence human intelligence in healthy people cause impaired cognitive ability and epilepsy when mutated. And they found that genes that make new memories or sensible decisions when faced with lots of complex information also overlap with those that cause severe childhood onset epilepsy or intellectual disability.

“This study shows how we can use large genomic datasets to uncover new pathways for human brain function in both health and disease,” Johnson said. “Eventually, we hope that this sort of analysis will provide new insights into better treatments for neurodevelopmental diseases such as epilepsy, and ameliorate or treat the cognitive impairments associated with these devastating diseases.”


Abstract of Systems genetics identifies a convergent gene network for cognition and neurodevelopmental disease

Genetic determinants of cognition are poorly characterized, and their relationship to genes that confer risk for neurodevelopmental disease is unclear. Here we performed a systems-level analysis of genome-wide gene expression data to infer gene-regulatory networks conserved across species and brain regions. Two of these networks, M1 and M3, showed replicable enrichment for common genetic variants underlying healthy human cognitive abilities, including memory. Using exome sequence data from 6,871 trios, we found that M3 genes were also enriched for mutations ascertained from patients with neurodevelopmental disease generally, and intellectual disability and epileptic encephalopathy in particular. M3 consists of 150 genes whose expression is tightly developmentally regulated, but which are collectively poorly annotated for known functional pathways. These results illustrate how systems-level analyses can reveal previously unappreciated relationships between neurodevelopmental disease–associated genes in the developed human brain, and provide empirical support for a convergent gene-regulatory network influencing cognition and neurodevelopmental disease.

Pulsed laser light turns whole-brain activity on and off

Optogenetic laser light stimulation of the thalamus (credit: Jia Liu et al./eLife)

By flashing high-frequency (40 to 100 pulses per second) optogenetic lasers at the brain’s thalamus, scientists were able to wake up sleeping rats and cause widespread brain activity. In contrast, flashing the laser at 10 pulses per second suppressed the activity of the brain’s sensory cortex and caused rats to enter a seizure-like state of unconsciousness.

“We hope to use this knowledge to develop better treatments for brain injuries and other neurological disorders,” said Jin Hyung Lee, Ph.D., assistant professor of neurology, neurosurgery, and bioengineering at Stanford University, and a senior author of the study, published in the open-access journal eLIFE.

Located deep inside the brain, the thalamus regulates arousal, acting as a relay station to the cortex for neural signals from the body. Damage to neurons in the central part of the thalamus may lead to problems with sleep, attention, and memory.*

Combining light stimulation and fMRI measurements

The observations used a combination of optogenetics and whole-brain functional MRI (fMRI) — known as “ofMRI” — to detect overall effects on the brain, along with EEG and single-unit cell recordings.The researchers noted in the paper that “using targeted, temporally precise optogenetic stimulation in the current study allowed us to selectively excite a single group of neuronal elements and identify their specific role in creating distinct modes of network function.” That could not be achieved with conventional electrode stimulation, the researchers say.

They explain that this method may allow for direct-brain stimulation (DBS) therapeutic methods to be optimized in the clinic “for a wide range of neurological disorders that currently lack such treatment.”

“This study takes a big step towards understanding the brain circuitry that controls sleep and arousal,” Yejun (Janet) He, Ph.D., program director at NIH’s National Institute of Neurological Disorders and Stroke (NINDS), which partially funded the study.

* Further experiments suggested the different effects may be due to a unique firing pattern by inhibitory neurons in a neighboring brain region, the zona incerta, during low frequency stimulation. Cells in this brain region have been shown to send inhibitory signals to cells in the sensory cortex. Electrical recordings showed that during low frequency stimulation of the central thalamus, zona incerta neurons fired in a spindle pattern that often occurs during sleep. In contrast, sleep spindles did not occur during high frequency stimulation. Moreover, when the scientists blocked the firing of the zona incerta neurons during low frequency stimulation of the central thalamus, the average activity of sensory cortex cells increased.


Abstract of Frequency-selective control of cortical and subcortical networks by central thalamus

Central thalamus plays a critical role in forebrain arousal and organized behavior. However, network-level mechanisms that link its activity to brain state remain enigmatic. Here, we combined optogenetics, fMRI, electrophysiology, and video-EEG monitoring to characterize the central thalamus-driven global brain networks responsible for switching brain state. 40 and 100 Hz stimulations of central thalamus caused widespread activation of forebrain, including frontal cortex, sensorimotor cortex, and striatum, and transitioned the brain to a state of arousal in asleep rats. In contrast, 10 Hz stimulation evoked significantly less activation of forebrain, inhibition of sensory cortex, and behavioral arrest. To investigate possible mechanisms underlying the frequency-dependent cortical inhibition, we performed recordings in zona incerta, where 10, but not 40, Hz stimulation evoked spindle-like oscillations. Importantly, suppressing incertal activity during 10 Hz central thalamus stimulation reduced the evoked cortical inhibition. These findings identify key brain-wide dynamics underlying central thalamus arousal regulation.

Stanford researcher scans his own brain for a year and a half — the most studied in the world

humanconnectome

Human connectome (Credit: NIH Human Connectome Project)

You’ve probably seen the “connectome” map of the major networks between different functional areas of the human brain. Cool graphic. But this is just an average.

It raises a lot of questions: How does this map relate to your brain? Do these connections persist over a period of months or more? Or do they vary with different conditions (happy or sad mood, etc.)? And what if you’re a schizophrenic, alcoholic, meditator, or videogamer, etc., how does your connectome look?

These questions obsessed Stanford psychologist Russell Poldrack, leading to his “MyConnectome project.” In the noble DIY tradition of Marie Curie, Jonas Salk, and Albert Hoffman, he started off his day by climbing into an MRI machine and scanning his brain for 10 minutes Tuesdays and Thursdays every week for a year and a half — making his brain the most studied in the world.

Poldrack’s morning FMRI scan (credit: Russell Poldrack)

He also fasted and drew blood on Tuesdays for testing with metabolomics (chemical fingerprints in biological fluids) and genomics (gene tests, performed by 23andMe).

The results — the most complete study of the brain’s network connections over time — are published in open-access Nature Communications.

An overview of the resting-state fMRI analysis pipeline (credit: Russell A. Poldrack et al./Nature Communications)

Here is some of what he found out:

  • His connectivity was surprisingly consistent, which is good news for researchers studying differences between healthy brains and those of patients with neurological disorders that might suffer from disrupted connectivity, such as schizophrenia or bipolar disorder.
  • There was a strong correlation between brain activity and changes in the expression of many different families of genes. The expression of genes related to inflammation and immune response matched Poldrack’s psoriasis flare-ups, for example.
  • Fasting with no caffeine on Tuesdays radically changed the connection between the somatosensory motor network and the higher vision network: it grew significantly tighter without caffeine. “That was totally unexpected, but it shows that being caffeinated radically changes the connectivity of your brain,” Poldrack said. “We don’t really know if it’s better or worse, but it’s interesting that these are relatively low-level areas. It may well be that I’m more fatigued on those days, and that drives the brain into this state that’s focused on integrating those basic processes more.”

Network connections for Tuesdays (fasted) and Thursdays (fed/caffeinated). Hubs are shown as larger nodes, with provincial hubs depicted as circles and connector hubs depicted as triangles. Network module membership is coded by node color; major networks are shaded, including somatomotor (red), second visual (blue), cingulo-opercular (purple), fronto-parietal (yellow) and default mode (black). (credit: Russell A. Poldrack et al./Nature Communications)

What’s next

“I’m generally a pretty happy and even-keeled person,” Poldrack said. “My positive mood is almost always high, and my negative mood is almost always non-existent. It would be interesting to scan people with a wider emotional variation and see how their connections look over time.” As he suggests in the video (below), “We need to learn a lot more about how individual brains differ from one another. … There are many more questions yet to be answered. … When it comes to understanding the brain. we really just scratched the surface.”

Fortunately, Poldrack and his colleagues have made the entire data set and the ready-built tools to analyze it available here. The data set is large and deep; Poldrack said he hopes people will approach it from innovative angles and uncover connections that will help advance the research. Meanwhile, Poldrack plans to hone software to elucidate the interplay between brain function and gene expression.

But so far we only have a experimental population of one. Any volunteers (and funders) for a follow-up study?

* In any action that a person undertakes, many different regions of the brain communicate with each other, serving as a sort of check-and-balance system to make sure that the correct actions are taken to deal with the situation at hand. These messages are communicated over more than a dozen networks, sets of functional areas of the brain that preferentially talk to one another.

There are multiple networks for vision, a somatosensory/motor network, and there are others that are attributed to attention or task management. Collectively, these are known as the connectome. Because the strength or efficiency of these individual networks can affect behavior, they have become of greater interest to researchers in recent years. To isolate these connections, researchers examine functional MRI data collected while the patient is at rest.


Stanford | Stanford researcher scans his own brain for a year and a half


Abstract of Long-term neural and physiological phenotyping of a single human

Psychiatric disorders are characterized by major fluctuations in psychological function over the course of weeks and months, but the dynamic characteristics of brain function over this timescale in healthy individuals are unknown. Here, as a proof of concept to address this question, we present the MyConnectome project. An intensive phenome-wide assessment of a single human was performed over a period of 18 months, including functional and structural brain connectivity using magnetic resonance imaging, psychological function and physical health, gene expression and metabolomics. A reproducible analysis workflow is provided, along with open access to the data and an online browser for results. We demonstrate dynamic changes in brain connectivity over the timescales of days to months, and relations between brain connectivity, gene expression and metabolites. This resource can serve as a testbed to study the joint dynamics of human brain and metabolic function over time, an approach that is critical for the development of precision medicine strategies for brain disorders.

Importance of physical activity and aerobic exercise for healthy brain function

Results of exploratory whole-brain analysis. Parts (a) and (b) illustrate the results of an exploratory whole brain analysis, showing regions (red) where gray matter volume may be associated with fitness percentile or memory accuracy, respectively. Results are depicted within the group average brain. (credit: Andrew S. Whiteman et al./NeuroImage)

Young adults who have greater aerobic fitness also have greater volume of their entorhinal cortex, an area of the brain responsible for memory, Boston University School of medicine (BUSM) researchers have found.

While aerobic fitness is not directly associated with performance on a recognition memory task, the participants with a larger entorhinal cortex also performed better on a recognition memory task.

The entorhinal cortex is a brain area known to show early pathology in Alzheimer’s disease, which is characterized by profound memory impairment.

The researchers recruited healthy young adults (ages 18-35 years) who underwent a treadmill test to measure aerobic capacity. During this test, the amount of oxygen and carbon dioxide in the participants’ breath as they walked or ran on a treadmill was measured.

Participants then underwent magnetic resonance imaging and performed a recognition memory task. Entorhinal and hippocampal volume was determined using a method known as voxel-based morphometry and then regression analysis to examine whether recognition memory and aerobic fitness predicted brain volumes.

Effects of aerobic exercise

“Our results suggest that aerobic exercise may have a positive effect on the medial temporal lobe memory system (which includes the entorhinal cortex) in healthy young adults. This suggests that exercise training, when designed to increase aerobic fitness, might have a positive effect on the brain in healthy young adults,” explained corresponding author and principal investigator Karin Schon, PhD, BUSM assistant professor of anatomy and neurobiology.

Researchers said this work could support previous studies that suggest aerobic exercise may forestall cognitive decline in older individuals at risk of dementia, and extends the idea that exercise may be beneficial for brain health to younger adults. “This is critical given that obesity, which has recently been linked with cognitive deficits in young and middle-aged adults, and physical inactivity are on the rise in young adults,” Schon said.

These findings appear in the journal NeuroImage.


Abstract of Entorhinal volume, aerobic fitness, and recognition memory in healthy young adults: A voxel-based morphometry study

Converging evidence supports the hypothesis effects of aerobic exercise and environmental enrichment are beneficial for cognition, in particular for hippocampus-supported learning and memory. Recent work in humans suggests that exercise training induces changes in hippocampal volume, but it is not known if aerobic exercise and fitness also impact the entorhinal cortex. In animal models, aerobic exercise increases expression of growth factors, including brain derived neurotrophic factor (BDNF). This exercise-enhanced expression of growth hormones may boost synaptic plasticity, and neuronal survival and differentiation, potentially supporting function and structure in brain areas including but not limited to the hippocampus. Here, using voxel based morphometry and a standard graded treadmill test to determine cardio-respiratory fitness (Bruce protocol; VO2 max), we examined if entorhinal and hippocampal volumes were associated with cardio-respiratory fitness in healthy young adults (N = 33). In addition, we examined if volumes were modulated by recognition memory performance and by serum BDNF, a putative marker of synaptic plasticity. Our results show a positive association between volume in right entorhinal cortex and cardio-respiratory fitness. In addition, average gray matter volume in the entorhinal cortex, bilaterally, was positively associated with memory performance. These data extend prior work on the cerebral effects of aerobic exercise and fitness to the entorhinal cortex in healthy young adults thus providing compelling evidence for a relationship between aerobic fitness and structure of the medial temporal lobe memory system.

How much TV you watch as a young adult may affect midlife cognitive function

(credit: iStock)

Watching a lot of TV and having a low physical activity level as a young adult were associated with worse cognitive function 25 years later in midlife, according to an article published online by JAMA Psychiatry.

The researchers* examined associations between 25-year patterns of television viewing and physical activity and midlife cognition in a study of 3,247 adults (ages 18 to 30), using a questionnaire to assess television viewing and physical activity during repeated visits over 25 years.

Cognitive function was evaluated at year 25 using three tests that assessed processing speed, executive function and verbal memory.

Participants with high television viewing (more than three hours per day for more than two-thirds of the visits) during 25 years were more likely to have poor cognitive performance on some of the tests. Low physical activity (measured as units based on time and intensity) during 25 years was associated with poor performance on one of the tests. The odds of poor cognitive performance were almost two times higher for adults with both high television viewing and low physical activity.

These behaviors were associated with slower processing speed and worse executive function but not with verbal memory. Participants with the least active patterns of behavior (both low physical activity and high television viewing time) were the most likely to have poor cognitive function. … Individuals with both low physical activity and high sedentary behavior may represent a critical target group, the study concludes.

The authors acknowledge a few limitations, including possible selection bias and that physical activity and TV viewing were self-reported. (Also, correlation does not imply causation.)

* Tina D. Hoang, M.S.P.H., of the Northern California Institute for Research and Education at the Veterans Affairs Medical Center, San Francisco, Kristine Yaffe, M.D., of the University of California, San Francisco, and coauthors.


Abstract of Effect of Early Adult Patterns of Physical Activity and Television Viewing on Midlife Cognitive Function

IMPORTANCE Sedentary behaviors and physical inactivity are not only increasing worldwide but also are critical risk factors for adverse health outcomes. Yet, few studies have examined the effects of sedentary behavior on cognition or the long-term role of either behavior in early to middle adulthood.

OBJECTIVE To investigate the association between 25-year patterns of television viewing and physical activity and midlife cognition.

DESIGN, SETTING, AND PARTICIPANTS Prospective study of 3247 adults (black and white races; aged 18-30 years) enrolled in the Coronary Artery Risk Development in Young Adults (CARDIA) Study (March 25, 1985, to August 31, 2011). Data analysis was performed June 1, 2014, through April 15, 2015.

MAIN OUTCOMES AND MEASURES We assessed television viewing and physical activity at repeated visits (3 assessments) over 25 years using a validated questionnaire. A 25-year pattern of high television viewing was defined as watching TV above the upper baseline quartile (>3 hours/d) for more than two-thirds of the visits, and a 25-year pattern of low physical activity was defined as activity levels below the lower, sex-specific baseline quartile for more than two-thirds of the of the visits. We evaluated cognitive function at year 25 using the Digit Symbol Substitution Test (DSST), Stroop test, and Rey Auditory Verbal Learning Test.

RESULTS At baseline, the mean (SD) age of the 3247 study participants was 25.1 (3.6) years, 1836 (56.5%) were female, 1771 (54.5%) were white, and 3015 (92.9%) had completed at least high school. Compared with participants with low television viewing, those with high television viewing during 25 years (353 of 3247 [10.9%]) were more likely to have poor cognitive performance (<1 SD below the race-specific mean) on the DSST and Stroop test, with findings reported as adjusted odds ratio (95% CI): DSST, 1.64 (1.21-2.23) and Stroop test, 1.56 (1.13-2.14), but not the Rey Auditory Verbal Learning Test, adjusted for age, race, sex, educational level, smoking, alcohol use, body mass index, and hypertension. Low physical activity during 25 years in 528 of 3247 participants (16.3%) was significantly associated with poor performance on the DSST, 1.47 (1.14-1.90). Compared with participants with low television viewing and high physical activity, the odds of poor performance were almost 2 times higher for adults with both high television viewing and low physical activity in 107 of 3247 (3.3%) (DSST, 1.95 [1.19-3.22], and Stroop test, 2.20 [1.36-3.56]).

CONCLUSIONS AND RELEVANCE High television viewing and low physical activity in early adulthood were associated with worse midlife executive function and processing speed. This is one of the first studies to demonstrate that these risk behaviors may be critical targets for prevention of cognitive aging even before middle age.

Parkinson’s disease researchers discover a way to reprogram the genome to produce dopamine neurons

Image shows a protein found only in neurons (red) and an enzyme that synthesizes dopamine (green). Cell DNA is labeled in blue. (credit: Jian Feng, University at Buffalo)

Parkinson’s disease researchers at the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo have developed a way to ramp up the conversion of skin cells into neurons that can produce dopamine.

For decades, the elusive holy grail in Parkinson’s disease research has been finding a way to repair faulty dopamine neurons and put them back into patients, where they will start producing dopamine again. Researchers have tried fetal material, which is difficult to obtain and of variable quality, and embryonic stem cells (a long process with a low yield), and more recently, skin cells (difficult to obtain sufficient quantities of neurons).

To control movement and balance, dopamine signals travel from the substantia nigra in the midbrain up to brain regions including the corpus striatum, the globus pallidus, and the thalamus. But in Parkinson’s disease, most of the dopamine signals from the substantia nigra are lost. (credit: NIH)

Bypassing the cellular “gatekeeper”

The new UB research, published Dec. 7 in an open-access article in Nature Communications, is based on their discovery that p53, a transcription factor protein, acts as a gatekeeper protein.

“We found that p53 tries to maintain the status quo in a cell; it guards against changes from one cell type to another,” explained Jian Feng, PhD, senior author and professor in the Department of Physiology and Biophysics in the Jacobs School of Medicine and Biomedical Sciences at UB.

This is, p53 acts as a kind of gatekeeper protein to prevent conversion into another type of cell. “Once we lowered the expression of p53, then things got interesting: We were able to reprogram the [skin cell] fibroblasts into neurons much more easily.”

The advance may also be important for basic cell biology, Feng said. “This is a generic way for us to change cells from one type to another,” he said. “It proves that we can treat the cell as a software system when we remove the barriers to change. If we can identify transcription factor combinations that control which genes are turned on and off, we can change how the genome is being read. We might be able to play with the system more quickly and we might be able to generate tissues similar to those in the body, even brain tissue.

“People like to think that things proceed in a hierarchical way, that we start from a single cell and develop into an adult with about 40 trillion cells, but our results prove that there is no hierarchy,” he continued. “All our cells have the same source code as our first cell; this code is read differently to generate all types of cells that make up the body.”

Generating new dopamine neurons via cellular conversion

Timing was key to their success.  “We found that the point in the cell cycle just before the cell tries to sense its environment to ensure that all is ready for duplicating the genome is the prime time when the cell is receptive to change,” said Feng.

By lowering the genomic gatekeeper p53 at the right time of cell cycle, they could easily turn the skin cells into dopamine neurons, using transcription-factor combinations discovered in previous studies. These manipulations turn on the expression of Tet1, a DNA modification enzyme that changes how the genome is read.

“Our method is faster and much more efficient than previously developed ones,” said Feng. “The best previous method could take two weeks to produce 5 percent dopamine neurons. With ours, we got 60 percent dopamine neurons in ten days.”

The researchers have done multiple experiments to prove that these neurons are functional mid-brain dopaminergic neurons, the type lost in Parkinson’s disease.

The finding may enable researchers to generate patient-specific neurons in a dish that could then be transplanted into the brain to repair the faulty neurons, or used to efficiently screen new treatments for Parkinson’s disease.


Abstract of Cell cycle and p53 gate the direct conversion of human fibroblasts to dopaminergic neurons

The direct conversion of fibroblasts to induced dopaminergic (iDA) neurons and other cell types demonstrates the plasticity of cell fate. The low efficiency of these relatively fast conversions suggests that kinetic barriers exist to safeguard cell-type identity. Here we show that suppression of p53, in conjunction with cell cycle arrest at G1 and appropriate extracellular environment, markedly increase the efficiency in the transdifferentiation of human fibroblasts to iDA neurons by Ascl1, Nurr1, Lmx1a and miR124. The conversion is dependent on Tet1, as G1 arrest, p53 knockdown or expression of the reprogramming factors induces Tet1 synergistically. Tet1 knockdown abolishes the transdifferentiation while its overexpression enhances the conversion. The iDA neurons express markers for midbrain DA neurons and have active dopaminergic transmission. Our results suggest that overcoming these kinetic barriers may enable highly efficient epigenetic reprogramming in general and will generate patient-specific midbrain DA neurons for Parkinson’s disease research and therapy.