Neuroscientists create organic-computing ‘Brainet’ network of rodent and primate brains — humans next

Experimental apparatus scheme for a Brainet computing device. A Brainet of four interconnected brains is shown. The arrows represent the flow of information through the Brainet. Inputs were delivered (red) as simultaneous intracortical microstimulation (ICMS) patterns (via implanted electrodes) to the somatosensory cortex of each rat. Neural activity (black) was then recorded and analyzed in real time. Rats were required to synchronize their neural activity with the other Brainet participants to receive water. (credit: Miguel Pais-Vieira et al./Scientific Reports)

Duke University neuroscientists have created a network called “Brainet” that uses signals from an array of electrodes implanted in the brains of multiple rodents in experiments to merge their collective brain activity and jointly control a virtual avatar arm or even perform sophisticated computations — including image pattern recognition and even weather forecasting.

Brain-machine interfaces (BMIs) are computational systems that allow subjects to use their brain signals to directly control the movements of artificial devices, such as robotic arms, exoskeletons or virtual avatars. The Duke researchers at the Center for Neuroengineering previously built BMIs to capture and transmit the brain signals of individual rats, monkeys, and even human subjects, to control devices.

“Supra-brain” — the Matrix for monkeys?

As reported in two open-access papers in the July 9th 2015 issue of Scientific Reports, in the new research, rhesus monkeys were outfitted with electrocorticographic (ECoG) multiple-electrode arrays implanted in their motor and somatosensory cortices to capture and transmit their brain activity.

For one experiment, two monkeys were placed in separate rooms where they observed identical images of an avatar on a display monitor in front of them, and worked together to move the avatar on the screen to touch a moving target.

In another experiment, three monkeys were able to mentally control three degrees of freedom (dimensions) of a virtual arm movement in 3-D space. To achieve this performance, all three monkeys had to synchronize their collective brain activity to produce a “supra-brain” in charge of generating the 3-D movements of the virtual arm.

In the second Brainet study, three to four rats whose brains have been interconnected via pairwise brain-to-brain interfaces (BtBIs) were able to perform a variety of sophisticated shared classification and other computational tasks in a distributed, parallel computing architecture.

Human Brainets next

These results support the original claim of the Duke researchers that brainets may serve as test beds for the development of organic computers created by interfacing multiple animals brains with computers. This arrangement would employ a unique hybrid digital-analog computational engine as the basis of its operation, in a clear departure from the classical digital-only mode of operation of modern computers.

“This is the first demonstration of a shared brain-machine interface, said Miguel Nicolelis, M.D., Ph. D., co-director of the Center for Neuroengineering at the Duke University School of Medicine and principal investigator of the study. “We foresee that shared-BMIs will follow the same track and soon be translated to clinical practice.”

Nicolelis and colleagues of the Walk Again Project, based at the project’s laboratory in Brazil, are currently working to implement a non-invasive human Brainet to be employed in their neuro-rehabilitation training paradigm with severely paralyzed patients.


In this movie, three monkeys share control over the movement of a virtual arm in 3-D space. Each monkey contributes to two of three axes (X, Y and Z). Monkey C contributes to y- and z-axes (red dot), monkey M contributes to x- and y-axes (blue dot), and monkey K contributes to y- and z-axes (green dot). The contribution of the two monkeys to each axis is averaged to determine the arm position (represented by the black dot). (credit: Arjun Ramakrishnan et al./Scientific Reports)


Abstract of Building an organic computing device with multiple interconnected brains

Recently, we proposed that Brainets, i.e. networks formed by multiple animal brains, cooperating and exchanging information in real time through direct brain-to-brain interfaces, could provide the core of a new type of computing device: an organic computer. Here, we describe the first experimental demonstration of such a Brainet, built by interconnecting four adult rat brains. Brainets worked by concurrently recording the extracellular electrical activity generated by populations of cortical neurons distributed across multiple rats chronically implanted with multi-electrode arrays. Cortical neuronal activity was recorded and analyzed in real time, and then delivered to the somatosensory cortices of other animals that participated in the Brainet using intracortical microstimulation (ICMS). Using this approach, different Brainet architectures solved a number of useful computational problems, such as discrete classification, image processing, storage and retrieval of tactile information, and even weather forecasting. Brainets consistently performed at the same or higher levels than single rats in these tasks. Based on these findings, we propose that Brainets could be used to investigate animal social behaviors as well as a test bed for exploring the properties and potential applications of organic computers.

Abstract of Computing arm movements with a monkey Brainet

Traditionally, brain-machine interfaces (BMIs) extract motor commands from a single brain to control the movements of artificial devices. Here, we introduce a Brainet that utilizes very-large-scale brain activity (VLSBA) from two (B2) or three (B3) nonhuman primates to engage in a common motor behaviour. A B2 generated 2D movements of an avatar arm where each monkey contributed equally to X and Y coordinates; or one monkey fully controlled the X-coordinate and the other controlled the Y-coordinate. A B3 produced arm movements in 3D space, while each monkey generated movements in 2D subspaces (X-Y, Y-Z, or X-Z). With long-term training we observed increased coordination of behavior, increased correlations in neuronal activity between different brains, and modifications to neuronal representation of the motor plan. Overall, performance of the Brainet improved owing to collective monkey behaviour. These results suggest that primate brains can be integrated into a Brainet, which self-adapts to achieve a common motor goal.

Crowdsourcing neurofeedback data

In front of an audience, the collective neurofeedback of 20 participants were projected on the 360° surface of the semi-transparent dome as artistic video animations with soundscapes generated based on a pre-recorded sound library and improvisations from live musicians (credit: Natasha Kovacevic et al./PLoS ONE/Photo: David Pisarek)

In a large-scale art-science installation called My Virtual Dream in Toronto in 2013, more than 500 adults wearing a Muse wireless electroencephalography (EEG) headband inside a 60-foot geodesic dom participated in an unusual neuroscience experiment.

As they played a collective neurofeedback computer game where they were required to manipulate their mental states of relaxation and concentration, the group’s collective EEG signals triggered a catalog of related artistic imagery displayed on the dome’s 360-degree interior, along with spontaneous musical interpretation by live musicians on stage.

“What we’ve done is taken the lab to the public. We collaborated with multimedia artists, made this experiment incredibly engaging, attracted highly motivated subjects, which is not easy to do in the traditional lab setting, and collected useful scientific data from their experience.”

Collective neurofeedback: a new kind of neuroscience research

Participant instructions (credit: Natasha Kovacevic et al./PLoS ONE)

Results from the experiment demonstrated the scientific viability of collective neurofeedback as a potential new avenue of neuroscience research that takes into account individuality, complexity and sociability of the human mind. They also yielded new evidence that neurofeedback learning can have an effect on the brain almost immediately the researchers say.

Studying brains in a social and multi-sensory environment is closer to real life and may help scientists to approach questions of complex real-life social cognition that otherwise are not accessible in traditional labs that study one person’s cognitive functions at a time.

“In traditional lab settings, the environment is so controlled that you can lose some of the fine points of real-time brain activity that occur in a social life setting,” said Natasha Kovacevic, creative producer of My Virtual Dream and program manager of the Centre for Integrative Brain Dynamics at Baycrest’s Rotman Research Institute.

The massive amount of EEG data collected in one night yielded an interesting and statistically relevant finding: that subtle brain activity changes were taking place within approximately one minute of the neurofeedback learning exercise — unprecedented speed of learning changes that have not been demonstrated before.

Building the world’s first virtual brain

“These results really open up a whole new domain of neuroscience study that actively engages the public to advance our understanding of the brain,” said Randy McIntosh, director of the Rotman Research Institute and vice-president of Research at Baycrest. He is a senior author on the paper.

The idea for the Nuit Blanche art-science experiment was inspired by Baycrest’s ongoing international project to build the world’s first functional, virtual brain — a research and diagnostic tool that could one day revolutionize brain healthcare.

Baycrest cognitive neuroscientists collaborated with artists and gaming and wearable technology industry partners for over a year to create the My Virtual Dream installation. Partners included the University of Toronto, Scotiabank Nuit Blanche, Muse, and Uken Games.

Plans are underway to travel My Virtual Dream to other cities around the world.


Abstract of ‘My Virtual Dream’: Collective Neurofeedback in an Immersive Art Environment

While human brains are specialized for complex and variable real world tasks, most neuroscience studies reduce environmental complexity, which limits the range of behaviours that can be explored. Motivated to overcome this limitation, we conducted a large-scale experiment with electroencephalography (EEG) based brain-computer interface (BCI) technology as part of an immersive multi-media science-art installation. Data from 523 participants were collected in a single night. The exploratory experiment was designed as a collective computer game where players manipulated mental states of relaxation and concentration with neurofeedback targeting modulation of relative spectral power in alpha and beta frequency ranges. Besides validating robust time-of-night effects, gender differences and distinct spectral power patterns for the two mental states, our results also show differences in neurofeedback learning outcome. The unusually large sample size allowed us to detect unprecedented speed of learning changes in the power spectrum (~ 1 min). Moreover, we found that participants’ baseline brain activity predicted subsequent neurofeedback beta training, indicating state-dependent learning. Besides revealing these training effects, which are relevant for BCI applications, our results validate a novel platform engaging art and science and fostering the understanding of brains under natural conditions.

‘Brainprints’ could replace passwords

Sarah Laszlo, an assistant professor of psychology, adjusting an EEG electrode (credit: Jonathan Cohen, Binghamton University)photographer

The way your brain responds to certain words could be used to replace passwords, according to a study by researchers from Binghamton University, published in academic journal Neurocomputing.

The psychologists recorded volunteers’ EEG signals from volunteers reading a list of acronyms, focusing on the part of the brain associated with reading and recognizing words.

Participants’ “event-related potential” signals reacted differently to each acronym, enough that a computer system was able to identify each volunteer with 94 percent accuracy, using only three electrodes.

The results suggest that brainwaves could be used by security systems to verify a person’s identity.

Better than fingerprints or retinal patterns in the eye

According to Sarah Laszlo, assistant professor of psychology and linguistics at Binghamton University and co-author of the “Brainprint” paper, brain biometrics are appealing because they are cancellable (can be reset) and cannot be stolen by malicious means, such as copying a fingerprint.

“If someone’s fingerprint is stolen, that person can’t just grow a new finger to replace the compromised fingerprint — the fingerprint for that person is compromised forever. Fingerprints are ‘non-cancellable.’ Brainprints, on the other hand, are potentially cancellable.

So, in the unlikely event that attackers were actually able to steal a brainprint from an authorized user, the authorized user could then ‘reset’ their brainprint,” Laszlo said, meaning the user could simply record the EEG pattern associated with another word or phrase.

Useful in high-security environments

Sample correctly classified brainprint recording (credit: B.C. Armstrong/Neurocomputing)

Zhanpeng Jin, assistant professor at Binghamton University’s departments of Electrical and Computer Engineering, and Biomedical Engineering, doesn’t see brainprint as the kind of system that would be mass-produced for low security applications (at least in the near future*) but it could have important security applications.

“We tend to see the applications of this system as being more along the lines of high-security physical locations, like the Pentagon, where there aren’t that many users that are authorized to enter, and those users don’t need to constantly be authorizing the way that a consumer might need to authorize into their phone or computer,” Jin said.

The project is funded by the National Science Foundation and Binghamton University’s Interdisciplinary Collaboratino Grants (ICG) Program.

* Widespread use of low-cost EEG devices could potentially change that.


Abstract of Brainprint: Assessing the uniqueness, collectability, and permanence of a novel method for ERP biometrics

The human brain continually generates electrical potentials representing neural communication. These potentials can be measured at the scalp, and constitute the electroencephalogram (EEG). When the EEG is time-locked to stimulation – such as the presentation of a word – and averaged over many such presentations, the Event-Related Potential (ERP) is obtained. The functional characteristics of components of the ERP are well understood, and some components represent processing that may differ uniquely from individual to individual—such as the N400 component, which represents access to the semantic network. We applied several pattern classifiers to ERPs representing the response of individuals to a stream of text designed to be idiosyncratically familiar to different individuals. Results indicate that there are robustly identifiable features of the ERP that enable labeling of ERPs as belonging to individuals with accuracy reliably above chance (in the range of 82–97%). Further, these features are stable over time, as indicated by continued accurate identification of individuals from ERPs after a lag of up to six months. Even better, the high degree of labeling accuracy achieved in all cases was achieved with the use of only 3 electrodes on the scalp—the minimal possible number that can acquire clean data.

Autistic brain is hyper-functional — needs predictable, paced environments, study finds

Part of the calming “Squeeze Machine” designed by Temple Grandin (credit: Therafin Corp.)

A new open-access study shows that social and sensory overstimulation drives autistic behaviors and supports the unconventional view that the autistic brain is actually hyper-functional. The research offers new hope, with therapeutic emphasis on paced and non-surprising environments tailored to the individual’s sensitivity.

For decades, autism has been viewed as a form of mental retardation, a brain disease that destroys children’s ability to learn, feel and empathize, thus leaving them disconnected from our complex and ever-changing social and sensory surroundings. From this perspective, the main kind of therapeutic intervention in autism to date aims at strongly engaging the child to revive brain functions believed dormant.

Predictability is key

Now researchers at the Swiss Federal Institute of Technology in Lausanne (EPFL) have completed a study that turns this traditional view of autism completely around. The study, conducted on rats exposed to a known risk factor in humans, demonstrates that unpredictable environmental stimulation drives autistic symptoms at least as much as an impoverished environment does.

It also shows that predictable stimulation can prevent these symptoms.

The study is also evidence for a drastic shift in the clinical approach to autism, away from the idea of a damaged brain that demands extensive stimulation. Instead, autistic brains may be hyper-functional and thus require enriched environments that are non-surprising, structured, safe, and tailored to a particular individual’s sensitivity.

“The valproate rat model used is highly relevant for understanding autism, because children exposed to valproate in the womb have an increased chance of presenting autism after birth,” says Prof. Henry Markram, co-author of the study and father of a child with autism. He notes that the rats exposed to valproate in early embryonic development demonstrate behavioral, anatomical and neurochemical abnormalities that are comparable to characteristics of human autism.

The scientists here show that if these rats are reared in a home environment that is calm, safe, and highly predictable with little surprise — while still rich in sensory and social engagement — they do not develop symptoms of emotional over-reactivity such as fear and anxiety, nor social withdrawal or sensory abnormalities.

“We were amazed to see that environments lacking predictability, even if enriched, favored the development of hyper-emotionality in rats exposed to the prenatal autism risk factor,” says Markram.

The study critically shows that in certain individuals, non-predictable environments lead to the development of a wider range of negative symptoms, including social withdrawal and sensory abnormalities. Such symptoms normally prevent individuals from fully benefiting from and contributing to their surroundings, and are thus the targets of therapeutic success.

The study identifies drastically opposite behavioral outcomes depending on levels of predictability in the enriched environment, and suggests that the autistic brain is unusually sensitive to predictability in rearing environment, but to different extent in different individuals.

Hyper-functional brain microcircuits

The study is strong evidence for the Intense World Theory of Autism, proposed in 2007 by neuroscientists Kamila Markram and Henry Markram, both co-authors on the present study. This theory is based on recent research suggesting that the autistic brain, in both humans and animal models, reacts differently to stimuli.

It proposes that an interaction — between an individual’s genetic background with biologically toxic events early in embryonic development — triggers a cascade of abnormalities that create hyper-functional brain microcircuits, the functional units of the brain.

Once activated, these hyper-functional circuits could become autonomous and affect further brain functional connectivity and development. These would lead to an experience of the world as intense, fragmented, and overwhelming; while differences in severity between persons with autism would stem from the system affected and the timing of the effect.

Stable, structured environment

Instead, a stable, structured environment rich in stimuli could help children with autism, by providing a safe haven from an overload of sensory and emotional stimuli, the authors suggest.

This study has immediate implications for clinical and research settings. It suggests that if brain hyper-function can be diagnosed soon after birth, at least some of the debilitating effects of a supercharged brain can be prevented by highly specialized environmental stimulation that is safe, consistent, controlled, announced and only changed very gradually at the pace determined by each child.

The research supports the work of Temple Grandin, PhD, an author and professor of animal science at Colorado State University. One of the therapeutic methods she developed (and used herself) was the “hug machine” (AKA “squeeze machine”), a deep-pressure device designed to calm hypersensitive persons. The device is featured in an award-winning biographical film, Temple Grandin.


Abstract of Predictable enriched environment prevents development of hyper-emotionality in the VPA rat model of autism

Understanding the effects of environmental stimulation in autism can improve therapeutic interventions against debilitating sensory overload, social withdrawal, fear and anxiety. Here, we evaluate the role of environmental predictability on behavior and protein expression, and inter-individual differences, in the valproic acid (VPA) model of autism. Male rats embryonically exposed (E11.5) either to VPA, a known autism risk factor in humans, or to saline, were housed from weaning into adulthood in a standard laboratory environment, an unpredictably enriched environment, or a predictably enriched environment. Animals were tested for sociability, nociception, stereotypy, fear conditioning and anxiety, and for tissue content of glutamate signaling proteins in the primary somatosensory cortex, hippocampus and amygdala, and of corticosterone in plasma, amygdala and hippocampus. Standard group analyses on separate measures were complemented with a composite emotionality score, using Cronbach’s Alpha analysis, and with multivariate profiling of individual animals, using Hierarchical Cluster Analysis. We found that predictable environmental enrichment prevented the development of hyper-emotionality in the VPA-exposed group, while unpredictable enrichment did not. Individual variation in the severity of the autistic-like symptoms (fear, anxiety, social withdrawal and sensory abnormalities) correlated with neurochemical profiles, and predicted their responsiveness to predictability in the environment. In controls, the association between socio-affective behaviors, neurochemical profiles and environmental predictability was negligible. This study suggests that rearing in a predictable environment prevents the development of hyper-emotional features in animals exposed to an autism risk factor, and demonstrates that unpredictable environments can lead to negative outcomes, even in the presence of environmental enrichment.

Missing link found between brain, immune system

Maps of the lymphatic system: old (left) and updated (right) (credit: University of Virginia Health System)

Overrturning decades of textbook teaching, researchers at the University of Virginia School of Medicine have discovered that the brain is directly connected to the immune system by vessels previously thought not to exist.

The finding could have significant implications for the study and treatment of neurological diseases ranging from autism to Alzheimer’s disease to multiple sclerosis.

“It changes entirely the way we perceive the neuro-immune interaction. We always perceived it before as something esoteric that can’t be studied. But now we can ask mechanistic questions.” said Jonathan Kipnis, PhD, professor in the UVA Department of Neuroscience and director of UVA’s Center for Brain Immunology and Glia (BIG).

“We believe that for every neurological disease that has an immune component to it, these vessels may play a major role,” Kipnis said. “Hard to imagine that these vessels would not be involved in a [neurological] disease with an immune component.”

“Very Well Hidden”

A schematic representation of a connection between the glymphatic system, responsible for collecting of the interstitial fluids from within the central nervous system parenchyma to cerebrospinal fluid, and the newly identified meningeal lymphatic vessels (credit: Antoine Louveau et al./Nature)

The discovery was made possible by the work of Antoine Louveau, PhD, a postdoctoral fellow in Kipnis’ lab, who noticed vessel-like patterns in the distribution of immune cells on his slides of a mouse’s meninges — the membranes covering the brain.

So how did the brain’s lymphatic vessels manage to escape notice all this time? Kipnis described them as “very well hidden” — they follow a major blood vessel down into the sinuses, an area difficult to image. “It’s so close to the blood vessel, you just miss it… if you don’t know what you’re after.”

Alzheimer’s, Autism, MS and Beyond

The unexpected presence of the lymphatic vessels raises a tremendous number of questions that now need answers, both about the workings of the brain and the diseases that plague it.

For example: “In Alzheimer’s, there are accumulations of big protein chunks in the brain,” Kipnis said. “We think they may be accumulating in the brain because they’re not being efficiently removed by these vessels.”

He noted that the vessels look different with age, so the role they play in aging is another avenue to explore. And there’s an enormous array of other neurological diseases, from autism to multiple sclerosis, that must be reconsidered in light of the presence of something science insisted did not exist.

The findings have been published online by the journal Nature and will appear in a forthcoming print edition.


Abstract of Structural and functional features of central nervous system lymphatic vessels

One of the characteristics of the central nervous system is the lack of a classical lymphatic drainage system. Although it is now accepted that the central nervous system undergoes constant immune surveillance that takes place within the meningeal compartment123, the mechanisms governing the entrance and exit of immune cells from the central nervous system remain poorly understood456. In searching for T-cell gateways into and out of the meninges, we discovered functional lymphatic vessels lining the dural sinuses. These structures express all of the molecular hallmarks of lymphatic endothelial cells, are able to carry both fluid and immune cells from the cerebrospinal fluid, and are connected to the deep cervical lymph nodes. The unique location of these vessels may have impeded their discovery to date, thereby contributing to the long-held concept of the absence of lymphatic vasculature in the central nervous system. The discovery of the central nervous system lymphatic system may call for a reassessment of basic assumptions in neuroimmunology and sheds new light on the aetiology of neuroinflammatory and neurodegenerative diseases associated with immune system dysfunction.

Scientists recover ‘lost’ memories using brain stimulation by blue light

(credit: Christine Daniloff/MIT)

MIT researchers have found they were able to reactivate memories in mice that could not otherwise be retrieved, using optogenetics — in which proteins are added to neurons to allow them to be activated with light.

The breakthrough finding, in a paper published Thursday (May 28) in the journal Science, appears to answer a longstanding question in neuroscience regarding amnesia.

Damaged or blocked memory?

Neuroscience researchers have for many years debated whether retrograde amnesia — which follows traumatic injury, stress, or diseases such as Alzheimer’s — is caused by damage to specific brain cells, meaning a memory cannot be stored, or if access to that memory is somehow blocked, preventing its recall.

The answer, according to Susumu Tonegawa, the Picower Professor in MIT’s Department of Biology and director of the RIKEN-MIT Center at the Picower Institute for Learning and Memory: “Amnesia is a problem of retrieval impairment.”

Memory researchers have previously speculated that somewhere in the brain network is a population of neurons that are activated during the process of acquiring a memory, causing enduring physical or chemical changes.

If these groups of neurons are subsequently reactivated by a trigger such as a particular sight or smell, for example, the entire memory is recalled. These neurons are known as “memory engram cells.”

Blocking, then activating memories with light

Until now, no one has been able to show that these groups of neurons undergo enduring chemical changes, in a process known as memory consolidation. One such change, known as “long-term potentiation” (LTP), involves the strengthening of synapses, the structures that allow groups of neurons to send signals to each other, as a result of learning and experience.

To find out if these chemical changes do indeed take place, the researchers first identified a group of engram cells in the hippocampus that, when activated using optogenetic tools, were able to express a memory.

When they then recorded the activity of this particular group of cells, they found that the synapses connecting them had been strengthened. “We were able to demonstrate for the first time that these specific cells — a small group of cells in the hippocampus — had undergone this augmentation of synaptic strength,” Tonegawa says.

The researchers then attempted to discover what happens to memories without this consolidation process. By administering a compound called anisomycin, which blocks protein synthesis within neurons, immediately after mice had formed a new memory, the researchers were able to prevent the synapses from strengthening.

When they returned one day later and attempted to reactivate the memory using an emotional trigger, they could find no trace of it. “So even though the engram cells are there, without protein synthesis those cell synapses are not strengthened, and the memory is lost,” Tonegawa says.

But startlingly, when the researchers then reactivated the protein synthesis-blocked engram cells using optogenetic tools, they found that the mice exhibited all the signs of recalling the memory in full.

“If you test memory recall with natural recall triggers in an anisomycin-treated animal, it will be amnesiac, you cannot induce memory recall,” Tonegawa says. “But if you go directly to the putative engram-bearing cells and activate them with light, you can restore the memory, despite the fact that there has been no LTP.”

Memories are stored in a circuit of groups of cells in multiple brain areas, not synapses

Further studies carried out by Tonegawa’s group demonstrated that memories are stored not in synapses strengthened by protein synthesis in individual engram cells, but in a circuit, or “pathway” of multiple groups of engram cells and the connections between them.

“We are proposing a new concept, in which there is an engram cell ensemble pathway, or circuit, for each memory,” he says. “This circuit encompasses multiple brain areas and the engram cell ensembles in these areas are connected specifically for a particular memory.”

The research dissociates the mechanisms used in memory storage from those of memory retrieval, according to Ryan. “The strengthening of engram synapses is crucial for the brain’s ability to access or retrieve those specific memories, while the connectivity pathways between engram cells allows the encoding and storage of the memory information itself,” he says.

Changes in synaptic strength and in spine properties have long been associated with learning and memory, according to Alcino Silva, director of the Integrative Center for Learning and Memory at the University of California at Los Angeles.

“This groundbreaking paper suggests that these changes may not be as critical for memory as once thought, since under certain conditions, it seems to be possible to disrupt these changes and still preserve memory,” he says. “Instead, it appears that these changes may be needed for memory retrieval, a mysterious process that has so far evaded neuroscientists.”


Abstract of Engram cells retain memory under retrograde amnesia

Memory consolidation is the process by which a newly formed and unstable memory transforms into a stable long-term memory. It is unknown whether the process of memory consolidation occurs exclusively through the stabilization of memory engrams. By using learning-dependent cell labeling, we identified an increase of synaptic strength and dendritic spine density specifically in consolidated memory engram cells. Although these properties are lacking in engram cells under protein synthesis inhibitor–induced amnesia, direct optogenetic activation of these cells results in memory retrieval, and this correlates with retained engram cell–specific connectivity. We propose that a specific pattern of connectivity of engram cells may be crucial for memory information storage and that strengthened synapses in these cells critically contribute to the memory retrieval process.

Light electric stimulation of the brain may improve memory for people with schizophrenia

Transcranial direct-current stimulation device (credit: GoFlow)

Lightly stimulating the brain with transcranial direct current stimulation (tDCS) may improve short-term memory in people with schizophrenia, according to a new study by researchers at the Johns Hopkins University School of Medicine.

The tDCS procedure involves placing sponge-covered electrodes on the head and passing a weak electrical current between them.

David Schretlen, Ph.D., a professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine, reasoned that this type of brain stimulation might ease some of the cognitive difficulties that afflict people with schizophrenia.

A test based on prefrontal cortex stimulation

To test that possibility, Schretlen and Johns Hopkins colleagues targeted a brain region called the left dorsolateral prefrontal cortex, which plays an important role in short-term or working memory and is abnormal in people with schizophrenia, according to Schretlen.

Schretlen recruited 11 participants: five adults with confirmed schizophrenia and six of their close relatives (parents, siblings, and children of people with schizophrenia show some of the same abnormalities to a lesser degree, says Schretlen).

Each participant received two 30-minute treatments — one using a negative electrical charge, which the researchers thought might prove beneficial — and the other using a positive charge as a control. During and after each treatment, participants completed a battery of cognitive tests.

Thinking improvements

There were two notable results:

  • On tests of verbal and visual working memory, participants performed significantly better after receiving a negative charge, and the effects were “surprisingly strong,” says Schretlen.
  • Participants did better at the challenging task of switching between naming categories of items in a supermarket after a negatively charged treatment. The stimulation “was associated with better performance on working memory and subtle changes in word retrieval,” Schretlen says. People with schizophrenia often struggle to find the right words, he says. Because the prefrontal cortex contains a brain region responsible for word retrieval, Schretlen reasoned that transcranial direct current stimulation might help.

Schretlen is now studying transcranial direct current stimulation in a larger sample of patients using repeated sessions of stimulation, which he hopes will induce lasting benefits.

“Cognitive impairment is as ubiquitous as hallucinations in schizophrenia, yet medications only treat the hallucinations,” Schretlen says. “So even with medication, affected individuals often remain very disabled.” His hope is that transcranial direct current stimulation could give people with schizophrenia a shot at leading a more normal life.

Other findings

A related study last year showed that tDCS improved correction of mistakes. But another recent study found that after a repeated IQ test (which is normally expected to show improvements), IQ scores of people who underwent tDCS brain stimulation improved markedly less than did the IQ scores of people in the placebo group.

The tDCS procedure is also being studied by other researchers as a treatment for depression and Alzheimer’s-related memory loss, and to enhance recovery following strokes.

The research is described in a paper published online in Clinical Schizophrenia and Related Psychoses. The study was funded by the Therapeutic Cognitive Neuroscience Professorship; the Therapeutic Cognitive Neuroscience Fund; the Benjamin and Adith Miller Family Endowment on Aging, Alzheimer’s and Autism; and the National Institute on Child and Human Development.


Abstract of Can Transcranial Direct Current Stimulation Improve Cognitive Functioning in Adults with Schizophrenia?

Cognitive impairment is nearly ubiquitous in schizophrenia. First-degree relatives of persons with schizophrenia often show similar but milder deficits. Current methods for the treatment of schizophrenia are often ineffective in cognitive remediation. Since transcranial direct current stimulation (tDCS) can enhance cognitive functioning in healthy adults, it might provide a viable option to enhance cognition in schizophrenia. We sought to explore whether tDCS can be tolerated by persons with schizophrenia and potentially improve their cognitive functioning. We examined the effects of anodal versus cathodal tDCS on working memory and other cognitive tasks in five outpatients with schizophrenia and six first-degree relatives of persons with schizophrenia. Each participant completed tasks thought to be mediated by the prefrontal cortex during two 30-minute sessions of tDCS to the left and right dorsolateral prefrontal cortex (DLPFC). Anodal stimulation over the left DLPFC improved performance relative to cathodal stimulation on measures of working memory and aspects of verbal fluency relevant to word retrieval. The patient group showed differential changes in novel design production without alteration of overall productivity, suggesting that tDCS might be capable of altering selfmonitoring and executive control. All participants tolerated tDCS well. None withdrew from the study or experienced any adverse reaction. We conclude that adults with schizophrenia can tolerate tDCS while engaging in cognitive tasks and that tDCS can alter their performance.

Converting blood stem cells to sensory neural cells to predict and treat pain

McMaster University scientists have discovered how to make adult sensory neurons from a patient’s blood sample to measure pain (credit: McMaster University

Stem-cell scientists at McMaster University have developed a way to directly convert adult human blood cells to sensory neurons, providing the first objective measure of how patients may feel things like pain, temperature, and pressure, the researchers reveal in an open-access paper in the journal Cell Reports.

Currently, scientists and physicians have a limited understanding of the complex issue of pain and how to treat it. “The problem is that unlike blood, a skin sample or even a tissue biopsy, you can’t take a piece of a patient’s neural system,” said Mick Bhatia, director of the McMaster Stem Cell and Cancer Research Institute and research team leader. “It runs like complex wiring throughout the body and portions cannot be sampled for study.

“Now we can take easy to obtain blood samples, and make the main cell types of neurological systems in a dish that is specialized for each patient,” said Bhatia. “We can actually take a patient’s blood sample, as routinely performed in a doctor’s office, and with it we can produce one million sensory neurons, [which] make up the peripheral nerves. We can also make central nervous system cells.”

Testing pain drugs

The new technology has “broad and immediate applications,” said Bhatia: It allows researchers to understand disease and improve treatments by asking questions such as: Why is it that certain people feel pain versus numbness? Is this something genetic? Can the neuropathy that diabetic patients experience be mimicked in a dish?

It also paves the way for the discovery of new pain drugs that don’t just numb the perception of pain. Bhatia said non-specific opioids used for decades are still being used today. “If I was a patient and I was feeling pain or experiencing neuropathy, the prized pain drug for me would target the peripheral nervous system neurons, but do nothing to the central nervous system, thus avoiding addictive drug side effects,” said Bhatia.

“Until now, no one’s had the ability and required technology to actually test different drugs to find something that targets the peripheral nervous system, and not the central nervous system, in a patient-specific, or personalized manner.”

A patient time machine 

Bhatia’s team also successfully tested their process with cryopreserved (frozen) blood. Since blood samples are taken and frozen with many clinical trials, this give them “almost a bit of a time machine” to run tests on neurons created from blood samples of patients taken in past clinical trials, where responses and outcomes have already been recorded.

In the future, the process may have prognostic (predictive diagnostic) potential, explained Bhatia: one might be able to look at a patient with Type 2 Diabetes and predict whether they will experience neuropathy, by running tests in the lab using their own neural cells derived from their blood sample.

“This bench-to-bedside research is very exciting and will have a major impact on the management of neurological diseases, particularly neuropathic pain,” said Akbar Panju, medical director of the Michael G. DeGroote Institute for Pain Research and Care, a clinician and professor of medicine.

“This research will help us understand the response of cells to different drugs and different stimulation responses, and allow us to provide individualized or personalized medical therapy for patients suffering with neuropathic pain.”

This research was supported by the Canadian Institutes of Health Research, Ontario Institute of Regenerative Medicine, Marta and Owen Boris Foundation, J.P. Bickell Foundation, the Ontario Brain Institute, and Brain Canada.

Pain insensitivity

In related news, an international team of researchers co-led by the University of Cambridge reported Monday in the journal Nature Genetics that they have identified a gene, PRDM12, that is essential to the production of pain-sensing neurons in humans. Rare individuals — around one in a million people in the UK — are born unable to feel pain, in a condition known as congenital insensitivity to pain (CIP). These people accumulate numerous self-inflicted injuries, often leading to reduced lifespan.

The researchers are hopeful that this new gene could be an excellent candidate for drug development.


Abstract of Single Transcription Factor Conversion of Human Blood Fate to NPCs with CNS and PNS Developmental Capacity

The clinical applicability of direct cell fate conversion depends on obtaining tissue from patients that is easy to harvest, store, and manipulate for reprogramming. Here, we generate induced neural progenitor cells (iNPCs) from neonatal and adult peripheral blood using single-factor OCT4 reprogramming. Unlike fibroblasts that share molecular hallmarks of neural crest, OCT4 reprogramming of blood was facilitated by SMAD+GSK-3 inhibition to overcome restrictions on neural fate conversion. Blood-derived (BD) iNPCs differentiate in vivo and respond to guided differentiation in vitro, producing glia (astrocytes and oligodendrocytes) and multiple neuronal subtypes, including dopaminergic (CNS related) and nociceptive neurons (peripheral nervous system [PNS]). Furthermore, nociceptive neurons phenocopy chemotherapy-induced neurotoxicity in a system suitable for high-throughput drug screening. Our findings provide an easily accessible approach for generating human NPCs that harbor extensive developmental potential, enabling the study of clinically relevant neural diseases directly from patient cohorts.

Robotic arm precisely controlled by thought

Erik Sorto smoothly controls robotic arm with his brain (credit: Spencer Kellis and Christian Klaes /Caltech)

Paralyzed from the neck down, Erik G. Sorto now can smoothly move a robotic arm just by thinking about it, thanks to a clinical collaboration between Caltech, Keck Medicine of USC and Rancho Los Amigos National Rehabilitation Center,

Previous neural prosthetic devices, such as Braingate, were implanted in the motor cortex, resulting in delayed, jerky movements. The new device was implanted in the posterior parietal cortex (PPC), a part of the brain that controls the intent to move, not the movement directly.

That makes Sorto, who has been paralyzed for over 10 years, the first quadriplegic person in the world to perform a fluid hand-shaking gesture or play “rock, paper, scissors,” using a robotic arm.

In April 2013, Keck Medicine of USC surgeons implanted a pair of small electrode arrays in two parts of the posterior parietal cortex, one that controls reach and another that controls grasp.

Each 4-by-4 millimeter array contains 96 active electrodes that, in turn, each record the activity of single neurons in the PPC. The arrays are connected by a cable to a system of computers that process the signals, to decode the brain’s intent and control output devices, such as a computer cursor and a robotic arm.

Although he was able to immediately move the robot arm with his thoughts, after weeks of imagining, Sorto refined his control of the arm.

Now, Sorto is able to execute advanced tasks with his mind, such as controlling a computer cursor; drinking a beverage; making a hand-shaking gesture; and performing various tasks with the robotic arm.

Designed to test the safety and effectiveness of this new approach, the clinical trial was led by principal investigator Richard Andersen, the James G. Boswell Professor of Neuroscience at Caltech, neurosurgeon Charles Y. Liu, professor of neurological surgery and neurology at the Keck School of Medicine of USC and biomedical engineering at USC, and neurologist Mindy Aisen, chief medical officer at Rancho Los Amigos.

Aisen, also a clinical professor of neurology at the Keck School of Medicine of USC, says that advancements in prosthetics like these hold promise for the future of patient rehabilitation.

NeuroPort microelectrode array implanted in Erik Sorto’s posterior parietal cortex (credit: Blackrock Microsystems)

“This research is relevant to the role of robotics and brain-machine interfaces as assistive devices, but also speaks to the ability of the brain to learn to function in new ways,” Aisen said. “We have created a unique environment that can seamlessly bring together rehabilitation, medicine, and science as exemplified in this study.”

Sorto has signed on to continue working on the project for a third year. He says the study has inspired him to continue his education and pursue a master’s degree in social work.

The results of the clinical trial appear in the May 22, 2015, edition of the journal Science. The implanted device and signal processors used in the clinical trial were the NeuroPort Array and NeuroPort Bio-potential Signal Processors developed by Blackrock Microsystems in Salt Lake City, Utah. The robotic arm used in the trial was the Modular Prosthetic Limb, developed at the Applied Physics Laboratory at Johns Hopkins.

This trial was funded by the National Institutes of Health, the Boswell Foundation, the Department of Defense, and the USC Neurorestoration Center.


Caltech | Next Generation of Neuroprosthetics: Science Explained — R. Andersen May 2015


Keck Medicine of USC | Next Generation of Neuroprosthetics: Erik’s Story


Abstract of Decoding motor imagery from the posterior parietal cortex of a tetraplegic human

Nonhuman primate and human studies have suggested that populations of neurons in the posterior parietal cortex (PPC) may represent high-level aspects of action planning that can be used to control external devices as part of a brain-machine interface. However, there is no direct neuron-recording evidence that human PPC is involved in action planning, and the suitability of these signals for neuroprosthetic control has not been tested. We recorded neural population activity with arrays of microelectrodes implanted in the PPC of a tetraplegic subject. Motor imagery could be decoded from these neural populations, including imagined goals, trajectories, and types of movement. These findings indicate that the PPC of humans represents high-level, cognitive aspects of action and that the PPC can be a rich source for cognitive control signals for neural prosthetics that assist paralyzed patients.

‘Natural’ sounds improve mood and productivity, study finds

(credit: iStock)

Playing natural sounds such as flowing water in offices could boost worker moods and improve cognitive abilities in addition to providing speech privacy, according to a new study from researchers at Rensselaer Polytechnic Institute.

An increasing number of modern open-plan offices employ sound masking systems such as “white noise” that raise the background sound of a room so that speech is rendered unintelligible beyond a certain distance and distractions are less annoying.

“If you’re close to someone, you can understand them. But once you move farther away, their speech is obscured by the masking signal,” said Jonas Braasch, an acoustician and musicologist at the Rensselaer Polytechnic Institute in New York.

Braasch and his team are currently testing whether masking signals inspired by natural sounds might work just as well, or better, than white noise. The idea was inspired by previous work by Braasch and his graduate student Mikhail Volf, which showed that people’s ability to regain focus improved when they were exposed to natural sounds versus silence or machine-based sounds.

Recently, Braasch and his graduate student Alana DeLoach built upon those results to start a new experiment.

They are exposing 12 human participants to three different sound stimuli while performing a task that requires them to pay close attention: typical office noises with the conventional random electronic signal; an office soundscape with a “natural” masker; and an office soundscape with no masker. The test subjects only encounter one of the three stimuli per visit.

The natural sound used in the experiment was designed to mimic the sound of flowing water in a mountain stream. “The mountain stream sound possessed enough randomness that it did not become a distraction,” DeLoach said. “This is a key attribute of a successful masking signal.”

They want to find out if workers who are listening to natural sounds are more productive and overall in better moods than the workers exposed to traditional masking signals.

Braasch said using natural sounds as a masking signal could have benefits beyond the office environment. “You could use it to improve the moods of hospital patients,” for example, Braasch said.


Abstract of Tuning the cognitive environment: sound masking with “natural” sounds in open-plan offices

With the gain in popularity of open-plan office design and the engineering efforts to achieve acoustical comfort for building occupants, a majority of workers still report dissatisfaction in their workplace environment. Office acoustics influence organizational effectiveness, efficiency, and satisfaction through meeting appropriate requirements for speech privacy and ambient sound levels. Implementing a sound masking system is one tried-and-true method of achieving privacy goals. Although each sound masking system is tuned for its specific environment, the signal – random steady state electronic noise, has remained the same for decades. This session explores how “natural” sounds may be used as an alternative to this standard masking signal employed so ubiquitously in sound masking systems in the contemporary office environment. As an unobtrusive background sound, possessing the appropriate spectral characteristics, this proposed use of “natural” sounds for masking challenges the convention that masking sounds should be as meaningless as possible. Based on psychophysical data and a sound-field analysis through an auditory model, we hypothesize that “natural” sounds as masking sounds have the ability (with equal success as conventional masking sounds) to meet standards and criteria for speech privacy while enhancing cognitive functioning, optimizing the ability to concentrate, and increasing overall worker satisfaction.