Experiments show magnetic chips could dramatically increase computing’s energy efficiency

Magnetic microscope image of three nanomagnetic computer bits. Each bit is a tiny bar magnet only 90 nanometers long. The image hows a bright spot at the “North” end and a dark spot at the “South” end of the magnet. The “H” arrow shows the direction of magnetic field applied to switch the direction of the magnets. (credit: Jeongmin Hong et al./Science Advances)

UC Berkeley engineers have shown for the first time that magnetic chips can actually operate at the lowest fundamental energy dissipation theoretically possible under the laws of thermodynamics. That means dramatic reductions in power consumption are possible — down to as little as one-millionth the amount of energy per operation used by transistors in modern computers.

The findings were published Mar. 11 an open-access paper in the peer-reviewed journal Science Advances.

This is critical at two ends of the size scale: for mobile devices, which demand powerful processors that can run for a day or more on small, lightweight batteries; and on an industrial scale, as computing increasingly moves into “the cloud,” where the electricity demands of the giant cloud data centers are multiplying, collectively taking an increasing share of the country’s — and world’s — electrical grid.

“The biggest challenge in designing computers and, in fact, all our electronics today is reducing their energy consumption,” aid senior author Jeffrey Bokor, a UC Berkeley professor of electrical engineering and computer sciences and a faculty scientist at the Lawrence Berkeley National Laboratory.

Lowering energy use is a relatively recent shift in focus in chip manufacturing after decades of emphasis on packing greater numbers of increasingly tiny and faster transistors onto chips to keep up with Moore’s law.

“Making transistors go faster was requiring too much energy,” said Bokor, who is also the deputy director the Center for Energy Efficient Electronics Science, a Science and Technology Center at UC Berkeley funded by the National Science Foundation. “The chips were getting so hot they’d just melt.”

So researchers have been turning to alternatives to conventional transistors, which currently rely upon the movement of electrons to switch between 0s and 1s. Partly because of electrical resistance, it takes a fair amount of energy to ensure that the signal between the two 0 and 1 states is clear and reliably distinguishable, and this results in excess heat.

Nanomagnetic computing: how low can you get?

The UC Berkeley team used an innovative technique to measure the tiny amount of energy dissipation that resulted when they flipped a nanomagnetic bit. The researchers used a laser probe to carefully follow the direction that the magnet was pointing as an external magnetic field was used to rotate the magnet from “up” to “down” or vice versa.

They determined that it only took 15 millielectron volts of energy — the equivalent of 3 zeptojoules — to flip a magnetic bit at room temperature, effectively demonstrating the Landauer limit (the lowest limit of energy required for a computer operation). *

This is the first time that a practical memory bit could be manipulated and observed under conditions that would allow the Landauer limit to be reached, the authors said. Bokor and his team published a paper in 2011 that said this could theoretically be done, but it had not been demonstrated until now.

While this paper is a proof of principle, he noted that putting such chips into practical production will take more time. But the authors noted in the paper that “the significance of this result is that today’s computers are far from the fundamental limit and that future dramatic reductions in power consumption are possible.”

The National Science Foundation and the U.S. Department of Energy supported this research.

* The Landauer limit was named after IBM Research Lab’s Rolf Landauer, who in 1961 found that in any computer, each single bit operation must expend an absolute minimum amount of energy. Landauer’s discovery is based on the second law of thermodynamics, which states that as any physical system is transformed, going from a state of higher concentration to lower concentration, it gets increasingly disordered. That loss of order is called entropy, and it comes off as waste heat. Landauer developed a formula to calculate this lowest limit of energy required for a computer operation. The result depends on the temperature of the computer; at room temperature, the limit amounts to about 3 zeptojoules, or one-hundredth the energy given up by a single atom when it emits one photon of light.


Abstract of Experimental test of Landauer’s principle in single-bit operations on nanomagnetic memory bits

Minimizing energy dissipation has emerged as the key challenge in continuing to scale the performance of digital computers. The question of whether there exists a fundamental lower limit to the energy required for digital operations is therefore of great interest. A well-known theoretical result put forward by Landauer states that any irreversible single-bit operation on a physical memory element in contact with a heat bath at a temperature Trequires at least kBT ln(2) of heat be dissipated from the memory into the environment, where kB is the Boltzmann constant. We report an experimental investigation of the intrinsic energy loss of an adiabatic single-bit reset operation using nanoscale magnetic memory bits, by far the most ubiquitous digital storage technology in use today. Through sensitive, high-precision magnetometry measurements, we observed that the amount of dissipated energy in this process is consistent (within 2 SDs of experimental uncertainty) with the Landauer limit. This result reinforces the connection between “information thermodynamics” and physical systems and also provides a foundation for the development of practical information processing technologies that approach the fundamental limit of energy dissipation. The significance of the result includes insightful direction for future development of information technology.

Major steps toward a bioengineered heart for transplantation

A human whole heart was stripped of cells, leaving behind a “structural scaffold” of connective tissue, which was then partially re-seeded with human heart-like cells grown in the laboratory from skin cells. Once these new heart-like cells were re-seeded into the heart scaffold, the heart was regenerated in this bioreactor, which delivers a nutrient solution and replicates some of the environmental conditions in a living heart. (credit: Bernhard Jank, MD/Ott Lab, Center for Regenerative Medicine, Massachusetts General Hospital)

Massachusetts General Hospital (MGH) researchers have taken early steps towards producing a bioengineered heart for transplantation that would use cells from the patient receiving the heart.

Using a patient’s own cells would help to overcome some of the problems associated with receiving a heart donated by another person, including immune rejection of the donated heart, as well as the long-term side effects of life-long treatment with the immunosuppressive drugs needed to suppress the immune system and reduce the risk of rejection.

To achieve those steps, Jacques Guyette, PhD, of the Massachusetts General Hospital’s Center for Regenerative Medicine, lead author of a paper in Circulation Research, says the research team had to overcome three major technical challenges.

Future methodology for producing bioengineered hearts (credit: Julie Milland)

1. Create a structural scaffold from a human heart

One challenge is producing a structural scaffold able to support new functioning heart cells. To do this, the researchers take a human heart and remove the heart muscle and other cells and components that would stimulate the recipient’s immune system to reject the organ. Once stripped, the structural and connective tissues that give the heart its 3D structure are left behind.

If researchers can re-seed this structural scaffold with viable heart-like cells from the patient who will receive the heart (steps 2 and 3 below), the engineered heart would have the potential to reduce the risk of rejection and the resulting need for long-term immunosuppressive treatment.

Team leader Harald Ott, MD, an assistant professor of surgery at Harvard Medical School, has pioneered a method for stripping the living cells from donor organs and then re-populating the remaining scaffold with new cells.*

2. Grow cells that will function and contract like heart cells

Another challenge is developing a method that will enable researchers to use cells from the potential heart recipient to produce cells that will function like heart cells. These are generated from induced pluripotent stem cells (iPSCs) — cells with the potential to be turned into many different types of cells.

The team generated the heart-like muscle cells from reprogrammed skin cells. Once they had checked the quality of these cells, they grew them in the laboratory for several days and showed that the cells developed into tissue that spontaneously contracted like heart cells.**

3. Re-seed the human heart scaffold and grow it in an automated bioreactor

The final challenge was developing an automated bioreactor system capable of supporting a whole human heart while the re-seeded heart cells take hold.*** In this initial study, the researchers only partially re-seeded the scaffold — many more cells would be needed to totally re-populate a functioning heart scaffold.

After incubating the engineered heart in the bioreactor, the researchers showed that the regenerated tissue behaved like immature cardiac muscle tissue that was able to contract in response to electrical stimulation.

Next steps

“Regenerating a whole heart is most certainly a long-term goal that is several years away,” says Guyette. “Among the next steps that we are pursuing are improving methods to generate even more cardiac cells.”

He says re-seeding a whole heart would take tens of billions of cells, so the team needs to optimize the bioreactor techniques to improve the maturation and function of engineered cardiac tissue. They also need to integrate the electrical function of the regenerated tissue in the bioengineered heart. In the meantime, Guyette says the team is working on engineering a functional myocardial patch that could be used to replace tissue damaged after a heart attack or heart failure.

The study was supported by National Institutes of Health Director’s New Innovator Award and by National Heart Lung and Blood Institute grants.

* Since 2008, Ott’s team has used the approach to generate functional rat kidneys and lungs and has stripped cells from large-animal hearts, lungs and kidneys.

This study used 73 human hearts donated through the New England Organ Bank, unsuitable for transplantation and recovered under research consent. Using a scaled-up version of a process originally developed in rat hearts, the team stripped cells from the hearts from both brain-dead donors and from those who had undergone cardiac death. The consequent cardiac scaffolds showed a high retention of matrix proteins and preserved blood vessels. The structure was free of cardiac cells and other molecules that could induce rejection.

** Instead of using genetic manipulation to generate iPSCs from adult cells, the team used a newer method to reprogram skin cells, which should be both more efficient and less likely to run into regulatory hurdles. They then induced the iPSCs to create cardiac muscle cells or cardiomyocytes, documenting patterns of gene expression that reflected developmental milestones and generating cells in sufficient quantity for possible clinical application. Cardiomyocytes were then reseeded into three-dimensional matrix tissue, first into thin matrix slices and then into 15 mm fibers, which developed into spontaneously contracting tissue after several days in culture.

*** The team delivered about 500 million heart-like cells grown from iPSC into the left ventricular wall of the heart scaffolds. The re-seeded organs were mounted for 14 days in an automated bioreactor system developed by the team. The automated bioreactor system supplied the organ with a circulating nutrient solution and applied environmental conditions to reproduce conditions within a living heart.


Abstract of Bioengineering Human Myocardium on Native Extracellular Matrix

Rationale: More than 25 million individuals have heart failure worldwide, with ≈4000 patients currently awaiting heart transplantation in the United States. Donor organ shortage and allograft rejection remain major limitations with only ≈2500 hearts transplanted each year. As a theoretical alternative to allotransplantation, patient-derived bioartificial myocardium could provide functional support and ultimately impact the treatment of heart failure.

Objective: The objective of this study is to translate previous work to human scale and clinically relevant cells for the bioengineering of functional myocardial tissue based on the combination of human cardiac matrix and human induced pluripotent stem cell–derived cardiomyocytes.

Methods and Results: To provide a clinically relevant tissue scaffold, we translated perfusion-decellularization to human scale and obtained biocompatible human acellular cardiac scaffolds with preserved extracellular matrix composition, architecture, and perfusable coronary vasculature. We then repopulated this native human cardiac matrix with cardiomyocytes derived from nontransgenic human induced pluripotent stem cells and generated tissues of increasing 3-dimensional complexity. We maintained such cardiac tissue constructs in culture for 120 days to demonstrate definitive sarcomeric structure, cell and matrix deformation, contractile force, and electrical conduction. To show that functional myocardial tissue of human scale can be built on this platform, we then partially recellularized human whole-heart scaffolds with human induced pluripotent stem cell–derived cardiomyocytes. Under biomimetic culture, the seeded constructs developed force-generating human myocardial tissue and showed electrical conductivity, left ventricular pressure development, and metabolic function.

A roadmap for the next generation of additive manufacturing materials and processes

The Strategic Roadmap for the Next Generation of Additive Manufacturing Materials offers a strategy for building the fundamental knowledge necessary to accelerate the design and application of additive manufacturing (AM) materials over the next 10 years. It organizes research and activities for developing additive manufacturing materials into five strategic thrusts: enabling integrated design methodologies for materials, processes and parts; developing AM process-structure-property relationships; establishing part and feedstock testing protocols; building AM process analytics capabilities; and exploring next-generation AM materials and processes. (credit: Penn State)

Penn State University researchers have released a roadmap for developing future additive manufacturing (3D printing) materials and processes.

It’s much needed. Most of the feedstock materials currently used in 3D printing are costly, not readily available, and limited, according to the researchers. The first additive manufacturing (AM) processes were actually developed 30 years ago. All of the metal alloys currently used, for example, were developed to be processed using casting and forging processes.

There is also a limited understating and inadequate compatibility with current AM processing technologies, the researchers say.

Funded by the U.S. National Institute of Standards and Technology, the new roadmap offers “a strategy for building the fundamental knowledge necessary to accelerate the design and application of additive manufacturing (AM) materials over the next 10 years.”

An example of a radical new additive-manufacturing process: JPL’s prototype of a compositionally graded mirror mount made by a new metal-based AM powder deposition technique. The gradient alloy component design — which contains a nickel and nickel-iron alloy at the top of the part and stainless steel at the base — replaces epoxy bonding techniques and mitigates the effects of thermal expansion caused by the extreme temperatures of outer space. (credit: NASA JPL)

The roadmapping effort involved more than 120 participants from industry, government and academia, according to Todd Palmer, Penn State associate professor of materials science and engineering and senior research associate with the Applied Research Lab (ARL), principal investigator on the roadmapping project.

The roadmap organizes research and activities into five strategic thrusts: enabling integrated design methodologies for materials, processes and parts; developing AM process-structure-property relationships; establishing part and feedstock testing protocols; building AM process analytics capabilities; and exploring next-generation AM materials and processes.

The researchers have also been coordinating their roadmapping efforts with America Makes, the National Additive Manufacturing Innovation Institute, which helps transition research and development in AM into the marketplace.

The researchers are also hoping that the roadmap generates enough interest from academia, research institutions, government labs, and industry partners so that they can launch the Consortium for Additive Manufacturing Materials (CAMM).

Additive manufacturing (3D printing) could affect a wide range of industries, including defense, energy, aerospace, automotive, medical and metals manufacturing.


editor’s comments: It will be interesting to see what cool new products, materials, and processes for the maker community become available. What would you make if you had the ideal 3D printer and ideal materials?