A new Energy Frontier Research Center is spearheading the search for quantum materials that can revolutionize electronics and computer technologies
Rubul Mout

Neuromorphic computing, or cognitive computing, is an approach to building computer chips that mimics our brain’s neurons and synapses, as compared to traditional transistor-based computing. Image courtesy Nathan Johnson, Pacific Northwest National Laboratory.

In 1947, a new electronic device called the transistor was invented, which threw out the hegemony of previous analogous devices, called vacuum tubes, and set the platform for the entire electronics industry for next half-a-century or so. Transistors, that primarily can amplify electric currents, also provided a means to miniaturize on/off electrical switches. Coincidentally, computers had just begun to evolve about the same time, and thus quickly adopted these miniature on/off transistor switches as ‘0’ and ‘1’, called bits—the smallest units of data in nearly all computing devices today.

The computer industry advanced quickly and packed a growing number of ever-smaller transistors into chips, enabling the development of more and more powerful computers. In the 1960s, one of the early entrepreneurs of the industry, the co-founder of the company Intel Corporation, Gordon Moore, observing this trend, predicted that the number of transistors per chip doubled every year. This prediction became known as “Moore’s law,” and for decades it shaped the future of the computer industry—and arguably the global economy, politics, and more. Moore also asked the question: How small can a transistor be? Carver Mead, a physicist at the California Institute of Technology, helped to answer this question—a transistor can be as small as few nanometers in size.

The transistor’s size is now approaching that lower limit. The trend predicted by Moore’s law can no longer continue. Experts believe that by the year 2025, transistor-based computers can’t be made more powerful than those available today.

Yet the demand for computational power is growing every day. For example, a three-year-old child does better than today’s most powerful computers at recognizing objects and gestures. As artificial intelligence (AI) holds potential to solve many problems that humanity faces today, continually increasing computational power will be key to realize AI’s promise.

How should we, then, meet this demand?

Emulating the brain

Our brain transmits information from one neuron to others through synapses. Image courtesy Nathan Johnson, Pacific Northwest National Laboratory.

Carver Mead is also credited with the idea that the “brain” of a computer (the chip) can be made to resemble our own brains. Human brains have about 100 billion neurons (a type of cell). Each neuron makes around 7,000 connections to other neurons. These neuronal connections are called synapses, which occur through parts of neuron bodies called axons and dendrites. It is estimated that the brain of a three-year-old child has about 1015 synapses.

Mead proposed that, if we make a chip containing components that mimic the human brain, (i.e., by mimicking neurons and synapses), we can make more powerful—and more energy-efficient—alternatives to transistor-based computers. He called these “neuromorphic computers.”

Over the past decade or so, a few companies and academic labs have built such neuromorphic devices (e.g., IBM’s TrueNorth and Intel’s Loihi chip). But these early prototypes have major limitations. Chief among them: Their main components are transistors, which can only be made so small. It is estimated that one would need to fill out about 20 rooms with stacks of transistor-based neuromorphic chips, each room requiring large amounts of energy, to build a computer with the computing power of a human brain. In comparison, the human brain is only about 15 cm long, weighing merely 3 pounds, and needs only 20 Watts of power.

So then, is it possible to build a neuromorphic computer that emulates the brain’s capabilities in terms of size, energy consumption, and computational efficiency? Possibly yes, if new materials and technologies are developed and used.

This is where a new Energy Frontier Research Center (EFRC) funded by the U.S. Department of Energy, called Quantum Materials for Energy-Efficient Neuromorphic Computing (Q-MEEN-C), emerged to develop these powerful new materials that can potentially revolutionize computing technologies.

Meeting the quantum pioneers

On a Friday afternoon, I met three of the pioneers of Q-MEEN-C, through a Zoom teleconference, to discuss their center’s vision and research. The director of the center and a professor of physics at the University of California San Diego (UC San Diego), Ivan Schuller, pointed out, “Our center’s name tells everything—we are studying materials that have the potential to change how computers are built. These materials are called “quantum materials”—which cannot be understood with conventional theories. Our objective is to understand them and use them in an energy-efficient fashion for neuromorphic computing.”

Quantum materials are composed of so-called “quasi-particles,” carrying properties like “charge” and “spin”’ that potentially can be controlled, which makes them unique and useful.

Alex Frano, assistant director of the center and an assistant professor of physics at UC San Diego, noted that experts in very disparate fields are needed if the vision for Q-MEEN-C is to be realized. “We have to understand these materials at a fundamental level, which requires a comprehensive approach involving materials and device synthesis, characterization, and theory/modeling to predict their behavior,” said Frano.

Oleg Shpyrko, associate director of the center and a professor of physics at UC San Diego, interjected enthusiastically, “Big change requires the transformation of the platform itself. It [has] happened throughout human civilization. When the vacuum tube was replaced by transistors, there was a big revolution. Now the transistors need to be replaced for another revolution.” Schuller then amplified that point, “Vacuum tubes to the transistor; and now transistor to quantum devices.” Everyone agreed.

But fundamentally, what makes quantum materials promising for neuromorphic applications? To learn more, I contacted other researchers at Q-MEEN-C.

Tiny brain-like devices based on moving charges

Mimicking brain’s neurons and synapses through new quantum materials design can transform how computers work today. Image courtesy Q-MEEN-C.

I made a phone call to Javier del Valle Granda, a postdoctoral researcher in Schuller’s lab, who had recently published a paper in Nature describing how quantum materials were used to make an energy-efficient, charge-based neuromorphic computer.

In our brain, neurons communicate with each other by sending tiny electrical spikes (i.e., firing) through the synapses. Synapses control how much of the neuron’s signal is transferred between neurons. Analogously, quantum materials can be laid on a chip to act as neurons and synapses. Similar to how our brain’s neurons fire in response to external stimuli—like light and temperature—quantum materials can respond to external stimuli—like an electrical field, temperature, pressure, and light—to trigger a firing. In del Valle’s study, this firing used the Metal-to-Insulator-Transition (MIT) that occurs in some quantum materials. In an MIT, the material changes from an electrical insulator to conductor (metal), or vice versa, in response to external stimuli.

In addition to the capability of controlling the electrical signals, some quantum materials can have memory. An insulator with an MIT becomes conducting when a threshold external stimulus is applied, but once the stimulus is withdrawn, the material goes back to the original insulator state. del Valle and his colleagues found that there is a slight delay in going back to the insulator state for a certain class of quantum materials, called Mott devices, which imparted memory to the materials.

“This is exciting because, if you have to trigger another firing in the same device within a short time, you need less than the threshold stimulus,” explained del Valle, adding, “it already has the memory”.

While “charge-based” devices behave much like our brain’s neurons and synapses, “spin-based” devices work fundamentally in a different way. And to understand that, I had to call another expert at Q-MEEN-C.

Pendulum-like spin materials make perfect brains, too

Andy Kent, a professor of physics and the Director of the Center for Quantum Phenomena at New York University, and a principal investigator in Q-MEEN-C, explained how ‘spin-based’ devices made of quantum materials will work in neuromorphic computers.

The magnetic moment, or spin, of certain materials oscillates when in a magnetic field. These oscillations can be modified when an external stimulus, such as an electric current, is applied. This is analogous to how the oscillations of a pendulum can be changed by a push. Note that it’s the magnetic moment that oscillates not the whole material itself, just like it’s the pendulum that oscillates, not the whole pendulum clock. When the spin of the materials oscillates, it emits a wave, known as a “spin wave”. This spin wave can make connections with neighboring oscillators, similar to the way neurons connect through the synapses.

Most amazingly, like the charge materials, spin oscillators have memories, too. But how is that possible?

Kent put it in this way, “Well, think of the pendulum again. Once it oscillates, it already has momentum, and you don’t need to push it as hard as the first time; as long as it has not completely stopped oscillating. But once it stops, it loses its memory.”

But losing memory is also sometimes good (yes, we all want to forget our painful memories in our personal lives!). “To be a good neuron, oscillators need to remember, but also forget,” added Kent.

These aren’t the only types of oscillators that Q-MEEN-C is developing—they are also developing ones that are sensitive to stimuli like light, pressure, and temperature. Q-MEEN-C is also looking for new materials that could use both charge and spin for computations.

An important step in the neuromorphic computer revolution

As Q-MEEN-C forges new frontiers toward eventual development of neuromorphic computers, one thing seems clear—quantum materials will play an important role in the ‘new-age’ computing. Ivan Schuller made an important point during our conversation: No longer do companies exist like Bell Labs that invested deeply in fundamental research that was crucial for the development of the transistor and related technologies. Schuller predicts that DOE’s decision to fund Q-MEEN-C will be deemed a wise and high-impact investment in the development of energy-efficient, brain-inspired computing.

More Information

del Valle J, JG Ramírez, MJ Rozenberg, and IK Schuller “Challenges in materials and devices for resistive-switching-based neuromorphic computing.” Journal of Applied Physics 124, 211101 (2018).  DOI 10.1063/1.5047800

del Valle J, P Salev, F Tesler, NM Vargas, Y Kalcheim, P Wang, J Trastoy, M-H Lee, G Kassabian, JG Ramírez, MJ Rozenberg, and IK Schuller “Subthreshold firing in Mott nanodevices.” Nature 569, 388–392 (2019). DOI 10.1038/s41586-019-1159-6


del Valle J et al. This multidisciplinary review paper integrates and includes major components in the fields of quantum materials, bio-inspired electronics, and neuromorphic computation. The development of bio-inspired hybrids (I.K.S.) was supported by the Vannevar Bush Faculty Fellowship program sponsored by the Basic Research Office of the Assistant Secretary of Defense for Research and Engineering and funded by the Office of Naval Research through Grant No. N00014-15-1-2848. The international collaboration between UCSD and CNRS (I.K.S., M.J.R.) and a major effort to develop an energy-efficient neuromorphic computer is funded through an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences under Award No. # DE-SC0019273. M.J.R. acknowledges fruitful collaborations and discussions with C. Acha, L. Cario, B. Corraze, I. H. Inoue, E. Janod, P. Levy, M. J. Sanchez, P. Stoliar, and F. Tesler, and the support from the LIA CNRS-UCSD. J.G.R. acknowledges support from FAPA program through Facultad de Ciencias and Vicerrectoria de Investigaciones of Universidad de los Andes, Bogotá Colombia and Colciencias No. 120471250659 and No. 120424054303. J.d.V. thanks Fundación Ramón Areces for their funding.

del Valle J et al. The development of the materials and devices in this Letter was supported by the Vannevar Bush Faculty Fellowship programme sponsored by the Basic Research Office of the Assistant Secretary of Defense for Research and Engineering and funded by the US Office of Naval Research through grant N00014-15-1-2848. The neuromorphic aspects of this work, including the collaboration between UCSD and CNRS, were supported as part of the Quantum Materials for Energy Efficient Neuromorphic Computing, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Basic Energy Sciences under award DE-SC0019273. Part of the fabrication process was done at the San Diego Nanotechnology Infrastructure (SDNI) of UCSD, a member of the National Nanotechnology Coordinated Infrastructure (NNCI), which is supported by the US National Science Foundation under grant ECCS-1542148. J.G.R. acknowledges support from FAPA and Colciencias number 120471250659. J.d.V. and J.T. thank Fundación Ramón Areces for their support with a postdoctoral fellowship. J.d.V. thanks I. Valmianski for helpful discussions.

About the author(s):

Rubul Mout, Ph.D., is a Washington Research Foundation Innovation Fellow at the Institute for Protein Design, University of Washington. He is a member of the Center for the Science of Synthesis Across Scales (CSSAS), an Energy Frontier Research Center. He received his Ph.D. from the University of Massachusetts at Amherst, where he worked in the field of nanotechnology and gene editing. In his current research, he makes “brand new” protein machines, which were never built in the nature before, for a range of applications including in cell signaling, optogenetics, and bioelectronics. He is also the author of two books—a collection of short stories and a memoir—in his native language Assamese.