AI Chip Breakthrough: Memristors Mimic Neural Timekeeping

AI Memristor Technolgy Artificial Neural Network Art Concept

Artificial neural networks could soon process time-dependent data more efficiently with the development of a tunable memristor. This technology, detailed in a University of Michigan-led study, could significantly reduce AI energy consumption. Credit: SciTechDaily.com

In the brain, timekeeping is done with neurons that relax at different rates after receiving a signal; now memristors—hardware analogs of neurons—can do that too.

Artificial neural networks may soon be able to process time-dependent information, such as audio and video data, more efficiently. The first memristor with a ‘relaxation time’ that can be tuned is reported today in Nature Electronics, in a study led by the University of Michigan.

Energy Efficiency and AI

Memristors, electrical components that store information in their electrical resistance, could reduce AI’s energy needs by about a factor of 90 compared to today’s graphical processing units. Already, AI is projected to account for about half a percent of the world’s total electricity consumption in 2027, and that has the potential to balloon as more companies sell and use AI tools.

“Right now, there’s a lot of interest in AI, but to process bigger and more interesting data, the approach is to increase the network size. That’s not very efficient,” said Wei Lu, the James R. Mellor Professor of Engineering at U-M and co-corresponding author of the study with John Heron, U-M associate professor of materials science and engineering.

The Problem With GPUs

The problem is that GPUs operate very differently from the artificial neural networks that run the AI algorithms—the whole network and all its interactions must be sequentially loaded from the external memory, which consumes both time and energy. In contrast, memristors offer energy savings because they mimic key aspects of the way that both artificial and biological neural networks function without external memory. To an extent, the memristor network can embody the artificial neural network.

Innovations in Memristor Materials

“We anticipate that our brand-new material system could improve the energy efficiency of AI chips six times over the state-of-the-art material without varying time constants,” said Sieun Chae, a recent U-M Ph.D. graduate in materials science and engineering and co-first-author of the study with Sangmin Yoo, a recent U-M PhD graduate in electrical and computer engineering.

In a biological neural network, timekeeping is achieved through relaxation. Each neuron receives electrical signals and sends them on, but it isn’t a guarantee that a signal will move forward. Some threshold of incoming signals must be reached before the neuron will send its own, and it has to be met in a certain amount of time. If too much time passes, the neuron is said to relax as the electrical energy seeps out of it. Having neurons with different relaxation times in our neural networks helps us understand sequences of events.

How Memristors Work

Memristors operate a little differently. Rather than the total presence or absence of a signal, what changes is how much of the electrical signal gets through. Exposure to a signal reduces the resistance of the memristor, allowing more of the next signal to pass. In memristors, relaxation means that the resistance rises again over time.

While Lu’s group had explored building relaxation time into memristors in the past, it was not something that could be systematically controlled. But now, Lu and Heron’s team have shown that variations on a base material can provide different relaxation times, enabling memristor networks to mimic this timekeeping mechanism.

Material Composition and Testing

The team built the materials on the superconductor YBCO, made of yttrium, barium, carbon and oxygen. It has no electrical resistance at temperatures below -292 SciTechDaily