Brain-inspired hafnium-oxide device could cut AI energy use by up to 70%

A brain-like nanoelectronic device developed by researchers at the University of Cambridge could sharply reduce the electricity demand of artificial intelligence, potentially cutting energy use by as much as 70%. The work, published in Science Advances, mimics the way neurons process and store information in one place—an approach known as neuromorphic computing.
Today’s AI systems rely on conventional chips that shuttle data constantly between memory and processors, a power-hungry back-and-forth that grows more costly as models scale. By contrast, neuromorphic hardware merges computation and storage, enabling analogue “in-memory” computing that more closely resembles the brain’s architecture.
The Cambridge team engineered a modified form of hafnium oxide to function as a highly stable, low-energy memristor—a device intended to emulate synaptic connections. Lead author Dr.
Babak Bakhit, from Cambridge’s Department of Materials Science and Metallurgy and the Department of Engineering, said AI hardware needs extremely low currents, excellent stability, uniformity across cycles and devices, and the ability to switch among many distinct states.
Most existing memristors rely on tiny conductive filaments in metal oxides, a mechanism prone to randomness and high voltages that limit scalability. To avoid filamentary behavior, the researchers added strontium and titanium to create a hafnium-based thin film and used a two-step growth process that formed p–n junctions at the interfaces between layers.
Instead of forming and rupturing filaments, the device changes resistance by adjusting the energy barrier at these interfaces, producing smoother, more predictable switching. According to the team, the devices operate at switching currents roughly a million times lower than some conventional oxide-based memristors and achieve hundreds of stable conductance levels.
Lab tests showed the devices withstood tens of thousands of switching cycles and held their programmed states for about a day. They also exhibited spike-timing dependent plasticity, a key biological learning behavior in which the timing of signals strengthens or weakens connections.
“Because our devices switch at the interface, they show outstanding uniformity from cycle to cycle and from device to device,” Bakhit said. A major obstacle remains: fabrication currently requires temperatures around 700°C, higher than typical limits in standard semiconductor manufacturing.
The researchers said they are working to lower the processing temperature to make the technology compatible with industry workflows. If that hurdle can be cleared, the devices could be integrated into practical chip-scale systems aimed at far more energy-efficient AI.
