Researchers track qubit data loss 100 times faster, clarifying quantum instability

Quantum computers promise to solve problems beyond the reach of today’s machines, but their Achilles’ heel remains instability: qubits lose information unpredictably.
A research collaboration led by the Niels Bohr Institute in Copenhagen, working with scientists at the Norwegian University of Science and Technology (NTNU), says it has developed a way to track that loss more than 100 times faster than before, offering a clearer view into what goes wrong inside these systems.
“In quantum computers, information is transmitted and stored using qubits. But quantum information can quickly be lost,” said Jeroen Danon, a professor in NTNU’s Department of Physics. A central obstacle has been measuring exactly how fast that information disappears.
In widely used superconducting qubits, Danon noted, the average lifetime can be reasonable, but it varies in ways that appear random over time—making performance hard to improve without better diagnostics. Until now, gauging how long quantum information lasts typically took about one second—an eternity in quantum physics.
“We managed to do it in approximately 10 milliseconds, i.e. more than 100 times faster. And more or less in real time,” Danon said. The team describes its approach as a measurement method that captures the time it takes for information to be lost with high speed and accuracy, enabling researchers to watch instability develop as it happens.
That leap in speed, the researchers said, reveals subtle, rapid changes that would otherwise be missed. By pinpointing when and how qubits degrade, it becomes easier to identify the underlying causes of information loss and to refine quantum processors accordingly.
While further work is needed to translate these insights into sturdier machines, the team says the technique could reshape how scientists test and fine‑tune quantum hardware—an incremental but important step toward practical, reliable quantum computing.
