Researchers at the Niels Bohr Institute have achieved a major milestone. A team of scientists successfully built a real-time monitoring system capable of tracking the microscopic fluctuations that occur inside quantum computers. By combining commercially available hardware with adaptive measurement techniques, researchers can now observe sudden shifts in quantum behavior that were previously impossible to detect.
This quantum computing breakthrough operates about one hundred times faster than previous methods. Instead of waiting minutes for data, the new system works in milliseconds. Scientists can now instantly spot when a stable quantum bit shifts into an unstable state, providing a clear path forward for making advanced computing more reliable.
The Fragile Nature of Qubits
Qubits serve as the fundamental building blocks of quantum computers, which scientists hope will eventually outperform today’s supercomputers. However, managing qubits is an incredibly delicate process because they are highly sensitive to their environment.
The physical materials used to construct processors often contain microscopic defects that scientists do not yet fully understand. These tiny imperfections can shift hundreds of times per second. When these defects move, they alter the speed at which a qubit loses energy, resulting in the loss of valuable quantum information. Researchers compare this challenge to a workhorse pulling a plow; if obstacles constantly appear in the horse’s path faster than the farmer can react, unpredictable disruptions ruin the work.
Overcoming Slow Measurement Methods
Before this advancement, standard testing routines were too slow to capture real-time behavior. Traditional characterization could take up to a full minute to evaluate how a single qubit was performing. Because fluctuations happen in a fraction of a second, old testing methods missed the rapid changes entirely.
As a result, scientists were forced to rely on an average energy-loss rate. This masked the true, unstable nature of the hardware, giving researchers an incomplete picture of what was actually happening inside the processor.
Powering Speeds with FPGA Hardware
To solve this, a research team led by postdoctoral researcher Dr. Fabrizio Berritta developed an adaptive measurement approach. The project involved the Niels Bohr Institute, the Novo Nordisk Foundation Quantum Computing Programme, Chalmers University, Leiden University, and the Norwegian University of Science and Technology.
The team achieved unprecedented speeds using a Field-Programmable Gate Array (FPGA). An FPGA is a specialized classical processor designed for highly specific, rapid operations. By running experiments directly on this processor, researchers generated a “best guess” of the energy loss rate using only a handful of measurements. This eliminated the need to send data on slow roundtrips to a conventional computer.
Adaptive Learning in Milliseconds
Programming an FPGA is notoriously complex, yet the team successfully programmed the controller to update its internal Bayesian model after every single measurement. This allowed the system to continuously refine its understanding of the quantum state in real time.
Because the controller updates in milliseconds, it keeps pace with the rapidly changing environment. These experiments also revealed new insights: before this project, scientists simply did not know exactly how fast fluctuations occurred inside superconducting qubits.
Commercial Technology Meets Quantum Hardware
The team achieved these results using the OPX1000, a commercially available FPGA controller provided by Quantum Machines. It can be programmed using a language similar to Python, keeping the technology accessible to other physicists globally.
The quantum processing unit used was designed and fabricated at Chalmers University. Associate Professor Morten Kjaergaard noted that the controller enabled a very tight integration between logic, measurements, and feedforward, which made the experiment possible.
Implications for Future Processors
While large-scale quantum computers are still under development, these findings reshape how scientists calibrate superconducting processors. With current manufacturing methods, real-time monitoring is essential for improving hardware reliability.
According to Dr. Berritta, the overall performance of a quantum processor is limited by its worst qubits. The team was surprised to discover that a highly functional qubit can degrade into a poorly performing one in fractions of a second.
Thanks to the fast-control algorithm, hardware can pinpoint unstable components instantly, gathering vital statistics in seconds instead of days. While researchers cannot yet explain all the physical fluctuations they observe, understanding these mechanics will be necessary for scaling quantum computers to a useful size.
