A newly discovered trade-off in the way time-keeping devices operate on a fundamental level could set a hard limit on the performance of large-scale quantum computers, according to researchers from the Vienna University of Technology.
While the issue isn’t exactly pressing, our ability to grow systems based on quantum operations from backroom prototypes into practical number-crunching behemoths will depend on how well we can reliably dissect the days into ever finer portions. This is a feat the researchers say will become increasingly more challenging.
Whether you’re counting the seconds with whispers of Mississippi or dividing them up with the pendulum-swing of an electron in atomic confinement, the measure of time is bound by the limits of physics itself.
One of these limits involves the resolution with which time can be split. Measures of any event shorter than 5.39 x 10-44 seconds, for example, run afoul of theories on the basic functions of the Universe. They just don’t make any sense, in other words.
Yet even before we get to that hard line in the sands of time, physicists think there is a toll to be paid that could prevent us from continuing to measure ever smaller units.
Sooner or later, every clock winds down. The pendulum slows, the battery dies, the atomic laser needs resetting. This isn’t merely an engineering challenge – the march of time itself is a feature of the Universe’s progress from a highly ordered state to an entangled, chaotic mess in what is known as entropy.
“Time measurement always has to do with entropy,” says senior author Marcus Huber, a systems engineer who leads a research group in the intersection of Quantum Information and Quantum Thermodynamics at the Vienna University of Technology.
In their recently published theorem, Huber and his team lay out the logic that connects entropy as a thermodynamic phenomenon with resolution, demonstrating that unless you’ve got infinite energy at your fingertips, your fast-ticking clock will eventually run into precision problems.
Or as the study’s first author, theoretical physicist Florian Meier puts it, “That means: Either the clock works quickly or it works precisely – both are not possible at the same time.”
This might not be a major problem if you want to count out seconds that won’t deviate over the lifetime of our Universe. But for technologies like quantum computing, which rely on the temperamental nature of particles hovering on the edge of existence, timing is everything.
This isn’t a big problem when the number of particles is small. As they increase in number, the risk any one of them could be knocked out of their quantum critical state rises, leaving less and less time to carry out the necessary computations.
Plenty of research has gone into exploring the potential for errors in quantum technology caused by a noisy, imperfect Universe. This appears to be the first time researchers have looked at the physics of timekeeping itself as a potential obstacle.
“Currently, the accuracy of quantum computers is still limited by other factors, for example the precision of the components used or electromagnetic fields,” says Huber.
“But our calculations also show that today we are not far from the regime in which the fundamental limits of time measurement play the decisive role.”
It’s likely other advances in quantum computing will improve stability, reduce errors, and ‘buy time’ for scaled-up devices to operate in optimal ways. But whether entropy will have the final say on just how powerful quantum computers can get, only time will tell.
This research was published in Physical Review Letters.