↩ Accueil

Vue lecture

Cracking the limits of clocks: a new uncertainty relation for time itself

What if a chemical reaction, ocean waves or even your heartbeat could all be used as clocks? That’s the starting point of a new study by Kacper Prech, Gabriel Landi and collaborators, who uncovered a fundamental, universal limit to how precisely time can be measured in noisy, fluctuating systems. Their discovery – the clock uncertainty relation (CUR) – doesn’t just refine existing theory, it reframes timekeeping as an information problem embedded in the dynamics of physical processes, from nanoscale biology to engineered devices.

The foundation of this work contains a simple but powerful reframing: anything that “clicks” regularly is a clock. In the research paper’s opening analogy, a castaway tries to cook a fish without a wristwatch. They could count bird calls, ocean waves, or heartbeats – each a potential timekeeper with different cadence and regularity. But questions remain: given real-world fluctuations, what’s the best way to estimate time, and what are the inescapable limits?

The authors answer both. They show for a huge class of systems – those described by classical, Markovian jump processes (systems where the future depends only on the present state, not the past history – a standard model across statistical physics and biophysics) – there is a tight achievable bound on timekeeping precision. The bound is controlled not by how often the system jumps on average (the traditional “dynamical activity”), but by a subtler quantity: the mean residual time, or the average time you’d wait for the next event if you start observing at a random moment. That distinction matters.

The inspection paradox
The inspection paradox The graphic illustrates the mean residual time used in the CUR and how it connects to the so-called inspection paradox – a counterintuitive bias where randomly arriving observers are more likely to land in longer gaps between events. Buses arrive in clusters (gaps of 5 min) separated by long intervals (15 min), so while the average time between buses might seem moderate, a randomly arriving passenger (represented by the coloured figures) is statistically more likely to land in one of the long 15-min gaps than in a short 5-min one. The mean residual time is the average time a passenger waits for their bus if they arrive at the bus stop at a random time. Counterintuitively, this can be much longer than the average time between buses. The visual also demonstrates why the mean residual time captures more information than the simple average interval, since it accounts for the uneven distribution of gaps that biases your real waiting experience. (Courtesy: IOP Publishing)

The study introduces CUR, a universal, tight bound on timekeeping precision that – unlike earlier bounds – can be saturated and the researchers identify the exact observables that achieve this limit. Surprisingly, the optimal strategy for estimating time from a noisy process is remarkably simple: sum the expected waiting times of each observed state along the trajectory, rather than relying on complex fitting methods. The work also reveals that the true limiting factor for precision isn’t the traditional dynamical activity, but rather the inverse of the mean residual time. This makes the CUR provably tighter than the earlier kinetic uncertainty relation, especially in systems far from equilibrium.

The team also connects precision to two practical clock metrics: resolution (how often a clock ticks) and accuracy (how many ticks before it drifts by one tick.) In other words, achieving steadier ticks comes at the cost of accepting fewer of them per unit of time.

This framework offers practical tools across several domains. It can serve as a diagnostic for detecting hidden states in complex biological or chemical systems: if measured event statistics violate the CUR, that signals the presence of hidden transitions or memory effects. For nanoscale and molecular clocks – like biomolecular oscillators (cellular circuits that produce rhythmic chemical signals) and molecular motors (protein machines that walk along cellular tracks) – the CUR sets fundamental performance limits and guides the design of optimal estimators. Finally, while this work focuses on classical systems, it establishes a benchmark for quantum clocks, pointing toward potential quantum advantages and opening new questions about what trade-offs emerge in the quantum regime.

Landi, an associate professor of theoretical quantum physics at the University of Rochester, emphasizes the conceptual shift: that clocks aren’t just pendulums and quartz crystals. “Anything is a clock,” he notes. The team’s framework “gives the recipe for constructing the best possible clock from whatever fluctuations you have,” and tells you “what the best noise-to-signal ratio” can be. In everyday terms, the Sun is accurate but low-resolution for cooking; ocean waves are higher resolution but noisier. The CUR puts that intuition on firm mathematical ground.

Looking forward, the group is exploring quantum generalizations and leveraging CUR-violations to infer hidden structure in biological data. A tantalizing foundational question lingers: can robust biological timekeeping emerge from many bad, noisy clocks, synchronizing into a good one?

Ultimately, this research doesn’t just sharpen a bound; it reframes timekeeping as a universal inference task grounded in the flow of events. Whether you’re a cell sensing a chemical signal, a molecular motor stepping along a track or an engineer building a nanoscale device, the message is clear: to tell time well, count cleverly – and respect the gaps.

The research is detailed in Physical Review X.

The post Cracking the limits of clocks: a new uncertainty relation for time itself appeared first on Physics World.

  •  

Bidirectional scattering microscope detects micro- and nanoscale structures simultaneously

A new microscope that can simultaneously measure both forward- and backward-scattered light from a sample could allow researchers to image both micro- and nanoscale objects at the same time. The device could be used to observe structures as small as individual proteins, as well as the environment in which they move, say the researchers at the University of Tokyo who developed it.

“Our technique could help us link cell structures with the motion of tiny particles inside and outside cells,” explains Kohki Horie of the University of Tokyo’s department of physics, who led this research effort. “Because it is label-free, it is gentler on cells and better for long observations. In the future, it could help quantify cell states, holding potential for drug testing and quality checks in the biotechnology and pharmaceutical industries.”

Detecting forward and backward scattered light at the same time

The new device combines two powerful imaging techniques routinely employed in biomedical applications: quantitative phase microscopy (QPM) and interferometric scattering (iSCAT).

QPM measures forward-scattered (FS) light – that is, light waves that travel in the same direction as before they were scattered. This technique is excellent at imaging structures in the Mie scattering region (greater than 100 nm, referred to as microscale in this study). This makes it ideal for visualizing complex structures such as biological cells. It falls short, however, when it comes to imaging structures in the Rayleigh scattering region (smaller than 100 nm, referred to as nanoscale in this study).

The second technique, iSCAT, detects backward-scattered (BS) light. This is light that’s reflected back towards the direction from which it came and which predominantly contains Rayleigh scattering. As such, iSCAT exhibits high sensitivity for detecting nanoscale objects. Indeed, the technique has recently been used to image single proteins, intracellular vesicles and viruses. It cannot, however, image microscale structures because of its limited ability to detect in the Mie scattering region.

The team’s new bidirectional quantitative scattering microscope (BiQSM) is able to detect both FS and BS light at the same time, thereby overcoming these previous limitations.

Cleanly separating the signals from FS and BS

The BiQSM system illuminates a sample through an objective lens from two opposite directions and detects both the FS and BS light using a single image sensor. The researchers use the spatial-frequency multiplexing method of off-axis digital holography to capture both images simultaneously. The biggest challenge, says Horie, was to cleanly separate the signals from FS and BS light in the images while keeping noise low and avoiding mixing between them.

Horie and colleagues, Keiichiro Toda, Takuma Nakamura and team leader Takuro Ideguchi, tested their technique by imaging live cells. They were able to visualize micron-sized cell structures, including the nucleus, nucleoli and lipid droplets, as well as nanoscale particles. They compared the FS and BS results using the scattering-field amplitude (SA), defined as the amplitude ratios between the scattered wave and the incident illumination wave.

“SA characterizes the light scattered in both the forward and backward directions within a unified framework,” says Horie, “so allowing for a direct comparison between FS and BS light images.”

Spurred on by their findings, which are detailed in Nature Communications, the researchers say they now plan to study even smaller particles such as exosomes and viruses.

The post Bidirectional scattering microscope detects micro- and nanoscale structures simultaneously appeared first on Physics World.

  •  

The Oceans Just Keep Getting Hotter

For the eighth year in a row, the world’s oceans absorbed a record-breaking amount of heat in 2025. It was equivalent to the energy it would take to boil 2 billion Olympic swimming pools.

  •  

2026 will clarify Europe’s new priorities for space

Nyx

Launchers Isar Aerospace is expected to attempt its second two-stage Spectrum vehicle test flight, a key step after its first, partially successful liftoff in 2025. In parallel, Spain’s PLD Space and its Miura-5 remain the second contender — after Isar — for the European Launcher Challenge, a competition that increasingly looks like Europe’s closest analogue […]

The post 2026 will clarify Europe’s new priorities for space appeared first on SpaceNews.

  •  

The ‘space tax’ on your self-driving car

LiDAR costs, compute power and AI training are the “big three” usually associated with the high cost of autonomous vehicles (AVs). We rarely look up. But maybe we should. High above the Earth, the ionosphere, a chaotic, sun-charged layer of our atmosphere, is levying an invisible tax on every self-driving car in development. If you […]

The post The ‘space tax’ on your self-driving car appeared first on SpaceNews.

  •  

Quantum information theory sheds light on quantum gravity

This episode of the Physics World Weekly podcast features Alex May, whose research explores the intersection of quantum gravity and quantum information theory. Based at Canada’s Perimeter Institute for Theoretical Physics, May explains how ideas being developed in the burgeoning field of quantum information theory could help solve one of the most enduring mysteries in physics – how to reconcile quantum mechanics with Einstein’s general theory of relativity, creating a viable theory of quantum gravity.

This interview was recorded in autumn 2025 when I had the pleasure of visiting the Perimeter Institute and speaking to four physicists about their research. This is the last of those conversations to appear on the podcast.

The first interview in this series from the Perimeter Institute was with Javier Toledo-Marín, “Quantum computing and AI join forces for particle physics”; the second was with Bianca Dittrich, “Quantum gravity: we explore spin foams and other potential solutions to this enduring challenge“; and the third was with Tim Hsieh, “Building a quantum future using topological phases of matter and error correction”.

APS logo

 

This episode is supported by the APS Global Physics Summit, which takes place on 15–20 March, 2026, in Denver, Colorado, and online.

The post Quantum information theory sheds light on quantum gravity appeared first on Physics World.

  •  

Chess960 still results in white having an advantage, finds study

Chess is a seemingly simple game, but one that hides incredible complexity. In the standard game, the starting positions of the pieces are fixed so top players rely on memorizing a plethora of opening moves, which can sometimes result in boring, predictable games. It’s also the case that playing as white, and therefore going first, offers an advantage.

In the 1990s, former chess world champion Bobby Fischer proposed another way to play chess to encourage more creative play.

This form of the game – dubbed Chess960 – keeps the pawns in the same position but randomizes where the pieces at the back of the board – the knights, bishops, rooks, king and queen – are placed at the start while keeping the rest of the rules the same. It is named after the 960 starting positions that result from mixing it up at the back.

It was thought that Chess960 could allow for more permutations that would make the game fairer for both players. Yet research by physicist Marc Barthelemy at Paris-Saclay University suggests it’s not as simple as this.

Initial advantage

He used the open-source chess program called Stockfish to analyze each of the 960 starting positions and developed a statistical method to measure decision-making complexity by calculating how much “information” a player needs to identify the best moves.

He found that the standard game can be unfair, as players with black pieces who go second have to keep up with the moves from the player with white.

Yet regardless of starting positions at the back, Barthelemy discovered that white still has an advantage in almost all – 99.6% – of the 960 positions. He also found that the standard set-up – rook, knight, bishop, queen, king, bishop, knight, rook – is nothing special and is presumably an historical accident possibly as the starting positions are easy to remember, being visually symmetrical.

“Standard chess, despite centuries of cultural evolution, does not occupy an exceptional location in this landscape: it exhibits a typical initial advantage and moderate total complexity, while displaying above-average asymmetry in decision difficulty,” writes Barthelemy.

For a more fair and balanced match, Barthelemy suggests playing position #198, which has the starting positions as queen, knight, bishop, rook, king, bishop, knight and rook.

The post Chess960 still results in white having an advantage, finds study appeared first on Physics World.

  •  
❌