↩ Accueil

Vue lecture

Schrödinger cat state sets new size record

The University of Vienna's Multi-Scale Cluster Interference Experiment
Massively quantum: The University of Vienna’s Multi-Scale Cluster Interference Experiment (MUSCLE), where researchers detected quantum interference in massive nanoparticles. (Courtesy: S Pedalino / Uni Wien)

Classical mechanics describes our everyday world of macroscopic objects very well. Quantum mechanics is similarly good at describing physics on the atomic scale. The boundary between these two regimes, however, is still poorly understood. Where, exactly, does the quantum world stop and the classical world begin?

Researchers in Austria and Germany have now pushed the line further towards the macroscopic regime by showing that metal nanoparticles made up of thousands of atoms clustered together continue to obey the rules of quantum mechanics in a double-slit-type experiment. At over 170 000 atomic mass units, these nanoparticles are heavier than some viroids and proteins – a fact that study leader Sebastian Pedalino, a PhD student at the University of Vienna, says demonstrates that quantum mechanics remains valid at this scale and alternative models are not required.

Multiscale cluster interference

According to the rules of quantum mechanics, even large objects behave as delocalized waves. However, we do not observe this behaviour in our daily lives because the characteristic length over which this behaviour extends – the de Broglie wavelength λdB = h/mv, where h is Planck’s constant, m is the object’s mass and v is its velocity – is generally much smaller than the object itself.

In the new work, a team led by Vienna’s Markus Arndt and Stefan Gerlich, in collaboration with Klaus Hornberger at the University of Duisburg-Essen, created clusters of sodium atoms in a helium-argon mixture at 77 K in an ultrahigh vacuum. The clusters each contained between 5000 and 1000 atoms and travelled at velocities of around 160 m s−1, giving them de Broglie wavelengths between 10‒22 femtometres (1 fm = 10-12 m).

To observe matter-wave interference in objects with such ultra-short de Broglie wavelengths, the team used an interferometer containing three diffraction gratings constructed with deep ultraviolet laser beams in a so-called Talbot–Lau configuration. The first grating channels the clusters through narrow gaps, from which their wave function expands. This wave is then modulated by the second grating, resulting in interference that produces a measurable striped pattern at the third grating.

This result implies that the clusters’ location is not fixed as it propagates through the apparatus. Instead, its wave function is spread over a span dozens of times larger than an individual cluster, meaning that it is in a superposition of locations rather than occupying a fixed position in space. This is known as a Schrödinger cat state, in reference to the famous thought experiment by physicist Erwin Schrödinger in which he imagined a cat sitting in a sealed box to be both dead and alive at once.

Pushing the boundaries for quantum experiments

The Vienna-Duisburg-Essen researchers characterized their experiment by calculating a quantity known as macroscopicity that combines the duration of the quantum state (its coherence time), the mass of the object in that state and the degree of separation between states. In this work, which they detail in Nature, the macroscopicity reached a value of 15.5 – an order of magnitude higher than the best known previous reported measurement of this kind.

Arndt explains that this milestone was reached thanks to a long-term research programme that aims to push quantum experiments to ever higher masses and complexity. “The motivation is simply that we do not yet know if quantum mechanics is the ultimate theory or if it requires any modification at some mass limit,” he tells Physics World. While several speculative theories predict some degree of modification, he says, “as experimentalists our task is to be agnostic and see what happens”.

Arndt notes that the team’s machine is very sensitive to small forces, which can generate notable deflections of the interference fringes. In the future, he thinks this effect could be exploited to characterize the properties of materials. In the longer term, this force-sensing capability could even be used to search for new particles.

Interpretations and adventures

While Arndt says he is “impressed” that these mesoscopic objects – which are in principle easy to see and even to localize under a scattering microscope – can be delocalized on a scale more than 10 times their size if they are isolated and non-interacting, he is not entirely surprised. The challenge, he says, lies in understanding what it means. “The interpretation of this phenomenon, the duality between this delocalization and the apparently local nature in the act of measurement, is still an open conundrum,” he says.

Looking ahead, the researchers say they would now like to extend their research to higher mass objects, longer coherence times, higher force sensitivity and different materials, including nanobiological materials as well as other metals and dielectrics. “We still have a lot of work to do on sources, beam splitters, detectors, vibration isolation and cooling,” says Arndt. “This is a big experimental adventure for us.”

The post Schrödinger cat state sets new size record appeared first on Physics World.

  •  

Viridian inks cooperative agreement with Air Force Research Laboratory

SAN FRANCISCO — Viridian Space Corp. signed a cooperative research and development agreement (CRADA) with the Air Force Research Laboratory. The five-year CRADA will provide the Southern California startup with access to testing facilities and satellite-operations expertise at AFRL’s Kirtland Air Force Base in New Mexico. “There seems to be a good collaborative opportunity for testing […]

The post Viridian inks cooperative agreement with Air Force Research Laboratory appeared first on SpaceNews.

  •  

Space telescopes at light speed

The Nancy Grace Roman Space Telescope completed final assembly in late November 2025. Credit: Jolearra Tshiteya/NASA

Light is the fastest phenomenon in the universe, clocking in at just under 300,000 kilometers per second. The telescopes that observe that light, from radio waves to gamma rays, are built at rather slower speeds. Take, as one example, the James Webb Space Telescope. NASA began feasibility studies for the mission in the mid-1990s and […]

The post Space telescopes at light speed appeared first on SpaceNews.

  •  

Using AI boosts scientific productivity and career prospects, finds study

Using artificial intelligence (AI) increases scientists’ productivity and impact but collectively leads to a shrinking of research focus. That is according to an analysis of more than 41 million research papers by scientist in China and the US, which finds that scientists who produce AI-augmented research also progress faster in their careers than their colleagues who do not (Nature 649 1237).

The study was carried out by James Evans, a sociologist at the University of Chicago, and his colleagues who analysed 41.3 million papers listed in the OpenAlex dataset published between 1980 and 2025. They looked at papers in physics and five other disciplines – biology, chemistry, geology, materials science and medicine.

Using an AI language model to identify AI-assisted work, the team picked out almost 310 000 AI-augmented papers from the dataset. They found that AI-supported publications receive more citations than no-AI-assisted papers, while also being more impactful across multiple indicators and having a higher prevalence in high-impact journals.

Individual researchers who adopt AI publish, on average, three times as many papers and get almost five times as many citations as those not using AI. In physics, researchers who use AI tools garner 183 citations every year, on average, while those who do not use AI get only 51 annually.

AI also boosts career trajectories. Based on an analysis of more than two million scientists in the dataset, the study finds that junior researchers who adopt AI are more likely to become established scientists. They also gain project leadership roles almost one-and-a-half years earlier, on average, than those who do not use AI.

Fundamental questions

But when the researchers examined the knowledge spread of a random sample of 10 000 papers, half of which used AI, they found that AI-produced work shrinks the range of topics covered by almost 5%. The finding is consistent across all six disciplines. Furthermore, AI papers are more clustered than non-AI papers, suggesting a tendency to concentrate on specific problems.

AI tools, in other words, appear to funnel research towards areas rich in data and help to automate established fields rather than exploring new topics. Evans and colleagues think this AI-induced convergence could drive science away from foundational questions and towards data-rich operational topics.

AI could, however, help combat this trend. “We need to reimagine AI systems that expand not only cognitive capacity but also sensory and experimental capacity,” they say. “[This could] enable and incentivize scientists to search, select and gather new types of data from previously inaccessible domains rather than merely optimizing analysis of standing data.”

Meanwhile, a new report by the AI company OpenAI has found that messages on advanced topics in science and mathematics on ChatGPT over the last year have grown by nearly 50%, to almost 8.4 million per week. The firm says its generative AI chatbot is being used to advance research across scientific fields from experiment planning and literature synthesis to mathematical reasoning and data analysis.

The post Using AI boosts scientific productivity and career prospects, finds study appeared first on Physics World.

  •  

Interactions between dark matter and neutrinos could resolve a cosmic discrepancy

Hints of non-gravitational interactions between dark matter and “relic” neutrinos in the early universe have emerged in a study of astronomical data from different periods of cosmic history. The study was carried out by cosmologists in Poland, the UK and China, and team leader Sebastian Trojanowski of Poland’s NCBJ and NCAC PAS notes that future telescope observations could verify or disprove these hints of a deep connection between dark matter and neutrinos.

Dark matter and neutrinos play major roles in the evolution of cosmic structures, but they are among the universe’s least-understood components. Dark matter is thought to make up over 25% of the universe’s mass, but it has never been detected directly; instead, its existence is inferred from its gravitational interactions. Neutrinos, for their part, are fundamental subatomic particles that have a very low mass and interact only rarely with normal matter.

Analysing data from different epochs

According to the standard (ΛCDM) model of cosmology, dark matter and neutrinos do not interact with each other. The work of Trojanowski and colleagues challenges this model by proposing that dark matter and neutrinos may have interacted in the past, when the universe was younger and contained many more neutrinos than it does today.

This proposal, they say, was partly inspired by a longstanding cosmic conundrum. Measurements of the early universe suggest that structures such as galaxies should have grown more rapidly than ΛCDM predicts. At the same time, observations of today’s universe indicate that matter is slightly less densely packed than expected. This suggests a slight mismatch between early and late measurements.

To explore the impact that dark matter-neutrino interactions (νDM) would have on this mismatch, a team led by Trojanowski’s colleague Lei Zu analysed data from different epochs of the universe’s evolution. Data from the young (high redshift) universe came from two instruments – the ground-based Atacama Cosmology Telescope and the space-based Planck Telescope, which the European Space Agency operated from 2009 to 2013 – that were designed to study the afterglow of the Big Bang, which is known as the cosmic microwave background (CMB). Data from the older (low-redshift, or z< 3.5) universe, meanwhile, came from a variety of sources, including galaxy maps from the Sloan Digital Sky Survey and weak gravitational lensing data from the Dark Energy Survey (DES) conducted with the Dark Energy Camera on the Victor M Blanco Telescope in Chile.

“New insight into how structure formed in the universe”

Drawing on these data, the team calculated that an interaction strength u ≈10−4 between dark matter and neutrinos would be enough to resolve the discrepancy. The statistical significance of this result is nearly 3σ, which team member Sming Tsai Yue-Lin of the Purple Mountain Observatory in Nanjing, China says was “largely achieved by incorporating the high-precision weak lensing data from the DES with the weak lensing component”.​

While this is not high enough to definitively disprove the ΛCDM model, the researchers say it does show that the model is incomplete and requires further investigation. Our study shows that interactions between dark matter and neutrinos could help explain this difference, offering new insight into how structure formed in the universe,” explains team member Eleonora Di Valentino, a senior research fellow at Sheffield University, UK.

Trojanowski adds that the ΛCDM has been under growing pressure in recent years, while the Standard Model of particle physics cannot explain the nature of dark matter. “These two theories need to be extended to resolve these problems and studying dark matter-neutrino interactions are a promising way to achieve this goal,” he says.

The team’s result, he continues, adds to the “massive amount of data” suggesting that we are reaching the limits of the standard cosmological model and may be at the dawn of understanding physics beyond it. “We illustrate that we likely need to bridge cosmological data and fundamental particle physics to describe the universe across different scales and so resolve current anomalies,” he says.

Two worlds

One of the challenges of doing this, Trojanowski adds, is that the two fields involved – cosmological data analysis and theoretical astroparticle physics – are very different. “Each field has its own approach to problem-solving and even its own jargon,” he says. “Fortunately, we had a great team and working together was really fun.”

The researchers say that data from future telescope observations, such as those from the Vera C Rubin Observatory (formerly known as the Large Synoptic Survey Telescope, LSST) and the China Space Station Telescope (CSST), could place more stringent tests on their hypothesis. Data from CMB experiments and weak lensing surveys, which map the distribution of mass in the universe by analysing how distant galaxies distort light, could also come in useful.

They detail their present research in Nature Astronomy.

The post Interactions between dark matter and neutrinos could resolve a cosmic discrepancy appeared first on Physics World.

  •  

Silicon as strategy: the hidden battleground of the new space race

Photo of a 200mm silicon wafer. Credit: Goldenvu via Wikimedia Commons; CC BY-SA 4.0

In the consumer electronics playbook, custom silicon is the final step in the marathon: you use off-the-shelf components to prove a product, achieve mass scale and only then invest in proprietary chips to create differentiation, improve operations, and optimize margins. In the modern satellite communications (SATCOM) ecosystem, this script has been flipped. For the industry’s […]

The post Silicon as strategy: the hidden battleground of the new space race appeared first on SpaceNews.

  •  

As satellites become targets, Space Force plans a broader role

Gen. Shawn Bratton spoke with SpaceNews’ Sandra Erwin Jan. 21 at the Johns Hopkins University Bloomberg Center. Credit: Johns Hopkins University Bloomberg Center

Gen. Shawn Bratton, the Space Force’s vice chief of space operations, spoke with SpaceNews’ Sandra Erwin as part of an event focused on the Space Force 2040 at the Johns Hopkins University Bloomberg Center on Jan. 21. Here are six takeaways from their conversation: Planning for 2040 means more space superiority A long-range planning initiative […]

The post As satellites become targets, Space Force plans a broader role appeared first on SpaceNews.

  •  

Quantum states that won’t entangle

Quantum entanglement is a uniquely quantum link between particles that makes their properties inseparable. It underlies the power of many quantum technologies from secure communication to quantum computing, by enabling correlations impossible in classical physics.

Entanglement nevertheless remains poorly understood and is therefore the subject of a lot of research, both in the fields of quantum technologies as well as fundamental physics.

In this context, the idea of separability refers to a composite system that can be written as a simple product (or mixture of products) of the states of its individual parts. This implies there is no entanglement between them and to create entanglement, a global transformation is needed.

A system that remains completely free of entanglement, even after any possible global invertible transformation is applied, is called absolutely separable.  In other words, it can never become entangled under the action of quantum gates.

Absolutely separable
Separable, Absolutely Separable and Entangled sets: It is impossible to make absolutely separable states entangled with a global transformation (Courtesy J. Abellanet Vidal and A. Sanpera Trigueros)

Necessary and sufficient conditions to ensure separability exist only in the simplest cases or for highly restricted families of states. In fact, entanglement verification and quantification is known to be generically an NP-hard problem.

Recent research published by a team of researchers from Spain and Poland has tackled this problem head-on. By introducing new analytical tools such as linear maps and their inverses, they were able to identify when a quantum state is guaranteed to be absolutely separable.

These tools work in any number of dimensions and allow the authors to pinpoint specific states that are on the border of being absolutely separable or not (mathematically speaking, ones that lie on the boundary of the set). They also show how different criteria for absolute separability, which may not always agree with each other, can be combined and refined using convex geometry optimisation.

Being able to more easily and accurately determine whether a quantum state is absolutely separable will be invaluable in quantum computation and communication.

The team’s results for multipartite systems (systems with more than two parts) also reveal how little we currently understand about the entanglement properties of mixed, noisy states. This knowledge gap suggests that much more research is needed in this area.

Read the full article

Sufficient criteria for absolute separability in arbitrary dimensions via linear map inverses – IOPscience

J. Abellanet Vidal et al, 2025 Rep. Prog. Phys. 88 107601

The post Quantum states that won’t entangle appeared first on Physics World.

  •  
❌