↩ Accueil

Vue lecture

Viridian inks cooperative agreement with Air Force Research Laboratory

SAN FRANCISCO — Viridian Space Corp. signed a cooperative research and development agreement (CRADA) with the Air Force Research Laboratory. The five-year CRADA will provide the Southern California startup with access to testing facilities and satellite-operations expertise at AFRL’s Kirtland Air Force Base in New Mexico. “There seems to be a good collaborative opportunity for testing […]

The post Viridian inks cooperative agreement with Air Force Research Laboratory appeared first on SpaceNews.

  •  

Space telescopes at light speed

The Nancy Grace Roman Space Telescope completed final assembly in late November 2025. Credit: Jolearra Tshiteya/NASA

Light is the fastest phenomenon in the universe, clocking in at just under 300,000 kilometers per second. The telescopes that observe that light, from radio waves to gamma rays, are built at rather slower speeds. Take, as one example, the James Webb Space Telescope. NASA began feasibility studies for the mission in the mid-1990s and […]

The post Space telescopes at light speed appeared first on SpaceNews.

  •  

Using AI boosts scientific productivity and career prospects, finds study

Using artificial intelligence (AI) increases scientists’ productivity and impact but collectively leads to a shrinking of research focus. That is according to an analysis of more than 41 million research papers by scientist in China and the US, which finds that scientists who produce AI-augmented research also progress faster in their careers than their colleagues who do not (Nature 649 1237).

The study was carried out by James Evans, a sociologist at the University of Chicago, and his colleagues who analysed 41.3 million papers listed in the OpenAlex dataset published between 1980 and 2025. They looked at papers in physics and five other disciplines – biology, chemistry, geology, materials science and medicine.

Using an AI language model to identify AI-assisted work, the team picked out almost 310 000 AI-augmented papers from the dataset. They found that AI-supported publications receive more citations than no-AI-assisted papers, while also being more impactful across multiple indicators and having a higher prevalence in high-impact journals.

Individual researchers who adopt AI publish, on average, three times as many papers and get almost five times as many citations as those not using AI. In physics, researchers who use AI tools garner 183 citations every year, on average, while those who do not use AI get only 51 annually.

AI also boosts career trajectories. Based on an analysis of more than two million scientists in the dataset, the study finds that junior researchers who adopt AI are more likely to become established scientists. They also gain project leadership roles almost one-and-a-half years earlier, on average, than those who do not use AI.

Fundamental questions

But when the researchers examined the knowledge spread of a random sample of 10 000 papers, half of which used AI, they found that AI-produced work shrinks the range of topics covered by almost 5%. The finding is consistent across all six disciplines. Furthermore, AI papers are more clustered than non-AI papers, suggesting a tendency to concentrate on specific problems.

AI tools, in other words, appear to funnel research towards areas rich in data and help to automate established fields rather than exploring new topics. Evans and colleagues think this AI-induced convergence could drive science away from foundational questions and towards data-rich operational topics.

AI could, however, help combat this trend. “We need to reimagine AI systems that expand not only cognitive capacity but also sensory and experimental capacity,” they say. “[This could] enable and incentivize scientists to search, select and gather new types of data from previously inaccessible domains rather than merely optimizing analysis of standing data.”

Meanwhile, a new report by the AI company OpenAI has found that messages on advanced topics in science and mathematics on ChatGPT over the last year have grown by nearly 50%, to almost 8.4 million per week. The firm says its generative AI chatbot is being used to advance research across scientific fields from experiment planning and literature synthesis to mathematical reasoning and data analysis.

The post Using AI boosts scientific productivity and career prospects, finds study appeared first on Physics World.

  •  

Interactions between dark matter and neutrinos could resolve a cosmic discrepancy

Hints of non-gravitational interactions between dark matter and “relic” neutrinos in the early universe have emerged in a study of astronomical data from different periods of cosmic history. The study was carried out by cosmologists in Poland, the UK and China, and team leader Sebastian Trojanowski of Poland’s NCBJ and NCAC PAS notes that future telescope observations could verify or disprove these hints of a deep connection between dark matter and neutrinos.

Dark matter and neutrinos play major roles in the evolution of cosmic structures, but they are among the universe’s least-understood components. Dark matter is thought to make up over 25% of the universe’s mass, but it has never been detected directly; instead, its existence is inferred from its gravitational interactions. Neutrinos, for their part, are fundamental subatomic particles that have a very low mass and interact only rarely with normal matter.

Analysing data from different epochs

According to the standard (ΛCDM) model of cosmology, dark matter and neutrinos do not interact with each other. The work of Trojanowski and colleagues challenges this model by proposing that dark matter and neutrinos may have interacted in the past, when the universe was younger and contained many more neutrinos than it does today.

This proposal, they say, was partly inspired by a longstanding cosmic conundrum. Measurements of the early universe suggest that structures such as galaxies should have grown more rapidly than ΛCDM predicts. At the same time, observations of today’s universe indicate that matter is slightly less densely packed than expected. This suggests a slight mismatch between early and late measurements.

To explore the impact that dark matter-neutrino interactions (νDM) would have on this mismatch, a team led by Trojanowski’s colleague Lei Zu analysed data from different epochs of the universe’s evolution. Data from the young (high redshift) universe came from two instruments – the ground-based Atacama Cosmology Telescope and the space-based Planck Telescope, which the European Space Agency operated from 2009 to 2013 – that were designed to study the afterglow of the Big Bang, which is known as the cosmic microwave background (CMB). Data from the older (low-redshift, or z< 3.5) universe, meanwhile, came from a variety of sources, including galaxy maps from the Sloan Digital Sky Survey and weak gravitational lensing data from the Dark Energy Survey (DES) conducted with the Dark Energy Camera on the Victor M Blanco Telescope in Chile.

“New insight into how structure formed in the universe”

Drawing on these data, the team calculated that an interaction strength u ≈10−4 between dark matter and neutrinos would be enough to resolve the discrepancy. The statistical significance of this result is nearly 3σ, which team member Sming Tsai Yue-Lin of the Purple Mountain Observatory in Nanjing, China says was “largely achieved by incorporating the high-precision weak lensing data from the DES with the weak lensing component”.​

While this is not high enough to definitively disprove the ΛCDM model, the researchers say it does show that the model is incomplete and requires further investigation. Our study shows that interactions between dark matter and neutrinos could help explain this difference, offering new insight into how structure formed in the universe,” explains team member Eleonora Di Valentino, a senior research fellow at Sheffield University, UK.

Trojanowski adds that the ΛCDM has been under growing pressure in recent years, while the Standard Model of particle physics cannot explain the nature of dark matter. “These two theories need to be extended to resolve these problems and studying dark matter-neutrino interactions are a promising way to achieve this goal,” he says.

The team’s result, he continues, adds to the “massive amount of data” suggesting that we are reaching the limits of the standard cosmological model and may be at the dawn of understanding physics beyond it. “We illustrate that we likely need to bridge cosmological data and fundamental particle physics to describe the universe across different scales and so resolve current anomalies,” he says.

Two worlds

One of the challenges of doing this, Trojanowski adds, is that the two fields involved – cosmological data analysis and theoretical astroparticle physics – are very different. “Each field has its own approach to problem-solving and even its own jargon,” he says. “Fortunately, we had a great team and working together was really fun.”

The researchers say that data from future telescope observations, such as those from the Vera C Rubin Observatory (formerly known as the Large Synoptic Survey Telescope, LSST) and the China Space Station Telescope (CSST), could place more stringent tests on their hypothesis. Data from CMB experiments and weak lensing surveys, which map the distribution of mass in the universe by analysing how distant galaxies distort light, could also come in useful.

They detail their present research in Nature Astronomy.

The post Interactions between dark matter and neutrinos could resolve a cosmic discrepancy appeared first on Physics World.

  •  

Silicon as strategy: the hidden battleground of the new space race

Photo of a 200mm silicon wafer. Credit: Goldenvu via Wikimedia Commons; CC BY-SA 4.0

In the consumer electronics playbook, custom silicon is the final step in the marathon: you use off-the-shelf components to prove a product, achieve mass scale and only then invest in proprietary chips to create differentiation, improve operations, and optimize margins. In the modern satellite communications (SATCOM) ecosystem, this script has been flipped. For the industry’s […]

The post Silicon as strategy: the hidden battleground of the new space race appeared first on SpaceNews.

  •  

As satellites become targets, Space Force plans a broader role

Gen. Shawn Bratton spoke with SpaceNews’ Sandra Erwin Jan. 21 at the Johns Hopkins University Bloomberg Center. Credit: Johns Hopkins University Bloomberg Center

Gen. Shawn Bratton, the Space Force’s vice chief of space operations, spoke with SpaceNews’ Sandra Erwin as part of an event focused on the Space Force 2040 at the Johns Hopkins University Bloomberg Center on Jan. 21. Here are six takeaways from their conversation: Planning for 2040 means more space superiority A long-range planning initiative […]

The post As satellites become targets, Space Force plans a broader role appeared first on SpaceNews.

  •  

Quantum states that won’t entangle

Quantum entanglement is a uniquely quantum link between particles that makes their properties inseparable. It underlies the power of many quantum technologies from secure communication to quantum computing, by enabling correlations impossible in classical physics.

Entanglement nevertheless remains poorly understood and is therefore the subject of a lot of research, both in the fields of quantum technologies as well as fundamental physics.

In this context, the idea of separability refers to a composite system that can be written as a simple product (or mixture of products) of the states of its individual parts. This implies there is no entanglement between them and to create entanglement, a global transformation is needed.

A system that remains completely free of entanglement, even after any possible global invertible transformation is applied, is called absolutely separable.  In other words, it can never become entangled under the action of quantum gates.

Absolutely separable
Separable, Absolutely Separable and Entangled sets: It is impossible to make absolutely separable states entangled with a global transformation (Courtesy J. Abellanet Vidal and A. Sanpera Trigueros)

Necessary and sufficient conditions to ensure separability exist only in the simplest cases or for highly restricted families of states. In fact, entanglement verification and quantification is known to be generically an NP-hard problem.

Recent research published by a team of researchers from Spain and Poland has tackled this problem head-on. By introducing new analytical tools such as linear maps and their inverses, they were able to identify when a quantum state is guaranteed to be absolutely separable.

These tools work in any number of dimensions and allow the authors to pinpoint specific states that are on the border of being absolutely separable or not (mathematically speaking, ones that lie on the boundary of the set). They also show how different criteria for absolute separability, which may not always agree with each other, can be combined and refined using convex geometry optimisation.

Being able to more easily and accurately determine whether a quantum state is absolutely separable will be invaluable in quantum computation and communication.

The team’s results for multipartite systems (systems with more than two parts) also reveal how little we currently understand about the entanglement properties of mixed, noisy states. This knowledge gap suggests that much more research is needed in this area.

Read the full article

Sufficient criteria for absolute separability in arbitrary dimensions via linear map inverses – IOPscience

J. Abellanet Vidal et al, 2025 Rep. Prog. Phys. 88 107601

The post Quantum states that won’t entangle appeared first on Physics World.

  •  

The secret limits governing quantum relaxation

When we interact with everyday objects, we take for granted that physical systems naturally settle into stable, predictable states. A cup of coffee cools down. A playground swing slows down after being pushed.  Quantum systems, however, behave very differently.

These systems can exist in multiple states at once, and their evolution is governed by probabilities rather than certainties. Nevertheless, even these strange systems do eventually relax and settle down, losing information about their earlier state. The speed at which this happens is called the relaxation rate.

Relaxation rates tell us how fast a quantum system forgets its past, how quickly it thermalises, reaches equilibrium, decoheres, or dissipates energy. These rates are important not just for theorists but also for experimentalists, who can measure them directly in the lab.

Recently, researchers discovered that these rates obey a surprisingly universal rule. For a broad class of quantum processes (those described by what physicists call Markovian semigroups) the fastest possible relaxation rate cannot exceed a certain limit. Specifically, it must be no larger than the sum of all relaxation rates divided by the system’s dimension. This constraint, originally a conjecture, was first proven using tools from classical mathematics known as Lyapunov theory.

In a new paper published recently, an international team of researchers provided a new, more direct algebraic proof of this universal bound. There are a number of advantages of the new proof compared to the older one, and it can be generalised more easily, but that’s not all.

The very surprising outcome of their work is that the rule doesn’t require complete positivity. Instead, a weaker condition – two‑positivity is enough. The distinction between these two requirements is crucial.

Essentially, both are measures of how well-behaved a quantum system is, how it is protected from providing nonsensical results. The difference is that two-positivity is slightly less stringent but far more general, and hence very useful for many real-world applications.

The fact that the new proof only requires two-positivity means that it this new universal relaxation rate can actually be applied to a lot more scenarios.

What’s more, even when weakened even further, a slightly softer version of the universal constraint still holds. This shows that the structure behind these bounds is richer and more subtle than previously understood.

Read the full article

A universal constraint for relaxation rates for quantum Markov generators: complete positivity and beyond – IOPscience

D. Chruściński et al, 2025 Rep. Prog. Phys. 88 097602

The post The secret limits governing quantum relaxation appeared first on Physics World.

  •  
❌