Two Marsupials Re-emerge in New Guinea Rainforests After Being Considered Extinct for 6,000 Years









The British-American theoretical physicist Anthony Leggett died on 8 March at the age of 87. Leggett shared the 2003 Nobel Prize in Physics with Alexei Abrikosov and Vitaly Ginzburg for their “pioneering contributions to the theory of superconductors and superfluidity”.
Born on 26 March 1938 in London, UK, Leggett graduated in literae humaniores (classical languages and literature, philosophy and Greco-Roman history) at the University of Oxford in 1959.
While philosophy was Leggett’s strongest subject, he did not envisage a career as a philosopher because he felt that the subject depended more on turns of phrase than objective criteria.
As part of an experiment at Balliol College, Oxford, to see if it was possible to convert a classicist with minimal qualifications in maths and science into a physicist, Leggett was awarded a degree in physics in 1961.
Leggett then embarked on a DPhil in physics, which he completed at Oxford in 1964, followed by postdocs at the University of Illinois Urbana-Champaign in the US and Kyoto University, Japan.
In 1967 he moved back to the UK, spending the next 15 years at Sussex University. It was at Sussex that he carried out his Nobel-prize-winning work on the theory of superfluidity – the ability of a fluid to flow without viscosity.
Superfluidity in helium-4 was discovered in the 1930s, and in the 1960s several theorists predicted that helium-3 might also be a superfluid.
However, the two forms of helium are fundamentally different. Helium-4 atoms are bosons and can all condense into the same quantum ground state at low enough temperatures – an essential feature of both superfluidity and superconductivity.
Helium-3 atoms, on the other hand, are fermions and the Pauli exclusion principle prevents them from entering such a quantum state.
Electrons, which are also fermions, overcome this problem by forming Cooper pairs as described by the BCS theory of superconductivity that was developed in the mid-1950s by John Bardeen, Leon Cooper and Robert Schrieffer.
Theorists predicted that helium-3 atoms could do something similar and in 1972 superfluidity in helium-3 was finally observed at Cornell University – a feat that earned David Lee, Douglas Osheroff and Robert Richardson the 1996 Nobel Prize in Physics.
Yet many of the results puzzled theorists. In particular there were three different superfluid phases, and the results of nuclear magnetic resonance experiments on the samples could not be explained.
Leggett showed that these results could be explained by the spontaneous breaking of various symmetries in the superfluid and for the work he was awarded a third of the 2003 Nobel Prize in Physics, with Abrikosov and Ginzburg being honoured for their work on type-II superconductors.
In 1983 Leggett moved to the University of Illinois at Urbana-Champaign where he remained for the rest of his career until retiring in 2019. There he focussed on problems in high-temperature superconductivity, superfluidity in quantum gases and the fundamentals of quantum mechanics.
In 1998 he was elected an Honorary Fellow of the Institute of Physics and in 2004 was appointed Knight Commander of the Order of the British Empire (KBE) “for services to physics”. In 2023 the Institute for Condensed Matter Theory at the University of Illinois at Urbana-Champaign was renamed the Sir Anthony Leggett Institute.
As well as the Nobel prize, Leggett won many other awards including the 2002 Wolf Prize for physics. He also published two books: The Problems of Physics (Oxford University Press, 1987) and Quantum Liquids (Oxford University Press, 2006).
Peter McClintock from Lancaster University, who has carried out work in superfluidity, says he is “very sad” to hear the news. “[Leggett] was a brilliant physicist whose genius was to comprehend underlying mechanisms and processes and explain their physical essence in comprehensible ways,” says McClintock. “My dominant memory is of the discovery of the superfluid phases of helium-3 and of the way in which [Leggett] was able to interpret each new item of experimental information and slot it into a nascent theoretical framework to build up a coherent picture of what was going on – while always enumerating the remaining loose ends and possible alternative explanations.”
In a statement, Makoto Gonokami, president of the RIKEN labs in Japan, also expressed that he is “deeply saddened” by the news and that Leggett had “provided warm support for researchers in Japan” through his many trips to the country.
“Leggett made pioneering contributions to our understanding of how quantum mechanics manifests itself in macroscopic matter [and] his theoretical work on superfluid helium-3 provided profound insights into quantum order in strongly interacting fermionic systems,” notes Gonokami. “His work significantly advanced the study of quantum condensed matter and macroscopic quantum coherence.”
The post Condensed-matter physics pioneer and Nobel laureate Anthony Leggett dies aged 87 appeared first on Physics World.
Imagine all the different ways you can rearrange a list of labelled items. If you know only a tiny fraction of the labels describing the elements of the list, it’s easy to assume you have almost no information about the order of the list as a whole under permutations. After all, if you shuffle a large deck of cards and then hide most of the labels on the cards, how could anyone possibly tell what permutations you made?
Recent theoretical work by physicists at Universitat Autonoma de Barcelona (UAB), Spain, and Hunter College of the City University of New York (CUNY), US, reveals that this intuition can fail in surprising ways, hinting at deep links between information, symmetry and computation. Specifically, the UAB-CUNY team found that quantum mechanics plays a key role in preserving parity – a global property of a permutation – even when most local information is erased.
Imagine a clever magician named Alice. She hands you a stack of n coloured disks in a known order and leaves the room while you shuffle them. When she returns, she asks: “Can I tell how you permuted the disks?”
If every disk has its own unique label, the answer is obviously “yes”. But if Alice removes some of the labels, she can pose a subtler challenge: “Can I at least tell whether your shuffle swapped the positions of the cards an even or odd number of times?”
Classically, the answer is “no”. With fewer labels than disks, some labels must be repeated. Swapping two disks with the same label leaves the observed configuration unchanged, yet flips the parity of the underlying permutation. As a result, determining parity with certainty requires one unique label per disk. Anything less, and the information is fundamentally lost.
Quantum mechanics changes this conclusion. In their paper, which is published in Physical Review Letters, UAB’s Arnau Diebra and colleagues showed that as long as there are at least √n labels, far fewer than the total number of disks, one can still determine the parity of any permutation applied to the system when the game follows the rules of quantum mechanics. The problem remains the same; the only difference is that we are now preparing our initial state as a quantum state. In other words, even when most of the detailed information about individual elements is erased, a global feature of the transformation survives and exploiting quantum features makes it possible to extract it with carefully chosen information. This is not sleight of hand: it is a genuine mathematical insight into how much information certain global properties retain under massive data reduction.
In the field of quantum science, it’s common to ask whether quantum systems can outperform classical ones at specific tasks, a phenomenon known as quantum advantage. Here, “advantage” does not necessarily mean doing everything faster, but rather the ability to solve carefully chosen problems using fewer resources such as time, memory or information. Notable examples include quantum algorithms that factor large numbers more efficiently than any known classical method, and quantum communication protocols that achieve tasks that would be impossible with classical correlations alone.
The parity-identification problem fits naturally into this landscape. Parity is a global property, insensitive to most local details. In this respect, it resembles many other quantities studied in quantum physics, from topological invariants to entanglement measures.
What makes quantum advantage possible in this problem is entanglement – and lots of it. A compound quantum system is said to be entangled when its subsystems are correlated in a nonclassical way. A simple example might be a pair of qubits (quantum bits) for which measuring the state of one qubit gives you information about the state of the other in a way that cannot be reproduced by any classical correlation. In their work, the UAB-CUNY physicists used a geometric measure of entanglement: the “distance” between the state of the system and a state in which all subsystems are separable (that is, not entangled). If this distance is too short, the protocol fails entirely.
The crucial point is that entanglement allows information about the permutation to be stored in genuinely nonlocal correlations among particles (the “cards” in the deck), rather than in properties of each particle/card individually. In effect, the “memory” needed to identify the parity is written into the joint quantum state. No single particle carries the answer, but the system as a whole does. This is precisely what classical systems cannot replicate: once local labels are lost, there is nowhere left for the information to hide.
The fact that the threshold for quantum advantage scales with √n is one of the most intriguing aspects of the work. At present, the reason for this remains an open question. While Diebra and colleagues emphasize that the scaling is provably optimal within quantum mechanics, they acknowledge that a more intuitive or fundamental explanation is still missing. Finding such an explanation could illuminate broader principles governing how quantum systems compress and protect global information.
While the parity-identification problem has no immediate known applications, understanding how properties can be inferred from limited information is also crucial when dealing with realistic quantum devices, where noise, decoherence and imperfect measurements severely restrict what information can be accessed. Results like this therefore suggest that some computational or informational tasks may remain feasible even when our view of the system is drastically incomplete.
Speaking more broadly, the conceptual implications of proving new examples quantum advantage are clear: even for extremely simple inference tasks, quantum strategies can outperform classical ones in unexpected and qualitative ways. The result therefore provides a clean testing ground for deeper questions about quantum resources, symmetry and information compression. Which specific features of entanglement are responsible for the advantage? Can similar thresholds be found for other groups or more complex symmetries? And does the square-root scaling reflect a universal principle?
For now, the work serves as a reminder that – even decades into the development of quantum information theory – basic questions about how information is stored, hidden, and revealed in quantum systems can still produce genuine surprises.
The post Physicists identify unexpected quantum advantage in a permutation parity task appeared first on Physics World.

Modern society has become profoundly reliant on Global Navigation Satellite Systems (GNSS). These systems support aviation safety, emergency services, finance, communications, energy networks and an expanding array of autonomous and industrial systems. Yet despite this reliance, GNSS remains inherently fragile: low‑power signals transmitted from medium Earth orbit are surprisingly easy to degrade, and the consequences […]
The post GNSS resilience is an economic and security priority appeared first on SpaceNews.

Voyager Technologies is investing in Max Space to help accelerate a partnership between the two companies on developing lunar habitats.
The post Voyager Technologies invests in Max Space appeared first on SpaceNews.

SpaceX is pushing back the first launch of the latest version of Starship even as NASA is asking it to accelerate work on a lunar lander version of the vehicle.
The post First Starship V3 launch slips appeared first on SpaceNews.

A senior Chinese space scientist and delegate to the country’s national congress is proposing the prioritization of an unprecedented orbiter mission to ice giant Neptune.
The post Chinese official calls for prioritizing Neptune orbiter mission appeared first on SpaceNews.
A new of way of searching for dark-matter candidate particles called axions has produced the tightest constraint yet on how they can interact with normal matter. Using a two-city network of quantum sensors based on nuclear spins, physicists in China narrowed the possible values of a parameter known as axion-nucleon coupling below a limit previously set by astrophysical observations. As well as insights on the nature of dark matter, the technique could aid investigations of other beyond-the-Standard-Model physics phenomena such as axion stars, axion strings and Q-balls.
Dark matter is thought to make up over 25% of the universe’s mass, but it has never been detected directly. Instead, we infer its existence from its gravitational interactions with visible matter and its effect on the large-scale structure of the universe.
While the Standard Model of particle physics does not incorporate dark matter, several physicists have proposed ideas for how to bring it into the fold. One of the most promising involves particles called axions. First hypothesized in the 1970s as a way of explaining unresolved questions about charge-parity violation, axions are chargeless and much less massive than electrons. This means they interact only weakly with matter and electromagnetic radiation.
According to theoretical calculations, the Big Bang should have produced axions in abundance. During phase transitions in the early universe, these axions would have formed topological defects – defects that study leader Xinhua Peng of the University of Science and Technology of China (USTC) says should, in principle, be detectable. “These defects are expected to interact with nuclear spins and induce signals as the Earth crosses them,” Peng explains.
The problem, Peng continues, is that such signals are expected to be extremely weak and transient. She and her colleagues therefore developed an alternative axion search method that exploits a different predicted behaviour.
When fermions (particles with half-integer spin) interact, or couple, with axions, they should produce a pseudo-magnetic field. Peng and colleagues looked for evidence of this interaction using a network of five quantum sensors, four in Hefei and one in Hangzhou. These sensors combined a large ensemble of polarized rubidium-87 (87Rb) atoms with polarized xeon-129 (129Xe) nuclear spins.
“Using nuclear spins has many advantages,” Peng explains. “These include a higher energy resolution detection for topological dark matter (TDM) axions thanks to a much smaller gyromagnetic ratio of nuclear spins; substantial spin amplification owing to the high ensemble density of noble-gas spins; and efficient optimal filtering enabled by the long nuclear-spin coherence time.”
The USTC researchers’ setup also has other advantages over previous laboratory-based TDM searches, including the Global Network of Optical Magnetometers for Exotic physics searches (GNOME). While GNOME operates in a steady-state detection mode, the USTC researchers use a detection scheme that probes transient “free-decay oscillating” signals generated on spins after a TDM crossing. The USTC team also implemented a dual-phase optimal filtering algorithm to extract TDM signals with a signal-to-noise ratio at the theoretical maximum.
Peng tells Physics World that these advantages enabled the team to explore regions of TDM parameter space well beyond limits set by astrophysical searches. The transient-state detection scheme also enables sensitive searches for TDM in the region where the axion mass exceeds 100 peV – a region that GNOME cannot access.
The researchers have not yet recorded a statistically significant topological crossing event using their setup, so the dark matter search is not over. However, they have set more stringent constraints on axion-nucleon coupling across a range of axion masses from 10 peV to 0.2 μeV. Notably, they calculated that the coupling strength must be greater than 4.1 x 1010 GeV at an axion mass of 84 peV. This limit is stricter than those obtained from astrophysical observations, though Peng notes that these rely on different assumptions.
Peng says the technique developed in this study, which is published in Nature, could lead to the development of even larger, more sensitive networks for detecting transient spin signals such as those from TDM. It also opens new avenues for investigating other physical phenomena beyond the Standard Model that have been theoretically proposed, but have so far lacked a pathway for experimental exploration.
The researchers now plan to increase the number of sensor stations in their network and extend their geographical baselines to intercontinental and even space-based scales. Peng explains that doing so will enhance the network’s detection sensitivity and boost signal confidence. “We also want to enhance the sensitivity of individual sensors via better spin polarization, longer coherence times and advanced quantum control techniques,” she says. Switching to a ³He–K system, she adds, could boost their current spin-rotation sensitivity by up to four orders of magnitude.
The post Long-distance quantum sensor network advances the search for dark matter appeared first on Physics World.


German launch startup Rocket Factory Augsburg (RFA) says it is planning its first launch for this summer after delivering two of its stages to the launch site.
The post RFA plans first launch this summer appeared first on SpaceNews.

Program will test large segmented optical system for future surveillance satellites
The post Air Force lab awards BlackSky contract worth up to $99 million for large optical satellite payload appeared first on SpaceNews.

NASA has selected the Centaur upper stage currently used on United Launch Alliance’s Vulcan rocket for future flights of the Space Launch System.
The post NASA selects Centaur for new SLS upper stage appeared first on SpaceNews.
