↩ Accueil

Vue normale

Reçu aujourd’hui — 4 mars 2026 6.5 📰 Sciences English

The humanity of machines: the relationship between technology and our bodies

Par : No Author
4 mars 2026 à 12:00

Humanity has had a complicated relationship with machines and technology for centuries. While we created these inventions to make our lives easier, and have become heavily reliant upon them, we have often feared their impact on society.

In her debut book, The Body Digital: a Brief History of Humans and Machines from Cuckoo Clocks to ChatGPT, Vanessa Chang tells the story of this symbiotic partnership, covering tools as diverse as the self-playing piano and generative AI products. The short book combines creative storytelling, an inward look at our bodies and interpersonal relationships, and a detailed history of invention. Chang – who is the director of programmes at Leonardo, the International Society for the Arts, Sciences, and Technology in California – offers us a framework for examining future worlds based on the relationship between humanity and machines.

“Technology” has no easy definition. The Body Digital therefore takes a broad approach, looking at software, machines, infrastructure and tools. Chang examines objects as mundane as the pen and as complex as the road networks that define our cities. She focuses on the interplay between machine and human: how tools have lightened our load and become embedded in our behaviour. In doing this she asks the reader: is it possible for the human body to extract itself from technology?

Each chapter of the book centres on a different part of the human anatomy – hand, voice, ear, eye, foot, body and mind – looking at the historical relationship between that body part and technology. Chang follows this thread through to the modern day and the large-scale impact these technologies have had on the development of our communities, communications and social structures. The chapters are a vehicle for Chang to present interesting pieces of history and discussions about society and culture. Her explanations are tightly knit, and the book covers huge ground in its relatively concise page count.

Chang avoids “doomerism”, remaining even-handed about our reservations towards technological advancement. She is careful in her discussion of new technology, particularly those that are often fraught in the public discourse, such as the use of generative AI in creating art, and the potential harms of facial-recognition software.

She includes genuine concerns – like biases creeping into training data for large language models – but mitigates these fears by discussing how technologies have become enmeshed in human culture through history. Our fear of some technologies has been unfounded – take, for example, the idea that the self-playing piano would supersede live piano concerts. These debates, Chang argues, have happened throughout the history of technology, and some of the same arguments from the past can easily be applied to future technology.

While this commentary is often thought-provoking, it sometimes doesn’t go as far as it might. There is relatively limited discussion throughout the book about the technological ecosystem we currently live in and how that might impact our level of optimism about the future. In particular, the topics of human labour being supplanted by machine labour, and the impacts of tech monoliths like Apple and Google, are relatively minimal.

In one example, Chang discusses the ways in which “telecommunication technologies might serve as channels into the afterlife”, allowing us to use technology to artificially recreate the voices of our loved ones after death. While the book contains a full discussion of how uncanny and alarming this type of “artistic necrophilia” might be, Chang tempers fear by pointing out that by being careful with our data, careful with our digital selves, we might be able to “mitigate the transformation of [our] voices into pure commodities”. However, the questions of who controls our data, the relationship between data and capital, and the level of control that we have over the use of our data, is somewhat limited.

Poetic technology

The difference between offering interesting ideas and overexplaining is a hard needle to thread, and one that Chang navigates successfully. One striking feature of The Body Digital is the quality of the prose. Chang has a background in fiction writing and her descriptions reflect this. An automaton is anthropomorphized as a “petite, barefoot boy” with a “cloud of brown hair”; and the humble footpath is described as “veer[ing] at a jaunty angle from the pavement, an unruly alternative to concrete”. As a consequence, her ideas are interesting and memorable, making the book readable and often moving.

Particularly impressive is Chang’s attitude to exposition, which mimics fiction’s age-old adage of “show, don’t tell”. She gives the reader enough information to learn something new in context and ask follow-up questions, without banging the reader over the head with an answer to these questions. The book mimics the same relationship between the written word and human consciousness that Chang discusses within it. The Body Digital marinates with the reader in the way any good novel might, while teaching them something new.

The result is a poetic and well-observed text, which offers the reader a different way of understanding humanity’s relationship with technology. It reminds us that we have coexisted with machines throughout the history of our species, and that they have been helpful and positively shaped the direction of our world. While she covers too much ground to gaze in any one direction for too long, the reader is likely to come away enriched and perhaps even hopeful. And, as Chang points out, we have the opportunity to shape the future of technology, by “attending to the rich, idiosyncratic intelligence of our bodies”.

  • 2025 Melville House Publishing 256pp £14.99 pb / £9.49 ebook

The post The humanity of machines: the relationship between technology and our bodies appeared first on Physics World.

Making multipartite entanglement easier to detect

4 mars 2026 à 09:37

Genuine multipartite entanglement is the strongest form of entanglement, where every part of a quantum system is entangled with every other part. It plays a central role in advanced quantum tasks such as quantum metrology and quantum error correction. To detect this deep form of entanglement in practice, researchers often use entanglement witnesses which are fast, experimentally friendly tests that certify entanglement whenever a measurable quantity exceeds a certain bound.

In this work, the researchers significantly extend previous witness‑construction methods to cover a much broader family of multipartite quantum states. Their approach is built within the multi‑qudit stabiliser formalism, a powerful framework widely used in quantum error correction and known for describing large classes of entangled states, both pure and mixed. They generalise earlier results in two major directions: (i) to systems with arbitrary prime local dimension, going far beyond qubits, and (ii) to stabiliser subspaces, where the stabiliser defines not just a single state but an entire entangled subspace.

This generalisation allows them to construct witnesses tailored to high‑dimensional graph states and to stabiliser‑defined subspaces, and they show that these witnesses can be more robust to noise than those designed for multiqubit systems. In particular, witnesses tailored to GHZ‑type states achieve the strongest resistance to white noise, and in some cases the authors identify the most noise‑robust witness possible within this construction. They also demonstrate that stabiliser‑subspace witnesses can outperform graph‑state witnesses when the local dimension is greater than two.

Overall, this research provides more powerful and flexible tools for detecting genuine multipartite entanglement in noisy, high‑dimensional and computationally relevant quantum systems. It strengthens our ability to certify complex entanglement in real‑world quantum technologies and opens the door to future extensions beyond the stabiliser framework.

Read the full article

Entanglement witnesses for stabilizer states and subspaces beyond qubits

Jakub Szczepaniak et al 2025 Rep. Prog. Phys. 88 117602

Do you want to learn more about this topic?

Focus on Quantum Entanglement: State of the Art and Open Questions guest edited by Anna Sanpera and Carlo Marconi (2025-2026)

The post Making multipartite entanglement easier to detect appeared first on Physics World.

Resolving the spin of sound

4 mars 2026 à 09:32

Acoustic waves are usually thought of as purely longitudinal, moving back and forth in the direction the wave is travelling and having no intrinsic rotation, therefore no spin (spin‑0). Recent work has shown that acoustic waves can in fact carry local spin‑like behaviour. However, until now, the total spin angular momentum of an acoustic field was believed to vanish, with the local positive and negative spin contributions cancelling each other to give an overall global spin‑0. In this work, the researchers show that acoustic vortex beams can carry a non‑zero longitudinal spin angular momentum when the beam is guided by certain boundary conditions. This overturns the long‑held assumption that longitudinal waves cannot possess a global spin degree of freedom.

Using a self‑consistent theoretical framework, the researchers derive the full spin, orbital and total angular momentum of these beams and reveal a new kind of spin–orbit interaction that appears when the beam is compressed or expanded. They also uncover a detailed relationship between the two competing descriptions of angular momentum in acoustics which are canonical‑Minkowski and kinetic‑Abraham. They demonstrate that only the canonical‑Minkowski form is truly conserved and directly tied to the beam’s azimuthal quantum number, which describes how the wave twists as it travels.

The team further demonstrates this mechanism experimentally using a waveguide with a slowly varying cross‑section. They show that the effect is not limited to this setup, it can also arise in evanescent acoustic fields and even in other wave systems such as electromagnetism. These results introduce a missing fundamental degree of freedom in longitudinal waves, offer new strategies for manipulating acoustic spin and orbital angular momentum, and open the door to future applications in wave‑based devices, underwater communication and particle manipulation.

Read the full article

Longitudinal acoustic spin and global spin–orbit interaction in vortex beams

Wei Wang et al 2025 Rep. Prog. Phys. 88 110501

Do you want to learn more about this topic?

Acoustic manipulation of multi-body structures and dynamics by Melody X LimBryan VanSaders and Heinrich M Jaeger (2024)

The post Resolving the spin of sound appeared first on Physics World.

Quantum memories could help make long-baseline optical astronomy a reality

Par : No Author
4 mars 2026 à 09:30

Quantum-entangled sensors placed over a kilometre apart could allow interferometric measurements of optical light with single photon sensitivity, experiments in the US suggest. While this proof-of-principle demonstration of a theoretical proposal first made in 2012 is not yet practically useful for astronomy, it marks a significant step forward in quantum sensing.

Radio telescopes are often linked together to provide more detailed images with better angular resolution than would otherwise be possible. The Event Horizon Telescope array, for example, performs very long baseline interferometry of signals from observatories on four continents to take astrophysical images such as the first picture of a black hole in 2019. At shorter wavelengths, however, much weaker signals are often parcelled into higher-energy photons. “You start getting this granularity at the single photon level,” says Pieter-Jan Stas at Harvard University.

According to textbook quantum mechanics, one can create an interferometric image from single photons by recombining their paths at a single detector – provided that their paths are not measured before then. This principle is used in laboratory spectroscopy. In astronomical observations, however, attempting to transport single photons from widely spread telescopes to a central detector would almost certainly result in them being lost. The baseline of infrared and optical telescopes is therefore restricted to about 300 m.

In 2012, theorist Daniel Gottesman, then at the Perimeter Institute for Theoretical Physics in Canada, and colleagues proposed using a central single source of entangled photons as a quantum repeater to generate entanglement between two detection sites, putting them into the same quantum state. The effect of an incoming photon on this combined state could therefore be measured without having to recombine the paths and collect the photon at a central detector.

Hidden information

“In reality, the photon will be in a superposition of arriving at both of the detectors,” says Stas. “That’s where this advantage comes from – you have this photon that is delocalized and arrives at both the left and the right station – so you truly have this baseline that helps you with improving your resolution, but to do this you have to keep the ‘which path’ information hidden.”

The 2012 proposal was not thought to be practical, because it required distributing entanglement at a rate comparable with the telescope’s spectral bandwidth. In 2019, however, Harvard’s Mikail Lukin and colleagues proposed integrating a quantum memory into the system. In the new research, they demonstrate this in practice.

The team used qubits made from silicon–vacancy centres in diamond. These can be very long lived because the spin of the centre’s electron (which interacts with the photon) is mapped to the nuclear spin, which is very stable. The researchers used a central laser as a coherent photon source to generate heralded entanglement to certify that the qubits were event-ready. “It’s not like you have to receive the space signal to be simultaneous with the arrival of the photon,” says team member Aziza Suleymanzade at the University of California, Berkeley. “In our case, we distribute entanglement, and it has some coherence time, and during that time you can detect your signal.”

Using two detectors placed in adjacent laboratories and synthetic light sources, the researchers demonstrated photon detection above vacuum fluctuations in fibres over 1.5 km in length. They acknowledge that much work remains before this can be viable in practical astronomy, such as a higher rate of entanglement generation, but Stas says that “this is one step towards bringing quantum techniques into sensing”.

Similar work in China

The research is described in Nature. Researchers in China led by Jian-Wei Pan have achieved a similar result, but their work has yet to be peer reviewed.

Yujie Zhang of the University of Waterloo in Canada points out that Lukin and colleagues have done similar work on distributed quantum communication and the quantum internet. “The major difference is that for most of the original protocols, what people care about is trying to entangle different quantum memories in the quantum network so then they can do gates on those quantum memories,” he says. “There’s nothing about extra information from the environment…This one is different in that they have to get the information mapped from the starlight to their quantum memory.” He notes several difficulties acknowledged by the researchers – such as that vacancy centres are very narrowband, but says that now people know the system can work, they can work to show that it can beat classical systems in practice.

“I think this is definitely a step towards [realizing the protocol envisaged in 2012],” says Gottesman, now at the University of Maryland, College Park. “There have been previous experiments where they generated the entanglement and they did some interference but they didn’t have the repeater aspect, which is the real value-added aspect of doing quantum-assisted interferometry. Its rate is still well short of what you’d need to have a functioning telescope, but this is putting one of the important pieces into place.”

The post Quantum memories could help make long-baseline optical astronomy a reality appeared first on Physics World.

Reçu hier — 3 mars 2026 6.5 📰 Sciences English

UK physics leaders express ‘deep concern’ over funding cuts in letter to science minister Patrick Vallance

3 mars 2026 à 15:48

The heads of university physics departments in the UK have published an open letter expressing their “deep concern” about funding changes announced late last year by UK Research and Innovation (UKRI), the umbrella organisation for the UK’s research councils.

Addressed to science minister Patrick Vallance, the letter says the cuts are causing “reputational risk” and calls for “strategic clarity and stability” to ensure that UK physics can thrive.

It has so far been signed by 58 people who represent 45 different universities, including Birmingham, Bristol, Cambridge, Durham, Imperial College, Liverpool, Manchester and Oxford.

The letter says that the changes at UKRI “risk undermining science’s fundamental role in improving our prosperity, health and quality of life, as well as delivering sustainable growth through innovation, productivity and scientific leadership”.

The signatories warn that the UK’s international standing in physics is “a strategic asset” and that areas such as particle physics, astronomy and nuclear physics are “especially important”.

Raising concerns

The decision by the heads of physics to write to Vallance comes in the wake of UKRI stating in December that it will be adjusting how it allocates government funding for scientific research and infrastructure.

The Science and Technology Facilities Council (STFC), which is part of UKRI, stated that projects would need to be cut given inflation, rising energy costs as well as “unfavourable movements in foreign exchange rates” that have increased STFC’s annual costs by over £50m a year.

The STFC noted that it would need to reduce spending from its core budget by at least 30% over 2024/2025 levels while also cutting the number of projects financed by its infrastructure fund.

The council has already said two UK national facilities – the Relativistic Ultrafast Electron Diffraction and Imaging facility and a mass spectrometry centre dubbed C‑MASS – will now not be prioritised.

In addition, two international particle-physics projects will not be supported: a UK-led upgrade to the LHCb experiment at CERN as well as a contribution to the Electron-Ion Collider at the Brookhaven National Laboratory that is currently being built.

Philip Burrows, director of the John Adams Institute for Accelerator Science at the University of Oxford, who is one of the signatories of the letter, told Physics World that the cuts are “like buying a Formula-1 car but not being able to afford the driver”.

Burrows admits that the STFC has been hit “particularly hard” by its flat-cash settlement, given that a large fraction of its expenditure pays the UK’s subscriptions to international facilities and operating the UK’s flagship national facilities.

But because most of the rest of the STFC’s budget supports scientists to do research at those facilities, he is concerned that the funding cuts will fall disproportionately on the science programme.

“Constraining these areas risks weakening the very talent pipeline on which the UK’s innovation economy depends,” the letter states. “Fundamental physics also delivers substantial public engagement and cultural impact, strengthening public support for science and reinforcing the UK’s reputation as a global scientific leader.”

The signatories also say they are “particularly concerned” about the UK’s capacity to lead the scientific exploitation of major international projects. “An abrupt pause in funding for key international science programmes risks damaging UK researchers’ competitive advantage into the 2040s,” they note.

The letter now calls on the government to work with UKRI and STFC to “stabilise” curiosity-driven grants for physics within STFC “at a minimum of flat funding in real terms” as well as protect post-docs, students and technicians from the cuts.

It also calls on the UK to develop a long-term strategy for infrastructure and call on the government to address facilities cost pressures through “dedicated and equitable mechanisms so that external shocks do not singularly erode the UK’s research base in STFC-funded research areas”.

The news comes as Michele Dougherty today formally stepped down from her role as IOP president. Dougherty, who also holds the position of executive chair of the STFC, had previously stepped back from presidential duties on 26 January due to a conflict of interest.

Paul Howarth, who has been IOP president-elect since September, will now become IOP president.

The post UK physics leaders express ‘deep concern’ over funding cuts in letter to science minister Patrick Vallance appeared first on Physics World.

Ancient reversal of Earth’s magnetic field took an extraordinarily long time

3 mars 2026 à 15:00

The Earth’s magnetic poles have reversed 540 times over the past 170 million years. Usually, these reversals are relatively speedy in geological terms, taking around 10,000 years to complete. Now, however, scientists in the US, France and Japan have found evidence of much slower reversals deep in Earth’s geophysical past. Their findings could have important implications for our understanding of Earth’s climate and evolutionary history.

Scientists think the Earth’s magnetic field arises from a dynamo effect created by molten metal circulating inside the planet’s outer core. Its consequences include the bubble-like magnetosphere, which shields us from the solar wind and cosmic radiation that would otherwise erode our atmosphere.

From time to time, this field weakens, and the Earth’s magnetic north and south poles switch places. This is known as a geomagnetic reversal, and we know about it because certain types of terrestrial rocks and marine sediment cores contain evidence of past reversals. Judging from this evidence, reversals usually take a few thousand years, during which time the poles drift before settling again on opposite sides of the globe.

Looking into the past

Researchers led by Yuhji Yamamoto of Kochi University, Japan and Peter Lippert at the University of Utah, US, have now identified two major exceptions to this rule. Drawing on evidence obtained during the Integrated Ocean Drilling Program expedition in 2012, they say that around 40 million years ago, during the Eocene epoch, the Earth experienced two reversals that took 18,000 and 70,000 years.

The team based these findings on cores of sediment extracted off the coast of Newfoundland, Canada, up to 250 metres below the seabed. These cores contain crystals of magnetite that were produced by a combination of ancient microorganisms and other natural processes. The iron oxide particles within these crystals align with the polarity of the Earth’s magnetic field at the time the sediments were deposited. Because marine sediments are far less affected by erosion and weathering than sediments onshore, Yamamoto says the information they preserve about past Earth environments – including geomagnetic conditions – is exceptionally clean.

Significance for evolutionary history

The team says the difference between a geomagnetic reversal that takes 10,000 years and one that takes 70,000 years is significant because prolonged intervals of weaker geomagnetic fields would have exposed the Earth to higher amounts of cosmic radiation for longer. The effects on living creatures could have been devastating, says Lippert. As well as higher rates of genetic mutations due to increased radiation, he points out that organisms from bacteria to birds use the Earth’s magnetic field while navigating. “A lower strength field would create sustained pressures on these organisms to adapt,” he says.

If humans had existed at the time of these reversals, the effects on our species could have been similarly profound. “Modern humans (Homo sapiens) are thought to have begun dispersing out of Africa only about 50,000 years ago,” Yamamoto observes. “If a geomagnetic reversal can persist for a period comparable to – or even longer than – this timescale, it implies that the Earth’s environment could undergo substantial and continuous change throughout the entire period of human evolution.”

Although our genetic ancestors dodged that particular bullet, Yamamoto thinks the team’s findings, which are published in Nature Communications Earth & Environment, offer a valuable perspective on how evolution and environmental change could interact in the future. “This period corresponds to an epoch when Earth was far warmer than it is today, and when Greenland is thought to have been a truly ‘green land’,” he explains. “We also know that atmospheric CO₂ concentrations during this era were comparable to levels projected for the end of this century, making it an important ‘climate analogue’ for understanding near‑future climate conditions.”

The discovery could also have more direct implications for future life on Earth. The magnitude of the Earth’s magnetic field has decreased by around 5% in each century since records began. This decrease, combined with the slow drift of our current magnetic North Poletowards Siberia, could indicate that we are in the early stages of a new geomagnetic reversal. Re‑evaluating the duration of such reversals is thus not only an issue for geophysicists, Yamamoto says. It’s also an important opportunity to reconsider fundamental questions about how we should coexist with our planet and how we ought to confront a continually changing environment.

Motivation for future studies

John Tarduno, a geophysicist at the University of Rochester, US, who was not involved in the study, describes it as “outstanding” work that “documents an exciting discovery bearing on the nature of magnetic shielding through time and the geomagnetic reversal process”. He agrees that reduced shielding could have had biotic effects, and adds that the discovery of long reversal transitions could influence scientific thinking on the statistics of field reversals – including questions of whether the field retains some “memory” of previous events. “This new study will provide motivation to examine reversal transitions at very high resolution,” Tarduno says.

For their next project, Yamamoto and colleagues aim to use sequences of lava flows in Iceland to analyse how the Earth’s magnetic field evolved. Lippert’s team, for its part, will be studying features called geomagnetic excursions that appear in both deep sea and terrestrial sediments. Such excursions are evidence of short-lived, incomplete attempts at field reversals, and Lippert explains that they can be excellent stratigraphic markers, helping scientists correlate records on geological timescales and compare them with samples taken from different parts of the world. “Excursions, like long reversals, can inform our understanding of what ultimately causes a geomagnetic field reversal to start and persist to completion,” he says.

The post Ancient reversal of Earth’s magnetic field took an extraordinarily long time appeared first on Physics World.

Spectrum showdown

3 mars 2026 à 13:00
Illustration of a Weather System Follow-on Microwave satellite. Credit: BAE Systems

As satellite communications constellations grow in size and number, they are also competing for a scarce and increasingly valuable resource: spectrum, the bands of radio frequencies that are crucial for communications and broadband service — and for tracking weather. The pace is intensifying as companies race to expand global communications networks, raising alarms at some […]

The post Spectrum showdown appeared first on SpaceNews.

❌