↩ Accueil

Vue normale

Reçu hier — 12 septembre 2025Physics World

High-speed 3D microscope improves live imaging of fast biological processes

12 septembre 2025 à 10:00

A new high-speed multifocus microscope could facilitate discoveries in developmental biology and neuroscience thanks to its ability to image rapid biological processes over the entire volume of tiny living organisms in real time.

The pictures from many 3D microscopes are obtained sequentially by scanning through different depths, making them too slow for accurate live imaging of fast-moving natural functions in individual cells and microscopic animals. Even current multifocus microscopes that capture 3D images simultaneously have either relatively poor image resolution or can only image to shallow depths.

In contrast, the new 25-camera “M25” microscope – developed during his doctorate by Eduardo Hirata-Miyasaki and his supervisor Sara Abrahamsson, both then at the University of California Santa Cruz, together with collaborators at the Marine Biological Laboratory in Massachusetts and the New Jersey Institute of Technology – enables high-resolution 3D imaging over a large field-of-view, with each camera capturing 180 × 180 × 50 µm volumes at a rate of 100 per second.

“Because the M25 microscope is geared towards advancing biomedical imaging we wanted to push the boundaries for speed, high resolution and looking at large volumes with a high signal-to-noise ratio,” says Hirata-Miyasaki, who is now based in the Chan Zuckerberg Biohub in San Francisco.

The M25, detailed in Optica, builds on previous diffractive-based multifocus microscopy work by Abrahamsson, explains Hirata-Miyasaki. In order to capture multiple focal planes simultaneously, the researchers devised a multifocus grating (MFG) for the M25. This diffraction grating splits the image beam coming from the microscope into a 5 × 5 grid of evenly illuminated 2D focal planes, each of which is recorded on one of the 25 synchronized machine vision cameras, such that every camera in the array captures a 3D volume focused on a different depth. To avoid blurred images, a custom-designed blazed grating in front of each camera lens corrects for the chromatic dispersion (which spreads out light of different wavelengths) introduced by the MFG.

The team used computer simulations to reveal the optimal designs for the diffractive optics, before creating them at the University of California Santa Barbara nanofabrication facility by etching nanometre-scale patterns into glass. To encourage widespread use of the M25, the researchers have published the fabrication recipes for their diffraction gratings and made the bespoke software for acquiring the microscope images open source. In addition, the M25 mounts to the side port of a standard microscope, and uses off-the-shelf cameras and camera lenses.

The M25 can image a range of biological systems, since it can be used for fluorescence microscopy – in which fluorescent dyes or proteins are used to tag structures or processes within cells – and can also work in transmission mode, in which light is shone through transparent samples. The latter allows small organisms like C. elegans larvae, which are commonly used for biological research, to be studied without disrupting them.

The researchers performed various imaging tests using the prototype M25, including observations of the natural swimming motion of entire C. elegans larvae. This ability to study cellular-level behaviour in microscopic organisms over their whole volume may pave the way for more detailed investigations into how the nervous system of C. elegans controls its movement, and how genetic mutations, diseases or medicinal drugs affect that behaviour, Hirata-Miyasaki tells Physics World. He adds that such studies could further our understanding of human neurodegenerative and neuromuscular diseases.

“We live in a 3D world that is also very dynamic. So with this microscope I really hope that we can keep pushing the boundaries of acquiring live volumetric information from small biological organisms, so that we can capture interactions between them and also [see] what is happening inside cells to help us understand the biology,” he continues.

As part of his work at the Chan Zuckerberg Biohub, Hirata-Miyasaki is now developing deep-learning models for analysing dynamic cell and organism multichannel dynamic live datasets, like those acquired by the M25, “so that we can extract as much information as possible and learn from their dynamics”.

Meanwhile Abrahamsson, who is currently working in industry, hopes that other microscopy development labs will make their own M25 systems.  She is also considering commercializing the instrument to help ensure its widespread use.

The post High-speed 3D microscope improves live imaging of fast biological processes appeared first on Physics World.

Reçu avant avant-hierPhysics World

Juno: the spacecraft that is revolutionizing our understanding of Jupiter

11 septembre 2025 à 15:55

This episode of the Physics World Weekly podcast features Scott Bolton, who is principal investigator on NASA’s Juno mission to Jupiter. Launched in 2011, the mission has delivered important insights into the nature of the gas-giant planet. In this conversation with Physics World’s Margaret Harris, Bolton explains how Juno continues to change our understanding of Jupiter and other gas giants.

Bolton and Harris chat about the mission’s JunoCam, which has produced some gorgeous images of Jupiter and it moons.

Although the Juno mission was expected to last only a few years, the spacecraft is still going strong despite operating in Jupiter’s intense radiation belts. Bolton explains how the Juno team has rejuvenated radiation-damaged components, which has provided important insights for those designing future missions to space.

However Juno’s future is uncertain. Despite its great success, the mission is currently scheduled to end at the end of September, which is something that Bolton also addresses in the conversation.

The post Juno: the spacecraft that is revolutionizing our understanding of Jupiter appeared first on Physics World.

Optimizing upright proton therapy: hybrid delivery provides faster, sharper treatments

11 septembre 2025 à 10:00

A combination of static proton arcs and shoot-through proton beams could increase plan conformity and homogeneity and reduce delivery times in upright proton therapy, according to new research from RaySearch Laboratories in Sweden.

Proton arc therapy (PAT) is an emerging rotational delivery technique with potential to improve plan quality – reducing dose to organs-at-risk while maintaining target dose. The first clinical PAT treatments employed static arcs, in which multiple energy layers are delivered from many (typically 10 to 30) discrete angles. Importantly, static arc PAT can be delivered on conventional proton therapy machines. It also offers simpler beam arrangements than intensity-modulated proton therapy (IMPT).

“In IMPT of head-and-neck cancers, the beam directions are normally set up in a complicated pattern in different planes, with range shifters needed to treat the shallow part of the tumour,” explains Erik Engwall, chief physicist at RaySearch Laboratories. “In PAT, the many beam directions are arranged in the same plane and no range shifters are typically needed. With all beams in the same plane, it is easier to move to upright treatments.”

Upright proton therapy involves rotating the patient (in an upright position) in front of a static horizontal treatment beam. The approach could reduce costs by using compact proton delivery systems. This compactness, however, places energy selection close to the patient, increasing scattering in the proton beam. To combat this, the team propose adding a layer of shoot-through protons to each direction of the proton arc.

The idea is that while most protons are delivered with Bragg peaks placed in the target, the sharp penumbra of the high-energy protons shooting through the target will combat beam broadening. The rotational delivery in the proton arc spreads the exit dose from these shoot-through beams over many angles, minimizing dose to surrounding tissues. And as the beamline is fixed, shoot-through protons exit in the same direction (behind the patient) for all angles, simplifying shielding to a single beam dump opposite the fixed beam.

Simulation studies

To test this approach, Engwall and colleagues simulated treatment plans for a virtual phantom containing three targets and an organ-at-risk, reporting their findings in Medical Physics. They used a development version of RayStation v2025 with a beam model of the Mevion s250-FIT system (which combines a compact cyclotron, an upright positioner and an in-room CT scanner).

For each target, the team created static arc plans with (Arc+ST) and without shoot-through beams and with/without collimation, as well as 3-beam IMPT plans with and without shoot-through beams (all with collimation). Arc plans used 20 uniformly spaced beam directions, and the shoot-through plans included an additional layer of the highest system energy (230 MeV) for each direction.

For all targets, Arc+ST plans showed superior conformity, homogeneity and target robustness to arc plans without shoot-through protons. Adding collimation slightly improved the arc plans without shoot-through protons but had little impact on Arc+ST plans.

The IMPT plans achieved similar homogeneity and robustness to the best arc plans, but with far lower conformity due to the shoot-through protons delivering a concentrated exit dose behind the target (while static arcs distribute this dose over many directions). Adding shoot-through protons improved IMPT plan quality, but to a lesser degree than for PAT plans.

Clinical case

The researchers repeated their analysis for a clinical head-and-neck cancer case, comparing static arcs with 5-beam IMPT. Again, Arc+ST plans performed better than any others for almost all metrics. “The Arc+ST plans have the best quality due to the sharpening of the penumbra of the shoot-through part, even better than when using a collimator,” says Engwall.

Treatment plan comparisons
Plan comparisons (a) Static arc with an additional shoot-through layer, (b) partial static arcs with collimation and (c) 5-beam collimated plan. Panel (d) shows the shoot-through portion of the dose distribution in (a). Dose–volume histograms are displayed for the targets and representative organs-at-risk. (Courtesy: CC BY 4.0/Med. Phys. 10.1002/mp.18051)

Notably, the findings suggest that collimation is not needed when combining arcs with shoot-through beams, enabling rapid treatments. With fast energy switching and the patient rotation at 1 rpm, Arc+ST achieved an estimated delivery time of less than 5.4 min – faster than all other plans for this case, including 5-beam IMPT.

“Treatment time is reduced when the leaves of the dynamic collimator do not need to move,” Engwall explains. “There is also no risk of mechanical failures of the collimator and the secondary neutron production will be lower when there are fewer objects in the beamline.”

Another benefit of upright delivery is that the shoot-through protons can be used for range verification during treatments, using a detector integrated into the beam dump behind the patient. The team investigated this concept with three simulated error scenarios: 5% systematic shift in stopping power ratio; 5 mm setup shift; and 2 cm shoulder movement. The technique successfully detected all errors.

As the range detector is permanently installed in the treatment room and the shoot-through protons are part of the treatment plan, this method does not add time to the patient setup and can be used in every treatment fraction to detect both intra- and inter-fraction uncertainties.

Although this is a proof-of-concept study, the researchers conclude that it highlights the combined advantages of the new treatment technique, which could “leverage the potential of compact upright proton treatments and make proton treatments more affordable and accessible to a larger patient group”.

Engwall tells Physics World that the team is now collaborating with several clinical research partners to investigate the technique’s potential across larger patient data sets, for other treatment sites and multiple treatment machines.

The post Optimizing upright proton therapy: hybrid delivery provides faster, sharper treatments appeared first on Physics World.

LIGO could observe intermediate-mass black holes using artificial intelligence

10 septembre 2025 à 19:41

A machine learning-based approach that could help astronomers detect lower-frequency gravitational waves has been unveiled by researchers in the UK, US, and Italy. Dubbed deep loop shaping, the system would apply real-time corrections to the mirrors used in gravitational wave interferometers. This would dramatically reduce noise in the system, and could lead to a new wave of discoveries of black hole and neutron star mergers – according to the team.

In 2015, the two LIGO interferometers made the very first observation of a gravitational wave: attributing its origin to a merger of two black holes that were roughly 1.3 billion light–years from Earth.

Since then numerous gravitational waves have been observed with frequencies ranging from 30–2000 Hz. These are believed to be from the mergers of small black holes and neutron stars.

So far, however, the lower reaches of the gravitational wave frequency spectrum (corresponding to much larger black holes) have gone largely unexplored. Being able to detect gravitational waves at 10–30 Hz would allow us to observe the mergers of intermediate-mass black holes at 100–100,000 solar masses. We could also measure the eccentricities of binary black hole orbits. However, these detections are not currently possible because of vibrational noise in the mirrors at the end of each interferometer arm.

Subatomic precision

“As gravitational waves pass through LIGO’s two 4-km arms, they warp the space between them, changing the distance between the mirrors at either end,” explains Rana Adhikari at Caltech, who is part of the team that has developed the machine-learning technique. “These tiny differences in length need to be measured to an accuracy of 10-19 m, which is 1/10,000th the size of a proton. [Vibrational] noise has limited LIGO for decades.”

To minimize noise today, these mirrors are suspended by a multi-stage pendulum system to suppress seismic disturbances. The mirrors are also polished and coated to eliminate surface imperfections almost entirely. On top of this, a feedback control system corrects for many of the remaining vibrations and imperfections in the mirrors.

Yet for lower-frequency gravitational waves, even this subatomic level of precision and correction is not enough. As a laser beam impacts a mirror, the mirror can absorb minute amounts of energy – creating tiny thermal distortions that complicate mirror alignment. In addition, radiation pressure from the laser, combined with seismic motions that are not fully eliminated by the pendulum system, can introduce unwanted vibrations in the mirror.

The team proposed that this problem could finally be addressed with the help of artificial intelligence (AI). “Deep loop shaping is a new AI method that helps us to design and improve control systems, with less need for deep expertise in control engineering,” describes Jonas Buchli at Google DeepMind, who led the research. “While this is helping us to improve control over high precision devices, it can also be applied to many different control problems.”

Deep reinforcement learning

The team’s approach is based on deep reinforcement learning, whereby a system tests small adjustments to its controls and adapts its strategy over time through a feedback system of rewards and penalties.

With deep loop shaping, the team introduced smarter feedback controls for the pendulum system suspending the interferometer’s mirrors. This system can adapt in real time to keep the mirrors aligned with minimal control noise – counteracting thermal distortions, seismic vibrations, and forces induced by radiation pressure.

“We tested our controllers repeatedly on the LIGO system in Livingston, Louisiana,” Buchli continues. “We found that they worked as well on hardware as in simulation, confirming that our controller keeps the observatory’s system stable over prolonged periods.”

Based on these promising results, the team is now hopeful that deep loop shaping could help to boost the cosmological reach of LIGO and other existing detectors, along with future generations of gravitational-wave interferometers.

“We are opening a new frequency band, and we might see a different universe much like the different electromagnetic bands like radio, light, and X-rays tell complementary stories about the universe,” says team member Jan Harms at the Gran Sasso Science Institute in Italy. “We would gain the ability to observe larger black holes, and to provide early warnings for neutron star mergers. This would allow us to tell other astronomers where to point their telescopes before the explosion occurs.”

The research is described in Science.

The post LIGO could observe intermediate-mass black holes using artificial intelligence appeared first on Physics World.

Physicists set to decide location for next-generation Einstein Telescope

10 septembre 2025 à 11:30

A decade ago, on 14 September 2015, the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) in Hanford, Washington, and Livingston, Louisiana, finally detected a gravitational wave. The LIGO detectors – two L-shaped laser interferometers with 4 km-long arms – had measured tiny differences in laser beams bouncing off mirrors at the end of each arm. The variations in the length of the arms, caused by the presence of a gravitational wave, were converted into the now famous audible “chirp signal”, which indicated the final approach between two merging black holes.

Since that historic detection, which led to the 2017 Nobel Prize for Physics, the LIGO detectors, together with VIRGO in Italy, have measured several hundred gravitational waves – from mergers of black holes to neutron-star collisions. More recently, they have been joined by the KAGRA detector in Japan, which is located some 200 m underground, shielding it from vibrations and environmental noise.

Yet the current number of gravitational waves could be dwarfed by what the planned Einstein Telescope (ET) would measure. This European-led, third-generation gravitational-wave detector would be built several hundred metres underground and be at least 10 times more sensitive than its second-generation counterparts including KAGRA. Capable of “listening” to a thousand times larger volume of the universe, the new detector would be able to spot many more sources of gravitational waves. In fact, the ET will be able to gather in a day what it took LIGO and VIRGO a decade to collect.

The ET is designed to operate in two frequency domains. The low-frequency regime – 2–40 Hz – is below current detectors’ capabilities and will let the ET pick up waves from more massive black holes. The high-frequency domain, on the other hand, would operate from 40 Hz to 10 kHz  and detect a wide variety of astrophysical sources, including merging black holes and other high-energy events. The detected signals from waves would also be much longer with the ET, lasting for hours. This would allow physicists to “tune in” much earlier as black holes or neutron stars approach each other.

Location, location, location

But all that is still a pipe dream, because the ET, which has a price tag of €2bn, is not yet fully funded and is unlikely to be ready until 2035 at the earliest. The precise costs will depend on the final location of the experiment, which is still up for grabs.

Three regions are vying to host the facility: the Italian island of Sardinia, the Belgian-German-Dutch border region and the German state of Saxony. Each candidate is currently investigating the suitability of its preferred site (see box below), the results of which will be published in a “bid book” by the end of 2026. The winning site will be picked in 2027 with construction beginning shortly after.

Other factors that will dictate where the ET is built include logistics in the host region, the presence of companies and research institutes (to build and exploit the facility) and government support. With the ET offering high-quality jobs, economic return, scientific appeal and prestige, that could give the German-Belgian-Dutch candidacy the edge given the three nations could share the cost.

Another major factor is the design of the ET. One proposal is to build it as an equilateral triangle with each side being 10 km. The other is a twin L-shaped design where both arms are 15 km long and each detector located far from each other. The latter design is similar to the two LIGO over-ground detectors, which are 3000 km apart. If the “2L design” is chosen, the detector would then be built at two of the three competing sites.

The 2L design is being investigated by all three sites, but those behind the Sardinia proposal strongly favour this approach. “With the detectors properly oriented relative to each other, this design could outperform the triangular design across all key scientific objectives,” claims Domenico D’Urso, scientific director of the Italian candidacy. He points to a study by the ET collaboration in 2023 that investigated the impact of the ET design on its scientific goals. “The 2L design enables, for example, more precise localization of gravitational wave sources, enhancing sky-position reconstruction,” he says. “And it provides superior overall sensitivity.”

Where could the next-generation Einstein Telescope be built?

Three sites are vying to host the Einstein Telescope (ET), with each offering various geological advantages. Lausitz in Saxony benefits from being a former coal-mining area. “Because of this mining past, the subsurface was mapped in great detail decades ago,” says Günther Hasinger, founding director of the German Center for Astrophysics, which is currently being built in Lausitz and would house the ET if picked. The granite formation in Lausitz is also suitable for a tunnel complex because the rock is relatively dry. Not much water would need to be pumped away, causing less vibration.

Thanks to the former lead, zinc and silver mine of Sos Enattos, meanwhile, the subsurface near Nuoro in Sardinia – another potential location for the ET – is also well known. The island is on a very stable, tectonic microplate, making it seismically quiet. Above ground, the area is undeveloped and sparsely populated, further shielding the experiment from noise.

The third ET candidate, lying near the point where Belgium, Germany and the Netherlands meet, also has a hard subsurface, which is needed for the tunnels. It is topped by a softer, clay-like layer that would dampen vibrations from traffic and industry. “We are busy investigating the suitability of the subsurface and the damping capacity of the top layer,” says Wim Walk of the Dutch Center for Subatomic Physics (Nikhef), which is co-ordinating the candidacy for this location. “That research requires a lot of work, because the subsurface here has not yet been properly mapped.”

Localization is important for multi­messenger astronomy. In other words, if a gravitational-wave source can be located quickly and precisely in the sky, other telescopes can be pointed towards it to observe any eventual light or other electromagnetic (EM) signals. This is what happened after LIGO detected a gravitational wave on 17 August 2017, originating from a neutron star collision. Dozens of ground- and space-based satellites were able to pick up a gamma-ray burst and the subsequent EM afterglow.

The triangle design, however, is favoured by the Belgian-German-Dutch consortium. It would be the Earth equivalent to the European Space Agency’s planned LISA space-based gravitational-waves detector, which will consist of three spacecraft in a triangle configuration that is set for launch in 2035, the same year that the ET could open. LISA would detect gravitational waves with even much lower frequency, coming, for example, from mergers of supermassive black holes.

While the Earth-based triangle design would not be able to locate the source as precisely, it would – unlike the 2L design – be able to do “null stream” measurements. These would yield  a clearer picture of the noise from the environment and the detector itself, including  “glitches”, which are bursts of noise that overlap with gravitational-wave signals. “With a non-stop influx of gravitational waves but also of noise and glitches, we need some form of automatic clean-up of the data,” says Jan Harms, a physicist at the Gran Sasso Science Institute in Italy and member of the scientific ET collaboration. “The null stream could provide that.”

However, it is not clear if that null stream would be a fundamental advantage for data analysis, with Harms and colleagues thinking more work is needed. “For example, different forms of noise could be connected to each other, which would compromise the null stream,” he says. The problem is also that a detector with a null stream has not yet been realized. And that applies to the triangle design in general. “While the 2L design is well established in the scientific community,” adds D’Urso.

Backers of the triangle design see the ET as being part of a wider, global network of third-generation detectors, where the localization argument no longer matters. Indeed, the US already has plans for an above-ground successor to LIGO. Known as the Cosmic Explorer, it would feature two L-shaped detectors with arm lengths of up to 40 km. But with US politics in turmoil, it is questionable how realistic these plans are.

Matthew Evans, a physicist at the Massachusetts Institute of Technology and member of the LIGO collaboration, recognizes the “network argument”. “I think that the global gravitational waves community are double counting in some sense,” he says. Yet for Evans it is all about the exciting discoveries that could be made with a next-generation gravitational-wave detector. “The best science will be done with ET as 2Ls,” he says.

The post Physicists set to decide location for next-generation Einstein Telescope appeared first on Physics World.

The destructive effects of ionising radiation

10 septembre 2025 à 10:32

Studying the chain of processes that take place when UV or x-ray radiation interacts with solid matter is crucial in various fields from medical physics to materials testing.

The first of these processes is typically photoionisation, where an atom or molecule absorbs a photon and loses one or more electrons as a result.

What comes next can vary depending on the strength of the radiation and the nature of the material, but some of the most important effects are secondary ionisation processes. These often go on to dominate the dynamics of the whole system.

One of these is interatomic Coulombic decay (ICD), in which energy is transferred from one excited atom or molecule to a neighbouring one, which in turn is ionised.

ICD has captured considerable interest since its discovery, not least because it often produces low energy electrons which cause radiation damage in biological matter.

Understanding this phenomenon better was the team’s goal in this latest work. Using the ASTRID2 synchrotron in Aarhus, Denmark, they studied what happened when extreme UV photons interacted with small clusters of helium atoms.

To capture what was going on they used an electron velocity-map imaging spectrometer. This is a powerful diagnostic that measures any emitted electrons’ energy in addition to their angular distribution.

Using this technique, they were able to show that the ICD process is even more efficient than previously thought. The researchers expect it to play a crucial role in other condensed phase systems exposed to ionising radiation as well.

Knowledge gained from studies such as this one is crucial for fields like radiation therapy, where the effects of ionising radiation on human cells must be tightly controlled.

The post The destructive effects of ionising radiation appeared first on Physics World.

A route to more efficient wireless charging?

10 septembre 2025 à 10:32

Wireless power transfer (WPT) is increasingly used in consumer electronics, electric vehicles, and medical devices, with technologies like inductive charging and resonant coupling leading the way.

The key to successful WPT technologies is ensuring that a high percentage of the input power is transferred to its intended destination – i.e. it must be efficient.

The idea of parity-time (PT) symmetry helps achieve this goal by offering a way to balance energy gain and loss in a system, which helps maintain stable and efficient power flow – even in the presence of disturbances.

Here parity means spatial reflection (like flipping left and right), and time refers to reversing the direction of time. A PT-symmetric system behaves the same when both of these transformations are applied together.

However, this method has its limitations. It often requires very fine-tuning, it can struggle when devices do not function as idealised resistors,. Most importantly, it falls short of achieving the theoretical maximum efficiency of WPT.

This is where the new paper comes in. They’ve performed a comprehensive experimental and theoretical study demonstrating that dispersive gain can greatly enhance the efficiency of WPT beyond the limits of previous methods.

Dispersive gain describes a type of energy amplification in a system where the gain depends on the frequency of the signal. This means the system amplifies energy differently at different frequencies, rather than uniformly across all frequencies.

This allows the system to naturally shift energy into the most efficient frequency modes for transfer.

Their work could be used to enable new technologies or make existing ones more affordable. It could also open up new possibilities for harnessing dispersion effects across electronics and optics.

Read the full article

Dispersive gains enhance wireless power transfer with asymmetric resonance – IOPscience

Hao et al. 2025 Rep. Prog. Phys. 88 020501

The post A route to more efficient wireless charging? appeared first on Physics World.

‘Breathing’ crystal reversibly releases oxygen

9 septembre 2025 à 17:03

A new transition-metal oxide crystal that reversibly and repeatedly absorbs and releases oxygen could be ideal for use in fuel cells and as the active medium in clean energy technologies such as thermal transistors, smart windows and new types of batteries. The “breathing” crystal, discovered by scientists at Pusan National University in Korea and Hokkaido University in Japan, is made from strontium, cobalt and iron and contains oxygen vacancies.

Transition-metal oxides boast a huge range of electrical properties that can be tuned all the way from insulating to superconducting. This means they can find applications in areas as diverse as energy storage, catalysis and electronic devices.

Among the different material parameters that can be tuned are the oxygen vacancies. Indeed, ordering these vacancies can produce new structural phases that show much promise for oxygen-driven programmable devices.

Element-specific behaviours

In the new work, a team of researchers led by physicist Hyoungjeen Jeen of Pusan and materials scientist Hiromichi Ohta in Hokkaido studied SrFe0.5Co0.5Ox. The researchers focused on this material, they say, since it belongs to the family of topotactic oxides, which are the main oxides being studied today in solid-state ionics. “However, previous work had not discussed which ion in this compound was catalytically active,” explains Jeen. “What is more, the cobalt-containing topotactic oxides studied so far were fragile and easily fractured during chemical reactions.”

The team succeeded in creating a unique platform from a solid solution of epitaxial SrFe0.5Co0.5O2.5 in which both the cobalt and iron ions bathed in the same chemical environment. “In this way, we were able to test which ion was better for reduction reactions and whether or not it sustained its structural integrity,” Jeen tells Physics World. “We found that our material showed element-specific reduction behaviours and reversible redox reactions.”

The researchers made their material using a pulsed laser deposition technique, ideal for the epitaxial synthesis of multi-element oxides that allowed them to grow SrFe0.5Co0.5O2.5 crystals in which the iron and cobalt ions were randomly located in the crystal. This random arrangement was key to the material’s ability to repeatedly release and absorb oxygen, they say.

“It’s like giving the crystal ‘lungs’ so that it can inhale and exhale oxygen on command,” says Jeen.

Stable and repeatable

This simple breathing picture comes from the difference in the catalytic activity of cobalt and iron in the compound, he explains. Cobalt ions prefer to lose and gain oxygen and these ions are the main sites for the redox activity. However, since iron ions prefer not to lose oxygen during the reduction reaction, they serve as pillars in this architecture. This allows for stable and repeatable oxygen release and uptake.

Until now, most materials that absorb and release oxygen in such a controlled fashion were either too fragile or only functioned at extremely high temperatures. The new material works under more ambient conditions and is stable. “This finding is striking in two ways: only cobalt ions are reduced, and the process leads to the formation of an entirely new and stable crystal structure,” explains Jeen.

The researchers also showed that the material could return to its original form when oxygen was reintroduced, so proving that the process is fully reversible. “This is a major step towards the realization of smart materials that can adjust themselves in real time,” says Ohta. “The potential applications include developing a cathode for intermediate solid oxide fuel cells, an active medium for thermal transistors (devices that can direct heat like electrical switches), smart windows that adjust their heat flow depending on the weather and even new types of batteries.”

Looking ahead, Jeen, Ohta and colleagues aim to investigate the material’s potential for practical applications.

They report their present work in Nature Communications.

The post ‘Breathing’ crystal reversibly releases oxygen appeared first on Physics World.

New hollow-core fibres break a 40-year limit on light transmission

9 septembre 2025 à 11:32

Optical fibres form the backbone of the Internet, carrying light signals across the globe. But some light is always lost as it travels, becoming attenuated by about 0.14 decibels per kilometre even in the best fibres. That means signals must be amplified every few dozen kilometres – a performance that hasn’t improved in nearly four decades.

Physicists at the University of Southampton, UK have now developed an alternative that could call time on that decades-long lull. Writing in Nature Photonics, they report hollow-core fibres that exhibit 35% less attenuation while transmitting signals 45% faster than standard glass fibres.

“A bit like a soap bubble”

The core of conventional fibres is made of pure glass and is surrounded by a cladding of slightly different glass. Because the core has a higher refractive index than the cladding, light entering the fibre reflects internally, bouncing back and forth in a process known as total internal reflection. This effect traps the light and guides it along the fibre’s length.

The Southampton team led by Francesco Poletti swapped the standard glass core for air. Because air is more transparent than glass, channelling light through it cuts down on scattering and speeds up signals. The problem is that air’s refractive index is lower, so the new fibre can’t use total internal reflection. Instead, Poletti and colleagues guided the light using a mechanism called anti-resonance, which requires the walls of the hollow core to be made from ultra-thin glass membranes.

“It’s a bit like a soap bubble,” Poletti says, explaining that such bubbles appear iridescent because their thin films reflect some wavelengths and lets others through. “We designed our fibre the same way, with glass membranes that reflect light at certain frequencies back into the core.” That anti-resonant reflection, he adds, keeps the light trapped and moving through the fibre’s hollow centre.

Greener telecommunications

To make the new air-core fibre, the researchers stacked thin glass capillaries in a precise pattern, forming a hollow channel in the middle. Heating and drawing the stack into a hair-thin filament preserved this pattern on a microscopic scale. The finished fibre has a nested design: an air core surrounded by ultra-thin layers that provide anti-resonant guidance and cut down on leakage.

To test their design, the team measured transmission through a full spool of fibre, then cut the fibre shorter and compared the results. They also fired in light pulses and tracked the echoes. Their results show that the hollow fibres reduce attenuation to just 0.091 decibels per kilometre. This lower loss implies that fewer amplifiers would be needed in long cables, lowering costs and energy use. “There’s big potential for greener telecommunications when using our fibres,” says Poletti.

Poletti adds that reduced attenuation (and thus lower energy use) is only one of the new fibre’s advantages. At the 0.14 dB/km attenuation benchmark, the new hollow fibre supports a bandwidth of 54 THz compared to 10 THz for a normal fibre. At the reduced 0.1 dB/km attenuation, the bandwidth is still 18 THz, which is close to twice that of a normal cable. This means that a single strand can carry far more channels at once.

Perhaps the most impressive advantage is that because the speed of light is faster in air than in glass, data could travel the same distance up to 45% faster. “It’s almost the same speed light takes when we look at a distant star,” Poletti says. The resulting drop in latency, he adds, could be crucial for real-time services like online gaming or remote surgery, and could also speed up computing tasks such as training large language models.

Field testing

As well as the team’s laboratory tests, Microsoft has begun testing the fibres in real systems, installing segments in its network and sending live traffic through them. These trials prove the hollow-core design works with existing telecom equipment, opening the door to gradual rollout. In the longer run, adapting amplifiers and other gear that are currently tuned for solid glass fibres could unlock even better performance.

Poletti believes the team’s new fibres could one day replace existing undersea cables. “I’ve been working on this technology for more than 20 years,” he says, adding that over that time, scepticism has given way to momentum, especially now with Microsoft as an industry partner. But scaling up remains a real hurdle. Making short, flawless samples is one thing; mass-producing thousands of kilometres at low cost is another. The Southampton team is now refining the design and pushing toward large-scale manufacturing. They’re hopeful that improvements could slash losses by another order of magnitude and that the anti-resonant design can be tuned to different frequency bands, including those suited to new, more efficient amplifiers.

Other experts agree the advance marks a turning point. “The work builds on decades of effort to understand and perfect hollow-core fibres,” says John Ballato, whose group at Clemson University in the US develops fibres with specialty cores for high-energy laser and biomedical applications. While Ballato notes that such fibres have been used commercially in shorter-distance communications “for some years now”, he believes this work will open them up to long-haul networks.

The post New hollow-core fibres break a 40-year limit on light transmission appeared first on Physics World.

Indefinite causal order: how quantum physics is challenging our understanding of cause and effect

9 septembre 2025 à 10:01

The concept of cause and effect plays an important role in both our everyday lives, and in physics. If you set a ball down in front of a window and kick it hard, a split-second later the ball will hit the window and smash it. What we don’t observe is a world where the window smashes on its own, thereby causing the ball to be kicked – that would seem rather nonsensical. In other words, kick before smash, and smash before kick, are two different physical processes each having a unique and definite causal order.

But, does definite causal order also reign supreme the quantum world, where concepts like position and time can be fuzzy?  Most physicists are happy to accept the paradox of Schrödinger’s cat – a thought experiment in which a cat hidden in a box is simultaneously dead and alive at the same time, until you open the box to check. Schrödinger’s cat illustrates the quantum concept of “superposition”, whereby a system can be in two or more states at the same time. It is only when a measurement is made (by opening the box), does the system collapse into one of its possible states.

But could two (or more) causally distinct processes occur at the same time in the quantum world? The answer, perhaps shockingly, is yes and this paradoxical phenomenon is called indefinite causal order (ICO).

Stellar superpositions and the order of time

It turns out that different causal processes can also exist in a superposition. One example is a thought experiment called the “gravitational quantum switch”, which was proposed in 2019 by Magdalena Zych of the University of Queensland and colleagues (Nat. Comms 10 3772). This features our favourite quantum observers Alice and Bob, who are in the vicinity of a very large mass, such as a star. Alice and Bob both have initially synchronized clocks and in the quantum world, these clocks would continue to run at identical rates. However, Einstein’s general theory of relativity dictates that the flow of time is influenced by the distribution of matter in the vicinity of Alice and Bob. This means that if Alice is closer to the star than Bob, then her clock will run slower than Bob’s, and vice versa.

Like with Schrödinger’s cat, quantum mechanics allows the star to be in a superposition of spatial states; meaning that in one state Alice is closer to the star than Bob, and in the other Bob is closer to the star than Alice. In other words, this is a superposition of a state in which Alice’s clock runs slower than Bob’s, and a state in which Bob’s clock runs slower than Alice’s.

Alice and Bob are both told they will receive a message at a specific time (say noon) and that they would then pass that message on to the their counterpart. If Alice’s clock is running faster than Bob’s then she will receive the message first, and then pass it on to Bob, and vice versa. This superposition of Alice to Bob with Bob to Alice is an example of indefinite causal order.

Now, you might be thinking “so what” because this seems to be a trivial example. But it becomes more interesting if you replace the message with a quantum particle like a photon; and have Alice and Bob perform different operations on that photon. If the two operations do not commute – such as rotations of the photon polarization in the X and Z planes – then the order in which the operations are done will affect the outcome.

As a result, this “gravitational quantum switch” is a superposition of two different causal processes with two different outcomes. This means that Alice and Bob could do more exotic operations on the photon, such as “measure-and-reprepare” operations (where a quantum system is first measured, and then, based on the measurement outcome, a new quantum state is prepared). In this case Alice measures the quantum state of the received photon and prepares a photon that she sends to Bob (or vice versa).

Much like Schrödinger’s cat, a gravitational quantum switch cannot currently be realized in the lab. But, never say never. Physicists have been able to create experimental analogues of some thought experiments, so who knows what the future will bring. Indeed, a gravitational quantum switch could provide important information regarding a quantum description of gravity – something that has eluded physicists ever since quantum mechanics and general relativity were being developed in the early 20th century.

Switches and superpositions

Moving on to more practical ICO experiments, physicists have already built and tested light-based quantum switches in the lab. Instead of having the position of the star determining whether Alice or Bob go first, the causal order is determined by a two-level quantum state – which can have a value of 0 or 1. If this control state is 0, then Alice goes first and if the control state is 1, then Bob goes first. Crucially, when the control state is in a superposition of 0 and 1 the system shows indefinite causal order (see figure 1).

1 Simultaneous paths

Illustration of a proton travelling between Alice and Bob on different routes
(Illustration courtesy: Mayank Shreshtha)

In this illustration of a quantum switch a photon (driving a car) can follow two different paths, each with a different causal order. One path (top) leads to Alice’s garage followed by a visit to Bob’s drive thru. The second path (middle) visits Bob first, and then Alice. The path taken by the photon is determined by a control qubit that is represented by a traffic light. If the value of the qubit is “0” then the photon visits Alice First; if the qubit is “1” then the photon visits Bob first. Both of these scenarios have definite causal order.

However, the control qubit can exist in a quantum superposition of “0” and “1” (bottom). In this superposition, the path followed by the photon – and therefore the temporal order in which it visits Alice and Bob – is not defined. This is an example of indefinite causal order. Of course, any attempt to identify exactly which path the photon goes through initially will destroy the superposition (and therefore the ICO) and the photon will take only one definite path.

The first such quantum switch was created by in 2015 by Lorenzo Procopio (now at Germany’s University of Paderborn) and colleagues at the Vienna Center for Quantum Science and Technology (Nat. Comms 6, 7913). Their quantum switch involves firing a photon at a beam splitter, which puts the photon into a superposition of a photon that has travelled straight through the splitter (state 0) and a photon that has been deflected by 90 degrees (state 1). This spatial superposition is the control state of the quantum switch, playing the role of the star in the gravitational quantum switch.

State 0 photons first travel to an Alice apparatus where a polarization rotation is done in a specific direction (say X). Then the photons are sent to a Bob apparatus where a non-commuting rotation (say Z) is done. Conversely, the photons that travel along the state 1 path encounter Bob before Alice.

Finally, the state 0 and state 1 paths are recombined at a second beamsplitter, which is monitored by two photon-detectors. Because Alice-then-Bob has a different effect on a photon than does Bob-then-Alice, interference can occur between recombined photons. This interference is studied by systematically changing certain aspects of the experiment. For example, by changing Alice’s direction of rotation or the polarization of the incoming photons.

In 2017 quantum-information researcher Giulia Rubino, then at the Vienna Center for Quantum Science and Technology, teamed up with Procopia and colleagues to verify ICO in their quantum switch using a “causal witness” (Sci. Adv. 3 e1602589). This involves doing a specific set of experiments on the quantum switch and calculating a mathematical entity (the causal witness) that reveals whether a system has definite or indefinite causal order. Sure enough, this test revealed that their system does indeed have ICO. Since then, physicists working in several independent labs have successfully created their own quantum switches.

Computational speed up?

While this effect might still seem somewhat obscure, in 2019, an international team led by the renowned Chinese physicist Jian-Wei Pan showed that a quantum switch can be very useful for doing computations that are distributed between two parties (Phys. Rev. Lett122 120504). In such a scenario a string of data is received and then processed by Alice, who then passes the results on to Bob for further processing. In an experiment using photons, they showed that ICO delivers an exponential speed-up of the rate at which longer strings are processed – compared to a system with no ICO.

Physicists are also exploring if ICO could be used to enhance quantum metrology. Indeed, recent calculations by Oxford University’s Giulio Chiribella and colleagues suggest that it could lead to a significant increase in precision when compared to techniques that involve states with definite causal order (Phys. Rev. Lett. 124 190503).

While other applications could be possible, it is often difficult work out whether ICO offers the best solution to a specific problem. For example, physicists had thought a quantum switch offered an advantage when it comes to communicating along a noisy channel, but it turns out that some configurations of Alice and Bob with definite causal order were just as good as an ICO.

Beyond the quantum switch, there are other types of circuits that would display ICO. These include “quantum circuits with quantum control of causal order”, which have yet to be implemented in the lab because of their complexity.

But despite the challenges in creating ICO systems and proving that they outperform other solutions, it looks like ICO is set to join ranks of other weird phenomena such as superposition and entanglement that have found practical applications in quantum technologies.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Indefinite causal order: how quantum physics is challenging our understanding of cause and effect appeared first on Physics World.

Reformulation of general relativity brings it closer to Newtonian physics

5 septembre 2025 à 15:37

The first-ever detection of gravitational waves was made by LIGO in 2015 and since then researchers have been trying to understand the physics of the black-hole and neutron-star mergers that create the waves. However, the physics is very complicated and is defined by Albert Einstein’s general theory of relativity.

Now Jiaxi Wu, Siddharth Boyeneni and Elias Most at the California Institute of Technology (Caltech) have addressed this challenge by developing a new formulation of general relativity that is inspired by the equations that describe electromagnetic interactions. They show that general relativity behaves in the same way as the gravitational inverse square law described by Isaac Newton more than 300 years ago. “This is a very non-trivial insight,” says Most.

One of the fascinations of black holes is the extreme physics they invoke. These astronomical objects  pack so much mass into so little space that not even light can escape their gravitational pull. Black holes (and neutron stars) can exist in binary systems in which the objects orbit each other. These pairs eventually merge to create single black holes in events that create detectable gravitational waves. The study of these waves provides an important testbed for gravitational physics. However, the mathematics of general relativity that describe these mergers is very complicated.

Inverse square law

According to Newtonian physics, the gravitational attraction between two masses is proportional to the inverse of the square of the distance between them – the inverse square law. However, as Most points out, “Unless in special cases, general relativity was not thought to act in the same way.”

Over the past decade, gravitational-wave researchers have taken various approaches including post-Newtonian theory and effective one-body approaches to better understand the physics of black-hole mergers. One important challenge is how to model parameters such as orbital eccentricity and precession in black hole systems and how best to understand “ringdown”. The latter is the process whereby a black hole formed by a merger emits gravitational waves as it relaxes into a stable state.

The trio’s recasting of the equations of general relativity was inspired by the Maxwell equations that describe how electric and magnetic fields leapfrog each other through space. According to these equations, the forces between electric charges diminish according to the same inverse square law as Newton’s gravitational attraction.

Early reformulations

The original reformulations of “gravitoelectromagnetism” date back to the 90s. Most explains how among those who did this early work was his Caltech colleague and LIGO Nobel laureate Kip Thorne, who exploited a special mathematical structure of the curvature of space–time.

“This structure mathematically looks like the equations governing light and the attraction of electric charges, but the physics is quite different,” Most tells Physics World. The gravito-electric field thus derived describes how an object might squish under the forces of gravity. “Mathematically this means that the previous gravito-electric field falls off with inverse distance cubed, which is unlike the inverse distance square law of Newtonian gravity or electrostatic attraction,” adds Most.

Most’s own work follows on from previous studies of the potential radio emission from the interaction of magnetic fields during the collision of neutron stars and black holes from which it seemed reasonable to then “think about whether some of these insights naturally carry over to Einstein’s theory of gravity”. The trio began with different formulations of general relativity and electromagnetism with the aim of deriving gravitational analogues for the electric and magnetic fields that behave more closely to classical theories of electromagnetism. They then demonstrated how their formulation might describe the behaviour of a non-rotating Schwarzschild black hole, as well as a black hole binary.

Not so different

“Our work says that actually general relativity is not so different from Newtonian gravity (or better, electric forces) when expressed in the right way,” explains Most. The actual behaviour predicted is the same in both formulations but the trio’s reformulation reveals how general relativity and Newtonian physics are more similar than they are generally considered to be. “The main new thing is then what does it mean to ‘observe’ gravity, and what does it mean to measure distances relative to how you ‘observe’.”

Alexander Phillipov is a black-hole expert at the University of Maryland in the US and was not directly involved with Most’s research. He describes the research as “very nice”, adding that while the analogy between gravity and electromagnetism has been extensively explored in the past, there is novelty in the interpretation of results from fully nonlinear general relativistic simulations in terms of effective electromagnetic fields. “It promises to provide valuable intuition for a broad class of problems involving compact object mergers.”

The research is described in Physical Review Letters.

The post Reformulation of general relativity brings it closer to Newtonian physics appeared first on Physics World.

Researchers create glow-in-the-dark succulents that recharge with sunlight

5 septembre 2025 à 14:17

“Picture the world of Avatar, where glowing plants light up an entire ecosystem,” describes Shuting Liu of South China Agricultural University in Guangzhou.

Well, that vision is now a step closer thanks to researchers in China who have created glow-in-the-dark succulents that recharge in sunlight.

Instead of coaxing cells to glow through genetic modification, the team instead used afterglow phosphor particles – materials similar to those found in glow-in-the-dark toys – that can absorb light and release it slowly over time.

The researchers then injected the particles into succulents, finding that they produced a strong glow, thanks to the narrow, uniform and evenly distributed channels within the leaf that helped to disperse the particles.

After a couple of minutes of exposure to sunlight or indoor LED light, the modified plants glowed for up to two hours. By using different types of phosphors, the researchers created plants that shine in various colours, including green, red and blue.

The team even built a glowing plant wall with 56 succulents, which was bright enough to illuminate nearby objects.

“I just find it incredible that an entirely human-made, micro-scale material can come together so seamlessly with the natural structure of a plant,” notes Liu. “The way they integrate is almost magical. It creates a special kind of functionality.”

The post Researchers create glow-in-the-dark succulents that recharge with sunlight appeared first on Physics World.

Big data helps Gaelic football club achieve promotion following 135-year wait

5 septembre 2025 à 11:18

An astrophysics PhD student from County Armagh in Northern Ireland has combined his passion for science with Gaelic football to help his club achieve a historic promotion.

Eamon McGleenan plays for his local team – O’Connell’s GAC Tullysaran – and is a PhD student at Queen’s University Belfast, where he is a member of the Predictive Sports Analytics (PSA) research team, which was established in 2023.

McGleenan and his PhD supervisor David Jess teamed up with GAC Tullysaran to investigate whether data analysis and statistical techniques could improve their training and results.

Over five months, the Queen’s University researchers took over 550 million individual measurements from the squad, which included information such as player running speed, accelerations and heart rates.

“We applied mathematical models to the big data we obtained from the athletes,” notes McGleenan. “This allowed us to examine how the athletes evolved over time and we then provided key insights for the coaching staff, who then generated bespoke training routines and match tactics.”

The efforts immediately paid off as in July GAC Tullysaran won their league by two points and were promoted for the first time in 135 years to the top-flight Senior Football League, which they will start in March.

“The statistical insight provided by PSA is of great use and I like how it lets me get the balance of training right, especially in the run-up to match day,” noted Tullysaran manager Pauric McGlone, who adds that it also provided a bit of competition in the squad that ensured the players were “conditioned in a way that allows them to perform at their best”.

For more about the PSA’s activities, see here.

The post Big data helps Gaelic football club achieve promotion following 135-year wait appeared first on Physics World.

Zero-point motion of atoms measured directly for the first time

5 septembre 2025 à 10:11

Physicists in Germany say they have measured the correlated behaviour of atoms in molecules prepared in their lowest quantum energy state for the first time. Using a technique known as Coulomb explosion imaging, they showed that the atoms do not simply vibrate individually. Instead, they move in a coupled fashion that displays fixed patterns.

According to classical physics, molecules with no thermal energy – for example, those held at absolute zero – should not move. However, according to quantum theory, the atoms making up these molecules are never completely “frozen”, so they should exhibit some motion even at this chilly temperature. This motion comes from the atoms’ zero-point energy, which is the minimum energy allowed by quantum mechanics for atoms in their ground state at absolute zero. It is therefore known as zero-point motion.

Reconstructing the molecule’s original structure

To study this motion, a team led by Till Jahnke from the Institute for Nuclear Physics at Goethe University Frankfurt and the Max Planck Institute for Nuclear Physics in Heidelberg used the European XFEL in Hamburg to bombard their sample – an iodopyridine molecule consisting of 11 atoms – with ultrashort, high-intensity X-ray pulses. These high-intensity pulses violently eject electrons out of the iodopyridine, causing its constituent atoms to become positively charged (and thus to repel each other) so rapidly that the molecule essentially explodes.

To image the molecular fragments generated by the explosion, the researchers used a customized version of a COLTRIMS reaction microscope. This approach allowed them to reconstruct the molecule’s original structure.

From this reconstruction, the researchers were able to show that the atoms do not simply vibrate individually, but that they do so in correlated, coordinated patterns. “This is known, of course, from quantum chemistry, but it had so far not been measured in a molecule consisting of so many atoms,” Jahnke explains.

Data challenges

One of the biggest challenges Jahnke and colleagues faced was interpreting what the microscope data was telling them. “The dataset we obtained is super-rich in information and we had already recorded it in 2019 when we began our project,” he says. “It took us more than two years to understand that we were seeing something as subtle (and fundamental) as ground-state fluctuations.”

Since the technique provides detailed information that is hidden to other imaging approaches, such as crystallography, the researchers are now using it to perform further time-resolved studies – for example, of photochemical reactions. Indeed, they performed and published the first measurements of this type at the beginning of 2025, while the current study (which is published in Science) was undergoing peer review.

“We have pushed the boundaries of the current state-of-the-art of this measurement approach,” Jahnke tells Physics World, “and it is nice to have seen a fundamental process directly at work.”

For theoretical condensed matter physicist Asaad Sakhel at Balqa Applied University, Jordan, who was not involved in this study, the new work is “an outstanding achievement”. “Being able to actually ‘see’ zero-point motion allows us to delve deeper into the mysteries of quantum mechanics in our quest to a further understanding of its foundations,” he says.

The post Zero-point motion of atoms measured directly for the first time appeared first on Physics World.

Artificial intelligence predicts future directions in quantum science

4 septembre 2025 à 15:55

Can artificial intelligence predict future research directions in quantum science? Listen to this episode of the Physics World Weekly podcast to discover what is already possible.

My guests are Mario Krenn – who heads the Artificial Scientist Lab at Germany’s Max Planck Institute for the Science of Light – and Felix Frohnert, who is doing a PhD on the intersection of quantum physics and machine learning at Leiden University in the Netherlands.

Frohnert, Krenn and colleagues published a paper earlier this year called “Discovering emergent connections in quantum physics research via dynamic word embeddings” in which they analysed more than 66,000 abstracts from the quantum-research literature to see if they could predict future trends in the field. They were particularly interested in the emergence of connections between previously isolated subfields of quantum science.

We chat about what motivated the duo to use machine learning to study quantum science; how their prediction system works; and I ask them whether they have been able to predict current trends in quantum science using historical data.

Their paper appears in the journal Machine Learning Science and Technology. It is published by IOP Publishing – which also brings you Physics World.  Krenn is on the editorial board of the journal and in the podcast he explains why it is important to have a platform to publish research at the intersection of physics and machine learning.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

 

The post Artificial intelligence predicts future directions in quantum science appeared first on Physics World.

Errors in A-level physics papers could jeopardize student university admissions, Institute of Physics warns

4 septembre 2025 à 15:30

Errors in some of this year’s A-level physics exam papers could leave students without good enough grades to study physics at university. The mistakes have forced Tom Grinyer, chief executive of the Institute of Physics (IOP), to write to all heads of physics at UK universities, calling on them to “take these exceptional circumstances into account during the final admissions process”. The IOP is particularly concerned about students whose grades are lower than expected or are “a significant outlier” compared to other subjects.

The mistakes in question appeared in the physics (A) exam papers 1 and 2 set by the OCR exam board. Erratum notices had been issued to students at the start of the exam in June, but a further error in paper 2 was only spotted after the exam had taken place, causing some students to get stuck. Physics paper 2 from the rival AQA exam board was also seen to contain complex phrasing that hindered students’ ability to answer the question and led to time pressures.

A small survey of physics teachers carried out after the exam by the IOP, which publishes Physics World, reveals that 41% were dissatisfied with the OCR physics exam papers and more than half (58%) felt that students had a negative experience. Two-thirds of teachers, meanwhile, reported that students had a negative experience during the AQA exam. A-levels are mostly taken by 18 year olds in England, Wales and Northern Ireland, with the grades being used by universities to decide admission.

Grinyer says that the IOP is engaging in “regular, open dialogue with exam boards” to ensure that the assessment process supports and encourages students, while maintaining the rigour and integrity of the qualification. “Our immediate concern,” Grinyer warns, “is that the usual standardization processes and adjustments to grade boundaries – particularly for the OCR paper with errors – may fail to compensate fully for the negative effect on exam performance for some individuals.”

An OCR spokesperson told Physics World that the exam board is “sorry to the physics students and teachers affected by errors in A-level physics this year”. The board says that it “evaluated student performance across all physics papers, and took all necessary steps to mitigate the impact of these errors”. The OCR claims that the 13,000 students who sat OCR A-level physics A this year “can be confident” in their A-level physics results.

“We have taken immediate steps to review and strengthen our quality assurance processes to prevent such issues from occurring in the future,” the OCR adds. “We appreciated the opportunity to meet with the Institute of Physics to discuss these issues, and also to discuss our shared interest in encouraging the growth of this vital subject.”

Almost 23,500 students sat AQA A-level physics this year and an AQA spokesperson told Physics World that the exam board “listened to feedback and took steps to make A-level physics more accessible” to students and that there “is no need for universities to make an exception for AQA physics outcomes when it comes to admissions criteria”.

“These exam papers were error-free, as teachers and students would expect, and we know that students found the papers this year to be more accessible than last year,” they say. “We’ll continue to engage with any feedback that we receive, including feedback from the Institute of Physics, to explore how we can enhance our A-level physics assessments and give students the best possible experience when they sit exams.”

Students ‘in tears’

The IOP now wants A-level physics students to be given a “fair opportunity” when it comes to university admissions. “These issues are particularly concerning for students on widening participation pathways, many of whom already face structural barriers to high-stakes assessment,” the IOP letter states. “The added challenge of inaccessible or error-prone exam papers risks compounding disadvantage and may not reflect the true potential of these students.”

The IOP also contacted AQA last year over inaccessible contexts and language used in previous physics exams. But despite AQA’s assurances that the problems would be addressed, some of the same issues have now recurred. Helen Sinclair, head of physics at the all-girls Wimbledon High School, believes that the “variable quality” of recent A-level papers have had “far-reaching consequences” on young people thinking of going into physics at university.

“Our students have exceptionally high standards for themselves and the opaque nature of many questions affects them deeply, no matter what grades they ultimately achieve. This has even led some to choose to apply for other subjects at university,” she told Physics World. “This is not to say that papers should not be challenging; however, better scaffolding within some questions would help students anchor themselves in what is an already stressful environment, and would ultimately enable them to better demonstrate their full potential within an exam.”

Students come out of the exams feeling disheartened, and those students share their perceptions with younger students

Abbie Hope, Stokesley School

Those concerns are echoed by Abbie Hope, head of physics at Stokesley School near Middlesbrough. She says the errors in this year’s exam papers are “not acceptable” and believes that OCR has “failed their students”. Hope says that AQA physics papers in recent years have been “very challenging” and have resulted in students feeling like they cannot do physics. She also says some have emerged from exam halls in tears.

“Students come out of the exams feeling disheartened and share their perceptions with younger students,” she says. “I would rather students sat a more accessible paper, with higher grade boundaries so they feel more successful when leaving the exam hall, rather than convinced they have underachieved and then getting a surprise on results day.” Hope fears the mistakes will undermine efforts to encourage uptake and participation in physics and that exam boards need to serve students and teachers better.

A ‘growing unease’

Rachael Houchin, head of physics at Royal Grammar School Newcastle, says this year’s errors have added to her “growing unease” about the state of physics education in the UK. “Such incidents – particularly when they are public and recurring – do little to improve the perception of the subject or encourage its uptake,” she says. “Everyone involved in physics education – at any level – has a duty to get it right. If we fail, we risk physics drifting into the category of subjects taught predominantly in selective or independent schools, and increasingly absent from the mainstream.”

Hari Rentala, associate director of education and workforce at the IOP, is concerned that the errors unfairly “perpetuate the myth” that physics is a difficult subject. “OCR appear to have managed the situation as best they can, but this is not much consolation for how students will have felt during the exam and over the ensuing weeks,” says Rentala. “Once again AQA set some questions that were overly challenging. We can only hope that the majority of students who had a negative experience as a result of these issues at least receive a fair grade – as grade boundaries have been adjusted down.”

Mixed news for pupils

Despite the problems with some specific papers, almost 45,000 students took A-level physics in the UK – a rise of 4.3% on last year – to reach the highest level for 25 years. Physics is now the sixth most popular subject at A-level, up from ninth last year, with girls representing a quarter of all candidates. Meanwhile, in Scotland the number of entries in both National 5 and Higher physics was 13,680 and 8560, respectively, up from 13,355 and 8065 last year.

“We are delighted so many young people, and increasing numbers of girls, are hearing the message that physics can open up a lifetime of opportunities,” says Grinyer. “If we can build on this momentum there is a real opportunity to finally close the gap between boys and girls in physics at A-level. To do that we need to continue to challenge the stereotypes that still put too many young people off physics and ensure every young person knows that physics – and a career in science and innovation – could be for them.”

However, there is less good news for younger pupils, with a new IOP report finding that more than half a million GCSE students are expected to start the new school year with no physics teacher. It reveals that a quarter of English state schools have no specialist physics teachers at all and fears that more than 12,000 students could miss out on taking A-level physics because of this. The IOP wants the UK government to invest £120m over the next 10 years to address the shortage by retaining, recruiting and retraining a new generation of physics teachers.

The post Errors in A-level physics papers could jeopardize student university admissions, Institute of Physics warns appeared first on Physics World.

Quantum sensors reveal ‘smoking gun’ of superconductivity in pressurized bilayer nickelates

4 septembre 2025 à 10:00

Physicists at the Chinese Academy of Sciences (CAS) have used diamond-based quantum sensors to uncover what they say is the first unambiguous experimental evidence for the Meissner effect – a hallmark of superconductivity – in bilayer nickelate materials at high pressures. The discovery could spur the development of highly sensitive quantum detectors that can be operated under high-pressure conditions.

Superconductors are materials that conduct electricity without resistance when cooled to below a certain critical transition temperature Tc. Apart from a sharp drop in electrical resistance, another important sign that a material has crossed this threshold is the appearance of the Meissner effect, in which the material expels a magnetic field from its interior (diamagnetism). This expulsion creates such a strong repulsive force that a magnet placed atop the superconducting material will levitate above it.

In “conventional” superconductors such as solid mercury, the Tc is so low that the materials must be cooled with liquid helium to keep them in the superconducting state. In the late 1980s, however, physicists discovered a new class of superconductors that have a Tabove the boiling point of liquid nitrogen (77 K). These “unconventional” or high-temperature superconductors are derived not from metals but from insulators containing copper oxides (cuprates).

Since then, the search has been on for materials that superconduct at still higher temperatures, and perhaps even at room temperature. Discovering such materials would have massive implications for technologies ranging from magnetic resonance imaging machines to electricity transmission lines.

Enter nickel oxides

In 2019 researchers at Stanford University in the US identified nickel oxides (nickelates) as additional high-temperature superconductors. This created a flurry of interest in the superconductivity community because these materials appear to superconduct in a way that differs from their copper-oxide cousins.

Among the nickelates studied, La3Ni2O7-δ (where δ can range from 0 to 0.04) is considered particularly promising because in 2023, researchers led by Meng Wang of China’s Sun Yat-Sen University spotted certain signatures of superconductivity at a temperature of around 80 K. However, these signatures only appeared when crystals of the material were placed in a device called a diamond anvil cell (DAC). This device subjects samples of material to extreme pressures of more than 400 GPa (or 4 × 106 atmospheres) as it squeezes them between the flattened tips of two tiny, gem-grade diamond crystals.

The problem, explains Xiaohui Yu of the CAS’ Institute of Physics, is that it is not easy to spot the Meissner effect under such high pressures. This is because the structure of the DAC limits the available sample volume and hinders the use of highly sensitive magnetic measurement techniques such as SQUID. Another problem is that the sample used in the 2023 study contains several competing phases that could mix and degrade the signal of the La3Ni2O7-δ.

Nitrogen-vacancy centres embedded as in-situ quantum sensors

In the new work, Yu and colleagues used nitrogen-vacancy (NV) centres embedded in the DAC as in-situ quantum sensors to track and image the Meissner effect in pressurized bilayer La3Ni2O7-δ. This newly developed magnetic sensing technique boasts both high sensitivity and high spatial resolution, Yu says. What is more, it fits perfectly into the DAC high-pressure chamber.

Next, they applied a small external magnetic field of around 120 G. Under these conditions, they measured the optically detected magnetic resonance (ODMR) spectra of the NV centres point by point. They could then extract the local magnetic field from the resonance frequencies of these spectra. “We directly mapped the Meissner effect of the bilayer nickelate samples,” Yu says, noting that the team’s image of the magnetic field clearly shows both a diamagnetic region and a region where magnetic flux is concentrated.

Weak demagnetization signal

The researchers began their project in late 2023, shortly after receiving single-crystal samples of La3Ni2O7-δ from Wang. “However, after two months of collecting data, we still had no meaningful results,” Yu recalls. “From these experiments, we learnt that the demagnetization signal in La3Ni2O7-δ crystals was quite weak and that we needed to improve either the nickelate sample or the sensitivity of the quantum sensor.”

To overcome these problems, they switched to using polycrystalline samples, enhancing the quality of the nickelate samples by doping them with praseodymium to make La2PrNi2O7. This produced a sample with an almost pure bilayer structure and thus a much stronger demagnetization signal. They also used shallow NV centres implanted on the DAC cutlet (the smaller face of the two diamond tips).

“Unlike the NV centres in the original experiments, which were randomly distributed in the pressure-transmitting medium and have relatively large ODMR widths, leading to only moderate sensitivity in the measurements, these shallow centres are evenly distributed and well aligned, making it easier for us to perform magnetic imaging with increased sensitivity,” Yu explains.

These improvements enabled the team to obtain a demagnetization signal from the La2PrNi2O7 and La3Ni2O7-δ samples, he tells Physics World. “We found that the diamagnetic signal from the La2PrNi2O7 samples is about five times stronger than that from the La3Ni2O7-δ ones prepared under similar conditions – a result that is consistent with the fact that the Pr-doped samples are of a better quality.”

Physicist Jun Zhao of Fudan University, China, who was not involved in this work, says that Yu and colleagues’ measurement represents “an important step forward” in nickelate research. “Such measurements are technically very challenging, and their success demonstrates both experimental ingenuity and scientific significance,” he says. “More broadly, their result strengthens the case for pressurized nickelates as a new platform to study high-temperature superconductivity beyond the cuprates. It will certainly stimulate further efforts to unravel the microscopic pairing mechanism.”

As well as allowing for the precise sensing of magnetic fields, NV centres can also be used to accurately measure many other physical quantities that are difficult to measure under high pressure, such as strain and temperature distribution. Yu and colleagues say they are therefore looking to further expand the application of these structures for use as quantum sensors in high-pressure sensing.

They report their current work in National Science Review.

The post Quantum sensors reveal ‘smoking gun’ of superconductivity in pressurized bilayer nickelates appeared first on Physics World.

Quantum foundations: towards a coherent view of physical reality

3 septembre 2025 à 12:00

One hundred years after its birth, quantum mechanics remains one of the most powerful and successful theories in all of science. From quantum computing to precision sensors, its technological impact is undeniable – and one reason why 2025 is being celebrated as the International Year of Quantum Science and Technology.

Yet as we celebrate these achievements, we should still reflect on what quantum mechanics reveals about the world itself. What, for example, does this formalism actually tell us about the nature of reality? Do quantum systems have definite properties before we measure them? Do our observations create reality, or merely reveal it?

These are not just abstract, philosophical questions. Having a clear understanding of what quantum theory is all about is essential to its long-term coherence and its capacity to integrate with the rest of physics. Unfortunately, there is no scientific consensus on these issues, which continue to provoke debate in the research community.

That uncertainty was underlined by a recent global survey of physicists about quantum foundational issues, conducted by Nature (643 1157). It revealed a persistent tension between “realist” views, which seek an objective, visualizable account of quantum phenomena, and “epistemic” views that regard the formalism as merely a tool for organizing our knowledge and predicting measurement outcomes.

Only 5% of the 1100 people who responded to the Nature survey expressed full confidence in the Copenhagen interpretation, which is still prevalent in textbooks and laboratories. Further divisions were revealed over whether the wavefunction is a physical entity, a mere calculation device, or a subjective reflection of belief. The lack of agreement on such a central feature underscores the theoretical fragility underlying quantum mechanics.

The willingness to explore alternatives reflects the intellectual vitality of the field but also underscores the inadequacy of current approaches

More broadly, 75% of respondents believe that quantum theory will eventually be replaced, at least partially, by a more complete framework. Encouragingly, 85% agree that attempts to interpret the theory in intuitive or physical terms are valuable. This willingness to explore alternatives reflects the intellectual vitality of the field but also underscores the inadequacy of current approaches.

Beyond interpretation

We believe that this interpretative proliferation stems from a deeper problem, which is that quantum mechanics lacks a well-defined physical foundation. It describes the statistical outcomes of measurements, but it does not explain the mechanisms behind them. The concept of causality has been largely abandoned in favour of operational prescriptions such that quantum theory works impressively in practice but remains conceptually opaque.

In our view, the way forward is not to multiply interpretations or continue debating them, but to pursue a deeper physical understanding of quantum phenomena. One promising path is stochastic electrodynamics (SED), a classical theory augmented by a random electromagnetic background field, the real vacuum or zero-point field discovered by Max Planck as early as 1911. This framework restores causality and locality by explaining quantum behaviour as the statistical response of particles to this omnipresent background field.

Over the years, several researchers from different lines of thought have contributed to SED. Since our early days with Trevor Marshall, Timothy Boyer and others, we have refined the theory to the point that it can now account for the emergence of features that are considered building blocks of quantum formalism, such as the basic commutator and Heisenberg inequalities.

Particles acquire wave-like properties not by intrinsic duality, but as a consequence of their interaction with the vacuum field. Quantum fluctuations, interference patterns and entanglement emerge from this interaction, without the need to resort to non-local influences or observer-dependent realities. The SED approach is not merely mechanical, but rather electrodynamic.

Coherent thoughts

We’re not claiming that SED is the final word. But it does offer a coherent picture of microphysical processes based on physical fields and forces. Importantly, it doesn’t abandon the quantum formalism but rather reframes it as an effective theory – a statistical summary of deeper dynamics. Such a perspective enables us to maintain the successes of quantum mechanics while seeking to explain its origins.

For us, SED highlights that quantum phenomena can be reconciled with concepts central to the rest of physics, such as realism, causality and locality. It also shows that alternative approaches can yield testable predictions and provide new insights into long-standing puzzles. One phenomenon lying beyond current quantum formalism that we could now test, thanks to progress in experimental physics, is the predicted violation of Heisenberg’s inequalities over very short time periods.

As quantum science continues to advance, we must not lose sight of its conceptual foundations. Indeed, a coherent, causally grounded understanding of quantum mechanics is not a distraction from technological progress but a prerequisite for its full realization. By turning our attention once again to the foundations of the theory, we may finally complete the edifice that began to rise a century ago.

The centenary of quantum mechanics should be a time not just for celebration but critical reflection too.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum foundations: towards a coherent view of physical reality appeared first on Physics World.

Twisted graphene reveals a new type of chirality

3 septembre 2025 à 10:32

Structural chirality refers to the geometric property of objects that are not superimposable on their mirror images, a concept that is central to organic chemistry. In contrast, topological chirality in physics involves quantum properties like spin and is essential for understanding topological edge states. The connection between these two forms of chirality remains an open question.

Traditionally, topological phenomena have been studied in spinful systems, where the presence of spin allows for chiral interactions and symmetry-breaking effects. This new study challenges that paradigm by demonstrating that topological chirality can arise even in spinless systems, purely from the three-dimensional structural arrangement of otherwise featureless units.

The researchers mathematically investigate two types of twisted 3D graphite systems, composed of stacked 2D graphene layers. Importantly, large twist angles were used (21.8). In one configuration, the layers are twisted into a helical screw-like structure, while in the other, the twist angles alternate between layers, forming a periodic chiral pattern. These structural designs give rise to novel topological phases.

A key mechanism underlying these effects is intervalley Umklapp scattering. This scattering captures the chirality of the twisted interfaces and induces a sign-flipped interlayer hopping, by introducing a π-flux lattice gauge field. This field alters the symmetry algebra of the system, enabling the emergence of spinless topological chirality.

This research opens up a new design principle for topological materials. By engineering the spatial patterning of structureless units, researchers can induce topological chirality without relying on spin. This has significant implications for the development of topological photonic and acoustic devices, potentially leading to simpler, more tunable materials for applications in quantum computing, sensing, and waveguiding technologies.

Read the full article

Spinless topological chirality from Umklapp scattering in twisted 3D structures

Cong Chen et al 2025 Rep. Prog. Phys. 88 018001

Do you want to learn more about this topic?

Interacting topological insulators: a review by Stephan Rachel (2018)

The post Twisted graphene reveals a new type of chirality appeared first on Physics World.

Unveiling topological edge states with attosecond precision

3 septembre 2025 à 10:31

In condensed matter physics, topological phase transitions are a key area of research because they lead to unusual and potentially useful states of matter. One example is the Floquet topological insulator, which can switch from a non-topological to a topological phase when exposed to a laser pulse. However, detecting these transitions is difficult due to the extremely fast timescales involved and interference from infrared fields, which can distort the photoelectron signals.

A Chern insulator is a unique material that acts as an insulator in its bulk but conducts electricity along its edges. These edge states arise from the material’s crystal structure of the bulk. Unlike other topological materials, Chern insulators do not require magnetic fields. Their edge conduction is topologically protected, meaning it is highly resistant to defects and noise. This makes them promising candidates for quantum technologies, spintronics, and energy-efficient electronics.

In this study, researchers developed a new method to detect phase changes in Chern insulators. Using numerical simulations, they demonstrated that attosecond x-ray absorption spectroscopy, combined with polarization-dependent dichroism, can effectively reveal these transitions. Their semi-classical approach isolates the intra-band Berry connection, providing deeper insight into how topological edge states form and how electrons behave in these systems.

This work represents a significant advance in topological materials research. It offers a new way to observe changes in quantum materials in real time, expands the use of attosecond spectroscopy from simple atoms and molecules to complex solids, and opens the door to studying dynamic systems like Floquet topological insulators.

Read the full article

Topological phase transitions via attosecond x-ray absorption spectroscopy

Juan F P Mosquera et al 2024 Rep. Prog. Phys. 87 117901

Do you want to learn more about this topic?

Strong–laser–field physics, non–classical light states and quantum information science by U BhattacharyaTh LamprouA S MaxwellA OrdóñezE PisantyJ Rivera-DeanP StammerM F CiappinaM Lewenstein and P Tzallas (2023)

The post Unveiling topological edge states with attosecond precision appeared first on Physics World.

Broadband wireless gets even broader thanks to integrated transmitter

3 septembre 2025 à 10:00

Researchers in China have unveiled an ultrabroadband system that uses the same laser and resonator to process signals at frequencies ranging from below 1 GHz up to more than 100 GHz. The system, which is based on a thin-film lithium niobate resonator developed in 2018 by members of the same team, could facilitate the spread of the so-called “Internet of things” in which huge numbers of different devices are networked together at different frequency bands to avoid interference.

Modern complementary metal oxide semiconductor (CMOS) electronic devices generally produce signals at frequencies of a few GHz. These signals are then often shifted into other frequency bands for processing and transmission. For example, sending electronic signals long distances down silicon optical fibres generally means using a frequency of around 200 THz, as silicon is transparent at the corresponding “telecoms” wavelength of 1550nm.

One of the most popular materials for performing this conversion is lithium niobate. This material has been called “the silicon of photonics” because it is highly nonlinear, allowing optical signals to be generated efficiently at a wide range of frequencies.

In integrated devices, bulk lithium niobate modulators are undesirable. However, in 2018 Cheng Wang and colleagues led by Marko Lončar of Harvard University in Massachusetts, US, developed a miniaturized, thin-film version that used an interferometric design to create a much stronger electro-optic effect in a shorter distance. “Usually, the bandwidth limit is set by the radiofrequency loss,” explains Wang, who is now at the City University of Hong Kong, China. “Being shorter means you can go to much higher frequencies.”

A broadband data transmission system

In the new work, Wang, together with researchers at Peking University in China and the University of California, Santa Barbara in the US, used an optimized version of this setup to make a broadband data transmission system. They divided the output of a telecom-wavelength oscillator into two arms. In one of these arms, optical signal modulation software imprinted a complex amplitude-phase pattern on the wave. The other arm was exposed to the data signal and a lithium niobate microring resonator. The two arms were then recombined at a photodetector, and the frequency difference between the two arms (in the GHz range) was transmitted using an antenna to a detector, where the process was reversed.

Crucially, the offset between the centre frequencies of the two arms (the frequency of the beat note at the photodetector when the two arms are recombined) is determined solely by the frequency shift imposed by the lithium niobate resonator. This can be tuned anywhere between 0.5 GHz and 115 GHz via the thermo-optic effect – essentially, incorporating a small electronic heater and using it to tune the refractive index. The signal is then encoded in modulations of the beat frequency, with additional information imprinted into the phase of the waves.

The researchers say this system is an improvement on standard electronic amplifiers because such devices usually operate in relatively narrow bands. Using them to make large jumps in frequency therefore means that signals need to be shifted multiple times. This introduces cumulative noise into the signal and is also problematic for applications such as robotic surgery, where the immediate arrival of a signal can literally be a matter of life and death.

Internet of things applications

The researchers demonstrated wireless data transfer across a distance of 1.3 m, achieving speeds of up to 100 gigabits per second. In the present setup, they used three different horn antennas to transmit microwaves of different frequencies through free space, but they hope to improve this: “That is our next goal – to get a fully frequency-tuneable link,” says Peking University’s Haowen Shu.

The researchers believe such a wideband setup could be crucial to the development of the “Internet of things” in which all sorts of different electronic devices are networked together without unwanted interference. Atmospheric transparency windows below 6 GHz, where loss is lower and propagation lengths are higher, are likely to be crucial for providing wireless Internet access to rural areas. Meanwhile, higher frequencies – with higher data rates – will probably be needed for augmented reality and remote surgery applications.

Alan Willner, an electrical engineer and optical scientist at the University of Southern California, US, who was not involved in the research, thinks the team is on the right track. “You have lots of spectrum in various radio bands for wireless communications,” he says. “But how are you going to take advantage of these bands to transmit high data rates in a cost-effective and flexible way? Are you going to use multiple different systems – one each for microwave, millimetre wave, and terahertz?  Using one tuneable and reconfigurable integrated platform to cover these bands is significantly better. This research is a great step in that direction.”

The research is published in Nature.

The post Broadband wireless gets even broader thanks to integrated transmitter appeared first on Physics World.

From ‘rewarding and exciting’ to ‘challenging and overwhelming’: what it means to have a career in intelligence and cyber security

2 septembre 2025 à 14:00
GCHQ spelt out in Scrabble pieces on a chess board
Your next move? A career in intelligence can suit physicists with the right mindset. (Courtesy: Shutterstock/shaneinswedenx)

As a physics graduate or an early career researcher looking for a job, you might not think of the UK’s primary intelligence and security agency – Government Communications Headquarters (GCHQ) – as somewhere you might consider. But GCHQ, which covers counter-terrorism, cybersecurity, organized crime and defence support for the UK, hires a vast number of physicists. Indeed, to celebrate the 2025 International Year of Quantum Science and Technology, the agency has hosted many internal talks, informational campaigns and more.

GCHQ works with the Secret Intelligence Service (MI6), MI5, as well as the armed forces, a number of international partners, and firms in the private sector and academia. To find out more about a career at GCHQ – working with cutting-edge technology to identify, analyse and disrupt threats to the UK – Physics World speaks to two people with academic backgrounds who have a long career at the organization. They tell us about the benefits, the difficulties and the complexity of working at an intelligence agency.

Nia is the deputy director for science at GCHQ, where she has worked for the past 15 years. After studying physics at university, she joined GCHQ as a graduate and has since contributed to a wide range of scientific and technological initiatives in support of national security. She is a Fellow of both the Institute of Physics (IOP), which publishes Physics World, and the Institution of Engineering and Technology (IET).

Cheryl leads GCHQ’s adoption of quantum technologies. Following a degree in engineering, her career began as an apprentice at an avionics company. Since then, she has had many roles across research and development at GCHQ and across broader UK government departments, with a focus on understanding and implementing emerging technology. Cheryl is a Fellow of the IET and a Member of the IOP. 

When did your interest in science first develop?

Nia My fascination with science was nurtured from a young age, largely inspired by my parents. My mum was a physics teacher, and my dad is a passionate historian with an insatiable curiosity about the world. Growing up in an environment rich with books, experiments, and discussions about how things work – whether exploring astrophysics, geology or ancient Egypt – instilled in me a lifelong desire to understand our universe. My mum’s electronics, mechanics and physics lessons meant there were always breadboards, crocodile clips and even a Van de Graaff generator in the house, transforming learning into an exciting tangible experience.

Cheryl As a child I was always interested in nature and in how things work. I used to build bug farms in the garden and still have my old Observer’s books with the butterflies, etc, ticked off when spotted. Leaning towards my practical side of constantly making things (and foolishly believing my careers teacher that a physics degree would only lead to teaching), I took physics, chemistry and maths A-levels and a degree in engineering.

Could you tell us a bit about your educational background and your career path that led to you work at GCHQ?

Nia I was born and grew up in South Wales and attended a Welsh-language school where I studied physics, maths and chemistry at A-level. I then studied physics at Durham University for four years, before I started working at GCHQ as a graduate. My first role was in an area that is now the National Cyber Security Centre (NCSC). As the cyber security arm of GCHQ, it researches the reliability of semiconductors in national security applications and uses that research to shape policy and security standards. This was great for me as my final year in university was focused on material science and condensed matter physics which came in very useful.

Cheryl My engineering degree apprenticeship was through an aerospace company in Cheltenham, and I worked there afterwards designing test kits for the RAF. It was almost natural that I should at least try a few years at GCHQ as a local employer and I had plans to then move to other R&D labs.

What’s it like to work here – what are some of the stresses of working in this kind of an environment and not being able to discuss your job with friends and family? What are some of the best aspects of working at GCHQ?

Nia Working at GCHQ is rewarding and exciting especially as we look at the most exciting developments in emerging technologies. It can also be challenging especially when navigating the complexities of global security challenges amid an unpredictable geopolitical landscape. There are days when media reports or international events feel overwhelming, but knowing that my work contributes towards safeguarding the UK’s interests today and into the future offers a strong sense of purpose.

The most rewarding aspect, by far, is the people. We have some of the brightest, most dedicated experts – mentors, colleagues, friends – whose commitment inspires me daily. Their support and collaboration make even the most demanding days manageable.

Cheryl At GCHQ I found that I have been able to enjoy several very different “careers” within the organization, including opportunities to travel and to develop diverse skills. This, together with a flexibility to change working patterns to suit stages of family life, has meant I have stayed for most of my career.

I’ve had some amazing and unique opportunities and experiences

Cheryl, GCHQ

I’ve had some amazing and unique opportunities and experiences. In the Cheltenham area it’s accepted that so many people work here and is widely respected that we cannot talk about the detail of what we do.

Fingerprint on circuitboard illustration
Safety net Maintaining secure communication and anticipating new threats are key to the work carried out at GCHQ. (Shutterstock/S and V Design)

What role does physics and especially quantum science play in what you do? And what role does physics play when it comes to the national security of the UK?

Nia As deputy director of science at GCHQ, my role involves collaborating with experts to understand how emerging technologies, including quantum science, impact national security. Quantum offers extraordinary potential for secure communication and advanced sensing – but it equally threatens to upend existing security protocols if adversaries harness it maliciously. A deep understanding of physics is crucial – not only to spot opportunities but also to anticipate and counter threats.

Quantum science is just one example of how a fundamental understanding of physics and maths gives you the foundations to understand the broad waterfront of emerging technologies coming our way. We work closely with government departments, academia, industry and start-ups to ensure the UK remains at the forefront of this field, shaping a resilient and innovative security ecosystem.

Cheryl I first came across quantum science, technologies and quantum computing around 15 years ago through an emerging technology analysis role in R&D; and I watched and learned keenly as I could see that these would be game changing. Little did I know at the time that I would later be leading our adoption of quantum and just how significant these emerging technologies for sensing, timing and computing would grow to be.

The UK national ecosystem developing around quantum technologies is a great mix of minds from academia, industry and government departments and is one of the most collegiate, inspiring and well-motivated communities that I have interacted with.

For today’s physics graduates who might be interested in a career at GCHQ, what are some of the key skills they require?

Nia Many people will have heard of historic tales of the tap on the shoulder for people to work in intelligence agencies, but as with all other jobs the reality is that people can find out about careers at GCHQ in much the same way they would with any other kind of job.

Maintaining a hunger to learn and adapt is what will set you apart

Nia, GCHQ

I would emphasize qualities like curiosity, problem-solving and resilience as being key. The willingness to roll up your sleeves, a genuine care for collaborative work, and empathy are equally important – particularly because much of what we do is sensitive and demands trust and discretion. Maintaining a hunger to learn and adapt is what will set you apart.

Cheryl We have roles where you will be helping to solve complex problems – doing work you simply won’t find anywhere else. It’s key to have curiosity, an open mind and don’t be put off by the fact you can’t ask too many questions in advance!

What sort of equality, diversity and inclusion initiatives do you have at GCHQ and how are you looking to get more women and minorities working there?

Nia Diversity and inclusion are mission-critical for us at GCHQ, gathering the right mix of minds to find innovative solutions to the toughest of problems. We’re committed to building on our work to better represent the communities we serve, including increasing the number of people from ethnic minority backgrounds and the number of women in senior roles.

Cheryl We are committed to having a workforce that reflects the communities we serve. Our locations in the north-west, in both Manchester and now Lancashire, are part of the mission to find the right mix of minds

What is your advice to today’s physics grads? What is it that you know today that you wish you knew at the start of your career?

Nia One key lesson is that career paths are rarely linear. When starting out, uncertainty can feel daunting, but it’s an opportunity for growth. Embrace challenges and seize opportunities that excite you – whether they seem narrowly related to your studies or not. Every experience contributes to your development. Additionally, don’t underestimate the importance of work–life balance. GCHQ offers a supportive environment – remember, careers are marathons, not sprints. Patience and curiosity will serve you well.

Cheryl It takes multidisciplinary teams to deliver game-changers and new ecosystems. Your initial “career choices” are just a stepping stone from which you can forge your own path and follow your instincts.

The post From ‘rewarding and exciting’ to ‘challenging and overwhelming’: what it means to have a career in intelligence and cyber security appeared first on Physics World.

Desert dust helps freeze clouds in the northern hemisphere

2 septembre 2025 à 10:07

Micron-sized dust particles in the atmosphere could trigger the formation of ice in certain types of clouds in the Northern Hemisphere. This is the finding of researchers in Switzerland and Germany, who used 35 years of satellite data to show that nanoscale defects on the surface of these aerosol particles are responsible for the effect. Their results, which agree with laboratory experiments on droplet freezing, could be used to improve climate models and to advance studies of cloud seeding for geoengineering.

In the study, which was led by environmental scientist Diego Villanueva of ETH Zürich, the researchers focused on clouds in the so-called mixed-phase regime, which form at temperatures of between −39° and 0°C and are commonly found in mid- and high-latitudes, particularly over the North Atlantic, Siberia and Canada. These mixed-phase regime clouds (MPRCs) are often topped by a liquid or ice layer, and their makeup affects how much sunlight they reflect back into space and how much water they can release as rain or snow. Understanding them is therefore important for forecasting weather and making projections of future climate.

Researchers have known for a while that MPRCs are extremely sensitive to the presence of ice-nucleating particles in their environment. Such particles mainly come from mineral dust aerosols (such as K-feldspar, quartz, albite and plagioclase) that get swept up into the upper atmosphere from deserts. The Sahara Desert in northern Africa, for example, is a prime source of such dust in the Northern Hemisphere.

More dust leads to more ice clouds

Using 35 years of satellite data collected as part of the Cloud_cci project and MERRA-2 aerosol reanalyses, Villanueva and colleagues looked for correlations between dust levels and the formation of ice-topped clouds. They found that at temperatures of between -15°C and -30°C, the more dust there was, the more frequent the ice clouds were. What is more, their calculated increase in ice-topped clouds with increasing dust loading agrees well with previous laboratory experiments that predicted how dust triggers droplet freezing.

The new study, which is detailed in Science, shows that there is a connection between aerosols in the micrometre-size range and cloud ice observed over distances of several kilometres, Villanueva says. “We found that it is the nanoscale defects on the surface of dust aerosols that trigger ice clouds, so the process of ice glaciation spans more than 15 orders of magnitude in length,” he explains.

Thanks to this finding, Villaneuva tells Physics World that climate modellers can use the team’s dataset to better constrain aerosol-cloud processes, potentially helping them to construct better estimates of cloud feedback and global temperature projections.

The result also shows how sensitive clouds are to varying aerosol concentrations, he adds. “This could help bring forward the field of cloud seeding and include this in climate geoengineering efforts.”

The researchers say they have successfully replicated their results using a climate model and are now drafting a new manuscript to further explore the implications of dust-driven cloud glaciation for climate, especially for the Arctic.

The post Desert dust helps freeze clouds in the northern hemisphere appeared first on Physics World.

Radioactive ion beams enable simultaneous treatment and imaging in particle therapy

2 septembre 2025 à 10:00

Researchers in Germany have demonstrated the first cancer treatment using a radioactive carbon ion beam (11C), on a mouse with a bone tumour close to the spine. Performing particle therapy with radioactive ion beams enables simultaneous treatment and visualization of the beam within the body.

Particle therapy using beams of protons or heavy ions is a highly effective cancer treatment, with the favourable depth–dose deposition – the Bragg peak – providing extremely conformal tumour targeting. This conformality, however, makes particle therapy particularly sensitive to range uncertainties, which can impact the Bragg peak position.

One way to reduce such uncertainties is to use positron emission tomography (PET) to map the isotopes generated as the treatment beam interacts with tissues in the patient. For therapy with carbon (12C) ions, currently performed at 17 centres worldwide, this involves detecting the beta decay of 10C and 11C projectile fragments. Unfortunately, such fragments generate a small PET signal, while their lower mass shifts the measured activity peak away from the Bragg peak.

The researchers – working within the ERC-funded BARB (Biomedical Applications of Radioactive ion Beams) project – propose that treatment with positron-emitting ions such as 11C could overcome these obstacles. Radioactive ion beams have the same biological effectiveness as their corresponding stable ion beams, but generate an order of magnitude larger PET signal. They also reduce the shift between the activity and dose peaks, enabling precise localization of the ion beam in vivo.

“Range uncertainty remains the main problem of particle therapy, as we do not know exactly where the Bragg peak is,” explains Marco Durante, head of biophysics at the GSI Helmholtz Centre for Heavy Ion Research and principal investigator of the BARB project. “If we ‘aim-and-shoot’ using a radioactive beam and PET imaging, we can see where the beam is and can then correct it. By doing this, we can reduce the margins around the target that spoil the precision of particle therapy.”

In vivo experiments

To test this premise, Durante and colleagues performed in vivo experiments at the GSI/FAIR accelerator facility in Darmstadt. For online range verification, they used a portable small-animal in-beam PET scanner built by Katia Parodi and her team at LMU Munich. The scanner, initially designed for the ERC project SIRMIO (Small-animal proton irradiator for research in molecular image-guided radiation-oncology), contains 56 depth-of-interaction detectors – based on scintillator blocks of pixelated LYSO crystals – arranged spherically with an inner diameter of 72 mm.

LMU researchers with small-animal PET scanner
LMU researchers Members of the LMU team involved in the BARB project (left to right: Peter Thirolf, Giulio Lovatti, Angelica Noto, Francesco Evangelista, Munetaka Nitta and Katia Parodi) with the small-animal PET scanner. (Courtesy: Katia Parodi/Francesco Evangelista, LMU)

“Not only does our spherical in-beam PET scanner offer unprecedented sensitivity and spatial resolution, but it also enables on-the-fly monitoring of the activity implantation for direct feedback during irradiation,” says Parodi, co-principal investigator of the BARB project.

The researchers used a radioactive 11C-ion beam – produced at the GSI fragment separator – to treat 32 mice with an osteosarcoma tumour implanted in the neck near the spinal cord. To encompass the full target volume, they employed a range modulator to produce a spread-out Bragg peak (SOBP) and a plastic compensator collar, which also served to position and immobilize the mice. The anaesthetized animals were placed vertically inside the PET scanner and treated with either 20 or 5 Gy at a dose rate of around 1 Gy/min.

For each irradiation, the team compared the measured activity with Monte Carlo-simulated activity based on pre-treatment microCT scans. The activity distributions were shifted by about 1 mm, attributed to anatomical changes between the scans (with mice positioned horizontally) and irradiation (vertical positioning). After accounting for this anatomical shift, the simulation accurately matched the measured activity. “Our findings reinforce the necessity of vertical CT planning and highlight the potential of online PET as a valuable tool for upright particle therapy,” the researchers write.

With the tumour so close to the spine, even small range uncertainties risk damage to the spinal cord, so the team used the online PET images generated during the irradiation to check that the SOPB did not cover the spine. While this was not seen in any of the animals, Durante notes that if it had, the beam could be moved to enable “truly adaptive” particle therapy. Assessing the mice for signs of radiation-induced myelopathy (which can lead to motor deficits and paralysis) revealed that no mice exhibited severe toxicity, further demonstrating that the spine was not exposed to high doses.

PET imaging in a mouse
PET imaging in a mouse (a) Simulation showing the expected 11C-ion dose distribution in the pre-treatment microCT scan. (b) Corresponding simulated PET activity. (c) Online PET image of the activity during 11C irradiation, overlaid on the same microCT used for simulations. The target is outlined in black, the spine in red. (Courtesy: CC BY 4.0/Nat. Phys. 10.1038/s41567-025-02993-8)

Following treatment, tumour measurements revealed complete tumour control after 20 Gy irradiation and prolonged tumour growth delay after 5 Gy, suggesting complete target coverage in all animals.

The researchers also assessed the washout of the signal from the tumour, which includes a slow activity decrease due to the decay of 11C (which has a half-life of 20.34 min), plus a faster decrease as blood flow removes the radioactive isotopes from the tumour. The results showed that the biological washout was dose-dependent, with the fast component visible at 5 Gy but disappearing at 20 Gy.

“We propose that this finding is due to damage to the blood vessel feeding the tumour,” says Durante. “If this is true, high-dose radiotherapy may work in a completely different way from conventional radiotherapy: rather than killing all the cancer stem cells, we just starve the tumour by damaging the blood vessels.”

Future plans

Next, the team intends to investigate the use of 10C or 15O treatment beams, which should provide stronger signals and increased temporal resolution. A new Super-FRS fragment separator at the FAIR accelerator facility will provide the high-intensity beams required for studies with 10C.

Looking further ahead, clinical translation will require a realistic and relatively cheap design, says Durante. “CERN has proposed a design [the MEDICIS-Promed project] based on ISOL [isotope separation online] that can be used as a source of radioactive beams in current accelerators,” he tells Physics World. “At GSI we are also working on a possible in-flight device for medical accelerators.”

The findings are reported in Nature Physics.

The post Radioactive ion beams enable simultaneous treatment and imaging in particle therapy appeared first on Physics World.

❌