↩ Accueil

Vue normale

Organic LED can electrically switch the handedness of emitted light

7 janvier 2026 à 14:00

Circularly polarized (CP) light is encoded with information through its photon spin and can be utilized in applications such as low-power displays, encrypted communications and quantum technologies. Organic light emitting diodes (OLEDs) produce CP light with a left or right “handedness”, depending on the chirality of the light-emitting molecules used to create the device.

While OLEDs usually only emit either left- or right-handed CP light, researchers have now developed OLEDs that can electrically switch between emitting left- or right-handed CP light – without needing different molecules for each handedness.

“We had recently identified an alternative mechanism for the emission of circularly polarized light in OLEDs, using our chiral polymer materials, which we called anomalous circularly polarized electroluminescence,” says lead author Matthew Fuchter from the University of Oxford. “We set about trying to better understand the interplay between this new mechanism and the generally established mechanism for circularly polarized emission in the same chiral materials”.

Light handedness controlled by molecular chirality

The CP light handedness of an organic emissive molecule is controlled by its chirality. A chiral molecule is one that has two mirror-image structural isomers that can’t be superimposed on top of each other. Each of these non-superimposable molecules is called an enantiomer, and will absorb, emit and refract CP light with a defined spin angular momentum. Each enantiomer will produce CP light with a different handedness, through an optical mechanism called normal circularly polarized electroluminescence (NCPE).

OLED designs typically require access to both enantiomers, but most chemical synthesis processes will produce racemic mixtures (equal amounts of the two enantiomers) that are difficult to separate. Extracting each enantiomer so that they can be used individually is complex and expensive, but the research at Oxford has simplified this process by using a molecule that can switch between emitting left- and right-handed CP light.

The molecule in question is a helical molecule called (P)-aza[6]helicene, which is the right-handed enantiomer. Even though it is just a one-handed form, the researchers found a way to control the handedness of the OLED, enabling it to switch between both forms.

Switching handedness without changing the structure

The researchers designed the helicene molecules so that the handedness of the light could be switched electrically, without needing to change the structure of the material itself. “Our work shows that either handedness can be accessed from a single-handed chiral material without changing the composition or thickness of the emissive layer,” says Fuchter. “From a practical standpoint, this approach could have advantages in future circularly polarized OLED technologies.”

Instead of making a structural change, the researchers changed the way that the electric charges are recombined in the device, using interlayers to alter the recombination position and charge carrier mobility inside the device. Depending on where the recombination zone is located, this leads to situations where there is balanced or unbalanced charge transport, which then leads to different handedness of CP light in the device.

When the recombination zone is located in the centre of the emissive layer, the charge transport is balanced, which generates an NCPE mechanism. In these situations, the helicene adopts its normal handedness (right handedness).

However, when the recombination zone is located close to one of the transport layers, it creates an unbalanced charge transport mechanism called anomalous circularly polarized electroluminescence (ACPE). The ACPE overrides the NCPE mechanism and inverts the handedness of the device to left handedness by altering the balance of induced orbital angular momentum in electrons versus holes. The presence of these two electroluminescence mechanisms in the device enables it to be controlled electrically by tuning the charge carrier mobility and the recombination zone position.

The research allows the creation of OLEDs with controllable spin angular momentum information using a single emissive enantiomer, while probing the fundamental physics of chiral optoelectronics. “This work contributes to the growing body of evidence suggesting further rich physics at the intersection of chirality, charge and spin. We have many ongoing projects to try and understand and exploit such interplay,” Fuchter concludes.

The researchers describe their findings in Nature Photonics.

The post Organic LED can electrically switch the handedness of emitted light appeared first on Physics World.

Francis Crick: a life of twists and turns

7 janvier 2026 à 12:00

Physicist, molecular biologist, neuroscientist: Francis Crick’s scientific career took many turns. And now, he is the subject of zoologist Matthew Cobb’s new book, Crick: a Mind in Motion – from DNA to the Brain.

Born in 1916, Crick studied physics at University College London in the mid-1930s, before working for the Admiralty Research Laboratory during the Second World War. But after reading physicist Erwin Schrödinger’s 1944 book What Is Life? The Physical Aspect of the Living Cell, and a 1946 article on the structure of biological molecules by chemist Linus Pauling, Crick left his career in physics and switched to molecular biology in 1947.

Six years later, while working at the University of Cambridge, he played a key role in decoding the double-helix structure of DNA, working in collaboration with biologist James Watson, biophysicist Maurice Wilkins and other researchers including chemist and X-ray crystallographer Rosalind Franklin. Crick, alongside Watson and Wilkins, went on to receive the 1962 Nobel Prize in Physiology and Medicine for the discovery.

Finally, Crick’s career took one more turn in the mid-1970s. After experiencing a mental health crisis, Crick left Britain and moved to California. He took up neuroscience in an attempt to understand the roots of human consciousness, as discussed in his 1994 book, The Astonishing Hypothesis: the Scientific Search for the Soul.

Parallel lives

When he died in 2004, Crick’s office wall at Salk Institute in La Jolla, US, carried portraits of Charles Darwin and Albert Einstein, as Cobb notes on the final page of his deeply researched and intellectually fascinating biography. But curiously, there is not a single other reference to Einstein in Cobb’s massive book. Furthermore, there is no reference at all to Einstein in the equally large 2009 biography of Crick, Francis Crick: Hunter of Life’s Secrets, by historian of science Robert Olby, who – unlike Cobb – knew Crick personally.

Nevertheless, a comparison of Crick and Einstein is illuminating. Crick’s family background (in the shoe industry), and his childhood and youth are in some ways reminiscent of Einstein’s. Both physicists came from provincial business families of limited financial success, with some interest in science yet little intellectual distinction. Both did moderately well at school and college, but were not academic stars. And both were exposed to established religion, but rejected it in their teens; they had little intrinsic respect for authority, without being open rebels until later in life.

The similarities continue into adulthood, with the two men following unconventional early scientific careers. Both of them were extroverts who loved to debate ideas with fellow scientists (at times devastatingly), although they were equally capable of long, solitary periods of concentration throughout their careers. In middle age, they migrated from their home countries – Germany (Einstein) and Britain (Crick) – to take up academic positions in the US, where they were much admired and inspiring to other scientists, but failed to match their earlier scientific achievements.

In their personal lives, both Crick and Einstein had a complicated history with women. Having divorced their first wives, they had a variety of extramarital affairs – as discussed by Cobb without revealing the names of these women – while remaining married to their second wives. Interestingly, Crick’s second wife, Odile Crick (whom he was married to for 55 years) was an artist, and drew the famous schematic drawing of the double helix published in Nature in 1953.

Stories of friendships

Although Cobb misses this fascinating comparison with Einstein, many other vivid stories light up his book. For example, he recounts Watson’s claim that just after their success with DNA in 1953, “Francis winged into the Eagle [their local pub in Cambridge] to tell everyone within hearing distance that we had found the secret of life” – a story that later appeared on a plaque outside the pub.

“Francis always denied he said anything of the sort,” notes Cobb, “and in 2016, at a celebration of the centenary of Crick’s birth, Watson publicly admitted that he had made it up for dramatic effect (a few years earlier, he had confessed as much to Kindra Crick, Francis’s granddaughter).” No wonder Watson’s much-read 1968 book The Double Helix caused a furious reaction from Crick and a temporary breakdown in their friendship, as Cobb dissects in excoriating detail.

Watson’s deprecatory comments on Franklin helped to provoke the current widespread belief that Crick and Watson succeeded by stealing Franklin’s data. After an extensive analysis of the available evidence, however, Cobb argues that the data was willingly shared with them by Franklin, but that they should have formally asked her permission to use it in their published work – “Ambition, or thoughtlessness, stayed their hand.”

In fact, it seems Crick and Franklin were friends in 1953, and remained so – with Franklin asking Crick for his advice on her draft scientific papers – until her premature death from ovarian cancer in 1958. Indeed, after her first surgery in 1956, Franklin went to stay with Crick and his wife at their house in Cambridge, and then returned to them after her second operation. There certainly appears to be no breakdown in trust between the two. When Crick was nominated for the Nobel prize in 1961, he openly stated, “The data which really helped us obtain the structure was mainly obtained by Rosalind Franklin.”

As for Crick’s later study of consciousness, Cobb comments, “It would be easy to dismiss Crick’s switch to studying the brain as the quixotic project of an ageing scientist who did not know his limits. After all, he did not make any decisive breakthrough in understanding the brain – nothing like the double helix… But then again, nobody else did, in Crick’s lifetime or since.” One is perhaps reminded once again of Einstein, and his preoccupation during later life with his unified field theory, which remains an open line of research today.

  • 2025 Profile Books £30.00hb 595pp

The post Francis Crick: a life of twists and turns appeared first on Physics World.

A theoretical physicist’s journey through the food and drink industry

6 janvier 2026 à 12:00

Rob Farr is a theorist and computer modeller whose career has taken him down an unconventional path. He studied physics at the University of Cambridge, UK, from 1991 to 1994, staying on to do a PhD in statistical physics. But while many of his contemporaries then went into traditional research fields – such as quantum science, high-energy physics and photonic technologies – Farr got a taste for the food and drink manufacturing industry. It’s a multidisciplinary field in which Farr has worked for more than 25 years.

After leaving academia in 1998, first stop was Unilever’s €13bn foods division. For two decades, latterly as a senior scientist, Farr guided R&D teams working across diverse lines of enquiry – “doing the science, doing the modelling”, as he puts it. Along the way, Farr worked on all manner of consumer products including ice-cream, margarine and non-dairy spreads, as well as “dry” goods such as bouillon cubes. There was also the occasional foray into cosmetics, skin creams and other non-food products.

As a theoretical physicist working in industrial-scale food production, Farr’s focus has always been on the materials science of the end-product and how it gets processed. “Put simply,” says Farr, “that means making production as efficient as possible – regarding both energy and materials use – while developing ‘new customer experiences’ in terms of food taste, texture and appearance.” 

Ice-cream physics

One tasty multiphysics problem that preoccupied Farr for a good chunk of his time at Unilever is ice cream. It is a hugely complex material that Farr likens to a high-temperature ceramic, in the sense that the crystalline part of it is stored very near to the melting point of ice. “Equally, the non-ice phase contains fats,” he says, “so there’s all sorts of emulsion physics and surface science to take into consideration.”

Ice cream also has polymers in the mix, so theoretical modelling needs to incorporate the complex physics of polymer–polymer phase separation as well as polymer flow, or “rheology”, which contributes to the product’s texture and material properties. “Air is another significant component of ice cream,” adds Farr, “which means it’s a foam as well as an emulsion.”

As well as trying to understand how all these subcomponents interact, there’s also the thorny issue of storage. After it’s produced, ice cream is typically kept at low temperatures of about –25 °C – first in the factory, then in transit and finally in a supermarket freezer. But once that tub of salted-caramel or mint choc chip reaches a consumer’s home, it’s likely to be popped in the ice compartment of a fridge freezer at a much milder –6 or –7 °C.

Manufacturers therefore need to control how those temperature transitions affect the recrystallization of ice. This unwanted outcome can lead to phenomena like “sintering” (which makes a harder product) and “ripening” (which can lead to big ice crystals that can be detected in the mouth and detract from the creamy texture).

“Basically, the whole panoply of soft-matter physics comes into play across the production, transport and storage of ice cream,” says Farr. “Figuring out what sort of materials systems will lead to better storage stability or a more consistent product texture are non-trivial questions given that the global market for ice cream is worth in excess of €100bn annually.”

A shot of coffee?

After almost 20 years working at Unilever, in 2017 Farr took up a role as coffee science expert at JDE Peet’s, the Dutch multinational coffee and tea company. Switching from the chilly depths of ice cream science to the dark arts of coffee production and brewing might seem like a steep career phase change, but the physics of the former provides a solid bridge to the latter.

The overlap is evident, for example, in how instant coffee gets freeze-dried – a low-temperature dehydration process that manufacturers use to extend the shelf-life of perishable materials and make them easier to transport. In the case of coffee, freeze drying (or lyophilization, as it’s commonly known) also helps to retain flavour and aromas.

If you want to study a parameter space that’s not been explored before, the only way to do that is to simulate the core processes using fundamental physics

After roasting and grinding the raw coffee beans, manufacturers extract a coffee concentrate using high pressure and water. This extract is then frozen, ground up and placed in a vacuum well below 0 °C. A small amount of heat is applied to sublime the ice away and remove the remaining water from the non-ice phase.

The quality of the resulting freeze-dried instant coffee is better than ordinary instant coffee. However, freeze-drying is also a complex and expensive process, which manufacturers seek to fine-tune by implementing statistical methods to optimize, for example, the amount of energy consumed during production.

Such approaches involve interpolating the gaps between existing experimental data sets, which is where a physics mind-set comes in. “If you want to study a parameter space that’s not been explored before,” says Farr, “the only way to do that is to simulate the core processes using fundamental physics.”

Beyond the production line, Farr has also sought to make coffee more stable when it’s stored at home. Sustainability is the big driver here: JDE Peet’s has committed to make all its packaging compostable, recyclable or reusable by 2030. “Shelf-life prediction has been a big part of this R&D initiative,” he explains. “The work entails using materials science and the physics of mass transfer to develop next-generation packaging and container systems.”

Line of sight

After eight years unpacking the secrets of coffee physics at JDE Peet’s, Farr was given the option to relocate to the Netherlands in mid-2025 as part of a wider reorganization of the manufacturer’s corporate R&D function. However, he decided to stay put in Oxford and is now deciding between another role in the food manufacturing sector, or moving into a new area of research, such as nuclear energy, or even education.

Rob Farr stood in front of a blackboard
Cool science “The whole panoply of soft-matter physics comes into play across the production, transport and storage of ice-cream,” says industrial physicist Rob Farr. (Courtesy: London Institute for Mathematical Sciences)

Farr believes he gained a lot from his time at JDE Peet’s. As well as studying a wide range of physics problems, he also benefited from the company’s rigorous approach to R&D, whereby projects are regularly assessed for profitability and quickly killed off if they don’t make the cut. Such prioritization avoids wasted effort and investment, but it also demands agility from staff scientists, who have to build long-term research strategies against a project landscape in constant flux.

A senior scientist needs to be someone who colleagues come to informally to discuss their technical challenges

To thrive in that setting, Farr says collaboration and an open mind are essential. “A senior scientist needs to be someone who colleagues come to informally to discuss their technical challenges,” he says. “You can then find the scientific question which underpins seemingly disparate problems and work with colleagues to deliver commercially useful solutions.” For Farr, it’s a self-reinforcing dynamic. “As more people come to you, the more helpful you become – and I love that way of working.”

What Farr calls “line-of-sight” is another unique feature of industrial R&D in food materials. “Maybe you’re only building one span of a really long bridge,” he notes, “but when you can see the process end-to-end, as well as your part in in it, that is a fantastic motivator.” Indeed, Farr believes that for physicists who want a job doing something useful, the physics of food materials makes a great career. “There are,” he concludes, “no end of intriguing and challenging research questions.”

The post A theoretical physicist’s journey through the food and drink industry appeared first on Physics World.

Band-aid like wearable sensor continuously monitors foetal movement

5 janvier 2026 à 15:00
Pressure and strain sensors on a clinical trial volunteer
Multimodal monitoring Pressure and strain sensors on a clinical trial volunteer undergoing an ultrasound scan (left). Snapshot image of the ultrasound video recording (right). (Courtesy: Yap et al., Sci. Adv. 11 eady2661)

The ability to continuously monitor and interpret foetal movement patterns in the third trimester of a pregnancy could help detect any potential complications and improve foetal wellbeing. Currently, however, such assessment of foetal movement is performed only periodically, with an ultrasound exam at a hospital or clinic.

A lightweight, easily wearable, adhesive patch-based sensor developed by engineers and obstetricians at Monash University in Australia may change this. The patches, two of which are worn on the abdomen, can detect foetal movements such as kicking, waving, hiccups, breathing, twitching, and head and trunk motion.

Reduced foetal movement can be associated with potential impairment in the central nervous system and musculoskeletal system, and is a common feature observed in pregnancies that end in foetal death and stillbirth. A foetus compromised in utero may reduce movements as a compensatory strategy to lower oxygen consumption and conserve energy.

To help identify foetuses at risk of complications, the Monash team developed an artificial intelligence (AI)-powered wearable pressure–strain combo sensor system that continuously and accurately detects foetal movement-induced motion in the mother’s abdominal skin. As reported in Science Advances, the “band-aid”-like sensors can discriminate between foetal and non-foetal movement with over 90% accuracy.

The system comprises two soft, thin and flexible patches designed to conform to the abdomen of a pregnant woman. One patch incorporates an octagonal gold nanowire-based strain sensor (the “Octa” sensor), the other is an interdigitated electrode-based pressure sensor.

Pressure and strain combo sensor system
Pressure and strain combo Photograph of the sensors on a pregnant mother (A). Exploded illustration of the foetal kicks strain sensor (B) and the pressure sensor (C). Dimensions of the strain (D) and pressure (E) sensors. (Courtesy: Yap et al., Sci. Adv. 11 eady2661)

The patches feature a soft polyimide-based flexible printed circuit (FPC) that integrates a thin lithium polymer battery and various integrated circuit chips, including a Bluetooth radiofrequency system for reading the sensor’s electrical resistance, storing data and communicating with a smartphone app. Each patch is encapsulated with kinesiology tape and sticks to the abdomen using a medical double-sided silicone adhesive.

The Octa sensor is attached to a separate FPC connector attached to the primary device, enabling easy replacement after each study. The pressure sensor is mounted on the silicone adhesive, to connect with the interdigitated electrode beneath the primary device. The Octa and pressure sensor patches are lightweight (about 3 g) and compact, measuring 63 x 30 x 4 mm and 62 x 28 x 2 mm, respectively.

Trialling the device

The researchers validated their foetal movement monitoring system via comparison with simultaneous ultrasound exams, examining 59 healthy pregnant women at Monash Health. Each participant had the pressure sensor attached to the area of their abdomen where they felt the most vigorous foetal movements, typically in the lower quadrant, while the strain sensor was attached to the region closest to foetal limbs. An accelerometer placed on the participant’s chest captured non-foetal movement data for signal denoising and training the machine-learning model.

Principal investigator Wenlong Cheng, now at the University of Sydney, and colleagues report that “the wearable strain sensor featured isotropic omnidirectional sensitivity, enabling detection of maternal abdominal [motion] over a large area, whereas the wearable pressure sensor offered high sensitivity with a small domain, advantageous for accurate localized foetal movement detection”.

The researchers note that the pressure sensor demonstrated higher sensitivity to movements directly beneath it compared with motion farther away, while the Octa sensor performed consistently across a wider sensing area. “The combination of both sensor types resulted in a substantial performance enhancement, yielding an overall AUROC [area under the receiver operating characteristic curve] accuracy of 92.18% in binary detection of foetal movement, illustrating the potential of combining diverse sensing modalities to achieve more accurate and reliable monitoring outcomes,” they write.

In a press statement, co-author Fae Marzbanrad explains that the device’s strength lies in a combination of soft sensing materials, intelligent signal processing and AI. “Different foetal movements create distinct strain patterns on the abdominal surface, and these are captured by the two sensors,” she says. “The machine-learning system uses the signals to detect when movement occurs while cancelling maternal movements.”

The lightweight and flexible device can be worn by pregnant women for long periods without disrupting daily life. “By integrating sensor data with AI, the system automatically captures a wider range of foetal movements than existing wearable concepts while staying compact and comfortable,” Marzbanrad adds.

The next steps towards commercialization of the sensors will include large-scale clinical studies in out-of-hospital settings, to evaluate foetal movements and investigate the relationship between movement patterns and pregnancy complications.

The post Band-aid like wearable sensor continuously monitors foetal movement appeared first on Physics World.

Unlocking novel radiation beams for cancer treatment with upright patient positioning

5 janvier 2026 à 13:27

Since the beginning of radiation therapy, almost all treatments have been delivered with the patient lying on a table while the beam rotates around them. But a resurgence in upright patient positioning is changing that paradigm. Novel radiation accelerators such as proton therapy, VHEE, and FLASH therapy are often too large to rotate around the patient, making access limited. By instead rotating the patient, these previously hard-to-access beams could now become mainstream in the future.

Join leading clinicians and experts as they discuss how this shift in patient positioning is enabling exploration of new treatment geometries and supporting the development of advanced future cancer therapies.

L-R Serdar Charyyev, Eric Deutsch, Bill Loo, Rock Mackie

Novel beams covered and their representative speaker

Serdar Charyyev – Proton Therapy – Clinical Assistant Professor at Stanford University School of Medicine
Eric Deutsch – VHEE FLASH – Head of Radiotherapy at Gustave Roussy
Bill Loo – FLASH Photons – Professor of Radiation Oncology at Stanford Medicine
Rock Mackie – Emeritus Professor at University of Wisconsin and Co-Founder and Chairman of Leo Cancer Care

The post Unlocking novel radiation beams for cancer treatment with upright patient positioning appeared first on Physics World.

The environmental and climate cost of war

2 janvier 2026 à 12:00

Despite not being close to the frontline of Russia’s military assault on Ukraine, life at the Ivano-Frankivsk National Technical University of Oil and Gas is far from peaceful. “While we continue teaching and research, we operate under constant uncertainty – air raid alerts, electricity outages – and the emotional toll on staff and students,” says Lidiia Davybida, an associate professor of geodesy and land management.

Last year, the university became a target of a Russian missile strike, causing extensive damage to buildings that still has not been fully repaired – although, fortunately, no casualties were reported. The university also continues to leak staff and students to the war effort – some of whom will tragically never return – while new student numbers dwindle as many school graduates leave Ukraine to study abroad.

Despite these major challenges, Davybida and her colleagues remain resolute. “We adapt – moving lectures online when needed, adjusting schedules, and finding ways to keep research going despite limited opportunities and reduced funding,” she says.

Resolute research

Davybida’s research focuses on environmental monitoring using geographic information systems (GIS), geospatial analysis and remote sensing. She has been using these techniques to monitor the devastating impact that the war is having on the environment and its significant contribution to climate change.

In 2023 she published results from using Sentinel-5P satellite data and Google Earth Engine to monitor the air quality impacts of war on Ukraine (IOP Conf. Ser.: Earth Environ. Sci. 1254 012112). As with the COVID-19 lockdowns worldwide, her results reveal that levels of common pollutants such as carbon monoxide, nitrogen dioxide and sulphur dioxide were, on average, down from pre-invasion levels. This reflects the temporary disruption to economic activity that war has brought on the country.

Rescue workers lift an elder person on a stretcher out of flood water
Wider consequences Ukrainian military, emergency services and volunteers work together to rescue people from a large flooded area in Kherson on 8 June 2023. Two days earlier, the Russian army blew up the dam of the Kakhovka hydroelectric power station, meaning about 80 settlements in the flood zone had to be evacuated. (Courtesy: Sergei Chuzavkov/SOPPA Images/Shutterstock)

More worrying, from an environment and climate perspective, were the huge concentrations of aerosols, smoke and dust in the atmosphere. “High ozone concentrations damage sensitive vegetation and crops,” Davybida explains. “Aerosols generated by explosions and fires may carry harmful substances such as heavy metals and toxic chemicals, further increasing environmental contamination.” She adds that these pollutants can alter sunlight absorption and scattering, potentially disrupting local climate and weather patterns, and contributing to long-term ecological imbalances.

A significant toll has been wrought by individual military events too. A prime example is Russia’s destruction of the Kakhovka Dam in southern Ukraine in June 2023. An international team – including Ukrainian researchers – recently attempted to quantify this damage by combining on-the-ground field surveys, remote-sensing data and hydrodynamic modelling; a tool they used for predicting water flow and pollutant dispersion.

The results of this work are sobering (Science 387 1181). Though 80% of the ecosystem is expected to re-establish itself within five years, the dam’s destruction released as much as 1.7 cubic kilometres of sediment contaminated by a host of persistent pollutants, including nitrogen, phosphorous and 83,000 tonnes of heavy metals. Discharging this toxic sludge across the land and waterways will have unknown long-term environmental consequences for the region, as the contaminants could be spread by future floods, the researchers concluded (figure 1).

1 Dam destruction

Map of Ukraine with a large area of coastline highlighted in orange and smaller inland areas highlighted green
(Reused with permission from Science 387 1181 10.1126/science.adn8655)

This map shows areas of Ukraine affected or threatened by dam destruction in military operations. Arabic numbers 1 to 6 indicate rivers: Irpen, Oskil, Inhulets, Dnipro, Dnipro-Bug Estuary and Dniester, respectively. Roman numbers I to VII indicate large reservoir facilities: Kyiv, Kaniv, Kremenchuk, Kaminske, Dnipro, Kakhovka and Dniester, respectively. Letters A to C indicate nuclear power plants: Chornobyl, Zaporizhzhia and South Ukraine, respectively.

Dangerous data

A large part of the reason for the researchers’ uncertainty, and indeed more general uncertainty in environmental and climate impacts of war, stems from data scarcity. It is near-impossible for scientists to enter an active warzone to collect samples and conduct surveys and experiments. Environmental monitoring stations also get damaged and destroyed during conflict, explains Davybida – a wrong she is attempting to right in her current work. Many efforts to monitor, measure and hopefully mitigate the environmental and climate impact of the war in Ukraine are therefore less direct.

In 2022, for example, climate-policy researcher Mathijs Harmsen from the PBL Netherlands Environmental Assessment Agency and international collaborators decided to study the global energy crisis (which was sparked by Russia’s invasion of Ukraine) to look at how the war will alter climate policy (Environ. Res. Lett. 19 124088).

They did this by plugging in the most recent energy price, trade and policy data (up to May 2023) into an integrated assessment model that simulates the environmental consequences of human activities worldwide. They then imposed different potential scenarios and outcomes and let it run to 2030 and 2050. Surprisingly, all scenarios led to a global reduction of 1–5% of carbon dioxide emissions by 2030, largely due to trade barriers increasing fossil fuel prices, which in turn would lead to increased uptake of renewables.

But even though the sophisticated model represents the global energy system in detail, some factors are hard to incorporate and some actions can transform the picture completely, argues Harmsen. “Despite our results, I think the net effect of this whole war is a negative one, because it doesn’t really build trust or add to any global collaboration, which is what we need to move to a more renewable world,” he says. “Also, the recent intensification of Ukraine’s ‘kinetic sanctions’ [attacks on refineries and other fossil fuel infrastructure] will likely have a larger effect than anything we explored in our paper.”

Elsewhere, Toru Kobayakawa was, until recently, working for the Japan International Cooperation Agency (JICA), leading the Ukraine support team. Kobayakawa used a non-standard method to more realistically estimate the carbon footprint of reconstructing Ukraine when the war ends (Environ. Res.: Infrastruct. Sustain. 5 015015). The Intergovernmental Panel on Climate Change (IPCC) and other international bodies only account for carbon emissions within the territorial country. “The consumption-based model I use accounts for the concealed carbon dioxide from the production of construction materials like concrete and steel imported from outside of the country,” he says.

Using an open-source database Eora26 that tracks financial flows between countries’ major economic sectors in simple input–output tables, Kobayakawa calculated that Ukraine’s post-war reconstruction will amount to 741 million tonnes carbon dioxide equivalent over 10 years. This is 4.1 times Ukraine’s pre-war annual carbon-dioxide emissions, or the combined annual emissions of Germany and Austria.

However, as with most war-related findings, these figures come with a caveat. “Our input–output model doesn’t take into account the current situation,” notes Kobayakawa “It is the worst-case scenario.” Nevertheless, the research has provided useful insights, such as that the Ukrainian construction industry will account for 77% of total emissions.

“Their construction industry is notorious for inefficiency, needing frequent rework, which incurs additional costs, as well as additional carbon-dioxide emissions,” he says. “So, if they can improve efficiency by modernizing construction processes and implementing large-scale recycling of construction materials, that will contribute to reducing emissions during the reconstruction phase and ensure that they build back better.”

Military emissions gap

As the experiences of Davybida, Harmsen and Kobayakawa show, cobbling together relevant and reliable data in the midst of war is a significant challenge, from which only limited conclusions can be drawn. Researchers and policymakers need a fuller view of the environmental and climate cost of war if they are to improve matters once a conflict ends.

That’s certainly the view of Benjamin Neimark, who studies geopolitical ecology at Queen Mary University of London. He has been trying for some time to tackle the fact that the biggest data gap preventing accurate estimates of the climate and environmental cost of war is military emissions. During the 2021 United Nations Climate Change Conference (COP26), for example, he and colleagues partnered with the Conflict and Environment Observatory (CEOBS) to launch The Military Emissions Gap, a website to track and trace what a country accounts for as its military emissions to the United Nations Framework Convention on Climate Change (UNFCCC).

At present, reporting military emissions is voluntary, so data are often absent or incomplete – but gathering such data is vital. According to a 2022 estimate extrapolated from the small number of nations that do share their data, the total military carbon footprint is approximately 5.5% of global emissions. This would make the world’s militaries the fourth biggest carbon emitter if they were a nation.

The website is an attempt to fill this gap. “We hope that the UNFCCC picks up on this and mandates transparent and visible reporting of military emissions,” Neimark says (figure 2).

2 Closing the data gap

Five sets of icons indicating categories of military and conflict-related carbon emissions
(Reused with permission from Neimark et al. 2025 War on the Climate: A Multitemporal Study of Greenhouse Gas Emissions of the Israel–Gaza Conflict. Available at SSRN)

Current United Nations Framework Convention on Climate Change (UNFCCC) greenhouse-gas emissions reporting obligations do not include all the possible types of conflict emissions, and there is no commonly agreed methodology or scope on how different countries collect emissions data. In a recent publication War on the Climate: a Multitemporal Study of Greenhouse Gas Emissions of the Israel-Gaza Conflict, Benjamin Neimark et al. came up with this framework, using the UNFCCC’s existing protocols. These reporting categories cover militaries and armed conflicts, and hope to highlight previously “hidden” emissions.

Measuring the destruction

Beyond plugging the military emissions gap, Neimark is also involved in developing and testing methods that he and other researchers can use to estimate the overall climate impact of war. Building on foundational work from his collaborator, Dutch climate specialist Lennard de Klerk – who developed a methodology for identifying, classifying and providing ways of estimating the various sources of emissions associated with the Russia–Ukraine war – Neimark and colleagues are trying to estimate the greenhouse-gas emissions from the Israel–Gaza conflict.

Their studies encompass pre-conflict preparation, the conflict itself and post-conflict reconstruction. “We were working with colleagues who were doing similar work in Ukraine, but every war is different,” says Neimark. “In Ukraine, they don’t have large tunnel networks, or they didn’t, and they don’t have this intensive, incessant onslaught of air strikes from carbon-intensive F16 fighter aircraft.” Some of these factors, like the carbon impact of Hamas’ underground maze of tunnels under Gaza, seem unquantifiable, but Neimark has found a way.

“There’s some pretty good data for how big these are in terms of height, the amount of concrete, how far down they’re dug and how thick they are,” says Neimark. “It’s just the length we had to work out based on reported documentation.” Finding the total amount of concrete and steel used in these tunnels involved triangulating open-source information with media reports to finalize an estimate of the dimensions of these structures. Standard emission factors could then be applied to obtain the total carbon emissions. According to data from Neimark’s Confronting Military Greenhouse Gas Emissions report, the carbon emissions from construction of concrete infrastructure by both Israel and Hamas were more than the annual emissions of 33 individual countries and territories (figure 3).

3 Climate change and the Gaza war

Three lists of headline facts and figures about carbon emissions from the Israel-Gaza war, split into direct military actions, large war-related infrastructure, and future rebuilding)
(Reused with permission from Neimark et al. 2024 Confronting Military Greenhouse Gas Emissions, Interactive Policy Brief, London, UK. Available from QMUL.)

Data from Benjamin Neimark, Patrick Bigger, Frederick Otu-Larbi and Reuben Larbi’s Confronting Military Greenhouse Gas Emissions report estimates the carbon emissions of the war in Gaza for three distinct periods: direct war activities; large-scale war infrastructure; and future reconstruction.

The impact of Hamas’ tunnels and Israel’s “iron wall” border fence are just two of many pre-war activities that must be factored in to estimate the Israel–Gaza conflict’s climate impact. Then, the huge carbon cost of the conflict itself must be calculated, including, for example, bombing raids, reconnaissance flights, tanks and other vehicles, cargo flights and munitions production.

Gaza’s eventual reconstruction must also be included, which makes up a big proportion of the total impact of the war, as Kobayakawa’s Ukraine reconstruction calculations showed. The United Nations Environment Programme (UNEP) has been systematically studying and reporting on “Sustainable debris management in Gaza” as it tracks debris from damaged buildings and infrastructure in Gaza since the outbreak of the conflict in October 2023. Alongside estimating the amounts of debris, UNEP also models different management scenarios – ranging from disposal to recycling – to evaluate the time, resource needs and environmental impacts of each option.

Visa restrictions and the security situation have prevented UNEP staff from entering the Gaza strip to undertake environmental field assessments to date. “While remote sensing can provide a valuable overview of the situation … findings should be verified on the ground for greater accuracy, particularly for designing and implementing remedial interventions,” says a UNEP spokesperson. They add that when it comes to the issue of contamination, UNEP needs “confirmation through field sampling and laboratory analysis” and that UNEP “intends to undertake such field assessments once conditions allow”.

The main risk from hazardous debris – which is likely to make up about 10–20% of the total debris – arises when it is mixed with and contaminates the rest of the debris stock. “This underlines the importance of preventing such mixing and ensuring debris is systematically sorted at source,” adds the UNEP spokesperson.

The ultimate cost

With all these estimates, and adopting a Monte Carlo analysis to account for uncertainties, Neimark and colleagues concluded that, from the first 15 months of the Israel–Gaza conflict, total carbon emissions were 32 million tonnes, which is huge given that the territory has a total area of just 365 km². The number also continues to rise.

Khan Younis in ruins
Rubble and ruins Khan Younis in the Gaza Strip on 11 February 2025, showing the widespread damage to buildings and infrastructure. (Courtesy: Shutterstock/Anas Mohammed)

Why does this number matter? When lives are being lost in Gaza, Ukraine, and across Sudan, Myanmar and other regions of the world, calculating the environmental and climate cost of war might seem like something only worth bothering about when the fighting stops.

But doing so even while conflicts are taking place can help protect important infrastructure and land, avoid environmentally disastrous events, and to ensure the long rebuild, wherever the conflict may be happening, is informed by science. The UNEP spokesperson says that it is important to “systematically integrate environmental considerations into humanitarian and early recovery planning from the outset” rather than treating the environment as an afterthought. They highlight that governments should “embed it within response plans – particularly in areas where it can directly impact life-saving activities, such as debris clearance and management”.

With Ukraine still in the midst of war, it seems right to leave the final word to Davybida. “Armed conflicts cause profound and often overlooked environmental damage that persists long after the fighting stops,” she says. “Recognizing and monitoring these impacts is vital to guide practical recovery efforts, protect public health, prevent irreversible harm to ecosystems and ensure a sustainable future.”

The post The environmental and climate cost of war appeared first on Physics World.

Exploring the icy moons of the solar system

30 décembre 2025 à 12:00

Our blue planet is a Goldilocks world. We’re at just the right distance from the Sun that Earth – like Baby Bear’s porridge – is not too hot or too cold, allowing our planet to be bathed in oceans of liquid water. But further out in our solar system are icy moons that eschew the Goldilocks principle, maintaining oceans and possibly even life far from the Sun.

We call them icy moons because their surface, and part of their interior, is made of solid water-ice. There are over 400 icy moons in the solar system – most are teeny moonlets just a few kilometres across, but a handful are quite sizeable, from hundreds to thousands of kilometres in diameter. Of the big ones, the best known are Jupiter’s moons, Europa, Ganymede and Callisto, and Saturn’s Titan and Enceladus.

Yet these moons are more than just ice. Deep beneath their frozen shells – some –160 to –200 °C cold and bathed in radiation – lie oceans of water, kept liquid thanks to tidal heating as their interiors flex in the strong gravitational grip of their parent planets. With water being a prerequisite for life as we know it, these frigid systems are our best chance for finding life beyond Earth.

The first hints that these icy moons could harbour oceans of liquid water came when NASA’s Voyager 1 and 2 missions flew past Jupiter in 1979. On Europa they saw a broken and geologically youthful-looking surface, just millions of years old, featuring dark cracks that seemed to have slushy material welling up from below. Those hints turned into certainty when NASA’s Galileo mission visited Jupiter between 1995 and 2003. Gravity and magnetometer experiments proved that not only does Europa contain a liquid layer, but so do Ganymede and Callisto.

Meanwhile at Saturn, NASA’s Cassini spacecraft (which arrived in 2004) encountered disturbances in the ringed planet’s magnetic field. They turned out to be caused by plumes of water vapour erupting out of giant fractures splitting the surface of Enceladus, and it is believed that this vapour originates from an ocean beneath the moon’s ice shell. Evidence for an ocean on Titan is a little less certain, but gravity and radio measurements performed by Cassini and its European-built lander Huygens point towards the possibility of some liquid or slushy water beneath the surface.

Water, ice and JUICE

“All of these ocean worlds are going to be different, and we have to go to all of them to understand the whole spectrum of icy moons,” says Amanda Hendrix, director of the Planetary Science Institute in Arizona, US. “Understanding what their oceans are like can tell us about habitability in the solar system and where life can take hold and evolve.”

To that end, an armada of spacecraft will soon be on their way to the icy moons of the outer planets, building on the successes of their predecessors Voyager, Galileo and Cassini–Huygens. Leading the charge is NASA’s Europa Clipper, which is already heading to Jupiter. Clipper will reach its destination in 2030, with the Jupiter Icy moons Explorer (JUICE) from the European Space Agency (ESA) just a year behind it. Europa is the primary target of scientists because it is possibly Jupiter’s most interesting moon as a result of its “astrobiological potential”. That’s the view of Olivier Witasse, who is JUICE project scientist at ESA, and it’s why Europa Clipper will perform nearly 50 fly-bys of the icy moon, some as low as 25 km above the surface. JUICE will also visit Europa twice on its tour of the Jovian system.

The challenge at Europa is that it’s close enough to Jupiter to be deep inside the giant planet’s magnetosphere, which is loaded with high-energy charged particles that bathe the moon’s surface in radiation. That’s why Clipper and JUICE are limited to fly-bys; the radiation dose in orbit around Europa would be too great to linger. Clipper’s looping orbit will take it back out to safety each time. Meanwhile, JUICE will focus more on Callisto and Ganymede – which are both farther out from Jupiter than Europa is – and will eventually go into orbit around Ganymede.

“Ganymede is a super-interesting moon,” says Witasse. For one thing, at 5262 km across it is larger than Mercury, a planet. It also has its own intrinsic magnetic field – one of only three solid bodies in the solar system to do so (the others being Mercury and Earth).

Beneath the icy exterior

It’s the interiors of these moons that are of the most interest to JUICE and Clipper. That’s where the oceans are, hidden beneath many kilometres of ice. While the missions won’t be landing on the Jovian moons, these internal structures aren’t as inaccessible as we might at first think. In fact, there are three independent methods for probing them.

A cross section of Europa
Many layers A cross section of Jupiter’s moon Europa, showing its internal layering: a rocky core and ocean floor (possibly with hydrothermal vents), the ocean itself and the ice shell above. (Courtesy: NASA/JPL–Caltech)

If a moon’s ocean contains salts or other electrically conductive contaminants, interesting things happen when passing through the parent planet’s variable magnetic field. “The liquid is a conductive layer within a varying magnetic field and that induces a magnetic field in the ocean that we can measure with a magnetometer using Faraday’s law,” says Witasse. The amount of salty contaminants, plus the depth of the ocean, influence the magnetometer readings.

Then there’s radio science – the way that an icy moon’s mass bends a radio signal from a spacecraft to Earth. By making multiple fly-bys with different trajectories during different points in a moon’s orbit around its planet, the moon’s gravity field can be measured. Once that is known to exacting detail, it can be applied to models of that moon’s internal structure.

Perhaps the most remarkable method, however, is using a laser altimeter to search for a tidal bulge in the surface of a moon. This is exactly what JUICE will be doing when in orbit around Ganymede. Its laser altimeter will map the shape of the surface – such as hills and crevasses – but gravitational tidal forces from Jupiter are expected to cause a bulge on the surface, deforming it by 1–10 m. How large the bulge is depends upon how deep the ocean is.

“If the surface ice is sitting above a liquid layer then the tide will be much bigger because if you sit on liquid, you are not attached to the rest of the moon,” says Witasse. “Whereas if Ganymede were solid the tide would be quite small because it is difficult to move one big, solid body.”

As for what’s below the oceans, those same gravity and radio-science experiments during previous missions have given us a general idea about the inner structures of Jupiter’s Europa, Ganymede and Callisto. All three have a rocky core. Inside Europa, the ocean surrounds the core, with a ceiling of ice above it. The rock–ocean interface potentially provides a source of chemical energy and nutrients for the ocean and any life there.

Ganymede’s interior structure is more complex. Separating the 3400 km-wide rocky core and the ocean is a layer, or perhaps several layers, of high-pressure ice, and there is another ice layer above the ocean. Without that rock–ocean interface, Ganymede is less interesting from an astrobiological perspective.

Meanwhile, Callisto, being the farthest from Jupiter, receives the least tidal heating of the three. This is reflected in Callisto’s lack of evolution, with its interior having not differentiated into layers as distinct as Europa and Ganymede. “Callisto looks very old,” says Witasse. “We’re seeing it more or less as it was at the beginning of the solar system.”

Crazy cryovolcanism

Tidal forces don’t just keep the interiors of the icy moons warm. They can also drive dramatic activity, such as cryovolcanoes – icy eruptions that spew out gases and volatile materials like liquid water (which quickly freezes in space), ammonia and hydrocarbons. The most obvious example of this is found on Saturn’s Enceladus, where giant water plumes squirt out through “tiger stripe” cracks at the moon’s south pole.

But there’s also growing evidence of cryovolcanism on Europa. In 2012 the Hubble Space Telescope caught sight of what looked like a water plume jetting out 200 km from the moon. But the discovery is controversial despite more data from Hubble and even supporting evidence found in archive data from the Galileo mission. What’s missing is cast-iron proof for Europa’s plumes. That’s where Clipper comes in.

Three of Jupiter’s moons
By Jove Three of Jupiter’s largest moons have solid water-ice. (Left) Europa, imaged by the JunoCam on NASA’s Juno mission to Jupiter. The surface sports myriad fractures and dark markings. (Middle) Ganymede, also imaged by the Juno mission, is the largest moon in our solar system. (Right) Our best image of ancient Callisto was taken by NASA’s Galileo spacecraft in 2001. The arrival of JUICE in the Jovian system in 2031 will place Callisto under much-needed scrutiny. (CC BY 3.0 NASA/JPL–Caltech/SwRI/MSS/ image processing by Björn Jónsson; CC BY 3.0 NASA/JPL–Caltech/SwRI/MSS/ image processing by Kalleheikki Kannisto; NASA/JPL/DLR)

“We need to find out if the plumes are real,” says Hendrix. “What we do know is if there is plume activity happening on Europa then it’s not as consistent or ongoing as is clearly happening at Enceladus.”

At Enceladus, the plumes are driven by tidal forces from Saturn, which squeeze and flex the 500 km-wide moon’s innards, forcing out water from an underground ocean through the tiger stripes. If there are plumes at Europa then they would be produced the same way, and would provide access to material from an ocean that’s dozens of kilometres below the icy crust. “I think we have a lot of evidence that something is happening at Europa,” says Hendrix.

These plumes could therefore be the key to characterizing the hidden oceans. One instrument on Clipper that will play an important role in investigating the plumes at Europa is an ultraviolet spectrometer, a technique that was very useful on the Cassini mission.

Because Enceladus’ plumes were not known until Cassini discovered them, the spacecraft’s instruments had not been designed to study them. However, scientists were able to use the mission’s ultraviolet imaging spectrometer to analyse the vapour when it was between Cassini and the Sun. The resulting absorption lines in the spectrum showed the plumes to be mostly pure water, ejected into space at a rate of 200 kg per second.

Black and white image of liquid eruptions from a moon's surface
Ocean spray Geysers of water vapour loaded with salts and organic molecules spray out from the tiger stripes on Enceladus. (Courtesy: NASA/JPL/Space Science Institute)

The erupted vapour freezes as it reaches space and some of it snows back down onto the surface. Cassini’s ultraviolet spectrometer was again used, this time to detect solar ultraviolet light reflected and scattered off these icy particles in the uppermost layers of Enceladus’ surface. Scientists found that any freshly deposited snow from the plumes has a different chemistry from older surface material that has been weathered and chemically altered by micrometeoroids and radiation, and therefore a different ultraviolet spectrum.

Icy moon landing

Another two instruments that Cassini’s scientists adapted to study the plumes were the cosmic dust analyser, and the ion and neutral mass spectrometer. When Cassini flew through the fresh plumes and Saturn’s E-ring, which is formed from older plume ejections, it could “taste” the material by sampling it directly. Recent findings from this data indicate that the plumes are rich in salt as well as organic molecules, including aliphatic and cyclic esters and ethers (carbon-bonded acid-based compounds such as fatty acids) (Nature Astron. 9 1662). Scientists also found nitrogen- and oxygen-bearing compounds that play a role in basic biochemistry and which could therefore potentially be building blocks of prebiotic molecules or even life in Enceladus’ ocean.

Direct image of Enceladus showing blue stripes
Blue moon Enceladus, as seen by Cassini in 2006. The tiger stripes are the blue fractures towards the south. (Courtesy: NASA/JPL/Space Science Institute)

While Cassini could only observe Enceladus’ plumes and fresh snow from orbit, astronomers are planning a lander that could let them directly inspect the surface snow. Currently in the technology development phase, it would be launched by ESA sometime in the 2040s to arrive at the moon in 2054, when winter at Enceladus’ southern, tiger stripe-adorned pole turns to spring and daylight returns.

“What makes the mission so exciting to me is that although it looks like every large icy moon has an ocean, Enceladus is one where there is a very high chance of actually sampling ocean water,” says Jörn Helbert, head of the solar system section at ESA, and the science lead on the prospective mission.

The planned spacecraft will fly through the plumes with more sophisticated instruments than Cassini’s, designed specifically to sample the vapour (like Clipper will do at Europa). Yet adding a lander could get us even closer to the plume material. By landing close to the edge of a tiger stripe, a lander would dramatically increase the mission’s ability to analyse the material from the ocean in the form of fresh snow. In particular, it would look for biosignatures – evidence of the ocean being habitable, or perhaps even inhabited by microbes.

However, new research urges caution in drawing hasty conclusions about organic molecules present in the plumes and snow. While not as powerful as Jupiter’s, Saturn also has a magnetosphere filled with high-energy ions that bombard Enceladus. A recent laboratory study, led by Grace Richards of the Istituto Nazionale di Astrofisica e Planetologia Spaziale (IAPS-INAF) in Rome, found that when these ions hit surface-ice they trigger chemical reactions that produce organic molecules, including some that are precursors to amino acids, similar to what Cassini tasted in the plumes.

So how can we be sure that the organics in Enceladus’ plumes originate from the ocean, and not from radiation-driven chemistry on the surface? It is the same quandary for dark patches around cracks on the surface of Europa, which seem to be rich with organic molecules that could either originate via upwelling from the ocean below, or just from radiation triggering organic chemistry. A lander on Enceladus might solve not just the mystery of that particular moon, but provide important pointers to explain what we’re seeing on Europa too.

More icy companions

Enceladus is not Saturn’s only icy moon; there’s Titan too. As the ringed planet’s largest moon at 5150 km across, Titan (like Ganymede) is larger than Mercury. However, unlike the other moons in the solar system, Titan has a thick atmosphere rich in nitrogen and methane. The atmosphere is opaque, hiding the surface from spacecraft in orbit except at infrared wavelengths and radar, which means that getting below the smoggy atmosphere is a must.

ESA did this in 2005 with the Huygens lander, which, as it parachuted down to Titan’s frozen surface, revealed it to be a land of hills and dune plains with river channels, lakes and seas of flowing liquid hydrocarbons. These organic molecules originate from the methane in its atmosphere reacting with solar ultraviolet.

Until recently, it was thought that Titan has a core of rock, surrounded by a shell of high-pressure ice, above which sits a layer of salty liquid water and then an outer crust of water ice. However, new evidence from re-analysing Cassini’s data suggests that rather than oceans of liquid water, Titan has “slush” below the frozen exterior, with pockets of liquid water (Nature 648 556). The team, led by Flavio Petricca from NASA’s Jet Propulsion Laboratory, looked at how Titan’s shape morphs as it orbits Saturn. There is a several-hour lag between the moon passing the peak of Saturn’s gravitational pull and its shape shifting, implying that while there must be some form of non-solid substance below Titan’s surface to allow for deformation, more energy is lost or dissipated than would be if it was liquid water. Instead, the researchers found that a layer of high-pressure ice close to its melting point – or slush – better fits the data.

Titan's atmosphere
Hello halo Titan is different to other icy moons in that it has a thick atmosphere, seen here with the moon in silhouette. (Courtesy: NASA/JPL/Space Science Institute)

To find out more about Titan, NASA is planning to follow in Huygens’ footsteps with the Dragonfly mission but in an excitingly different way. Set to launch in 2028, Dragonfly should arrive at Titan in 2034 where it will deploy a rotorcraft that will fly over the moon’s surface, beneath the smog, occasionally touching down to take readings. Scientists are intending to use Dragonfly to sample surface material with a mass spectrometer to identify organic compounds and therefore better assess Titan’s biological potential. It will also perform atmospheric and geological measurements, even listening for seismic tremors while landed, which could provide further clues about Titan’s interior.

Jupiter and Saturn are also not the only planets to possess icy moons. We find them around Uranus and Neptune too. Even the dwarf planet Pluto and its largest moon Charon have strong similarities to icy moons. Whether any of these bodies, so far out from the Sun, can maintain an ocean is unclear, however.

Recent findings point to an ocean deep inside Uranus’ moon Ariel that may once have been 170 km deep, kept warm by tidal heating (Icarus 444 116822). But over time Ariel’s orbit around Uranus has become increasingly circular, weakening the tidal forces acting on it, and the ocean has partly frozen. Another of Uranus’ moons, Miranda, has a chaotic surface that appears to have melted and refrozen, and the pattern of cracks on its surface strongly suggests that the moon also contains an ocean, or at least did 150 million years ago. A new mission to Uranus is a top priority in the US’s most recent Decadal Review.

It’s becoming clear that icy ocean moons could far outnumber more traditional habitable planets like Earth, not just in our solar system, but across the galaxy (although none have been confirmed yet). Understanding the internal structures of the icy moons in our solar system, and characterizing their oceans, is vital if we are to expand the search for life beyond Earth.

The post Exploring the icy moons of the solar system appeared first on Physics World.

Check your physics knowledge with our bumper end-of-year quiz

24 décembre 2025 à 11:00

How well have you been following events in physics? There are 20 questions in total: blue is your current question and white means unanswered, with green and red being right and wrong.

16–20 Top quark – congratulations, you’ve hit Einstein level

11–15 Strong force – good but not quite Nobel standard

6–10 Weak force – better interaction needed

0–5 Bottom quark – not even wrong

The post Check your physics knowledge with our bumper end-of-year quiz appeared first on Physics World.

ZAP-X radiosurgery and ZAP-Axon SRS planning: technology overview, workflow and complex case insights from a leading SRS centre

24 décembre 2025 à 10:11

ZAP-X is a next-generation, cobalt-free, vault-free stereotactic radiosurgery system purpose-built for the brain. Delivering highly precise, non-invasive treatments with exceptionally low whole-brain and whole-body dose, ZAP-X’s gyroscopic beam delivery, refined beam geometry and fully integrated workflow enable state-of-the-art SRS without the burdens of radioactive sources or traditional radiation bunkers.

Theresa Hofman headshot
Theresa Hofman

Theresa Hofman is deputy head of medical physics at the European Radiosurgery Center Munich (ERCM), specializing in stereotactic radiosurgery with the CyberKnife and ZAP‑X systems. She has been part of the ERCM team since 2018 and has extensive clinical experience with ZAP‑X, one of the first centres worldwide to implement the technology in 2021. Since then, the team has treated more than 900 patients with ZAP‑X, and she is deeply involved in both clinical use and evaluation of its planning software.

She holds a master’s degree in physics from Ludwig Maximilian University of Munich, where she authored two first‑author publications on range verification in carbon‑ion therapy. At ERCM, she has published additional first‑author studies on CyberKnife kidney‑treatment accuracy and on comparative planning between ZAP‑X and CyberKnife. She is currently conducting further research on the latest ZAP‑X planning software. Her work is driven by the goal of advancing high‑quality radiosurgery and ensuring the best possible treatment for every patient.

The post ZAP-X radiosurgery and ZAP-Axon SRS planning: technology overview, workflow and complex case insights from a leading SRS centre appeared first on Physics World.

Physics-based battery model parameterization from impedance data

23 décembre 2025 à 08:18

Electrochemical impedance spectroscopy (EIS) provides valuable insights into the physical processes within batteries – but how can these measurements directly inform physics-based models? In this webinar, we present recent work showing how impedance data can be used to extract grouped parameters for physics-based models such as the Doyle–Fuller–Newman (DFN) model or the reduced-order single-particle model with electrolyte (SPMe).

We will introduce PyBaMM (Python Battery Mathematical Modelling), an open-source framework for flexible and efficient battery simulation, and show how our extension, PyBaMM-EIS, enables fast numerical impedance computation for any implemented model at any operating point. We also demonstrate how PyBOP, another open-source tool, performs automated parameter fitting of models using measured impedance data across multiple states of charge.

Battery modelling is challenging, and obtaining accurate fits can be difficult. Our technique offers a flexible way to update model equations and parameterize models using impedance data.

Join us to see how our tools create a smooth path from measurement to model to simulation.

An interactive Q&A session follows the presentation.

Noël Hallemans headshot
Noël Hallemans

Noël Hallemans is a postdoctoral research assistant in engineering science at the University of Oxford, where he previously lectured in mathematics at St Hugh’s College. He earned his PhD in 2023 from the Vrije Universiteit Brussel and the University of Warwick, focusing on frequency-domain, data-driven modelling of electrochemical systems.

His research at the Battery Intelligence Lab, led by Professor David Howey, integrates electrochemical impedance spectroscopy (EIS) with physics-based modelling to improve understanding and prediction of battery behaviour. He also develops multisine EIS techniques for battery characterisation during operation (for example, charging or relaxation).

 

The Electrochemical Society, Gamry Instruments, BioLogic, EL-Cell logos

The post Physics-based battery model parameterization from impedance data appeared first on Physics World.

Higgs decay to muon–antimuon pairs sheds light on the origin of mass

22 décembre 2025 à 16:51

A new measurement by CERN’s ATLAS Collaboration has strengthened evidence that the masses of fundamental particles originate through their interaction with the Higgs field. Building on earlier results from CERN’s CMS Collaboration, the observations suggest that muon–antimuon pairs (dimuons) can be created by the decay of Higgs bosons.

In the Standard Model of particle physics, the fermionic particles are organized into three different generations, broadly in terms of their masses. The first generation comprises the two lightest quarks (up and down), the lightest lepton (the electron) and the electron neutrino. The second includes the strange and charm quarks, the muon and its neutrino; and the third generation the bottom and top quarks, the tau and its neutrino. In terms of the charged fermions, the top quark is nearly 340,000 times heavier than the lightest – the electron.

All of the quarks and leptons have both right- and left-handed components, which relate to the direction of a particle’s spin relative to its direction of motion (right-handed if both directions are aligned; left-handed if they are anti-aligned).

Right- and left-handed particles are treated the same by the strong and electromagnetic forces, regardless of their generation in the Standard Model. The weak force, however, only acts on left-handed particles.

Flipping handedness

In the 1960s, Steven Weinberg uncovered a theoretical solution to this seemingly bizarre asymmetry. He proposed that the Higgs field acts as a bridge between each particle’s left- and right-handed components, in a way that respects the Standard Model’s underlying symmetry. This interaction causes the particle to constantly flip between its two components, creating a resistance to motion that can be perceived as mass.

However, this deepens the mystery. According to Weinberg’s theory, higher-mass particles must interact more strongly with this Higgs field – but in contrast, the strong and electromagnetic forces can only differentiate between these particles according to their charges (colour and electrical). The question is how does Higgs field can distinguish between particles in different generations if their charges are identical?

Key to solving this mystery will be to observe the decay products of Higgs bosons with different interaction strengths. For stronger interactions, corresponding to heavier generations, these decays should become far more likely.

In 2022, both the ATLAS and CMS collaborations did just this. Through proton–proton collision experiments at CERN’s Large Hadron Collider (LHC), the groups independently observed Higgs bosons decaying to tau–antitau pairs. This relatively common process occurred at the same rate as predicted by theory.

Rare decay

A year earlier, similar experiments by the CMS collaboration probed the second generation by observing muon–antimuon pairs from the decays of Higgs bosons. This rarer event occurs in just 1 in 5000 Higgs decays.

In their latest study, the ATLAS collaboration have now reproduced this CMS result independently. They collided protons at about 13 TeV and observed muon–antimuon pairs in the same range of energies predicted by theory.

Through the improvements they offer on the earlier CMS analysis, these new results bring dimuon observations to a statistical significance of 3.4σ. This is well below the 5σ standard required for the observation to be considered a discovery, so more work is needed.

The research could also provide guidance in the search for much rarer Higgs interactions that involve first-generation particles. This includes decay electron–positron pairs, originating from Higgs bosons which decay in just 1 in 200 million cases.

The research is described in Physical Review Letters.

The post Higgs decay to muon–antimuon pairs sheds light on the origin of mass appeared first on Physics World.

Russia plans to revive abandoned Soviet-era particle accelerator

22 décembre 2025 à 11:00

Russia wants to revive a Soviet-era particle accelerator that has been abandoned since the 1990s. The Kurchatov Institute for High Energy Physics has allocated 176 million rubles ($25m) to assess the current condition of the unfinished 600 GeV Proton Accelerator and Storage Complex (UNK) in Protvino near Moscow. The move is part of plans to strengthen Russia’s technological sovereignty and its activity in high-energy physics.

Although work on the UNK was officially halted in the 1990s, construction only ceased in 2013. At that time, a 21 km tunnel had been built at a depth of 60 m along with underground experimental hall lighting and ventilation systems.

In February 2025, physicist Mikhail Kovalchuk, president of the Kurchatov Institute National Research Center, noted in Russia’s Kommersant newspaper that enormous intellectual and material resources had been invested in the UNK’s design and development before it was cancelled.

According to Kovalchuk, Western sanctions provided an additional impetus to restore the project, as scientists that had previously worked in CERN projects could no longer do so.

“By participating in [CERN] projects, we not only preserved our scientific potential and survived a difficult period, but also enriched ourselves intellectually and technologically,” added Kovalchuk. “Today we are self-sufficient.”

Anatoli Romaniouk, a Russian particle physicist who has worked at CERN since 1990, told Physics World that a revival of the UNK will at least maintain fundamental physics research in Russia.

“If this project is realized, then there is hope that it will be possible to at least somewhat slow down the scientific lag of Russian physics with global science,” says Romaniouk.

While official plans for the accelerator have not been disclosed, it is thought that the proton beam energy could be upgraded to reach 3 TeV. Romaniouk says it is also unclear what kind of science will be done with the accelerator, which will depend on what ideas come forward.

Yet some Russian scientists say that it could be used to produce neutrinos. This would involve putting a neutrino detector nearby to characterize the beam before it is sent some 4000 km towards Lake Baikal where a neutrino detector – the Baikal Deep Underwater Neutrino Telescope – is already installed 1 km underground.

“I think it’s possible to find an area of ​​high-energy physics where the research with the help of this collider could be beneficial,” adds Romaniouk.

The post Russia plans to revive abandoned Soviet-era particle accelerator appeared first on Physics World.

Quantum cluster targets business growth

18 décembre 2025 à 13:52
Julia Sutcliffe (second from the left), Chief Scientific Advisor for the UK's Department for Business and Trade, visits the NQCC's experimental facilities on the Harwell Cluster (Courtesy: NQCC)
Julia Sutcliffe (second from the left), Chief Scientific Advisor for the UK’s Department for Business and Trade, visits the NQCC’s experimental facilities on the Harwell Cluster (Courtesy: NQCC)

Ever since the National Quantum Computing Centre was launched five years ago, its core mission has been to accelerate the pathway towards practical adoption of the technology. That has required technical innovation to scale up hardware platforms and create the software tools and algorithms needed to tackle real-world applications, but there has also been a strong focus on engaging with companies to build connections, provide access to quantum resources, and identify opportunities for deriving near-term value from quantum computing.

It makes sense, then, that the NQCC should form the cornerstone of a new Quantum Cluster at the Harwell Campus of Science and Innovation in Oxfordshire. The hope is that the NQCC’s technical expertise and infrastructure, combined with the services and facilities available on the wider Harwell Campus, will provide a magnet for new quantum start-ups as well as overseas companies that are seeking to establish a presence within the UK’s quantum ecosystem.

By accelerating collaboration across government, industry and academia, we will turn research excellence into industrial strength.

“We want to leverage the public investment that has been made into the NQCC to catalyse business growth and attract more investment into the UK’s quantum sector,” said Najwa Sidqi, manager of the Harwell Quantum Cluster, at the official launch event in November. “By accelerating collaboration across government, industry and academia, we will turn research excellence into industrial strength.”

The cluster, which has been ramping up its activities over the last year, is working to ambitious targets. Over the next decade the aim is to incubate at least 100 quantum companies on the Harwell site, create more than 1000 skilled jobs, and generate more than £1 billion of private and public investment. “Our aim is to build the foundations of a globally competitive quantum economy that delivers impact far beyond science and research,” added Sidqi.

Tangible evidence that the approach works is offered by the previous clustering activities on the Harwell Campus, notably the Space Cluster that has expanded rapidly since its launch in 2010. Anchored by the RAL Space national laboratory and bolstered by the presence of ESA and the UK Space Agency, the Space Cluster now comprises more than 100 organizations that range from small start-ups to the UK technology hubs of global heavyweights such as Airbus and Lockheed Martin.

More generally, the survival rate of start-up companies operating on the Harwell site is around 95%, compared with an average of around 50%. “At Harwell there is a high density of innovators sharing the same space, which generates more connections and more ideas,” said Julia Sutcliffe, Chief Scientific Advisor for the UK’s Department for Business and Trade. “It provides an incredible combination of world-class infrastructure and expertise, accelerating the innovation pathway and helping to create a low-risk environment for early-stage businesses and investors.”

The NQCC has already seeded that innovation activity through its early engagement with both quantum companies and end users of the technology. One major initiative has been the testbed programme, which has enabled the NQCC to invest £30m in seven hardware companies to deploy prototype quantum computers on the Harwell Campus. As well as providing access to operational systems based on all of the leading qubit modalities, the testbed programme has also provided an impetus for inward investment and job creation.

One clear example is provided by QuEra Computing, a US-based spin-off from Harvard University and the Massachusetts Institute of Technology that is developing a hardware platform based on neutral atoms. QuEra was one of the companies to win funding through the testbed programme, with the firm setting up a UK-based team to deploy its prototype system on the Harwell Campus. But the company could soon see the benefits of establishing a UK centre for technology development on the site. “Harwell is immensely helpful to us,” said Ed Durking, Corporate Director of QuEra Computing UK. “It’s a nucleus where we enjoy access to world-class talent, vendors, customers, and suppliers.”

On a practical level, establishing its UK headquarters on the Harwell Campus has provided QueEra with easy access to specialist contractors and services for fitting out and its laboratories. In June the company moved into a building that is fully equipped with flexible lab space for R&D and manufacturing, and since then the UK-based team has started to build the company’s most powerful quantum computer at the facility. Longer term, establishing a base within the UK could open the door to new collaborations and funding opportunities for QuEra to further develop its technology, with the company now focused on integrating full error correction into its neutral-atom platform by 2026.

Access to the world-class infrastructure on the Harwell Campus has benefitted the other testbed providers in different ways. For ORCA Computing, a UK company developing and manufacturing photonic quantum computers, the goal was to install a testbed system within Harwell’s high-performance computing centre rather than the NQCC’s experimental labs. “Our focus is to build commercial photonic quantum systems that can be integrated into conventional datacentres, enabling hybrid quantum-classical workflows for real-world applications,” explained Geoff Barnes, Head of Customer Success at ORCA. “Having the NQCC as an expert customer enabled us to demonstrate and validate our capabilities, building the system in our own facility and then deploying it within an operational environment.”

This process provided a valuable learning experience for the ORCA engineers. The experts at Harwell helped them to navigate the constraints of installing equipment within a live datacentre, while also providing practical assistance with the networking infrastructure. Now that the system is up and running, the Harwell site also provides ORCA with an open environment for showcasing its technology to prospective customers. “As well as delivering a testbed system to the NQCC, we can now demonstrate our platform to clients within a real-world setting,” added Barnes. “It has also been a critical step toward commercial deployment on our roadmap, enabling our partners to access our systems remotely for applications development.”

Michael Cuthbert (left), Director of the NQCC, takes Sutcliffe and other visitors on a tour of the national lab (Courtesy: NQCC)
Michael Cuthbert (left), Director of the NQCC, takes Sutcliffe and other visitors on a tour of the national lab (Courtesy: NQCC)

While the NQCC has already played a vital role in supporting companies as they make the transition towards commercialization, the Quantum Cluster has a wider remit to extend those efforts into other quantum technologies, such as sensing and communications, that are already finding real-world applications. It will also have a more specific focus on attracting new investment into the UK, and supporting the growth of companies that are transitioning from the start-up phase to establish larger scale commercial operations.

“In the UK we have traditionally been successful in creating spin-off activities from our strong research base, but it has been more challenging to generate the large capital investments needed to scale businesses in the technology sector,” commented Sidqi. “We want to strengthen that pipeline to ensure that the UK can translate its leadership in quantum research and early-stage innovation into long-term prosperity.”

To accelerate that process the Quantum Cluster announced a strategic partnership with Quantum Exponential, the first UK-based venture capital fund to be entirely focused on quantum technologies. Ian Pearson, the non-executive chairman of the Quantum Exponential, explained that the company is working to generate an investment fund of £100m by the end of 2027 that will support quantum companies as they commercialize their technologies and scale up their businesses. “Now is the time for investment into quantum sector,” said Pearson. “A specialist quantum fund with the expertise needed to analyse and price deals, and to do all the necessary due diligence, will attract more private investment that will help UK companies to grow and scale.”

Around two-thirds of the investments will be directed towards UK-based companies, and as part of the partnership Quantum Exponential will work with the Quantum Cluster to identify and support high-potential quantum businesses within the Harwell Campus. The Quantum Cluster will also play a crucial role in boosting investor confidence – particularly in the unique ability of the Harwell Campus to nurture successful technology businesses – and making connections with international innovation networks to provide UK-based companies with improved access to global markets.

“This new cluster strengthens our national capability and sends a clear signal to global investors that the UK is the place to develop and scale quantum technologies,” commented Michael Cuthbert, Director of the NQCC. “It will help to ensure that quantum innovation delivers benefits not just for science and industry, but for the economy and society as a whole.”

The post Quantum cluster targets business growth appeared first on Physics World.

Transparent and insulating aerogel could boost energy efficiency of windows

18 décembre 2025 à 13:07

An aerogel material that is more than 99% transparent to light and is an excellent thermal insulator has been developed by Ivan Smalyukh and colleagues at the University of Colorado Boulder in the US. Called MOCHI, the material can be manufactured in large slabs and could herald a major advance in energy-efficient windows.

While the insulating properties of building materials have steadily improved over the past decades, windows have consistently lagged behind. The problem is that current materials used in windows – mostly glass – have an inherent trade-off between insulating ability and optical transparency. This is addressed to some extent by using two or three layers of glass in double- and triple-glazed windows. However, windows remain the largest source of heat loss from most buildings.

A solution to the window problem could lie with aerogels in which the liquid component of a regular gel is replaced with air. This creates solid materials with networks of pores that make aerogels the lightest solid materials ever produced. If the solid component is a poor conductor of heat, then the aerogel will be an extremely good thermal insulator.

“Conventional aerogels, like the silica and cellulose based ones, are common candidates for transparent, thermally insulating materials,” Smalyukh explains. “However, their visible-range optical transparency is intrinsically limited by the scattering induced by their polydisperse pores – which can range from nanometres to micrometres in scale.”

Hazy appearance

While this problem can be overcome fairly easily in thin aerogel films, creating appropriately-sized pores on the scale of practical windows has so far proven much more difficult, leading to a hazy, translucent appearance.

Now, Smalyukh’s team has developed a new fabrication technique involving a removable template. Their approach hinges on the tendency of surfactant molecules called CPCL to self-assemble in water. Under carefully controlled conditions, the molecules spontaneously form networks of cylindrical tubes, called micelles. Once assembled, the aerogel precursor – a silicone material called polysiloxane – condenses around the micelles, freezing their structure in place.

“The ensuing networks of micelle-templated polysiloxane tubes could be then preserved upon the removal of surfactant, and replacing the fluid solvent with air,” Smalyukh describes. The end result was a consistent mesoporous structure, with pores ranging from 2–50 nm in diameter. This is too small to scatter visible light, but large enough to interfere with heat transport.

As a result, the mesoporous, optically clear heat insulator (MOCHI) maintains its transparency even when fabricated in slabs over 3 cm thick and a square metre in area. This suggests that it could be used to create practical windows.

High thermal performance

“We demonstrated thermal conductivity lower than that of still air, as well as an average light transmission above 99%,” Smalyukh says. “Therefore, MOCHI glass units can provide a similar rate of heat transfer to high-performing building roofs and walls, with thicknesses comparable to double pane windows.”

If rolled out on commercial scales, this could lead to entirely new ways to manage interior heating and cooling. According to the team’s calculations, a building retrofitted with MOCHI windows could boost its energy efficiency from around 6% (a typical value in current buildings) to over 30%, while reducing the heat energy passing through by around 50%.

With its ability to admit light while blocking heat transport, the researchers suggest that MOCHI could unlock entirely new functionalities for conventional windows. “Such transparent insulation also allows for efficient harnessing of thermal energy from unconcentrated solar radiation in different climate zones, promising the use of parts of opaque building envelopes as solar thermal energy generating panels,” Smalyukh adds.

The new material is described in Science.

The post Transparent and insulating aerogel could boost energy efficiency of windows appeared first on Physics World.

Learning through laughter at Quantum Carousel 

17 décembre 2025 à 17:00

Quantum physics, kung-fu, LEGO and singing are probably not things you would normally put together. But that’s exactly what happened at this year’s Quantum Carousel 

The event is a free variety show where incredible performers from across academia and industry converge for an evening of science communication. Held in Bristol, UK, on 14 November 2025, this was the second year the event was run – and once again it was entirely sold out.

As organizers, our goal was to bring together those involved in quantum and adjacent fields for an evening of learning and laughter. Each act was only seven minutes long and audience participation was encouraged, with questions saved for the dinner and drinks intervals.

Photo of particpants at Quantum Carousel on stage.
All together now Speakers at Quantum Carousel 2025, which was organized by Zulekha Samiullah (second from right) and Hugh Barrett (far right). (Courtesy: Yolan Ankaine)

The evening kicked off with a rousing speech and song from Chris Stewart, motivating the promotion of science communication and understanding. Felix Flicker related electron spin rotations to armlocks, with a terrific demonstration on volunteer Tony Short, while Michael Berry entertained us all with his eye-opening talk on how quantum physics has democratized music.  

PhD student double act Eesa Ali and Sebastien Bisdee then welcomed volunteers to the stage to see who could align a laser fastest. Maria Violaris expertly taught us the fundamentals of quantum error correction using LEGO.

Mike Shubrook explained the quantum thermodynamics of beer through stand-up comedy. And finally, John Rarity and his assistant Hugh Barrett (event co-organizer and co-author of this article) rounded off the night by demonstrating the magic of entanglement.  

Our event sponsors introduced the food and drinks portions of the evening, with Antonia Seymour (chief executive of IOP Publishing) and Matin Durrani (editor-in-chief of Physics World) opening the dinner interval, while Josh Silverstone (founder and chief executive of Hartley Ultrafast) kickstarted the networking drinks reception.  

Singing praises

Whether it was singing along to an acoustic guitar or rotating hands to emulate electron spin, everyone got involved, and feedback cited audience participation as a highlight.

“The event ran very smoothly, it was lots of fun and a great chance to network in a relaxed atmosphere,” said one attendee. Another added: “The atmosphere was really fun, and it was a really nice event to get loads of the quantum community together in an enjoyable setting.”

Appreciation of the atmosphere went both ways, with one speaker saying that their favourite part of the night was that “the audience was very inviting and easy to perform to”.  

Audience members also enjoyed developing a better understanding of the science that drives their industry. “I understood it and I don’t have any background in physics,” said one attendee. “I feel a marker of being a good scientist is being able to explain it in layperson’s terms.”

Reaching out

With the quantum community rapidly expanding, it needs people from a wide range of backgrounds such as computer science, engineering and business. Quantum Carousel was designed to strike a balance between high-level academic discussion and entertainment through entry-level talks, such as explaining error correction with props, or relating research to impact from stimulated emission to CDs.

By focusing on real-world analogies, these talks can help newcomers to develop an intuitive and memorable understanding. Meanwhile, those already in the field can equip themselves with new ways of communicating elements of their research. 

We look forward to hosting Quantum Carousel again in the future. We want to make it bigger and better, with an even greater range of diverse acts.

But if you’re interested in organizing a similar outreach event of your own, it helps to consider how you can create an environment that can best spark connections between both speakers and attendees. Consider your audience and how your event can attract different people for different reasons. In our case, this included the chance to network, engage with the performances, and enjoy the food and drink. 

  • Quantum Carousel was founded by Zulekha Samiullah in 2024, and she and Hugh Barrett now co-lead the event. Quantum Carousel 2025 was sponsored by the QE-CDT, IOP Publishing and Hartley Ultrafast.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Learning through laughter at Quantum Carousel  appeared first on Physics World.

Korea’s long-term strategy for 2D materials: fundamental science is the secret of success

17 décembre 2025 à 16:03
ibs center image
Scaling up The IBS Center for Van der Waals Quantum Solids (IBS-VdWQS) acts as a catalyst for advances in fundamental materials science and condensed-matter physics. The purpose-built facility is colocated on the campus of POSTECH, one of Korea’s leading universities. (Courtesy: IBS)

What’s the research mission of the IBS Center for Van der Waals Quantum Solids (IBS-VdWQS)?

Our multidisciplinary team aims to create heteroepitaxial van der Waals quantum solids at system scales, where the crystal lattices and symmetries of these novel 2D materials are artificially moulded to atomic precision via epitaxial growth. Over time, we also hope to translate these new solids into quantum device platforms.

Clearly there’s all sorts of exotic materials physics within that remit.

Correct. We form van der Waals heterostructures by epitaxial manipulation of the crystal lattice in diverse, atomically thin 2D materials – for example, 2D heterostructures incorporating graphene, boron nitride or transition-metal dichalcogenides (such as MoS2, WSe2, NbSe2, TaSe2 and so on). Crucially, the material layers are held in place only by weak van der Waals forces and with no dangling chemical bonds in the direction normal to the layers.

These 2D layers can also be laterally “stitched” into hexagonal or honeycomb lattices, with the electronic and atomic motions confined into the atomic layers. Using state-of-the-art epitaxial techniques, our team can then artificially stack these lattices to form a new class of condensed matter with exotic interlayer couplings and emergent electronic, optical and magnetic properties – properties that, we hope, will find applications in next-generation quantum devices.

The IBS-VdWQS is part of Korea’s Institute for Basic Science (IBS). How does this arrangement work?

moon-ho jo image
Moon-Ho Jo “While the focus is very much on basic science, epitaxial scalability is hard-wired into all our lines of enquiry.” (Courtesy: IBS)

The IBS headquarters was established in 2011 as Korea’s first dedicated institute for fundamental science. It’s an umbrella organization coordinating the activity of 38 centres-of-excellence across the physical sciences, life sciences, as well as mathematics and data science. In this way, IBS specializes in long-range initiatives that require large groups of researchers from Korea and abroad.

Our IBS-VdWQS is a catalyst for advances in fundamental materials science and condensed-matter physics, essentially positioned as a central-government-funded research institution in a research-oriented university. Particularly important in this regard is our colocation on the campus of Pohang University of Science and Technology (POSTECH), one of Korea’s leading academic centres, and our adjacency to large-scale facilities like the Pohang Synchrotron Radiation Facility (PAL) and Pohang X-ray free-electron laser (PAL-XFEL). It’s worth noting as well that all the principal investigators (PIs) in our centre hold dual positions as IBS researchers and POSTECH professors.

So IBS is majoring on strategic research initiatives?

Absolutely – and that perspective also underpins our funding model. The IBS-VdWQS was launched in 2022 and is funded by IBS for an initial period through to 2032 (with a series of six-year extensions subject to the originality and impact of our research). As such, we are able to encourage autonomy across our 2D materials programme, giving scientists the academic freedom to pursue questions in basic research without the bureaucracy and overhead of endless grant proposals. Team members know that, with plenty of hard work and creativity, they have everything they need here to do great science and build their careers.

Your core remit is fundamental science, but what technologies could eventually emerge from the IBS-VdWQS research programme?

While the focus is very much on basic science, epitaxial scalability is hard-wired into all our lines of enquiry. In short: we are creating new 2D materials via epitaxial growth and this ultimately opens a pathway to wafer-scale industrial production of van der Waals materials with commercially interesting semiconducting, superconducting or emergent properties in general.

Right now, we are investigating van der Waals semiconductors and the potential integration of MoS2 and WSe2 with silicon for new generations of low-power logic circuitry. On a longer timeline, we are developing new types of high-Tc (around 10 K) van der Waals superconductors for applications in Josephson junctions, which are core building blocks in superconducting quantum computers.

There’s a parallel opportunity in photonic quantum computing, with van der Waals materials shaping up as promising candidates for quantum light-emitters that generate on-demand (deterministic) and highly coherent (indistinguishable) single-photon streams.

Establishing a new research centre from scratch can’t have been easy. How are things progressing?

It’s been a busy three years since the launch of the IBS-VdWQS. The most important task at the outset was centralization – pulling together previously scattered resources, equipment and staff from around the POSTECH campus. We completed the move into our purpose-built facility, next door to the PAL synchrotron light source, at the end of last year and have now established dedicated laboratory areas for the van der Waals Epitaxy Division; Quantum Device and Optics Division; Quantum Device Fabrication Division; and the Imaging and Spectroscopy Division.

One of our front-line research efforts is building a van der Waals Quantum Solid Cluster, an integrated system of multiple instruments connected by ultra-high-vacuum lines to maintain atomically clean surfaces. We believe this advanced capability will allow us to reliably study air-sensitive van der Waals materials and open up opportunities to discover new physics in previously inaccessible van der Waals platforms.

Integrated thinking The IBS-VdWQS hosts an end-to-end research programme spanning advanced fabrication, materials characterization and theoretical studies. From left to right: vapour-phase van der Waals crystal growth; femtosecond laser spectroscopy for studying ultrafast charge, spin and lattice dynamics; and an STM system for investigations of electronic structure and local quantum properties in van der Waals materials. (Courtesy: IBS)

Are there plans to scale the IBS-VdWQS work programme?

Right now, my priority is to promote opportunities for graduate students, postdoctoral researchers and research fellows to accelerate the centre’s expanding research brief. Diversity is strength, so I’m especially keen to encourage more in-bound applications from talented experimental and theoretical physicists in Europe and North America. Our current research cohort comprises 30+ PhD students, seven postdocs (from the US, India, China and Korea) and seven PIs.

Over the next five years, we aim to scale up to 25+ postdocs and research fellows and push out in new directions such as scalable quantum devices. In particular, we are looking for scientists with specialist know-how and expertise in areas like materials synthesis, quantum transport, optical spectroscopy and scanning probe microscopy (SPM) to accelerate our materials research.

How do you support your early-career researchers at IBS-VdWQS?

We are committed to nurturing global early-career talent and provide a clear development pathway from PhD through postdoctoral studies to student research fellow and research fellow/PI. Our current staff PIs have diverse academic backgrounds – materials science, physics, electronic engineering and chemistry – and we therefore allow early-career scientists to have a nominated co-adviser alongside their main PI. This model means research students learn in an integrated fashion that encourages a “multidisciplinarian” mindset – majoring in epitaxial growth, low-temperature electronic devices and optical spectroscopy, say, while also maintaining a watching brief (through their co-adviser) on the latest advances in materials characterization and analysis.

What does success look like at the end of the current funding cycle?

With 2032 as the first milestone year in this budget cycle, we are working to establish a global hub for van der Waals materials science – a highly collaborative and integrated research programme spanning advanced fabrication, materials characterization/analysis and theoretical studies. More capacity, more research infrastructure, more international scientists are all key to delivering our development roadmap for 2D semiconductor and superconductor integration towards scalable, next-generation low-power electronics and quantum computing devices.

Building a scientific career in 2D materials

myungchul oh image
Myungchul Oh “We are exploring the microscopic nature of quantum materials and their device applications.” (Courtesy: IBS)

Myungchul Oh joined the IBS-VdWQS in 2023 after a five-year stint as a postdoctoral physicist at Princeton University in the US, where he studied strongly correlated phenomena, superconductivity and topological properties in “twisted” graphene systems.

Recruited as an IBS-POSTECH research fellow, Oh holds dual academic positions: team leader for the quantum-device microscopy investigations at IBS-VdWQS and assistant professor in the semiconductor engineering department at POSTECH.

Van der Waals heterostructures, assembled layer-by-layer from 2D materials, enable precise engineering of quantum properties through the interaction between different atomic layers. By extension, Oh and his colleagues are focused on the development of novel van der Waals systems; their integration into devices via nanofabrication; and the study of electrical, magnetic and topological properties under extreme conditions, where quantum-mechanical effects dominate.

“We are  exploring the microscopic nature of quantum materials and their device applications,” Oh explains. “Our research combines novel 2D van der Waals heterostructure device fabrication techniques with cryogenic scanning probe microscopy (SPM) measurements – the latter to access the atomic-scale electronic structure and local physical properties of quantum phases in 2D materials.”

ibs logo

The post Korea’s long-term strategy for 2D materials: fundamental science is the secret of success appeared first on Physics World.

Atomic system acts like a quantum Newton’s cradle

17 décembre 2025 à 12:15

Atoms in a one-dimensional quantum gas behave like a Newton’s cradle toy, transferring energy from atom to atom without dissipation. Developed by researchers at the TU Wien, Austria, this quantum fluid of ultracold, confined rubidium atoms can be used to simulate more complex solid-state systems. By measuring transport quantities within this “perfect” atomic system, the team hope to obtain a deeper understanding of how transport phenomena and thermodynamics behave at the quantum level.

Physical systems transport energy, charge and mass in various ways. Electrical currents streaming along a wire, heat flowing through a solid and light travelling down an optical fibre are just three examples. How easily these quantities move inside a material depends on the resistance they experience, with collisions and friction slowing them down and making them fade away. This level of resistance largely determines whether the material is classed as an insulator, a conductor or a superconductor.

The mechanisms behind such transport fall into two main categories. The first is ballistic transport, which features linear movement without loss, like a bullet travelling in a straight line. The second is diffusive transport, where the quantity is subject to many random collisions. A good example is heat conduction, where the heat moves through a material gradually, travelling in many directions at once.

Breaking the rules

Most systems are strongly affected by diffusion, which makes it surprising that the TU Wien researchers could build an atomic system where mass and energy flowed freely without it. According to study leader Frederik Møller, the key to this unusual behaviour is the magnetic and optical fields that keep the rubidium atoms confined to one dimension, “freezing out” interactions in the atoms’ two transverse directions.

Because the atoms can only move along a single direction, Møller explains, they transfer momentum perfectly, without scattering their energy as would be the case in normal matter. Consequently, the 1D atomic system does not thermalize despite being subject to thousands of collisions.

To quantify the transport of mass (charge) and energy within this system, the researchers measured quantities known as Drude weights, which are fundamental parameters that describe ballistic, dissipationless transport in solid-state environments. According to these measurements, the single-dimensional interacting bosonic atoms do indeed demonstrate perfect dissipationless transport. The results also agree with the generalized hydrodynamics (GHD) theoretical framework, which describes the large-scale, inhomogeneous dynamics of one-dimensional integrable quantum many-body systems such as ultracold atomic gases or specific spin chains.

A Newton’s cradle for atoms

According to team leader Jörg Schmiedmayer, the experiment is analogous to a Newton’s cradle toy, which consists of a row of metal balls suspended on wires (see below). When the ball on one end of the row is made to collide with the one next to it, its momentum transfers straight through the other balls to the ball on the opposite end, which swings out. Schmiedmayer adds that the system makes it possible to study transport under perfectly controlled conditions and could open new ways of understanding how resistance emerges, or disappears, at the quantum level. “Our next steps are applying the method to strongly correlated transport and to transport in a topological fluid,” he tells Physics World.

 

Karèn Kheruntsyan, a theoretical physicist at the University of Queensland, Australia, who was not involved in this research, calls it “a significant step for studying quantum transport”. He says the team’s work clearly demonstrates ballistic (dissipationless) transport at a finite temperature, providing an experimental benchmark for theories of integrability and disorder. The work also validates the thermodynamic meaning of Drude weights, while confirming that linear-response theory and GHD accurately describe transport in quantum systems.

In Kheruntsyan’s view, though, the team’s biggest achievement is the quantitative extraction of Drude weights that characterize atomic and energy currents, with “excellent agreement” between experiment and theory. This, he says, shows truly ballistic transport in an interacting many-body system. One caveat, though, is that the system’s limited spatial resolution and near-ideal integrability prevent it from being used to explore diffusive regimes or stronger interaction effects, leaving microscopic dynamics such as dispersive shock waves unresolved.

The study is published in Science.

The post Atomic system acts like a quantum Newton’s cradle appeared first on Physics World.

Want a strong future for physics? Here’s why we must focus on students from under-represented groups

17 décembre 2025 à 12:00

Physics students from under-represented groups consistently report a lower sense of belonging at university than their over-represented peers. These students experience specific challenges that make them feel undervalued and excluded. Yet a strong sense of belonging has been shown to lead to improved academic performance, greater engagement in courses and better mental wellbeing. It is vital, then, that universities make changes to help eliminate these challenges.

Students are uniquely placed to describe the issues when it comes to belonging in physics. With this mind, as an undergraduate physics student with a passion for making the discipline more diverse and inclusive, I conducted focus groups with current and former physics students, interviewed experts and performed an analysis of current literature.  This was part of a summer project funded by the Royal Institution and is currently being finalized for publication.

From this work it became clear that under-represented groups face many challenges to developing a strong sense of belonging in physics, but, at the same time, there are ways to improve the everyday experiences of students. When it comes to barriers, one is the widely held belief – reflected in the way physicists are depicted in the media and textbooks – that you need to be a “natural genius” to succeed in university physics. This notion hampers students from under-represented groups, who see peers from the over-represented majority appearing to grasp concepts more quickly and lecturers suggesting certain topics are “easy”.

The feeling that physics demands natural ability also arises from the so-called “weed out” culture, which is defined as courses that are intentionally designed to filter students out, reduce class sizes and diminish sense of belonging. Students who we surveyed believe that the high fail rate is caused by a disconnect between the teaching and workshops on the course and the final exam.

A third cause of this perception that you need some innate ability to succeed in physics is the attitudes and behaviour of some professors, lecturers and demonstrators. This includes casual sexist and racist behaviour; belittling students who ask for help; and acting as if they’re uninterested in teaching. Students from under-represented groups report significantly lower levels of respect and recognition from instructors, which damages their resilience and weakens sense of belonging.

Students from under-represented groups are also more likely to be isolated from their class mates and feel socially excluded from them. This means they lack a support network, leaving them with no-one to turn to when they encounter challenges. Outside the lecture theatre, students from under-represented groups typically face many microaggressions in their day-to-day university experience. These are subtle indignities or insults, unconsciously or consciously, towards minorities such as people of colour being told they “speak English very well”, male students refusing to accept women’s ideas, and the assumption that gender minorities will take on administrative roles in group projects.

Focus on the future

So what can be done? The good news is that there are many solutions to mitigate these issues and improve a sense of belonging. First, institutions should place more emphasis on small group “active learning” – which includes discussions, problem solving and peer-based learning. These pedagogical strategies have been shown to boost belonging, particularly for female students. After these active-learning sessions, non-academic, culturally sensitive social lunches can help turn “course friends” to “real friends” who choose to meet socially and can become a support network. This can help build connections within and between degree cohorts.

Another solution is for universities to invite former students to speak about their sense of belonging and how it evolved or improved throughout their degree. Hearing about struggles and learning tried-and-tested strategies to improve resilience can help students better prepare for stressful situations. Alumni are more relatable than generic messaging from the university wellbeing team.

Building closer links between students and staff also enhances a sense of belonging. It helps humanise lecturers and demonstrate that staff care about student wellbeing and success. This should be implemented by recognizing staff efforts formally so that the service roles of faculty members are formally recognized and professionalized.

Universities should also focus on hiring more diverse teaching staff, who can serve as role models, using their experiences to relate to and engage with under-represented students. Students will end up feeling more embedded within the physics community, improving both their sense of belonging and performance.

One practical way to increase diversity in hiring is for institutions to re-evaluate what they value. While securing large grants is valuable, so is advocating for equality, diversity and inclusion; public engagement; and the ability to inspire the next generation of physicists.

Another approach is to establish “departmental action teams” to find tailored solutions to unite undergraduates, postgraduates, teaching and research staff. Such teams identify issues specific to their particular university, and they can gather data through surveying the department to identify trends and recommend practical changes to boost belonging.

Implementing these measures will not only improve the sense of belonging for students from under-represented groups but also cultivate a more inclusive, diverse physics workforce. That in turn will boost the overall research culture, opening up research directions that may have previously been overlooked, and yielding stronger scientific outputs. It is crucial that we do more to support physics students from under-represented groups to create a more diverse physics community. Ultimately, it will benefit physics and society as a whole.

The post Want a strong future for physics? Here’s why we must focus on students from under-represented groups appeared first on Physics World.

Motion through quantum space–time is traced by ‘q-desics’

16 décembre 2025 à 17:19

Physicists searching for signs of quantum gravity have long faced a frustrating problem. Even if gravity does have a quantum nature, its effects are expected to show up only at extremely small distances, far beyond the reach of experiments. A new theoretical study by Benjamin Koch and colleagues at the Technical University of Vienna in Austria suggests a different strategy. Instead of looking for quantum gravity where space–time is tiny, the researchers argue that subtle quantum effects could influence how particles and light move across huge cosmical distances.

Their work introduces a new concept called q-desics, short for quantum-corrected paths through space–time. These paths generalize the familiar trajectories predicted by Einstein’s general theory of relativity and could, in principle, leave observable fingerprints in cosmology and astrophysics.

General relativity and quantum mechanics are two of the most successful theories in physics, yet they describe nature in radically different ways. General relativity treats gravity as the smooth curvature of space–time, while quantum mechanics governs the probabilistic behavior of particles and fields. Reconciling the two has been one of the central challenges of theoretical physics for decades.

“One side of the problem is that one has to come up with a mathematical framework that unifies quantum mechanics and general relativity in a single consistent theory,” Koch explains. “Over many decades, numerous attempts have been made by some of the most brilliant minds humanity has to offer.” Despite this effort, no approach has yet gained universal acceptance.

Deeper difficulty

There is another, perhaps deeper difficulty. “We have little to no guidance, neither from experiments nor from observations that could tell us whether we actually are heading in the right direction or not,” Koch says. Without experimental clues, many ideas about quantum gravity remain largely speculative.

That does not mean the quest lacks value. Fundamental research often pays off in unexpected ways. “We rarely know what to expect behind the next tree in the jungle of knowledge,” Koch says. “We only can look back and realize that some of the previously explored trees provided treasures of great use and others just helped us to understand things a little better.”

Almost every test of general relativity relies on a simple assumption. Light rays and freely falling particles follow specific paths, known as geodesics, determined entirely by the geometry of space–time. From gravitational lensing to planetary motion, this idea underpins how physicists interpret astronomical data.

Koch and his collaborators asked what happens to this assumption when space–time itself is treated as a quantum object. “Almost all interpretations of observational astrophysical and astronomical data rest on the assumption that in empty space light and particles travel on a path which is described by the geodesic equation,” Koch says. “We have shown that in the context of quantum gravity this equation has to be generalized.”

Generalized q-desic

The result is the q-desic equation. Instead of relying only on an averaged, classical picture of space–time, q-desics account for the underlying quantum structure more directly. In practical terms, this means that particles may follow paths that deviate slightly from those predicted by classical general relativity, even when space–time looks smooth on average.

Crucially, the team found that these deviations are not confined to tiny distances. “What makes our first results on the q-desics so interesting is that apart from these short distance effects, there are also long range effects possible, if one takes into account the existence of the cosmological constant,” Koch says.

This opens the door to possible tests using existing astronomical data. According to the study, q-desics could differ from ordinary geodesics over cosmological distances, affecting how matter and light propagate across the universe.

“The q-desics might be distinguished from geodesics at cosmological large distances,” Koch says, “which would be an observable manifestation of quantum gravity effects.”

Cosmological tensions

The researchers propose revisiting cosmological observations. “Currently, there are many tensions popping up between the Standard Model of cosmology and observed data,” Koch notes. “All these tensions are linked, one way or another, to the use of geodesics at vastly different distance scales.” The q-desic framework offers a new lens through which to examine such discrepancies.

So far, the team has explored simplified scenarios and idealized models of quantum space–time. Extending the framework to more realistic situations will require substantial effort.

“The initial work was done with one PhD student (Ali Riahina) and one colleague (Ángel Rincón),” Koch says. “There are many things to be revisited and explored that our to-do list is growing far too long for just a few people.” One immediate goal is to encourage other researchers to engage with the idea and test it in different theoretical settings.

Whether q-desics will provide an observational window into quantum gravity remains to be seen. But by shifting attention from the smallest scales to the largest structures in the cosmos, the work offers a fresh perspective on an enduring problem.

The research is described in Physical Review D.

The post Motion through quantum space–time is traced by ‘q-desics’ appeared first on Physics World.

Remote work expands collaboration networks but reduces research impact, study suggests

15 décembre 2025 à 13:41

Academics who switch to hybrid working and remote collaboration do less impactful research. That’s according to an analysis of how scientists’ collaboration networks and academic outputs evolved before, during and after the COVID-19 pandemic (arXiv: 2511.18481). It involved studying author data from the arXiv preprint repository and the online bibliographic catalogue OpenAlex.

To explore the geographic spread of collaboration networks, Sara Venturini from the Massachusetts Institute of Technology and colleagues looked at the average distance between the institutions of co-authors. They found that while the average distance between team members on publications increased from 2000 to 2021, there was a particularly sharp rise after 2022.

This pattern, the researchers claim, suggests that the pandemic led to scientists collaborating more often with geographically distant colleagues. They found consistent patterns when they separated papers related to COVID-19 from those in unrelated areas, suggesting the trend was not solely driven by research on COVID-19.

The researchers also examined how the number of citations a paper received within a year of publication changed with distance between the co-authors’ institutions. In general, as the average distance between collaborators increases, citations fall, the authors found.

They suggest that remote and hybrid working hampers research quality by reducing spontaneous, serendipitous in-person interactions that can lead to deep discussions and idea exchange.

Despite what the authors say is a “concerning decline” in citation impact, there are, however, benefits to increasing remote interactions. In particular, as the geography of collaboration networks increases, so too does international partnerships and authorship diversity.

Remote tools

Lingfei Wu, a computational social scientist at the University of Pittsburgh, who was not involved in the study, told Physics World that he was surprised by the finding that remote teams produce less impactful work.

“In our earlier research, we found that historically, remote collaborations tended to produce more impactful but less innovative work,” notes Wu. “For example, the Human Genome Project published in 2001 shows how large, geographically distributed teams can also deliver highly impactful science. One would expect the pandemic-era shift toward remote collaboration to increase impact rather than diminish it.”

Wu says his work suggests that remote work is effective for implementing ideas but less effective for generating them, indicating that scientists need a balance between remote and in-person interactions. “Use remote tools for efficient execution, but reserve in-person time for discussion, brainstorming, and informal exchange,” he adds.

The post Remote work expands collaboration networks but reduces research impact, study suggests appeared first on Physics World.

How well do you know AI? Try our interactive quiz to find out

15 décembre 2025 à 13:00

There are 12 questions in total: blue is your current question and white means unanswered, with green and red being right and wrong. Check your scores at the end – and why not test your colleagues too?

How did you do?

10–12 Top shot – congratulations, you’re the next John Hopfield

7–9 Strong skills – good, but not quite Nobel standard

4–6 Weak performance – should have asked ChatGPT

0–3 Worse than random – are you a bot?

Reports on Progress in Physics

 

Physics World‘s coverage of this interactive quiz is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

The post How well do you know AI? Try our interactive quiz to find out appeared first on Physics World.

Components of RNA among life’s building blocks found in NASA asteroid sample

12 décembre 2025 à 12:30

More molecules and compounds vital to the origin of life have been detected in asteroid samples delivered to Earth by NASA’s OSIRIS-REx mission. The discovery strengthens the case that not only did life’s building blocks originate in space, but that the ingredients of RNA, and perhaps RNA itself, were brought to our planet by asteroids.

Two new papers in Nature Geoscience and Nature Astronomy describe the discovery of the sugars ribose and glucose in the 120 g of samples returned from the near-Earth asteroid 101955 Bennu, as well as an unusual carbonaceous “gum” that holds important compounds for life. The findings complement the earlier discovery of amino acids and the nucleobases of RNA and DNA in the Bennu samples.

A third new paper, in Nature Astronomy, addresses the abundance of pre-solar grains, which is dust that originated from before the birth of our Solar System, such as dust from supernovae. Scientists led by Ann Nguyen of NASA’s Johnson Space Center found six times more dust direct from supernova explosions than is found, on average, in meteorites and other sampled asteroids. This could suggest differences in the concentration of different pre-solar dust grains in the disc of gas and dust that formed the Solar System.

Space gum

It’s the discovery of organic materials useful for life that steals the headlines, though. For example, the discovery of the space gum, which is essentially a hodgepodge chain of polymers, represents something never found in space before.

Scott Sandford of NASA’s Ames Research Center, co-lead author of the Nature Astronomy paper describing the gum discovery, tells Physics World: “The material we see in our samples is a bit of a molecular jumble. It’s carbonaceous, but much richer in nitrogen and, to a lesser extent, oxygen, than most of the organic compounds found in extraterrestrial materials.”

Sandford refers to the material as gum because of its pliability, bending and dimpling when pressure is applied, rather like chewing gum. And while much of its chemical functionality is replicated in similar materials on our planet, “I doubt it matches exactly with anything seen on Earth,” he says.

Initially, Sandford found the gum using an infrared microscope, nicknaming the dust grains containing the gum “Lasagna” and “Neapolitan” because the grains are layered. To extract them from the rock in the sample, Sandford went to Zack Gainsforth of the University of California, Berkeley, who specializes in analysing and extracting materials from samples like this.

Platinum scaffolding

Having welded a tungsten needle to the Neapolitan sample in order to lift it, the pair quickly realised that the grain was very delicate.

“When we tried to lift the sample it began to deform,” Gainsforth says. “Scott and I practically jumped out of our chairs and brainstormed what to do. After some discussion, we decided that we should add straps to give it enough mechanical rigidity to survive the lift.”

Microscopic particle of asteroid Bennu
Fragile sample A microscopic particle of asteroid Bennu is manipulated under a transmission electron microscope. To move the 30 µm fragment for further analysis, the researchers reinforced it with thin platinum strips (the L shape on the surface). (Courtesy: NASA/University of California, Berkeley)

By straps, Gainsforth is referring to micro-scale platinum scaffolding applied to the grain to reinforce its structure while they cut it away with an ion beam. Platinum is often used as a radiation shield to protect samples from an ion beam, “but how we used it was anything but standard,” says Gainsforth. “Scott and I made an on-the-fly decision to reinforce the samples based on how they were reacting to our machinations.”

With the sample extracted and reinforced, they used the ion beam cutter to shave it down until it was a thousand times thinner than a human hair, at which point it could be studied by electron microscopy and X-ray spectrometry. “It was a joy to watch Zack ‘micro-manipulate’ [the sample],” says Sandford.

The nitrogen in the gum was found to be in nitrogen heterocycles, which are the building blocks of nucleobases in DNA and RNA. This brings us to the other new discovery, reported in Nature Geoscience, of the sugars ribose and glucose in the Bennu samples, by a team led by Yoshihiro Furukawa of Tohoku University in Japan.

The ingredients of RNA

Glucose is the primary source of energy for life, while ribose is a key component of the sugar-phosphate backbone that connects the information-carrying nucleobases in RNA molecules. Furthermore, the discovery of ribose now means that everything required to assemble RNA molecules is present in the Bennu sample.

Notable by its absence, however, was deoxyribose, which is ribose minus one oxygen atom. Deoxyribose in DNA performs the same job as ribose in RNA, and Furukawa believes that its absence supports a popular hypothesis about the origin of life on Earth called RNA world. This describes how the first life could have used RNA instead of DNA to carry genetic information, catalyse biochemical reactions and self-replicate.

Intriguingly, the presence of all RNA’s ingredients on Bennu raises the possibility that RNA could have formed in space before being brought to Earth.

“Formation of RNA from its building blocks requires a dehydration reaction, which we can expect to have occurred both in ancient Bennu and on primordial Earth,” Furukawa tells Physics World.

However, RNA would be very hard to detect because of its expected low abundance in the samples, making identifying it very difficult. So until there’s information to the contrary, “the present finding means that the ingredients of RNA were delivered from space to the Earth,” says Furukawa.

Nevertheless, these discoveries are major milestones in the quest of astrobiologists and space chemists to understand the origin of life on Earth. Thanks to Bennu and the asteroid 162173 Ryugu, from which a sample was returned by the Japanese Aerospace Exploration Agency (JAXA) mission Hayabusa2, scientists are increasingly confident that the building blocks of life on Earth came from space.

The post Components of RNA among life’s building blocks found in NASA asteroid sample appeared first on Physics World.

Slow spectroscopy sheds light on photodegradation

9 décembre 2025 à 18:22

Using a novel spectroscopy technique, physicists in Japan have revealed how organic materials accumulate electrical charge through long-term illumination by sunlight – leading to material degradation. Ryota Kabe and colleagues at the Okinawa Institute of Science and Technology have shown how charge separation occurs gradually via a rare multi-photon ionization process, offering new insights into how plastics and organic semiconductors degrade in sunlight.

In a typical organic solar cell, an electron-donating material is interfaced with an electron acceptor. When the donor absorbs a photon, one of its electrons may jump across the interface, creating a bound electron-hole pair which may eventually dissociate – creating two free charges from which useful electrical work can be extracted.

Although such an interface vastly boosts the efficiency of this process, it is not necessary for charge separation to occur when an electron donor is illuminated. “Even single-component materials can generate tiny amounts of charge via multiphoton ionization,” Kabe explains. “However, experimental evidence has been scarce due to the extremely low probability of this process.”

To trigger charge separation in this way, an electron needs to absorb one or more additional photons while in its excited state. Since the vast majority of electrons fall back into their ground states before this can happen, the spectroscopic signature of this charge separation is very weak. This makes it incredibly difficult to detect using conventional spectroscopy techniques, which can generally only make observations over timescales of up to a few milliseconds.

The opposite approach

“While weak multiphoton pathways are easily buried under much stronger excited-state signals, we took the opposite approach in our work,” Kabe describes. “We excited samples for long durations and searched for traces of accumulated charges in the slow emission decay.”

Key to this approach was an electron donor called NPD. This organic material has a relatively long triplet lifetime, where an excited electron is prevented from transitioning back to its ground state. As a result, these molecules emit phosphorescence over relatively long timescales.

In addition, Kabe’s team dispersed their NPD samples into different host materials with carefully selected energy levels. In one medium, the energies of both the highest-occupied and lowest-unoccupied molecular orbitals lay below NPD’s corresponding levels, so that the host material acted as an electron acceptor. As a result, charge transfer occurred in the same way as it would across a typical donor-acceptor interface.

Yet in another medium, the host’s lowest-unoccupied orbital lay above NPD’s – blocking charge transfer, and allowing triplet states to accumulate instead. In this case, the only way for charge separation to occur was through multi-photon ionization.

Slow emission decay analysis

Since NPD’s long triplet lifetime allowed its electrons to be excited gradually over an extended period of illumination, its weak charge accumulation became detectable through slow emission decay analysis. In contrast, more conventional methods involve multiple, ultra-fast laser pulses, severely restricting the timescale over which measurements can be made. Altogether, this approach enabled the team to clearly distinguish between the two charge generation pathways.

“Using this method, we confirmed that charge generation occurred via resonance-enhanced multiphoton ionization mediated by long-lived triplet states, even in single-component organic materials,” Kabe describes.

This result offers insights into how plastics and organic semiconductors are degraded by sunlight over years or decades. The conventional explanation is that sunlight generates free radicals. These are molecules that lose an electron through ionization, leaving behind an unpaired electron which readily reacts with other molecules in the surrounding environment. Since photodegradation unfolds over such a long timescale, researchers could not observe this charge generation in single-component organic materials – until now.

“The method will be useful for analysing charge behaviour in organic semiconductor devices and for understanding long-term processes such as photodegradation that occur gradually under continuous light exposure,” Kabe says.

The research is described in Science Advances.

The post Slow spectroscopy sheds light on photodegradation appeared first on Physics World.

Oak Ridge Quantum Science Center prioritizes joined-up thinking, multidisciplinary impacts

8 décembre 2025 à 15:00

Travis Humble is a research leader who’s thinking big, dreaming bold, yet laser-focused on operational delivery. The long-game? To translate advances in fundamental quantum science into a portfolio of enabling technologies that will fast-track the practical deployment of quantum computers for at-scale scientific, industrial and commercial applications.

As director of the Quantum Science Center (QSC) at Oak Ridge National Laboratory (ORNL) in East Tennessee, Humble and his management team are well placed to transform that research vision into scientific, economic and societal upside. Funded to the tune of $115 million through its initial five-year programme (2020–25), QSC is one of five dedicated National Quantum Information Science Research Centers (NQISRC) within the US Department of Energy (DOE) National Laboratory system.

Validation came in spades last month when, despite the current turbulence around US science funding, QSC was given follow-on DOE backing of $125 million over five years (2025–30) to create “a new scientific ecosystem” for fault-tolerant, quantum-accelerated high-performance computing (QHPC). In short, QSC will target the critical research needed to amplify the impact of quantum computing through its convergence with leadership-class exascale HPC systems.

“Our priority in Phase II QSC is the creation of a common software ecosystem to host the compilers, programming libraries, simulators and debuggers needed to develop hybrid-aware algorithms and applications for QHPC,” explains Humble. Equally important, QSC researchers will develop and integrate new techniques in quantum error correction, fault-tolerant computing protocols and hybrid algorithms that combine leading-edge computing capabilities for pre- and post-processing of quantum programs. “These advances will optimize quantum circuit constructions and accelerate the most challenging computational tasks within scientific simulations,” Humble adds.

Classical computing, quantum opportunity

At the heart of the QSC programme sits ORNL’s leading-edge research infrastructure for classical HPC, a capability that includes Frontier, the first supercomputer to break the exascale barrier and still one of the world’s most powerful. On that foundation, QSC is committed to building QHPC architectures that take advantage of both quantum computers and exascale supercomputing to tackle all manner of scientific and industrial problems beyond the reach of today’s HPC systems alone.

“Hybrid classical-quantum computing systems are the future,” says Humble. “With quantum computers connecting both physically and logically to existing HPC systems, we can forge a scalable path to integrate quantum technologies into our scientific infrastructure.”

Frontier, a high-performance supercomputer
Quantum acceleration ORNL’s current supercomputer, Frontier, was the first high-performance machine to break the exascale barrier. Plans are in motion for a next-generation supercomputer, Discovery, to come online at ORNL by 2028. (Courtesy: Carlos Jones/ORNL, US DOE)

Industry partnerships are especially important in this regard. Working in collaboration with the likes of IonQ, Infleqtion and QuEra, QSC scientists are translating a range of computationally intensive scientific problems – quantum simulations of exotic matter, for example – onto the vendors’ quantum computing platforms, generating excellent results out the other side.

“With our broad representation of industry partners,” notes Humble, “we will establish a common framework by which scientific end-users, software developers and hardware architects can collaboratively advance these tightly coupled, scalable hybrid computing systems.”

It’s a co-development model that industry values greatly. “Reciprocity is key,” Humble adds. “At QSC, we get to validate that QHPC can address real-world research problems, while our industry partners gather user feedback to inform the ongoing design and optimization of their quantum hardware and software.”

Quantum impact

Innovation being what it is, quantum computing systems will continue to trend on an accelerating trajectory, with more qubits, enhanced fidelity, error correction and fault-tolerance key reference points on the development roadmap. Phase II QSC, for its part, will integrate five parallel research thrusts to advance the viability and uptake of QHPC technologies.

The collaborative software effort, led by ORNL’s Vicente Leyton, will develop openQSE, an adaptive, end-to-end software ecosystem for QHPC systems and applications. Yigit Subasi from Los Alamos National Laboratory (LANL) will lead the hybrid algorithms thrust, which will design algorithms that combine conventional and quantum methods to solve challenging problems in the simulation of model materials.

Meanwhile, the QHPC architectures thrust, under the guidance of ORNL’s Chris Zimmer, will co-design hybrid computing systems that integrate quantum computers with leading-edge HPC systems. The scientific applications thrust, led by LANL’s Andrew Sornberger, will develop and validate applications of quantum simulation to be implemented on prototype QHPC systems. Finally, ORNL’s Michael McGuire will lead the thrust to establish experimental baselines for quantum materials that ultimately validate QHPC simulations against real-world measurements.

Longer term, ORNL is well placed to scale up the QHPC model. After all, the laboratory is credited with pioneering the hybrid supercomputing model that uses graphics processing units in addition to conventional central processing units (including the launch in 2012 of Titan, the first supercomputer of this type operating at over 10 petaFLOPS).

“The priority for all the QSC partners,” notes Humble, “is to transition from this still-speculative research phase in quantum computing, while orchestrating the inevitable convergence between quantum technology, existing HPC capabilities and evolving scientific workflows.”

Collaborate, coordinate, communicate

Much like its NQISRC counterparts (which have also been allocated further DOE funding through 2030), QSC provides the “operational umbrella” for a broad-scope collaboration of more than 300 scientists and engineers from 20 partner institutions. With its own distinct set of research priorities, that collective activity cuts across other National Laboratories (Los Alamos and Pacific Northwest), universities (among them Berkeley, Cornell and Purdue) and businesses (including IBM and IQM) to chart an ambitious R&D pathway addressing quantum-state (qubit) resilience, controllability and, ultimately, the scalability of quantum technologies.

“QSC is a multidisciplinary melting pot,” explains Humble, “and I would say, alongside all our scientific and engineering talent, it’s the pooled user facilities that we are able to exploit here at Oak Ridge and across our network of partners that gives us our ‘grand capability’ in quantum science [see box, “Unique user facilities unlock QSC opportunities”]. Certainly, when you have a common research infrastructure, orchestrated as part a unified initiative like QSC, then you can deliver powerful science that translates into real-world impacts.”

Unique user facilities unlock QSC opportunities

Stephen Streiffer tours the LINAC Tunnel at the Spallation Neutron Source
Neutron insights ORNL director Stephen Streiffer tours the linear accelerator tunnel at the Spallation Neutron Source (SNS). QSC scientists are using the SNS to investigate entirely new classes of strongly correlated materials that demonstrate topological order and quantum entanglement. (Courtesy: Alonda Hines/ORNL, US DOE)

Deconstructed, QSC’s Phase I remit (2020–25) spanned three dovetailing and cross-disciplinary research pathways: discovery and development of advanced materials for topological quantum computing (in which quantum information is stored in a stable topological state – or phase – of a physical system rather than the properties of individual particles or atoms); development of next-generation quantum sensors (to characterize topological states and support the search for dark matter); as well as quantum algorithms and simulations (for studies in fundamental physics and quantum chemistry).

Underpinning that collective effort: ORNL’s unique array of scientific user facilities. A case in point is the Spallation Neutron Source (SNS), an accelerator-based neutron-scattering facility that enables a diverse programme of pure and applied research in the physical sciences, life sciences and engineering. QSC scientists, for example, are using SNS to investigate entirely new classes of strongly correlated materials that demonstrate topological order and quantum entanglement – properties that show great promise for quantum computing and quantum metrology applications.

“The high-brightness neutrons at SNS give us access to this remarkable capability for materials characterization,” says Humble. “Using the SNS neutron beams, we can probe exotic materials, recover the neutrons that scatter off of them and, from the resultant signals, infer whether or not the materials exhibit quantum properties such as entanglement.”

While SNS may be ORNL’s “big-ticket” user facility, the laboratory is also home to another high-end resource for quantum studies: the Center for Nanophase Material Science (CNMS), one of the DOE’s five national Nanoscience Research Centers, which offers QSC scientists access to specialist expertise and equipment for nanomaterials synthesis; materials and device characterization; as well as theory, modelling and simulation in nanoscale science and technology.

Thanks to these co-located capabilities, QSC scientists pioneered another intriguing line of enquiry – one that will now be taken forward elsewhere within ORNL – by harnessing so-called quantum spin liquids, in which electron spins can become entangled with each other to demonstrate correlations over very large distances (relative to the size of individual atoms).

In this way, it is possible to take materials that have been certified as quantum-entangled and use them to design new types of quantum devices with unique geometries – as well as connections to electrodes and other types of control systems – to unlock novel physics and exotic quantum behaviours. The long-term goal? Translation of quantum spin liquids into a novel qubit technology to store and process quantum information.

SNS, CNMS and Oak Ridge Leadership Computing Facility (OLCF) are DOE Office of Science user facilities.

When he’s not overseeing the technical direction of QSC, Humble is acutely attuned to the need for sustained and accessible messaging. The priority? To connect researchers across the collaboration – physicists, chemists, material scientists, quantum information scientists and engineers – as well as key external stakeholders within the DOE, government and industry.

“In my experience,” he concludes, ”the ability of the QSC teams to communicate efficiently – to understand each other’s concepts and reasoning and to translate back and forth across disciplinary boundaries – remains fundamental to the success of our scientific endeavours.”

Further information

Listen to the Physics World podcast: Oak Ridge’s Quantum Science Center takes a multidisciplinary approach to developing quantum materials and technologies

Scaling the talent pipeline in quantum science

Quantum science graduate students and postdoctoral researchers present and discuss their work during a poster session
The next generation Quantum science graduate students and postdoctoral researchers present and discuss their work during a poster session at the fifth annual QSC Summer School. Hosted at Purdue University in April this year, the school is one of several workforce development efforts supported by QSC. (Courtesy: Dave Mason/Purdue University)

With an acknowledged shortage of skilled workers across the quantum supply chain, QSC is doing its bit to bolster the scientific and industrial workforce. Front-and-centre: the fifth annual QSC Summer School, which was held at Purdue University in April this year, hosting 130 graduate students (the largest cohort to date) through an intensive four-day training programme.

The Summer School sits as part of a long-term QSC initiative to equip ambitious individuals with the specialist domain knowledge and skills needed to thrive in a quantum sector brimming with opportunity – whether that’s in scientific research or out in industry with hardware companies, software companies or, ultimately, the end-users of quantum technologies in key verticals like pharmaceuticals, finance and healthcare.

“While PhD students and postdocs are integral to the QSC research effort, the Summer School exposes them to the fundamental ideas of quantum science elaborated by leading experts in the field,” notes Vivien Zapf, a condensed-matter physicist at Los Alamos National Laboratory who heads up QSC’s advanced characterization efforts.

“It’s all about encouraging the collective conversation,” she adds, “with lots of opportunities for questions and knowledge exchange. Overall, our emphasis is very much on training up scientists and engineers to work across the diversity of disciplines needed to translate quantum technologies out of the lab into practical applications.”

The programme isn’t for the faint-hearted, though. Student delegates kicked off this year’s proceedings with a half-day of introductory presentations on quantum materials, devices and algorithms. Next up: three and a half days of intensive lectures, panel discussions and poster sessions covering everything from entangled quantum networks to quantum simulations of superconducting qubits.

Many of the Summer School’s sessions were also made available virtually on Purdue’s Quantum Coffeehouse Live Stream on YouTube – the streamed content reaching quantum learners across the US and further afield. Lecturers were drawn from the US National Laboratories, leading universities (such as Harvard and Northwestern) and the quantum technology sector (including experts from IBM, PsiQuantum, NVIDIA and JPMorganChase).

The post Oak Ridge Quantum Science Center prioritizes joined-up thinking, multidisciplinary impacts appeared first on Physics World.

❌