↩ Accueil

Vue lecture

Multi-ion cancer therapy tackles the LET trilemma

Cancer treatments using heavy ions offer several key advantages over conventional proton therapy: a sharper Bragg peak and small lateral scattering for precision tumour targeting, as well as high linear energy transfer (LET). High-LET radiation induces complex DNA damage in cancer cells, enabling effective treatment of even hypoxic, radioresistant tumours. A team at the National Institutes for Quantum Science and Technology (QST) in Japan is now exploring the potential benefits of multi-ion therapy combining beams of carbon, oxygen and neon ions.

“Different ions exhibit distinct physical and biological characteristics,” explains QST researcher Takamitsu Masuda. “Combining them in a way that is tailored to the specific characteristics of a tumour and its environment allows us to enhance tumour control while reducing damage to surrounding healthy tissues.”

The researchers are using multi-ion therapy to increase the dose-averaged LET (LETd) within the tumour, performing a phase I trial at the QST Hospital to evaluate the safety and feasibility of this LETd escalation for head-and-neck cancers. But while high LETd prescriptions can improve treatment efficacy, increasing LETd can also deteriorate plan robustness. This so-called “LET trilemma” – a complex trade-off between target dose homogeneity, range robustness and high LETd – is a major challenge in particle therapy optimization.

In their latest study, reported in Physics in Medicine & Biology, Masuda and colleagues evaluated the impact of range and setup uncertainties on LETd-optimized multi-ion treatment plans, examining strategies that could potentially overcome this LET trilemma.

Robustness evaluation

The team retrospectively analysed the data of six patients who had previously been treated with carbon-ion therapy. Patients 1, 2 and 3 had small, medium and large central tumours, respectively, and adjacent dose-limiting organs-at-risk (OARs); and patients 4, 5 and 6 had small, medium and large peripheral tumours and no dose-limiting OARs.

Multi-ion therapy plans
Multi-ion therapy plans Reference dose and LETd distributions for patients 1, 2 and 3 for multi-ion therapy with a target LETd of 90 keV/µm. The GTV, clinical target volume (CTV) and OARs are shown in cyan, green and magenta, respectively. (Courtesy: Phys. Med. Biol.10.1088/1361-6560/ae387b)

For each case, the researchers first generated baseline carbon-ion therapy plans and then incorporated oxygen- or neon-ion beams and tuned the plans to achieve a target LETd of 90 keV/µm to the gross tumour volume (GTV).

Particle therapy plans can be affected by both range uncertainties and setup variations. To assess the impact of these uncertainties, the researchers recalculated the multi-ion plans to incorporate range deviations of +2.5% (overshoot) and –2.5% (undershoot) and various setup uncertainties, evaluating their combined effects on dose and LETd distributions.

They found that range uncertainty was the main contributor to degraded plan quality. In general, range overshoot increased dose to the target, while undershoot decreased dose. Range uncertainties had the largest effect on small tumours and central tumours: patient #1 exhibited a deviation of around ±6% from the reference, while patient #3 showed a dose deviation of just ±1%. Robust target coverage was maintained in all large or peripheral tumours, but deteriorated in patient 1, leading to an uncertainty band of roughly 11%.

“Wide uncertainty bands indicate a higher risk that the intended dose may not be accurately delivered,” Masuda explains. “In particular, a pronounced lower band for the GTV suggests the potential for cold spots within the tumour, which could compromise local tumour control.”

The team also observed that range undershoot increased LETd and overshoot decreased it, although absolute differences in LETd within the entire target were small. Importantly, all OAR dose constraints were satisfied even in the largest error scenarios, with uncertainty bands comparable to those of conventional carbon-ion treatment plans.

Addressing the LET trilemma

To investigate strategies to improve plan robustness, the researchers created five new plans for patient 1, who had a small, central tumour that was particularly susceptible to uncertainties. They modified the original multi-ion plan (carbon- and oxygen-ion beams delivered at 70° and 290°) in five ways: expanding the target; altering the beam angles to orthogonal or opposing arrangements; increasing the number of irradiation fields to a four-field arrangement; and using oxygen ions for both beam ports (“heavier-ion selection”).

The heavier-ion selection plan proved the most effective in mitigating the effects of range uncertainty, substantially narrowing the dose uncertainty bands compared with the original plan. The team attribute this to the inherently higher LETd in heavier ions, making the 90 keV/µm target easier to achieve with oxygen-ion beams alone. The other plan modifications led to limited improvements.

Dose–volume histograms
Improving robustness Dose–volume histograms for patient 1, for the original multi-ion plan and the heavier-ion selection plan, showing the combined effects of range and setup uncertainties. Solid, dashed and dotted curves represent the reference plans, and upper and lower uncertainty scenarios, respectively. (Courtesy: Phys. Med. Biol.10.1088/1361-6560/ae387b)

These findings suggest that strategically employing heavier ions to enhance plan robustness could help control the balance among range robustness, uniform dose and high LETd – potentially offering a practical strategy to overcome the LET trilemma.

“Clinically, this strategy is particularly well-suited for small, deep-seated tumours and complex, variable sites such as the nasal cavity, where range uncertainties are amplified by depth, steep dose gradients and daily anatomical changes,” says Masuda. “In such cases, the use of heavier ions enables robust dose delivery with high LETd.”

The researchers are now exploring the integration of emerging technologies – such as robust optimization, arc therapy, dual-energy CT, in-beam PET and online adaptation – to minimize uncertainties. “This integration is highly desirable for applying multi-ion therapy to challenging cases such as pancreatic cancer, where uncertainties are inherently large, or hypofractionated treatments, where even a single error can have a significant impact,” Masuda tells Physics World.

The post Multi-ion cancer therapy tackles the LET trilemma appeared first on Physics World.

  •  

New project takes aim at theory-experiment gap in materials data

Condensed-matter physics and materials science have a silo problem. Although researchers in these fields have access to vast amounts of data – from experimental records of crystal structures and conditions for synthesizing specific materials to theoretical calculations of electron band structures and topological properties – these datasets are often fragmented. Integrating experimental and theoretical data is a particularly significant challenge.

Researchers at the Beijing National Laboratory for Condensed Matter Physics and the Institute of Physics (IOP) of the Chinese Academy of Sciences (CAS) recently decided to address this challenge. Their new platform, MaterialsGalaxy, unifies data from experiment, computation and scientific literature, making it easier for scientists to identify previously hidden relationships between a material’s structure and its properties. In the longer term, their goal is to establish a “closed loop” in which experimental results validate theory and theoretical calculations guide experiments, accelerating the discovery of new materials by leveraging modern artificial intelligence (AI) techniques.

Physics World spoke to team co-leader Quansheng Wu to learn more about this new tool and how it can benefit the materials research community.

How does MaterialsGalaxy work?

The platform works by taking the atomic structure of materials and mathematically mapping it into a vast, multidimensional vector space. To do this, every material – regardless of whether its structure is known from experiment, from a theoretical calculation or from simulation – must first be converted into a unique structural vector that acts like a “fingerprint” for the material.

Then, when a MaterialsGalaxy user focuses on a material, the system automatically identifies its nearest neighbors in this vector space. This allows users to align heterogeneous data – for example, linking a synthesized crystal in one database with its calculated topological properties in another – even when different data sources define the material slightly differently.

The vector-based approach also enables the system to recommend “nearest neighbour” materials (analogs) to fill knowledge gaps, effectively guiding researchers from known data into unexplored territories. It does this by performing real-time vector similarity searches to dynamically link relevant experimental records, theoretical calculations and literature information. The result is a comprehensive profile for the material.

Where does data for MaterialsGalaxy come from?

We aggregated data from three primary channels: public databases; our institute’s own high-quality internal experimental records (known as the MatElab platform); and the scientific literature. All data underwent rigorous standardization using tools such as the pymatgen (Python Materials Genomics) materials analysis code and the spglib crystal structure library to ensure consistent definitions for crystal structures and physical properties.

Who were your collaborators on this project?

This project is a multi-disciplinary effort involving a close-knit collaboration among several research groups at the IOP, CAS and other leading institutions. My colleague Hongming Weng and I supervised the core development and design under the strategic guidance of Zhong Fang, while Tiannian Zhu (the lead author of our Chinese Physics B paper about MaterialsGalaxy) led the development of the platform’s architecture and core algorithms, as well as its technical implementation.

We enhanced the platform’s capabilities by integrating several previously published AI-driven tools developed by other team members. For example, Caiyuan Ye contributed the Con-CDVAE model for advanced crystal structure generation, while Jiaxuan Liu contributed VASPilot, which automates and streamlines first-principles calculations. Meanwhile, Qi Li contributed PXRDGen, a tool for simulating and generating powder X-ray diffraction patterns.

Finally, much of the richness of MaterialsGalaxy stems from the high-quality data it contains. This came from numerous collaborators, including Weng (who contributed the comprehensive topological materials database, Materiae), Youguo Shi (single-crystal growth), Shifeng Jin (crystal structure and diffraction), Jinbo Pan (layered materials), Qingbo Yan (2D ferroelectric materials), Yong Xu (nonlinear optical materials), and Xingqiu Chen (topological phonons). My own contribution was a library of AI-generated crystal structures produced by the Con-CDVAE model.

What does MaterialsGalaxy enable scientists to do that they couldn’t do before?

One major benefit is that it prevents researchers from becoming stalled when data for a specific material is missing. By leveraging the tool’s “structural analogs” feature, they can look to the properties or growth paths of similar materials for insights – a capability not available in traditional, isolated databases.

We also hope that MaterialsGalaxy will offer a bridge between theory and experiment. Traditionally, experimentalists tend to consult the Inorganic Crystal Structure Database while theorists check the Materials Project. Now, they can view the entire lifecycle of a material – from how to grow a single crystal (experiment) to its topological invariants (theory) – on a single platform.

Beyond querying known materials, MaterialsGalaxy also allows researchers to use integrated generative AI models to create new structures. These can be immediately compared against the known database to assess synthesis feasibility and potential performance throughout the “vertical comparison” workflow.

What do you plan to do next?

We’re focusing on enhancing the depth and breadth of the tool’s data fusion. For example, we plan to develop representations based on graph neural networks (GNNs) to better handle experimental data that may contain defects or disorder, thereby improving matching accuracy.

We’re also interested in moving beyond crystal structure by introducing multi-modal anchors such as electronic band structures, X-ray diffraction (XRD) patterns and spectroscopic data. To do this, we plan to utilize technologies derived from computational linguistics and information processing (CLIP) to enable cross-modal retrieval, for example searching for theoretical band data by uploading an experimental XRD pattern.

Separately, we want to continue to expand our experimental data coverage, specifically targeting synthesis recipes and “failed” experimental records, which are crucial for training the next generation of “AI-enabled” scientists. Ultimately, we plan to connect an even wider array of databases, establishing robust links between them to realize a true Materials Galaxy of interconnected knowledge.

The post New project takes aim at theory-experiment gap in materials data appeared first on Physics World.

  •  

The pros and cons of patenting

For any company or business, it’s important to recognize and protect intellectual property (IP). In the case of novel inventions, which can include machines, processes and even medicines, a patent offers IP protection and lets firms control how those inventions are used. Patents, which in most countries can be granted for up to 20 years, give the owner exclusive rights so that others can’t directly copy the creation. A patent essentially prevents others from making, using or selling your invention.

But there are more reasons for holding a patent than IP protection alone. In particular, patents go some way to protecting the investment that may have been necessary to generate the IP in the first place, such as the cost of R&D facilities, materials, labour and expertise. Those factors need to be considered when you’re deciding if patenting is the right approach or not.

Patents are tangible assets that can be sold to other businesses or licensed for royalties to provide your compay with regular income

Patents are in effect a form of currency. Counting as tangible assets that add to the overall value of a company, they can be sold to other businesses or licensed for royalties to provide regular income. Some companies, in fact, build up or acquire significant patent portfolios, which can be used for bargaining with competitors, potentially leading to cross-licensing agreements where both parties agree to use each other’s technology.

Patents also say something about the competitive edge of a company, by demonstrating technical expertise and market position through the control of a specific technology. Essentially, patents give credibility to a company’s claims of its technical know-how: a patent shows investors that a firm has a unique, protected asset, making the business more appealing and attractive to further investment.

However, it’s not all one-way traffic and there are obligations on the part of the patentee. Firstly, a patent holder has to reveal to the world exactly how their invention works. Governments favour this kind of public disclosure as it encourages broader participation in innovation. The downside is that whilst your competitors cannot directly copy you, they can enhance and improve upon your invention, provided those changes aren’t covered by the original patent.

It’s also worth bearing in mind that a patent holder is responsible for patent enforcement and any ensuing litigation; a patent office will not do this for you. So you’ll have to monitor what your competitors are up to and decide on what course of action to take if you suspect your patent’s been infringed. Trouble is, it can sometimes be hard to prove or disprove an infringement – and getting the lawyers in can be expensive, even if you win.

Money talks

Probably the biggest consideration of all is the cost and time involved in making a patent application. Filing a patent requires a rigorous understanding of “prior art” – the existing body of relevant knowledge on which novelty is judged. You’ll therefore need to do a lot of work finding out about relevant established patents, any published research and journal articles, along with products or processes publicly disclosed before the patent’s filing date.

Before it can be filed with a patent office, a patent needs to be written as a legal description, which includes all the legwork like an abstract, background, detailed specifications, drawings and claims of the invention. Once filed, an expert in the relevant technical field will be assigned to assess the worth of the claim; this examiner must be satisfied that the application is both unique and “non-obvious” before it’s granted.

Even when the invention is judged to be technically novel, in order to be non-obvious, it must also involve an “inventive step” that would not be obvious to a person with “ordinary skill” in that technical field at the time of filing. The assessment phase can result in significant to-ing and fro-ing between the examiner and the applicant to determine exactly what is patentable. If insufficient evidence is found, the patent application will be refused.

Patents are only ever granted in a particular country or region, such as Europe, and the application process has to be repeated for each new place (although the information required is usually pretty similar). Translations may be required for some countries, there are fees for each application and, even if a patent is granted, you have to pay an additional annual bill to maintain the patent (which in the UK rises year on year).

Patents can take years to process, which is why many companies pay specialized firms to support their applications

Patent applications, in other words, can be expensive and can take years to process. That’s why many companies pay specialized firms to support their patent applications. Those firms employ patent attorneys – legal experts with a technical background who help inventors and companies manage their IP rights by drafting patent applications, navigating patent office procedures and advising on IP strategy. Attorneys can also represent their clients in disputes or licensing deals, thereby acting as a crucial bridge between science/engineering and law.

Perspiration and aspiration

It’s impossible to write about patents without mentioning the impact that Thomas Edison had as an inventor. During the 20th century, he became the world’s most prolific inventor with a staggering 1093 US patents granted in his lifetime. This monumental achievement remained unsurpassed until 2003, when it was overtaken by the Japanese inventor Shunpei Yamazaki and, more recently, by the Australian “patent titan” Kia Silverbrook in 2008.

Edison clearly saw there was a lot of value in patents, but how did he achieve so much? His approach was grounded in systematic problem solving, which he accomplished through his Menlo Park lab in New Jersey. Dedicated to technological development and invention, it was effectively the world’s first corporate R&D lab. And whilst Edison’s name appeared on all the patents, they were often primarily the work of his staff; he was effectively being credited for inventions made by his employees.

I have a love–hate relationship with patents or at least the process of obtaining them

I will be honest; I have a love–hate relationship with patents or at least the process of obtaining them. As a scientist or engineer, it’s easy to think all the hard work is getting an invention over the line, slogging your guts out in the lab. But applying for a patent can be just as expensive and time-consuming, which is why you need to be clear on what and when to patent. Even Edison grew tired of being hailed a genius, stating that his success was “1% inspiration and 99% perspiration”.

Still, without the sweat of patents, your success might be all but 99% aspiration.

The post The pros and cons of patenting appeared first on Physics World.

  •  

Starlink and the unravelling of digital sovereignty

Wind sweeps dust across across southeastern Iran in January 2025. Credit: NASA Earth Observatory image by Michala Garrison

In January 2026, Iranian authorities shut down landline and mobile telecommunications infrastructure in the country to clamp down on coordinated protests. Starlink terminals, which were discreetly mounted on rooftops, helped Iranian protesters bypass this internet blackout. The role played by Starlink in the recent Iranian protests challenges the notion of digital sovereignty and promotes corporate […]

The post Starlink and the unravelling of digital sovereignty appeared first on SpaceNews.

  •  

Practical impurity analysis for biogas producers

Biogas is a renewable energy source formed when bacteria break down organic materials such as food waste, plant matter, and landfill waste in an oxygen‑free (anaerobic) process. It contains methane and carbon dioxide, along with trace amounts of impurities. Because of its high methane content, biogas can be used to generate electricity and heat, or to power vehicles. It can also be upgraded to almost pure methane, known as biomethane, which can directly replace natural fossil gas.

Strict rules apply to the amount of impurities allowed in biogas and biomethane, as these contaminants can damage engines, turbines, and catalysts during upgrading or combustion. EN 16723 is the European standard that sets maximum allowable levels of siloxanes and sulfur‑containing compounds for biomethane injected into the natural gas grid or used as vehicle fuel. These limits are extremely low, meaning highly sensitive analytical techniques are required. However, most biogas plants do not have the advanced equipment needed to measure these impurities accurately.

Researchers from the Paul Scherrer Institute, Switzerland: Julian Indlekofer (left) and Ayush Agarwal (right), with the Liquid Quench Sampling System
Researchers from the Paul Scherrer Institute, Switzerland: Julian Indlekofer (left) and Ayush Agarwal (right), with the Liquid Quench Sampling System (Courtesy: Markus Fischer/Paul Scherrer Institute PSI)

The researchers developed a new, simpler method to sample and analyse biogas using GC‑ICP‑MS. Gas chromatography (GC) separates chemical compounds in a gas mixture based on how quickly they travel through a column. Inductively Coupled Plasma Mass Spectrometry (ICP‑MS) then detects the elements within those compounds at very low concentrations. Crucially, this combined method can measure both siloxanes and sulfur compounds simultaneously. It avoids matrix effects that can limit other detectors and cause biased or ambiguous results. It also achieves the very low detection limits required by EN 16723.

The sampling approach and centralized measurement enables biogas plants to meet regulatory standards using an efficient, less complex, and more cost‑effective method with fewer errors. Overall, this research provides a practical, high‑accuracy tool that makes reliable biogas impurity monitoring accessible to plants of all sizes, strengthening biomethane quality, protecting infrastructure, and accelerating the transition to cleaner energy systems.

Read the full article

Sampling to analysis: simultaneous quantification of siloxanes and sulfur compounds in biogas for cleaner energy

Ayush Agarwal et al 2026 Prog. Energy 8 015001

Do you want to learn more about this topic?

Household biogas technology in the cold climate of low-income countries: a review of sustainable technologies for accelerating biogas generation Sunil Prasad Lohani et al. (2024)

The post Practical impurity analysis for biogas producers appeared first on Physics World.

  •  

Saudi Space Agency Announces Winners of Global ‘DebriSolver’ Competition at Space Debris Conference

The Saudi Space Agency announced on Tuesday the names of the winning teams of the global “DebriSolver” competition, one of the flagship initiatives accompanying the Space Debris Conference 2026. Launched […]

The post Saudi Space Agency Announces Winners of Global ‘DebriSolver’ Competition at Space Debris Conference appeared first on SpaceNews.

  •  
❌