↩ Accueil

Vue normale

index.feed.received.today — 17 avril 20256.5 📰 Sciences English

Why Resilient GPS (R-GPS) Matters for US Military Superiority: We Must Address GPS Vulnerabilities

17 avril 2025 à 13:00

GPS is not only a cornerstone to our military superiority, it is foundational to our national and global economic stability. In fact, analysts warn that GPS outages could cost our […]

The post Why Resilient GPS (R-GPS) Matters for US Military Superiority: We Must Address GPS Vulnerabilities appeared first on SpaceNews.

Strange metals get their strangeness from quantum entanglement

17 avril 2025 à 11:00

A concept from quantum information theory appears to explain at least some of the peculiar behaviour of so-called “strange” metals. The new approach, which was developed by physicists at Rice University in the US, attributes the unusually poor electrical conductivity of these metals to an increase in the quantum entanglement of their electrons. The team say the approach could advance our understanding of certain high-temperature superconductors and other correlated quantum structures.

While electrons can travel through ordinary metals such as gold or copper relatively freely, strange metals resist their flow. Intriguingly, some high-temperature superconductors have a strange metal phase as well as a superconducting one. This phenomenon that cannot be explained by conventional theories that treat electrons as independent particles, ignoring any interactions between them.

To unpick these and other puzzling behaviours, a team led by Qimiao Si turned to the concept of quantum Fisher information (QFI). This statistical tool is typically used to measure how correlations between electrons evolve under extreme conditions. In this case, the team focused on a theoretical model known as the Anderson/Kondo lattice that describes how magnetic moments are coupled to electron spins in a material.

Correlations become strongest when strange metallicity appears

These analyses revealed that electron-electron correlations become strongest at precisely the point at which strange metallicity appears in a material. “In other words, the electrons become maximally entangled at this quantum critical point,” Si explains. “Indeed, the peak signals a dramatic amplification of multipartite electron spin entanglement, leading to a complex web of quantum correlations between many electrons.”

What is striking, he adds, is that this surge of entanglement provides a new and positive characterization of why strange metals are so strange, while also revealing why conventional theory fails. “It’s not just that traditional theory falls short, it is that it overlooks this rich web of quantum correlations, which prevents the survival of individual electrons as the elementary objects in this metallic substance,” he explains.

To test their finding, the researchers, who report their work in Nature Communications, compared their predictions with neutron scattering data from real strange-metal materials. They found that the experimental data was a good match. “Our earlier studies had also led us to suspect that strange metals might host a deeply entangled electron fluid – one whose hidden quantum complexity had yet to be fully understood,” adds Si.

The implications of this work are far-reaching, he tells Physics World. “Strange metals may hold the key to unlocking the next generation of superconductors — materials poised to transform how we transmit energy and, perhaps one day, eliminate power loss from the electric grid altogether.”

The Rice researchers say they now plan to explore how QFI manifests itself in the charge of electrons as well as their spins. “Until now, our focus has only been on the QFI associated with electrons spins, but electrons also of course carry charge,” Si says.

The post Strange metals get their strangeness from quantum entanglement appeared first on Physics World.

index.feed.received.yesterday — 16 avril 20256.5 📰 Sciences English

KATRIN sets tighter limit on neutrino mass

16 avril 2025 à 17:00

Researchers from the Karlsruhe Tritium Neutrino experiment (KATRIN) have announced the most precise upper limit yet on the neutrino’s mass. Thanks to new data and upgraded techniques, the new limit – 0.45 electron volts (eV) at 90% confidence – is half that of the previous tightest constraint, and marks a step toward answering one of particle physics’ longest-standing questions.

Neutrinos are ghostlike particles that barely interact with matter, slipping through the universe almost unnoticed. They come in three types, or flavours: electron, muon, and tau. For decades, physicists assumed all three were massless, but that changed in the late 1990s when experiments revealed that neutrinos can oscillate between flavours as they travel. This flavour-shifting behaviour is only possible if neutrinos have mass.

Although neutrino oscillation experiments confirmed that neutrinos have mass, and showed that the masses of the three flavours are different, they did not divulge the actual scale of these masses. Doing so requires an entirely different approach.

Looking for clues in electrons

In KATRIN’s case, that means focusing on a process called tritium beta decay, where a tritium nucleus (a proton and two neutrons) decays into a helium-3 nucleus (two protons and one neutron) by releasing an electron and an electron antineutrino. Due to energy conservation, the total energy from the decay is shared between the electron and the antineutrino. The neutrino’s mass determines the balance of the split.

“If the neutrino has even a tiny mass, it slightly lowers the energy that the electron can carry away,” explains Christoph Wiesinger, a physicist at the Technical University of Munich, Germany and a member of the KATRIN collaboration. “By measuring that [electron] spectrum with extreme precision, we can infer how heavy the neutrino is.”

Because the subtle effects of neutrino mass are most visible in decays where the neutrino carries away very little energy (most of it bound up in mass), KATRIN concentrates on measuring electrons that have taken the lion’s share. From these measurements, physicists can calculate neutrino mass without having to detect these notoriously weakly-interacting particles directly.

Improvements over previous results

The new neutrino mass limit is based on data taken between 2019 and 2021, with 259 days of operations yielding over 36 million electron measurements. “That’s six times more than the previous result,” Wiesinger says.

Other improvements include better temperature control in the tritium source and a new calibration method using a monoenergetic krypton source. “We were able to reduce background noise rates by a factor of two, which really helped the precision,” he adds.

Photo of two researchers in lab coats and laser safety goggles bending over a box containing optics and other equipment. A beam of green laser light is passing through the box, suffusing the otherwise dark photo with a green glow
Keeping track: Laser system for the analysis of the tritium gas composition at KATRIN’s Windowless Gaseous Tritium Source. Improvements to temperature control in this source helped raise the precision of the neutrino mass limit. (Courtesy: Tritium Laboratory, KIT)

At 0.45 eV, the new limit means the neutrino is at least a million times lighter than the electron. “This is a fundamental number,” Wiesinger says. “It tells us that neutrinos are the lightest known massive particles in the universe, and maybe that their mass has origins beyond the Standard Model.”

Despite the new tighter limit, however, definitive answers about the neutrino’s mass are still some ways off. “Neutrino oscillation experiments tell us that the lower bound on the neutrino mass is about 0.05 eV,” says Patrick Huber, a theoretical physicist at Virginia Tech, US, who was not involved in the experiment. “That’s still about 10 times smaller than the new KATRIN limit… For now, this result fits comfortably within what we expect from a Standard Model that includes neutrino mass.”

Model independence

Though Huber emphasizes that there are “no surprises” in the latest measurement, KATRIN has a key advantage over its rivals. Unlike cosmological methods, which infer neutrino mass based on how it affects the structure and evolution of the universe, KATRIN’s direct measurement is model-independent, relying only on energy and momentum conservation. “That makes it very powerful,” Wiesinger argues. “If another experiment sees a measurement in the future, it will be interesting to check if the observation matches something as clean as ours.”

KATRIN’s own measurements are ongoing, with the collaboration aiming for 1000 days of operations by the end of 2025 and a final sensitivity approaching 0.3 eV. Beyond that, the plan is to repurpose the instrument to search for sterile neutrinos – hypothetical heavier particles that don’t interact via the weak force and could be candidates for dark matter.

“We’re testing things like atomic tritium sources and ultra-precise energy detectors,” Wiesinger says. “There are exciting ideas, but it’s not yet clear what the next-generation experiment after KATRIN will look like.”

The research appears in Science.

The post KATRIN sets tighter limit on neutrino mass appeared first on Physics World.

On the path towards a quantum economy

16 avril 2025 à 16:15
The high-street bank HSBC has worked with the NQCC, hardware provider Rigetti and the Quantum Software Lab to investigate the advantages that quantum computing could offer for detecting the signs of fraud in transactional data. (Courtesy: Shutterstock/Westend61 on Offset)

Rapid technical innovation in quantum computing has yielded an array of hardware platforms that can run increasingly sophisticated algorithms. In the real world, however, such technical advances will remain little more than a curiosity if they are not adopted by businesses and the public sector to drive positive change. As a result, one key priority for the UK’s National Quantum Computing Centre (NQCC) has been to help companies and other organizations to gain an early understanding of the value that quantum computing can offer for improving performance and enhancing outcomes.

To meet that objective the NQCC has supported several feasibility studies that enable commercial organizations in the UK to work alongside quantum specialists to investigate specific use cases where quantum computing could have a significant impact within their industry. One prime example is a project involving the high-street bank HSBC, which has been exploring the potential of quantum technologies for spotting the signs of fraud in financial transactions. Such fraudulent activity, which affects millions of people every year, now accounts for about 40% of all criminal offences in the UK and in 2023 generated total losses of more than £2.3 bn across all sectors of the economy.

Banks like HSBC currently exploit classical machine learning to detect fraudulent transactions, but these techniques require a large computational overhead to train the models and deliver accurate results. Quantum specialists at the bank have therefore been working with the NQCC, along with hardware provider Rigetti and the Quantum Software Lab at the University of Edinburgh, to investigate the capabilities of quantum machine learning (QML) for identifying the tell-tale indicators of fraud.

“HSBC’s involvement in this project has brought transactional fraud detection into the realm of cutting-edge technology, demonstrating our commitment to pushing the boundaries of quantum-inspired solutions for near-term benefit,” comments Philip Intallura, Group Head of Quantum Technologies at HSBC. “Our philosophy is to innovate today while preparing for the quantum advantage of tomorrow.”

Another study focused on a key problem in the aviation industry that has a direct impact on fuel consumption and the amount of carbon emissions produced during a flight. In this logistical challenge, the aim was to find the optimal way to load cargo containers onto a commercial aircraft. One motivation was to maximize the amount of cargo that can be carried, the other was to balance the weight of the cargo to reduce drag and improve fuel efficiency.

“Even a small shift in the centre of gravity can have a big effect,” explains Salvatore Sinno of technology solutions company Unisys, who worked on the project along with applications engineers at the NQCC and mathematicians at the University of Newcastle. “On a Boeing 747 a displacement of just 75 cm can increase the carbon emissions on a flight of 10,000 miles by four tonnes, and also increases the fuel costs for the airline company.”

aeroplane-image
A hybrid quantum–classical solution has been used to optimize the configuration of air freight, which can improve fuel efficiency and lower carbon emissions. (Courtesy: Shutterstock/supakitswn)

With such a large number of possible loading combinations, classical computers cannot produce an exact solution for the optimal arrangement of cargo containers. In their project the team improved the precision of the solution by combining quantum annealing with high-performance computing, a hybrid approach that Unisys believes can offer immediate value for complex optimization problems. “We have reached the limit of what we can achieve with classical computing, and with this work we have shown the benefit of incorporating an element of quantum processing into our solution,” explains Sinno.

The HSBC project team also found that a hybrid quantum–classical solution could provide an immediate performance boost for detecting anomalous transactions. In this case, a quantum simulator running on a classical computer was used to run quantum algorithms for machine learning. “These simulators allow us to execute simple QML programmes, even though they can’t be run to the same level of complexity as we could achieve with a physical quantum processor,” explains Marco Paini, the project lead for Rigetti. “These simulations show the potential of these low-depth QML programmes for fraud detection in the near term.”

The team also simulated more complex QML approaches using a similar but smaller-scale problem, demonstrating a further improvement in performance. This outcome suggests that running deeper QML algorithms on a physical quantum processor could deliver an advantage for detecting anomalies in larger datasets, even though the hardware does not yet provide the performance needed to achieve reliable results. “This initiative not only showcases the near-term applicability of advanced fraud models, but it also equips us with the expertise to leverage QML methods as quantum computing scales,” comments Intellura.

Indeed, the results obtained so far have enabled the project partners to develop a roadmap that will guide their ongoing development work as the hardware matures. One key insight, for example, is that even a fault-tolerant quantum computer would struggle to process the huge financial datasets produced by a bank like HSBC, since a finite amount of time is needed to run the quantum calculation for each data point. “From the simulations we found that the hybrid quantum–classical solution produces more false positives than classical methods,” says Paini. “One approach we can explore would be to use the simulations to flag suspicious transactions and then run the deeper algorithms on a quantum processor to analyse the filtered results.”

This particular project also highlighted the need for agreed protocols to navigate the strict rules on data security within the banking sector. For this project the HSBC team was able to run the QML simulations on its existing computing infrastructure, avoiding the need to share sensitive financial data with external partners. In the longer term, however, banks will need reassurance that their customer information can be protected when processed using a quantum computer. Anticipating this need, the NQCC has already started to work with regulators such as the Financial Conduct Authority, which is exploring some of the key considerations around privacy and data security, with that initial work feeding into international initiatives that are starting to consider the regulatory frameworks for using quantum computing within the financial sector.

For the cargo-loading project, meanwhile, Sinno says that an important learning point has been the need to formulate the problem in a way that can be tackled by the current generation of quantum computers. In practical terms that means defining constraints that reduce the complexity of the problem, but that still reflect the requirements of the real-world scenario. “Working with the applications engineers at the NQCC has helped us to understand what is possible with today’s quantum hardware, and how to make the quantum algorithms more viable for our particular problem,” he says. “Participating in these studies is a great way to learn and has allowed us to start using these emerging quantum technologies without taking a huge risk.”

Indeed, one key feature of these feasibility studies is the opportunity they offer for different project partners to learn from each other. Each project includes an end-user organization with a deep knowledge of the problem, quantum specialists who understand the capabilities and limitations of present-day solutions, and academic experts who offer an insight into emerging theoretical approaches as well as methodologies for benchmarking the results. The domain knowledge provided by the end users is particularly important, says Paini, to guide ongoing development work within the quantum sector. “If we only focused on the hardware for the next few years, we might come up with a better technical solution but it might not address the right problem,” he says. “We need to know where quantum computing will be useful, and to find that convergence we need to develop the applications alongside the algorithms and the hardware.”

Another major outcome from these projects has been the ability to make new connections and identify opportunities for future collaborations. As a national facility NQCC has played an important role in providing networking opportunities that bring diverse stakeholders together, creating a community of end users and technology providers, and supporting project partners with an expert and independent view of emerging quantum technologies. The NQCC has also helped the project teams to share their results more widely, generating positive feedback from the wider community that has already sparked new ideas and interactions.

“We have been able to network with start-up companies and larger enterprise firms, and with the NQCC we are already working with them to develop some proof-of-concept projects,” says Sinno. “Having access to that wider network will be really important as we continue to develop our expertise and capability in quantum computing.”

The post On the path towards a quantum economy appeared first on Physics World.

In-flight connectivity – where national policy and global service (don’t) mix

16 avril 2025 à 15:00
View of a large commercial plane leaving contrails in the sky.

An increase in the demand for better in-flight connectivity is inevitable: people don’t want to be incommunicado anymore when they travel. The in-flight connectivity (IFC) market is transitioning from niche […]

The post In-flight connectivity – where national policy and global service (don’t) mix appeared first on SpaceNews.

Microwaves slow down chemical reactions at low temperatures

16 avril 2025 à 14:17

Through new experiments, researchers in Switzerland have tested models of how microwaves affect low-temperature chemical reactions between ions and atoms. Through their innovative setup, Valentina Zhelyazkova and colleagues at ETH Zurich showed for the first time how the application of microwave pulses can slow down reaction rates via nonthermal mechanisms.

Physicists have been studying chemical reactions between ions and neutral molecules for some time. At close to room temperature, classical models can closely predict how the electric fields emanating from ions will induce dipoles in nearby neutral molecules, allowing researchers to calculate these reaction rates with impressive accuracy. Yet as temperatures drop close to absolute zero, a wide array of more complex effects come into play, which have gradually been incorporated into the latest theoretical models.

“At low temperatures, models of reactivity must include the effects of the permanent electric dipoles and quadrupole moments of the molecules, the effect of their vibrational and rotational motion,” Zhelyazkova explains. “At extremely low temperatures, even the quantum-mechanical wave nature of the reactants must be considered.”

Rigorous experiments

Although these low-temperature models have steadily improved in recent years, the ability to put them to the test through rigorous experiments has so far been hampered by external factors.

In particular, stray electric fields in the surrounding environment can heat the ions and molecules, so that any important quantum effects are quickly drowned out by noise. “Consequently, it is only in the past few years that experiments have provided information on the rates of ion–molecule reactions at very low temperatures,” Zhelyazkova explains.

In their study, Zhelyazkova’s team improved on these past experiments through an innovative approach to cooling the internal motions of the molecules being heated by stray electric fields. Their experiment involved a reaction between positively-charged helium ions and neutral molecules of carbon monoxide (CO). This creates neutral atoms of helium and oxygen, and a positively-charged carbon atom.

To initiate the reaction, the researchers created separate but parallel supersonic beams of helium and CO that were combined in a reaction cell. “In order to overcome the problem of heating the ions by stray electric fields, we study the reactions within the distant orbit of a highly excited electron, which makes the overall system electrically neutral without affecting the ion–molecule reaction taking place within the electron orbit,” explains ETH’s Frédéric Merkt.

Giant atoms

In such a “Rydberg atom”, the highly excited electron is some distance from the helium nucleus and its other electron. As a result, a Rydberg helium atom can be considered an ion with a “spectator” electron, which has little influence over how the reaction unfolds. To ensure the best possible accuracy, “we use a printed circuit board device with carefully designed surface electrodes to deflect one of the two beams,” explains ETH’s, Fernanda Martins. “We then merged this beam with the other, and controlled the relative velocity of the two beams.”

Altogether, this approach enabled the researchers to cool the molecules internally to temperatures below 10 K – where their quantum effects can dominate over externally induced noise. With this setup, Zhelyazkova, Merkt, Martins, and their colleagues could finally put the latest theoretical models to the test.

According to the latest low-temperature models, the rate of the CO–helium ion reaction should be determined by the quantized rotational states of the CO molecule – whose energies lie within the microwave range. In this case, the team used microwave pulses to put the CO into different rotational states, allowing them to directly probe their influence on the overall reaction rate.

Three important findings

Altogether, their experiment yielded three important findings. It confirmed that the reaction rate can vary, depending on the rotational state of the CO molecule; it showed that this reactivity can be modified by using a short microwave pulse to excite the CO molecule from its ground state to its first excited state – with the first excited state being less reactive than the ground state.

The third and most counterintuitive finding is that microwaves can slow down the reaction rate, via mechanisms unrelated to the heat they impart on the molecules absorbing them. “In most applications of microwaves in chemical synthesis, the microwaves are used as a way to thermally heat the molecules up, which always makes them more reactive,” Zhelyazkova says.

Building on the success of their experimental approach, the team now hopes to investigate these nonthermal mechanisms in more detail – with the aim to shed new light on how microwaves can influence chemical reactions via effects other than heating. In turn, their results could ultimately pave the way for advanced new techniques for fine-tuning the rate of reactions between ions and neutral molecules.

The research is described in Physical Review Letters.

The post Microwaves slow down chemical reactions at low temperatures appeared first on Physics World.

Robert P Crease lifts the lid on 25 years as a ‘science critic’

16 avril 2025 à 12:00

A quarter of a century ago, in May 2000, I published an article entitled “Why science thrives on criticism”. The article, which ran to slightly over a page in Physics World magazine, was the first in a series of columns called Critical Point. Periodicals, I said, have art and music critics as well as sports and political commentators, and book and theatre reviewers too. So why shouldn’t Physics World have a science critic?

The implication that I had a clear idea of the “critical point” for this series was not entirely accurate. As the years go by, I have found myself improvising, inspired by politics, books, scientific discoveries, readers’ thoughts, editors’ suggestions and more. If there is one common theme, it’s that science is like a workshop – or a series of loosely related workshops – as I argued in The Workshop and the World, a book that sprang from my columns.

Workshops are controlled environments, inside which researchers can stage and study special things – elementary particles, chemical reactions, plant uptakes of nutrients – that appear rarely or in a form difficult to study in the surrounding world. Science critics do not participate in the workshops themselves or even judge their activities. What they do is evaluate how workshops and worlds interact.

This can happen in three ways

Critical triangle

First is to explain why what’s going on inside the workshops matters to outsiders. Sometimes, those activities can be relatively simple to describe, which leads to columns concerning all manner of everyday activities. I have written, for example, about the physics of coffee and breadmaking. I’ve also covered toys, tops, kaleidoscopes, glass and other things that all of us – physicists and non-physicists alike – use, value and enjoy.

Sometimes I draw out more general points about why those activities are important. Early on, I invited readers to nominate their most beautiful experiments in physics. (Spoiler alert: the clear winner was the double-slit experiment with electrons.) I later did something similar about the cultural impact of equations – inviting readers to pick their favourites and reporting on their results (second spoiler alert: Maxwell’s equations came top). I also covered readers’ most-loved literature about laboratories.

Physicists often engage in activities that might seem inconsequential to them yet are an intrinsic part of the practice of physics

When viewing science as workshops, a second role is to explain why what’s outside the workshops matters to insiders. That’s because physicists often engage in activities that might seem inconsequential to them – they’re “just what the rest the world does” – yet are an intrinsic part of the practice of physics. I’ve covered, for example, physicists taking out patents, creating logos, designing lab architecture, taking holidays, organizing dedications, going on retirement and writing memorials for the deceased.

Such activities I term “black elephants”. That’s because they’re a cross between things physicists don’t want to talk about (“elephants in the room”) and things that force them to renounce cherished notions (just as “black swans” disprove that “all swans are white”).

A third role of a science critic is to explain what matters that takes place both inside and outside the workshop. I’m thinking of things like competition, leadership, trustsurprise, workplace training courses, cancel culture and even jokes and funny tales. Interpretations of the meaning of quantum mechanics, such as “QBism”, which I covered both in 2019 and 2022, are an ongoing interest. That’s because they’re relevant both to the structure of physics and to philosophy as they disrupt notions of realism, objectivity, temporality and the scientific method.

Being critical

The term “critic” may suggest someone with a congenitally negative outlook, but that’s wrong. My friend Fred Cohn, a respected opera critic, told me that, in a conversation after a concert, he criticized the performance of the singer Luciano Pavarotti. His remark provoked a woman to shout angrily at him: “Could you do better?” Of course not! It’s the critic’s role to evaluate performances of an activity, not to perform the activity oneself.

illustration of a person sat at a desk using a typewriter
Working practices In his first Critical Point column for Physics World, philosopher and historian of science Robert P Crease interrogated the role of the science critic. (Courtesy: iStock/studiostockart)

Having said that, sometimes a critic must be critical to be honest. In particular, I hate it when scientists try to delegitimize the experience of non-scientists by saying, for example, that “time does not exist”. Or when they pretend they don’t see rainbows but wavelengths of light or that they don’t see sunrises or the plane of a Foucault pendulum move but the Earth spinning. Comments like that turn non-scientists off science by making it seem elitist and other-worldly. It’s what I call “scientific gaslighting”.

Most of all, I hate it when scientists pontificate that philosophy is foolish or worthless, especially when it’s the likes of Steven Pinker, who ought to know better. Writing in Nature (518 300), I once criticized the great theoretical physicist Steven Weinberg, who I counted as a friend, for taking a complex and multivalent text, plucking out a single line, and misreading it as if the line were from a physics text.

The text in question was Plato’s Phaedo, where Socrates expresses his disappointment with his fellow philosopher Anaxagoras for giving descriptions of heavenly bodies “in purely physical terms, without regard to what is best”. Weinberg claimed this statement meant that Socrates “was not very interested in natural science”. Nothing could be further from the truth.

At that moment in the Phaedo, Socrates is recounting his intellectual autobiography. He has just come to the point where, as a youth, he was entranced by materialism and was eager to hear Anaxagoras’s opposing position. When Anaxagoras promised to describe the heavens both mechanically and as the product of a wise and divine mind but could do only the former, Socrates says he was disappointed.

Weinberg’s jibe ignores the context. Socrates is describing how he had once embraced Anaxagoras’s view of a universe ruled by a divine mind but later rejected that view. As an adult, Socrates learned to test hypotheses and other claims through putting them to the test, just as modern-day scientists do. Weinberg was misrepresenting Socrates by describing a position that he later abandoned.

The critical point of the critical point

Ultimately, the “critical point” of my columns over the last 25 years has been to provoke curiosity and excitement about what philosophers, historians and sociologists do for science. I’ve also wanted to raise awareness that these fields are not just fripperies but essential if we are to fully understand and protect scientific activity.

As I have explained several times – especially in the wake of the US shutting its High Flux Beam Reactor and National Tritium Labeling Facility – scientists need to understand and relate to the surrounding world with the insight of humanities scholars. Because if they don’t, they are in danger of losing their workshops altogether.

The post Robert P Crease lifts the lid on 25 years as a ‘science critic’ appeared first on Physics World.

Helium nanobubble measurements shed light on origins of heavy elements in the universe

16 avril 2025 à 10:00

New measurements by physicists from the University of Surrey in the UK have shed fresh light on where the universe’s heavy elements come from. The measurements, which were made by smashing high-energy protons into a uranium target to generate strontium ions, then accelerating these ions towards a second, helium-filled target, might also help improve nuclear reactors.

The origin of the elements that follow iron in the periodic table is one of the biggest mysteries in nuclear astrophysics. As Surrey’s Matthew Williams explains, the standard picture is that these elements were formed when other elements captured neutrons, then underwent beta decay. The two ways this can happen are known as the rapid (r) and slow (s) processes.

The s-process occurs in the cores of stars and is relatively well understood. The r-process is comparatively mysterious. It occurs during violent astrophysical events such as certain types of supernovae and neutron star mergers that create an abundance of free neutrons. In these neutron-rich environments, atomic nuclei essentially capture neutrons before the neutrons can turn into protons via beta-minus decay, which occurs when a neutron emits an electron and an antineutrino.

From the night sky to the laboratory

One way of studying the r-process is to observe older stars. “Studies on heavy element abundance patterns in extremely old stars provide important clues here because these stars formed at times too early for the s-process to have made a significant contribution,” Williams explains. “This means that the heavy element pattern in these old stars may have been preserved from material ejected by prior extreme supernovae or neutron star merger events, in which the r-process is thought to happen.”

Recent observations of this type have revealed that the r-process is not necessarily a single scenario with a single abundance pattern. It may also have a “weak” component that is responsible for making elements with atomic numbers ranging from 37 (rubidium) to 47 (silver), without getting all the way up to the heaviest elements such as gold (atomic number 79) or actinides like thorium (90) and uranium (92).

This weak r-process could occur in a variety of situations, Williams explains. One scenario involves radioactive isotopes (that is, those with a few more neutrons than their stable counterparts) forming in hot neutrino-driven winds streaming from supernovae. This “flow” of nucleosynthesis towards higher neutron numbers is caused by processes known as (alpha,n) reactions, which occur when a radioactive isotope fuses with a helium nucleus and spits out a neutron. “These reactions impact the final abundance pattern before the neutron flux dissipates and the radioactive nuclei decay back to stability,” Williams says. “So, to match predicted patterns to what is observed, we need to know how fast the (alpha,n) reactions are on radioactive isotopes a few neutrons away from stability.”

The 94Sr(alpha,n)97Zr reaction

To obtain this information, Williams and colleagues studied a reaction in which radioactive strontium-94 absorbs an alpha particle (a helium nucleus), then emits a neutron and transforms into zirconium-97. To produce the radioactive 94Sr beam, they fired high-energy protons at a uranium target at TRIUMF, the Canadian national accelerator centre. Using lasers, they selectively ionized and extracted strontium from the resulting debris before filtering out 94Sr ions with a magnetic spectrometer.

The team then accelerated a beam of these 94Sr ions to energies representative of collisions that would happen when a massive star explodes as a supernova. Finally, they directed the beam onto a nanomaterial target made of a silicon thin film containing billions of small nanobubbles of helium. This target was made by researchers at the Materials Science Institute of Seville (CSIC) in Spain.

“This thin film crams far more helium into a small target foil than previous techniques allowed, thereby enabling the measurement of helium burning reactions with radioactive beams that characterize the weak r-process,” Williams explains.

To identify the 94Sr(alpha,n)97Zr reactions, the researchers used a mass spectrometer to select for 97Zr while simultaneously using an array of gamma-ray detectors around the target to look for the gamma rays it emits. When they saw both a heavy ion with an atomic mass of 97 and a 97Zr gamma ray, they knew they had identified the reaction of interest. In doing so, Williams says, they were able to measure the probability that this reaction occurs at the energies and temperatures present in supernovae.

Williams thinks that scientists should be able to measure many more weak r-process reactions using this technology. This should help them constrain where the weak r-process comes from. “Does it happen in supernovae winds? Or can it happen in a component of ejected material from neutron star mergers?” he asks.

As well as shedding light on the origins of heavy elements, the team’s findings might also help us better understand how materials respond to the high radiation environments in nuclear reactors. “By updating models of how readily nuclei react, especially radioactive nuclei, we can design components for these reactors that will operate and last longer before needing to be replaced,” Williams says.

The work is detailed in Physical Review Letters.

The post Helium nanobubble measurements shed light on origins of heavy elements in the universe appeared first on Physics World.

❌