↩ Accueil

Vue normale

Reçu aujourd’hui — 21 août 2025Physics World

Nano-engineered flyers could soon explore Earth’s mesosphere

21 août 2025 à 13:00

Small levitating platforms that can stay airborne indefinitely at very high altitudes have been developed by researchers in the US and Brazil. Using photophoresis, the devices could be adapted to carry small payloads in the mesosphere where flight is notoriously difficult. It could even be used in the atmospheres of moons and other planets.

Photophoresis occurs when light illuminates one side of a particle, heating it slightly more than the other. The resulting temperature difference in the surrounding gas means that molecules rebound with more energy on the warmer side than the cooler side – producing a tiny but measurable push.

For most of the time since its discovery in the 1870s, the effect was little more than a curiosity. But with more recent advances in nanotechnology, researchers have begun to explore how photophoresis could be put to practical use.

“In 2010, my graduate advisor, David Keith, had previously written a paper that described photophoresis as a way of flying microscopic devices in the atmosphere, and we wanted to see if larger devices could carry useful payloads,” explains Ben Schafer at Harvard University, who led the research. “At the same time, [Igor Bargatin’s group at the University of Pennsylvania] was doing fascinating work on larger devices that generated photophoretic forces.”

Carrying payloads

These studies considered a wide variety of designs: from artificial aerosols, to thin disks with surfaces engineered to boost the effect. Building on this earlier work, Schafer’s team investigated how lightweight photophoretic devices could be optimized to carry payloads in the mesosphere: the atmospheric layer at about 50–80 km above Earth’s surface, where the sparsity of air creates notoriously difficult flight conditions for conventional aircraft or balloons.

“We used these results to fabricate structures that can fly in near-space conditions, namely, under less than the illumination intensity of sunlight and at the same pressures as the mesosphere,” Schafer explains.

The team’s design consists two alumina membranes – each 100 nm thick, and perforated with nanoscale holes. The membranes are positioned a short distance apart, and connected by ligaments. In addition, the bottom membrane is coated with a light-absorbing chromium layer, causing it to heat the surrounding air more than the top layer as it absorbs incoming sunlight.

As a result, air molecules move preferentially from the cooler top side toward the warmer bottom side through the membranes’ perforations: a photophoretic process known as thermal transpiration. This one-directional flow creates a pressure imbalance across the device, generating upward thrust. If this force exceeds the device’s weight, it can levitate and even carry a payload. The team also suggests that the devices could be kept aloft at night using the infrared radiation emitted by Earth into space.

Simulations and experiments

Through a combination of simulations and experiments, Schafer and his colleagues examined how factors such as device size, hole density, and ligament distribution could be tuned to maximize thrust at different mesospheric altitudes – where both pressure and temperature can vary dramatically. They showed that platforms 10 cm in radius could feasibly remain aloft throughout the mesosphere, powered by sunlight at intensities lower than those actually present there.

Based on these results, the team created a feasible design for a photophoretic flyer with a 3 cm radius, capable of carrying a 10 mg payload indefinitely at altitudes of 75 km. With an optimized design, they predict payloads as large as 100 mg could be supported during daylight.

“These payloads could support a lightweight communications payload that could transmit data directly to the ground from the mesosphere,” Schafer explains. “Small structures without payloads could fly for weeks or months without falling out of the mesosphere.”

With this proof of concept, the researchers are now eager to see photophoretic flight tested in real mesospheric conditions. “Because there’s nothing else that can sustainably fly in the mesosphere, we could use these devices to collect ground-breaking atmospheric data to benefit meteorology, perform telecommunications, and predict space weather,” Schafer says.

Requiring no fuel, batteries, or solar panels, the devices would be completely sustainable. And the team’s ambitions go beyond Earth: with the ability to stay aloft in any low-pressure atmosphere with sufficient light, photophoretic flight could also provide a valuable new approach to exploring the atmosphere of Mars.

The research is described in Nature.

The post Nano-engineered flyers could soon explore Earth’s mesosphere appeared first on Physics World.

Deep-blue LEDs get a super-bright, non-toxic boost

21 août 2025 à 10:00

A team led by researchers at Rutgers University in the US has discovered a new semiconductor that emits bright, deep-blue light. The hybrid copper iodide material is stable, non-toxic, can be processed in solution and has already been integrated into a light-emitting diode (LED). According to its developers, it could find applications in solid-state lighting and display technologies.

Creating white light for solid-state lighting and full-colour displays requires bright, pure sources of red, green and blue light. While stable materials that efficiently emit red or green light are relatively easily to produce, those that generate blue light (especially deep-blue light) are much more challenging. Existing blue-light emitters based on organic materials are unstable, meaning they lose their colour quality over time. Alternatives based on lead-halide perovskites or cadmium-containing colloidal quantum dots are more stable, but also toxic for humans and the environment.

Hybrid copper-halide-based emitters promise the best of both worlds, being both non-toxic and stable. They are also inexpensive, with tuneable optical properties and a high luminescence efficiency, meaning they are good at converting power into visible light.

Researchers have already used a pure inorganic copper iodide material, Cs3Cu2I5, to make deep-blue LEDs. This material emits light at the ideal wavelength of 445 nm, is robust to heat and moisture, and it emits between 87–95% of the excitation photons it absorbs as luminescence photons, giving it a high photoluminescence quantum yield (PLQY).

However, the maximum ratio of photon output to electron input (known as the maximum external quantum efficiency, EQEmax) for this material is very low, at just 1.02%.

Strong deep-blue photoluminescence

In the new work, a team led by Rutgers materials chemist Jing Li developed a hybrid copper iodide with the chemical formula 1D-Cu4I8(Hdabco)4 (CuI(Hda), where Hdabco is 1,4-diazabicyclo-[2.2.2]octane-1-ium. This material emits strong deep-blue light at 449 nm with a PLQY near unity (99.6%).

Li and colleagues opted to use CuI(Hda) as the sole light emitting layer and built a thin-film LED out of it using a solution process. The new device has an EQEmax of 12.6% with colour coordinates (0.147, 0.087) and a peak brightness of around 4000 cd m-2. It is also relatively stable, with an operational half-lifetime (T50) of approximately 204 hours under ambient conditions. These figures mean that its performance rivals the best existing solution-processed deep-blue LEDs, Li says. The team also fabricated a large-area device measuring 4 cm² to demonstrate that the material could be used in real-world applications.

Interfacial hydrogen-bond passivation strategy

The low PLQY of previous such devices is partly due to the fact that charge carriers (electrons and holes) in these materials rapidly recombine in a non-radiative way, typically due to surface and bulk defects, or traps. The charge carriers also have a low radiative recombination rate, which is associated with a small exciton (electron-hole pair) binding energy.

Li and colleagues overcame this problem in their new device thanks to an interfacial hydrogen-bond passivation (DIHP) strategy that involves introducing hydrogen bonds via an ultrathin sheet of polymethylacrylate (PMMA) and a carbazole-phosphonic acid-based self-assembled monolayer (Ac2PACz) at the two interfaces of the CuI(Hda) emissive layer. This effectively passivates both heterojunctions of the copper-iodide hydride light-emitting layer and optimizes exciton binding energies. “Such a synergistic surface modification dramatically boosts the performance of the deep-blue LED by a factor of fourfold,” explains Li.

According to Li, the study suggests a promising route for developing blue emitters that are both energy-efficient and environmentally benign, without compromising on performance. “Through the fabrication of blue LEDs using a low cost, stable and nontoxic material capable of delivering efficient deep-blue light, we address major energy and ecological limitations found in other types of solution-processable emitters,” she tells Physics World.

Li adds that the hydrogen-bonding passivation technique is not limited to the material studied in this work. It could also be applied to minimize interfacial energy losses in a wide range of other solution-based, light-emitting optoelectronic systems.

The team is now pursuing strategies for developing other solution-processable, high-performance hybrid copper iodide-based emitter materials similar to CuI(Hda). “Our goal is to further enhance the efficiency and extend the operational lifetime of LEDs utilizing these next-generation materials,” says Li.

The present work is detailed in Nature.

The post Deep-blue LEDs get a super-bright, non-toxic boost appeared first on Physics World.

Reçu hier — 20 août 2025Physics World

Physicists discover a new proton magic number

20 août 2025 à 15:00

The first precise mass measurements of an extremely short-lived and proton-rich nucleus, silicon-22, have revealed the “magic” – that is, unusually tightly bound – nature of nuclei containing 14 protons. As well as shedding light on nuclear structure, the discovery could improve our understanding of the strong nuclear force and the mechanisms by which elements form.

At the lighter end of the periodic table, stable nuclei tend to contain similar numbers of neutrons and protons. As the number of protons increases, additional neutrons are needed to balance out the mutual repulsion of the positively-charged protons. As a rule, therefore, an isotope of a given element will be unstable if it contains either too few neutrons or too many.

In 1949, Maria Goeppert Mayer and J Hans D Jensen proposed an explanation for this rule. According to their nuclear shell model, nuclei that contain certain “magic” numbers of nucleons (neutrons and/or protons) are more bound because they have just the right number of nucleons to fully fill their shells. Nuclei that contain magic numbers of both protons and neutrons are even more bound and are said to be “doubly magic”. Subsequent studies showed that for neutrons, these magic numbers are 2, 8, 20, 28, 50, 82 and 126.

While the magic numbers for stable and long-lived nuclei are now well-established, those for exotic, short-lived ones with unusual proton-neutron ratios are comparatively little understood. Do these highly unstable nuclei have the same magic numbers as their more stable counterparts? Or are they different?

In recent years, studies showing that neutron-rich nuclei have magic numbers of 14, 16, 32 and 34 have brought scientists closer to answering this question. But what about protons?

“The hunt for new magic numbers in proton-rich nuclei is just as exciting,” says Yuan-Ming Xing, a physicist at the Institute for Modern Physics (IMP) of the Chinese Academy of Sciences, who led the latest study on silicon-22. “This is because we know much less about the evolution of the shell structure of these nuclei, in which the valence protons are loosely bound.” Protons in these nuclei can even couple to states in the continuum, Xing adds, forming the open quantum systems that have become such a hot topic in quantum research.

Mirror nuclei

After measurements on oxygen-22 (14 neutrons, 8 protons) showed that 14 is a magic number of neutrons for this neutron-rich isotope, the hunt was on for a proton-rich counterpart. An important theory in nuclear physics known as isospin symmetry states that nuclei with interchanged numbers of protons and neutrons will have identical characteristics. The magic numbers for protons and neutrons for these “mirror” nuclei, as they are known, are therefore expected to be the same. “Of all the new neutron-rich doubly-magic nuclei discovered, only one loosely bound mirror nucleus for oxygen-22 exists,” says IMP team member Yuhu Zhang. “This is silicon-22.”

The problem is that silicon-22 (14 protons, 8 neutrons) has a short half-life and is hard to produce in quantities large enough to study. To overcome this, the researchers used an improved version of a technique known as Bρ-defined isochronous mass spectroscopy.

Working at the Cooler-Storage Ring of the Heavy Ion Research Facility in Lanzhou, China, Xing, Zhang and an international team of collaborators began by accelerating a primary beam of stable 36Ar15+ ions to around two thirds the speed of light. They then directed this beam onto a 15-mm-thick beryllium target, causing some of the 36Ar ions to fragment into silicon-22 nuclei. After injecting these nuclei into the storage ring, the researchers could measure their velocity and the time it took them to circle the ring. From this, they could determine their mass. This measurement confirmed that the proton number 14 is indeed magic in silicon-22.

A better understanding of nucleon interactions

“Our work offers an excellent opportunity to test the fundamental theories of nuclear physics for a better understanding of nucleon interactions, of how exotic nuclear structures evolve and of the limit of existence of extremely exotic nuclei,” says team member Giacomo de Angelis, a nuclear physicist affiliated with the National Laboratories of Legnaro in Italy as well as the IMP. “It could also help shed more light on the reaction rates for element formation in stars – something that could help astrophysicists to better model cosmic events and understand how our universe works.”

According to de Angelis, this first mass measurement of the silicon-22 nucleus and the discovery of the magic proton number 14 is “a strong invitation not only for us, but also for other nuclear physicists around the world to investigate further”. He notes that researchers at the Facility for Rare Isotope Beams (FRIB) at Michigan State University, US, recently measured the energy of the first excited state of the silicon-22 nucleus. The new High Intensity Heavy-Ion Accelerator Facility (HIAF) in Huizhou, China, which is due to come online soon, should enable even more detailed studies.

“HIAF will be a powerful accelerator, promising us ideal conditions to explore other loosely bound systems, thereby helping theorists to more deeply understand nucleon-nucleon interactions, quantum mechanics of open quantum systems and the origin of elements in the universe,” he says.

The present study is detailed in Physical Review Letters

The post Physicists discover a new proton magic number appeared first on Physics World.

Equations, quarks and a few feathers: more physics than birds

20 août 2025 à 12:00

Lots of people like birds. In Britain alone, 17 million households collectively spend £250m annually on 150,000 tonnes of bird food, while 1.2 million people are paying members of the Royal Society for the Protection of Birds (RSPB), Europe’s largest conservation charity. But what is the Venn diagram overlap between those who like birds and those who like physics?

The 11,000 or more species of birds in the world have evolved to occupy separate ecological niches, with many remarkable abilities that, while beyond human capabilities, can be explained by physics. Owls, for example, detect their prey by hearing with asymmetric ears then fly almost silently to catch it. Kingfishers and ospreys, meanwhile, dive for fish in freshwater or sea, compensating for the change of refractive index at the surface. Kestrels and hummingbirds, on the other hand, can hover through clever use of aerodynamics.

Many birds choose when to migrate by detecting subtle changes in barometric pressure. They are often colourful and can even be blue – a pigment that is scarce in nature – due to the structure of their feathers, which can make them appear kaleidoscopic depending on the viewing angle. Many species can even see into the ultraviolet; the blue tits in our gardens look very different in each other’s eyes than they do to ours.

Those of us with inquisitive minds cannot help but wonder how they do these things. Now, The Physics of Birds and Birding: the Sounds, Colors and Movements of Birds, and Our Tools for Watching Them by retired physicist Michael Hurben covers all of these wonders and more.

Where are the birds?

In each chapter Hurben introduces a new physics-related subject, often with an unexpected connection to birds. The more abstruse topics include fractals, gravity, electrostatics, osmosis and Fourier transforms. You might not think quarks would be mentioned in a book on birds, but they are. Some of these complicated subjects, however, take the author several pages to explain, and it can then be a disappointment to discover just a short paragraph mentioning a bird. It is also only in the final chapter that the author explains flight, the attribute unique among vertebrates to birds (and bats).

The antepenultimate chapter justifies the second part of the book’s title – birding. It describes the principles underlying some of the optical instruments used by humans to detect and identify birds, such as binoculars, telescopes and cameras. The physics is simpler, so the answers here might be more familiar to non-scientist birders. Indeed, focal lengths, refractive indices, shape of lenses and anti-reflection coatings, for example, are often covered in school physics and known to anyone wearing spectacles.

Unfortunately, Hurben has not heeded the warning given to Stephen Hawking by his editor of A Brief History of Time, which is that each equation would halve the book’s readership. That masterpiece includes only the single equation, which any physicist could predict. But The Physics of Birds and Birding sets the scene with seven equations in its first chapter, and many more throughout. While understanding is helped by over 100 small diagrams, if you’re expecting beautiful photos and illustrations of birds, you’ll be disappointed. In fact, there are no images of birds whatsoever – and without them the book appears like an old fashioned black-and-white textbook.

Physicist or birder?

The author’s interest in birds appears to be in travelling to see them, and he has a “life-list” of over 5000 species. But not much attention in this book is paid to those of us who are more interested in studying birds for conservation. For example, there is no mention of thermal imaging instruments or drones – technology that depends a lot on physics – which are increasingly being used to avoid fieldworkers having to search through sensitive vegetation or climb trees to find birds or their nests. Nowadays, there are more interactions between humans and birds using devices such as smartphones, GPS or digital cameras, or indeed the trackers attached to birds by skilled and licensed scientists, but none of these is covered in The Physics of Birds and Birding.

Although I am a Fellow of the Institute of Physics and the Royal Society of Biology who has spent more than 50 years as an amateur birder and published many papers on both topics, it is not clear who is the intended target audience for this volume. It seems to me that it would be of more interest to some physicists who enjoy seeing physics being applied to the natural world, than for birders who want to understand how birds work. Either way, the book is definitely for only a select part of the birder-physicist Venn diagram.

  • 2025 Pelagic Publishing 240pp £30 pb; £30 e-book

The post Equations, quarks and a few feathers: more physics than birds appeared first on Physics World.

Probing quantum entanglement in top quark pairs at the LHC

20 août 2025 à 10:19

Quantum entanglement is a fascinating phenomenon in which the states of two particles become intrinsically linked, such that a change in one particle’s state instantly affects the other, regardless of the distance between them. This nonlocal connection leads to remarkable effects, including the ability to influence one particle by manipulating its entangled partner. Famously, Einstein referred to quantum entanglement as “Spooky Action at a Distance”. Until now, entanglement has been observed in systems involving atoms, electrons, and photons.

Researchers from the CMS Collaboration at CERN have used data collected in 2016 at the Large Hadron Collider (LHC) to investigate quantum entanglement in top quark–antiquark pairs. In these experiments, protons were collided at extremely high energies (13 TeV), which, as predicted by Quantum Chromodynamics, can lead to the production of top quark pairs. The top quark is the heaviest known fundamental particle, and both it and its antiquark counterpart decay almost instantly after being produced.

In this study, CMS focused on events where two leptons (such as electrons or muons) were detected with opposite charges and high momentum. These leptons originate from the decay of the top quark and antiquark and carry information about their parent particles, including their spin. To probe entanglement, the researchers used an observable called D, which quantifies the correlation between the spins of the top quark and antiquark. A value of D < -1/3 serves as a clear signature of quantum entanglement. The CMS result lies more than five standard deviations below this threshold, establishing the observation of entanglement in top quark-antiquark pairs. This offers valuable insight into the behaviour of quantum systems under extreme energy conditions.

The ATLAS Collaboration previously reported the first observation of entanglement. This measurement was carried out at the level of stable particles, reconstructed after hadronization. In contrast, the CMS measurement was performed at the parton level, which refers to the behaviour of the original building blocks of particles (quarks and gluons) before they form composite particles. This approach provides a complementary view of the quantum state.

This groundbreaking work not only confirms that quantum mechanics holds true at the highest energies and shortest timescales by revealing entanglement in the production of the heaviest fundamental particles, but also establishes particle colliders like the LHC as powerful platforms for exploring quantum information science.

Read the full article

Observation of quantum entanglement in top quark pair production in proton-proton collisions at √s = 13 TeV

The CMS Collaboration 2024 Rep. Prog. Phys. 87 117801

Do you want to learn more about this topic?

Top quark physics in hadron collisions by Wolfgang Wagner (2005)

The post Probing quantum entanglement in top quark pairs at the LHC appeared first on Physics World.

How does entanglement affect high-energy collisions?

20 août 2025 à 10:17

Entanglement is fundamental to our understanding of the microscopic world and remains one the strangest aspects of quantum mechanics.

There are various ways to quantify the level of entanglement in quantum systems. One of these measures is called entanglement entropy.

In this context, entropy refers to the minimum amount of information required to describe a system. A system with high entropy requires a lot of information to describe it. This also means that it contains a large amount of uncertainty.

In recent years, there has been a growing interest in quantum entanglement within high-energy physics, for example in understanding the structure of protons and other hadrons.

Hadrons themselves are made up of quarks, which are tightly bound together via exchanges of gluons. The properties of these hadrons can be calculated using our best theory of the strong force – quantum chromodynamics (QCD) – but this is usually very challenging.

In this work, the team investigated how entanglement entropy evolves in high-energy processes. They particularly focused on deep inelastic scattering, where a high-energy electron probes the internal structure of a proton.

By examining how entanglement entropy depends on velocity, the researchers connected theoretical predictions with experimental data on hadron production.

Their results suggest that, in many cases, a state of maximum entanglement is reached. This is where the particles are as strongly correlated as quantum mechanics allows.

The team’s work will lead to a deeper understanding of fundamental QCD processes and help bridge the gap between theoretical predictions and experimental observations of particle collisions.

 

Read the full article

QCD evolution of entanglement entropy – IOPscience

Martin Hentschinski et al 2024 Rep. Prog. Phys. 87 120501

The post How does entanglement affect high-energy collisions? appeared first on Physics World.

Laser-driven implosion could produce megatesla magnetic fields

20 août 2025 à 10:00

Magnetic fields so strong that they are typically only observed in astrophysical jets and highly magnetized neutron stars could be created in the laboratory, say physicists at the University of Osaka, Japan. Their proposed approach relies on directing extremely short, intense laser pulses into a hollow tube housing sawtooth-like inner blades. The fields created in this improved version of the established “microtube implosion” technique could be used to imitate effects that occur in various high-energy-density processes, including non-linear quantum phenomena and laser fusion as well as astrophysical systems.

Researchers have previously shown that advanced ultra-intense femtosecond (10-15 s) lasers can generate magnetic fields with strengths of up to several kilotesla. More recently, a suite of techniques that combines advanced laser technologies with complex microstructures promises to push this limit even higher, into the megatesla regime.

Microtube implosion is one such technique. Here, femtosecond laser pulses with intensities between 1020 and 1022W/cm2 are aimed at a hollow cylindrical target with an inner radius of between 1 and 10 mm. This produces a plasma of hot electrons with MeV energies that form a sheath field along the inner wall of the tube. These electrons accelerate ions radially inward, causing the cylinder to implode.

At this point, a “seed” magnetic field deflects the ions and electrons in opposite azimuthal directions via the Lorentz force. The loop currents induced in the same direction ultimately generate a strong axial magnetic field.

Self-generated loop current

Although the microtube implosion technique is effective, it does require a kilotesla-scale seed field. This complicates the apparatus and makes it rather bulky.

 In the latest work, Osaka’s Masakatsu Murakami and colleagues propose a new setup that removes the need for this seed field. It does this by replacing the 1‒10 mm cylinder with a micron-sized one that has a periodically slanted inner surface housing sawtooth-shaped blades. These blades introduce a geometrical asymmetry in the cylinder, causing the imploding plasma to swirl asymmetrically inside it and generating circulating currents near its centre. These self-generated loop currents then produce an intense axial magnetic field with a magnitude in the gigagauss range (1 gigagauss = 100 000 T).

Using “particle-in-cell” simulations running the fully relativistic EPOCH code on Osaka’s SQUID supercomputer, the researchers found that such vortex structures and their associated magnetic field arise from a self-consistent positive feedback mechanism. The initial loop current amplifies the central magnetic field, which in turn constrains the motion of charged particles more tightly via the Lorentz force – and thereby reinforces and intensifies the loop current further.

“This approach offers a powerful new way to create and study extreme magnetic fields in a compact format,” Murakami says. “It provides an experimental bridge between laboratory plasmas and the astrophysical universe and could enable controlled studies of strongly magnetized plasmas, relativistic particle dynamics and potentially magnetic confinement schemes relevant to both fusion and astrophysics.”

The researchers, who report their work in Physics of Plasmas, are now looking to realize their scheme in an experiment using petawatt-class lasers. “We will also investigate how these magnetic fields can be used to steer particles or compress plasmas,” Murakami tells Physics World.

The post Laser-driven implosion could produce megatesla magnetic fields appeared first on Physics World.

Reçu avant avant-hierPhysics World

Travis Humble from Oak Ridge’s Quantum Science Center explains how large facilities benefit from collaboration

19 août 2025 à 11:47

What is the mission of the Quantum Science Center?

The Quantum Science Center is one of five of the DoE’s National Quantum Information Science Research Centers.  The Quantum Science Center is a partnership led by Oak Ridge and it includes more than 20 other institutions including universities and companies.

ORNL and the other national laboratories play a crucial role in research and development within the United States. And partnerships play a crucial role in that activity. Some partnerships are with universities, especially individual investigators who require access to the powerful instruments that we develop and maintain. The labs are funded by the DOE, and users can apply to use the instruments or collaborate with our resident scientists to develop research ideas and follow through to publication.

In addition to providing cutting edge facilities to the nation’s scientists, national labs also play an important role in creating a scientific workforce that is educated in how to develop and use a range of scientific infrastructure and instrumentation. These personnel will ensure that scientific breakthroughs will continue to be made and that scientists will continue to have access to the best research facilities.

Casual portrait of Travis Humble, 2022.
Travis Humble Focusing on the discovery, synthesis and characterization of new quantum materials at Oak Ridge National Laboratory. (Courtesy: ORNL)

ORNL is home to several facilities for material characterization, including the Spallation Neutron Source (SNS). How is the lab using these facilities to develop new materials for quantum technologies?

ORNL has many unique facilities, including the SNS, which is a user facility of the DOE. It is one of the brightest sources of neutrons in the world, which makes it an incredibly powerful tool for characterizing novel materials.

We are using the SNS to look at some remarkable materials that have useful quantum properties such as topological order and entanglement. These strongly correlated systems have useful electronic or magnetic properties. What makes them so interesting is that under the right conditions they have unique phases in which their electrons or spins are entangled quantum mechanically.

We probe these materials using a range of instruments on the SNS and infer whether or not the materials are entangled. This allows us to identify materials that will be useful for developing new quantum technologies.

What other instruments are used to develop new quantum materials at ORNL?

The SNS is certainly one of our biggest and boldest instruments that we have for characterizing these types of systems, but in no way is it the only one. ORNL is also home to the Center for Nanophase Materials Sciences (CNMS). This is one of the DOE’s Office of Science Nanoscale Science Research Centers and it’s a remarkable facility co-located with the SNS at ORNL. The CNMS enables the synthesis and characterization of new quantum materials with the ultimate goal of gaining control over their useful properties.

A scanning electron microscope image of MnSi microcrystal
Zoom in A scanning electron microscope image of MnSi microcrystal grown using chemical vapour deposition at the Center for Nanophase Materials Sciences. (Courtesy: ORNL)

Can you give an example of that work?

We are very interested in a type of material called a spin liquid. It’s a magnetic system where the quantum spins within the material can become entangled with each other and have correlations over very large distances – relative to the size of individual atoms. Using SNS we have detected a signature of entanglement in these materials. That material is ruthenium trichloride and it is one type of quantum magnet that we are studying at ORNL.

The next step is to take materials that have been certified as quantum or entangled and translate them into new types of devices. This is what the CNMS excels and involves fabricating a quantum material into unique geometries; and connecting it to electrodes and other types of control systems that then provide a path to demonstrating novel physics and other types of unique behaviours.

In the case of quantum spin liquids, for example, we believe that they can be translated into a new qubit (quantum bit) technology that can store and process quantum information. We haven’t got there yet, but the tools and capabilities that we have here at Oak Ridge are incredibly empowering for that type of technology development.

ORNL is also famous for its supercomputing capabilities and is home to Frontier, which is one of the world’s most powerful supercomputers. How does the lab’s high-performance computing expertise support the use of its instrumentation for the development of new quantum materials?

High-performance computing (HPC) has been a remarkable and disruptive tool because it allows us to explain and understand the physics of complex systems and how these systems can be controlled and ultimately exploited to create new technologies. One of the most remarkable developments in the last several years has been the application of HPC to machine learning and artificial intelligence.

This benefits material science, chemistry, biology, and the study of other complex physical systems, where modelling and simulation play a huge role in our scientific discovery process. And supercomputers are also used in the design and optimization of products that are based on those complex physical systems. In fact, many people would say that next to theory and experiment, computation is a third pillar of the R&D ecosystem.

In my view HPC is just as powerful a research tool as the SNS or the CNMS when it comes to understanding and exploring these complex physical systems.

Why is it important to have the SNS, CNMS, Frontier and other facilities co-located at ORNL?

This integration allows our researchers to very quickly compare computer simulations to experimental data from neutron scattering and other experiments. This results in a very tight and coordinated cycle of development that ensures that we get to the best results the fastest way possible. This way of working also highlights the multidisciplinary nature of how we operate – something we call “team science”.

This requires good coordination between all parties, you have to have clarity in your communication, and you have to have a very clear vision about the goals that you’re trying to accomplish. This ensures that everyone has a common understanding of the mission of ORNL and the DOE.

The POWGEN powder diffractometer at the Spallation Neutron Source
Power up The POWGEN powder diffractometer at the Spallation Neutron Source, which is one of the brightest sources of neutrons in the world. (Courtesy: ORNL)

How do you ensure that collaboration across different instruments and disciplines is done efficiently?

In my experience, the ability of team members to communicate efficiently, to understand each other’s concepts and reasoning, and to translate back and forth across these disciplinary boundaries is probably one of the central and most important parts of this type of scientific development. This is crucial to ensuring that people using a common infrastructure gain powerful scientific results.

For example, when we talk about qubits in quantum science and technology, people working in different subdisciplines have slightly different definitions of that word. For computer scientists, a qubit is a logical representation of information that we manipulate through quantum algorithms. In contrast, material scientists think of a qubit as a two-level quantum system that is embedded in some electronic or magnetic degree of freedom. They will often see qubits as being independent of the logical and computational connections that are necessary to create quantum computers. Bridging such differences is an important aspect of multidisciplinary research and amplifies success at our facilities.

I would say that the facilities and the ecosystem that we’ve created within the laboratory support interchange and collaboration across disciplines. The instruments enable new fundamental discoveries and the science is then developed and translated into new technologies.

That is a multi-step process and can only succeed when you have a team of  people working together on large-scale problems – and that team is always multidisciplinary.

Can you talk about your partnerships with industry

In the last decade quantum science and technology has emerged as a national priority in terms of science, industry and national security. This interest is driven by concerns for national security as well as the economic advantage – in terms of new products and services – that quantum brings. Innovation in the energy sector is an important example of how quantum has important implications for both security and the economy.

As a result, US national laboratories are partnering very closely with industry to provide access to instruments at the SNS, the CNMS and other facilities. At the same time, our researchers get access to technologies developed by industry – especially commercial quantum computing platforms.

I think that one of the most exciting things for us at ORNL today is gaining insights into these new quantum products and services and adapting this knowledge into our own scientific discovery workflows.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Travis Humble from Oak Ridge’s Quantum Science Center explains how large facilities benefit from collaboration appeared first on Physics World.

How hot can you make a solid before it melts?

19 août 2025 à 14:56

Gold can remain solid at temperatures of over 14 times its melting point, far beyond a long-assumed theoretical limit dubbed the “entropy catastrophe”. This finding is based on temperature measurements made using high-resolution inelastic X-ray scattering, and according to team leader Thomas White of the University of Nevada, US, it implies that the question “How hot can you make a solid before it melts?” has no good answer.

“Until now, we thought that solids could not exist above about three times their melting temperatures,” White says. “Our results show that if we heat a material rapidly – that is, before it has time to expand – it is possible to bypass this limit entirely.”

Gold maintains its solid crystalline structure

In their experiments, which are detailed in Nature, White and colleagues heated a 50-nanometre-thick film of gold using intense laser pulses just 50 femtoseconds long (1fs = 10-15 s). The key is the speed at which heating occurs. “By depositing energy faster than the gold lattice could expand, we created a state in which the gold was incredibly hot, but still maintained its solid crystalline structure,” White explains.

The team’s setup made it possible to achieve heating rates in excess of 1015 K s –1, and ultimately to heat the gold to 14 times its 1064 °C melting temperature. This is far beyond the boundary of the supposed entropy catastrophe, which was previously predicted to strike at 3000 °C.

To measure such extreme temperatures accurately, the researchers used the Linac Coherent Light Source (LCLS) at Stanford University as an ultrabright X-ray thermometer. In this technique, the atoms or molecules in a sample absorb photons from an X-ray laser at one frequency and then re-emit photons of a different frequency. The difference in these frequencies depends on a photon’s Doppler shift, and thus on whether it is propagating towards or away from the detector.

The method works because all the atoms in a material exhibit random thermal motion. The temperature of the sample therefore depends on the average kinetic energy of its atoms. Higher temperatures correspond to faster-moving atoms and a bigger spread in the velocities of the photons moving towards away or from the detector. Hence, the width of the spectrum of light scattered by the sample can be used to estimate its temperature. “This approach bypasses the need for complex computer modelling because we simply measure the velocity distribution of atoms directly,” White explains.

A direct, model-independent method

The team, which also includes researchers from the UK, Germany and Italy, undertook this project because its members wanted to develop a direct, model-independent method to measure atom temperatures in extreme conditions. The technical challenges of doing so were huge, White recalls. “We not only needed a high-resolution X-ray spectrometer capable of resolving energy features of just millielectronvolts (meV) but also an X-ray source bright enough to generate meaningful signals from small, short-lived samples,” he says.

A further challenge is that while pressure and density measurements under extreme conditions are more or less routine, temperature is typically inferred – often with large uncertainties. “In our experiments, these extreme states last just picoseconds or nanoseconds,” he says. “We can’t exactly insert a thermometer.”

White adds that this limitation has slowed progress across plasma and materials physics. “Our work provides the first direct method for measuring ion temperatures in dense, strongly driven matter, unlocking new possibilities in areas like planetary science – where we can now probe conditions inside giant planets – and in fusion energy, where temperature diagnostics are critical.”

Fundamental studies in materials science could also benefit, he adds, pointing out that scientists will now be able to explore the ultimate stability limits of solids experimentally as well as theoretically, studying how materials behave when pushed far beyond conventional thermodynamic boundaries.

The researchers are now applying their method to shock-compressed materials. “Just a few weeks ago, we completed a six-night experiment at the LCLS using the same high-resolution scattering platform to measure both particle velocity and temperature in shock-melted iron,” White says. “This is a major step forward. Not only are we tracking temperature in the solid phase, but now we’re accessing molten states under dynamic compression, that is, at conditions like those found inside planetary interiors.”

White tells Physics World that these experiments also went well, and he and his colleagues are now analysing the results. “Ultimately, our goal is to extend this approach to a wide range of materials and conditions, allowing for a new generation of precise, real-time diagnostics in extreme environments,” he says.

The post How hot can you make a solid before it melts? appeared first on Physics World.

Overlooked pioneers from quantum history

19 août 2025 à 11:54

In the folklore of physics, the origins of quantum mechanics are often told as the story of a handful of brilliant young men, trading ideas in lecture halls and cafes. The German term Knabenphysik – “boys’ physics” – helped cement that image, and its gender bias went largely unchallenged for decades.

The latest Physics World Stories podcast, hosted by Andrew Glester, features Margriet van der Heijden, professor of science communication at Eindhoven University of Technology in the Netherlands, and Michelle Frank, a 2024–25 Public Scholar with the US National Endowment for the Humanities. Both contributed to Women in the History of Quantum Physics: Beyond Knabenphysik, a new book that brings together the stories of sixteen women whose work, ideas and problem-solving helped shape the field from the very start.

The book challenges the “lone genius” narrative, showing that quantum theory emerged from a much wider network of people – many of whom were women, and many of whom went unrecognized. The discussion also reflects on barriers that remain in physics today.

Van der Heijden and Frank are part of the international working group of Women in the History of Quantum Physics. Visit the group’s website for links to a range of publications and events.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Overlooked pioneers from quantum history appeared first on Physics World.

💾

Big data, big wins: how solar astrophysics can be a ‘game-changer’ in sports analytics

18 août 2025 à 16:15
NASA image of the sun plus a high-tech outline of a football player
Star potential Data-analysis techniques originally developed for studying information about the Sun could help nurture the sporting stars of tomorrow. (Courtesy: NASA/Goddard/SDO; Shutterstock/E2.art.lab)

If David Jess were a professional footballer – and not a professional physicist – he’d probably be a creative midfielder: someone who links defence and attack to set up goal-scoring opportunities for his team mates. Based in the Astrophysics Research Centre at Queen’s University Belfast (QUB), Northern Ireland, Jess orchestrates his scientific activities in much the same way. Combining vision, awareness and decision-making, he heads a cross-disciplinary research team pursuing two very different and seemingly unconnected lines of enquiry.

Jess’s research within the QUB’s solar-physics groups centres on optical studies of the Sun’s lower atmosphere. That involves examining how the Sun’s energy travels through its near environment – in the form of both solar flares and waves. In addition, his group is developing instruments to support international research initiatives in astrophysics, including India’s upcoming National Large Solar Telescope.

But Jess is also a founding member of the Predictive Sports Analytics (PSA) research group within QUB and Ulster University’s AI Collaboration Centre – a £16m R&D facility supporting the adoption of AI and machine-learning technologies in local industry. PSA links researchers from a mix of disciplines – including physics, mathematics, statistics and computer science – with sports scientists in football, rugby, cycling and athletics. Its goal is to advance the fundamental science and application of predictive modelling in sports and health metrics. 

Joined-up thinking

Astronomy and sports science might seem worlds apart, but they have lots in common, not least because both yield vast amounts of data. “We’re lucky,” says Jess. “Studying the closest star in the solar system means we are not photon-starved – there’s no shortage of light – and we are able to make observations of the Sun’s atmosphere at very high frame rates, which means we’re accustomed to managing and manipulating really big data sets.”

Similarly, big data also fuels the sports analytics industry. Many professional athletes wear performance-tracking sports vests with embedded GPS trackers that can generate tens of millions of data points over the course of, say, a 90-minute football match. The trackers capture information such as a player’s speed, their distance travelled, and the number of sprints and high-intensity runs.

“Trouble is,” says Jess, “you’re not really getting the ebb and flow of all that data by just summing it all up into the ‘one big number’.” Researchers in the PSA group are therefore trying to understand how athlete data evolves over time – often in real-time – to see if there’s some nuance or wrinkle that’s been missed in the “big-picture” metrics that emerge at the end of a game or training session.

It’s all in the game for PSA

Meeting to look at PSA's data
Team talk As PSA’s research in sports analytics grows, David Jess (second right) wants to recruit PhD students keen to move beyond their core physics and maths to develop skills in other disciplines too. (Courtesy: QUB)

Set up in 2023, the Predictive Sports Analysis (PSA) research group in Belfast has developed collaborations with professional football teams, rugby squads and other sporting organizations across Northern Ireland and beyond. From elite-level to grassroots sports, real-world applications of PSA’s research aim to give athletes and coaches a competitive edge. Current projects include:

  • Player/squad speed distribution analyses to monitor strength and conditioning improvements with time (also handy for identifying growth and performance trajectories in youth sport)
  • Longitudinal examination of acceleration intensity as a proxy for explosive strength, which correlates with heart-rate variability (a useful aid to alert coaching staff to potential underlying cardiac conditions)
  • 3D force vectorization to uncover physics-based thresholds linked to concussion and musculoskeletal injury in rugby

The group’s work might, for example, make it possible not only to measure how tired a player becomes after a 90-minute game but also to pinpoint the rates and causes of fatigue during the match. “Insights like this have the power to better inform coaching staff so they can create bespoke training regimes to target these specific areas,” adds Jess.

Work at PSA involves a mix of data mining, analysis, interpretation and visualization – teasing out granular insights from raw, unfiltered data streams by adapting and applying tried-and-tested statistical and mathematical methods from QUB’s astrophysics research. Take, for example, observational studies of solar flares – large eruptions of electromagnetic radiation from the Sun’s atmosphere lasting for a few minutes up to several hours.

David Jess
Solar insights David Jess from Queen’s University Belfast assembles a near-UV instrument for hyperspectral imaging at the Dunn Solar Telescope in New Mexico, US. (Courtesy: QUB)

“We might typically capture a solar-flare event at multiple wavelengths – optical, X-ray and UV, for example – to investigate the core physical processes from multiple vantage points,” says Jess. In other words, they can see how one wavelength component differs from another or how the discrete spectral components correlate and influence each other. “Statistically, that’s not so different from analysing the player data during a football match, with each player offering a unique vantage point in terms of the data points they generate,” he adds.

If that sounds like a stretch, Jess insists that PSA is not an indulgence or sideline. “We are experts in big data at PSA and, just as important, all of us have a passion for sports,” says Jess, who is a big fan of Chelsea FC. “What’s more, knowledge transfer between QUB’s astrophysics and sports analytics programmes works in both directions and delivers high-impact research dividends.”  

The benefits of association

In-house synergies are all well and good, but the biggest operational challenge for PSA since it was set up in 2023 has been external. As a research group in QUB’s School of Mathematics and Physics, Jess and colleagues need to find ways to “get in the door” with prospective clients and clubs in the professional sports community. Bridging that gap isn’t straightforward for a physics lab that isn’t established in the sports-analytics business.

But clear communication as well as creative and accessible data visualization can help successful engagement. “Whenever we meet sports scientists at a professional club, the first thing we tell them is we’re not trying to do their job,” says Jess. “Rather, it’s about making their job easier to do and putting more analytical tools at their disposal.”

PSA’s skill lies in extracting “hidden signals” from big data sets to improve how athlete performance is monitored. Those insights can then be used by coaches, physiotherapists and medical staff to optimize training and recovery schedules as well as to improve the fitness, health and performance of individual athletes and teams.

Validation is everything in the sports analytics business, however, and the barriers to entry are high. That’s one reason why PSA’s R&D collaboration with STATSports could be a game-changer. Founded in 2007 in Newry, Northern Ireland, the company makes wearable devices that record and transmit athlete performance metrics hundreds of times each second.

Athlete running and being monitored by the PSA team.
Fast-track physics Real-time monitoring of athlete performance by PSA PhD students Jack Brown (left) and Eamon McGleenan. The researchers capture acceleration and sprint metrics to provide feedback on sprint profiling and ways to mitigate injury risks. (Courtesy: QUB)

STATSports is now a global leader in athlete monitoring and GPS performance analysis. Its technology is used by elite football clubs such as Manchester City, Liverpool, Arsenal and Juventus, as well as national football teams (including England, Argentina, USA and Australia) and leading teams in rugby and American football.

The tie-up lets PSA work with an industry “name”, while STATSports gets access to blue-sky research that could translate into technological innovation and commercial opportunities.

“PSA is an academic research team first and foremost, so we don’t want to just rest on our laurels,” explains Jess. “With so much data – whether astrophysics or sports analytics – we want to be at the cutting edge and deliver new advances that loop back to enhance the big data techniques we’re developing.”

Right now, PhD physicist Eamon McGleenan provides the direct line from PSA into STATSports, which is funding his postgraduate work. The joint research project, which also involves sports scientists from Saudi Pro League football club Al Qadsiah, uses detailed data about player sprints during a game. The aim is to use force, velocity and acceleration curves – as well as the power generated by a player’s legs – to evaluate the performance metrics that underpin athlete fatigue.

By reviewing these metrics during the course of a game, McGleenan and colleagues can model how an athlete’s performance drops off in real-time, indicating their level of fatigue. The hope is that the research will lead to in-game modelling systems to help coaches and medical staff at pitch-side to make data-driven decisions about player substitutions (rather than just taking a player off because they “look leggy”).

Six physicists who also succeeded at sport

Illustration of people doing a range of sports shown in silhouette
(Courtesy: Shutterstock/Christos Georghiou)

Quantum physicist Niels Bohr was a keen footballer, who played in goal for Danish side Akademisk Boldklub in the early 1900s. He once let a goal in because he was more focused on solving a maths problem mid-game by scribbling calculations on the goal post. His mathematician brother Harald Bohr also played for the club and won silver at the 1908 London Olympics for the Danish national team.

Jonathan Edwards, who originally studied physics at Durham University, still holds the men’s world record for the triple-jump. Edwards broke the record twice on 7 August 1995 at the World Athletics Championships in Gothenburg, Sweden, first jumping 18.16m and then 18.29m barely 20 minutes later.

David Florence, who studied physics at the University of Nottingham, won silver in the single C1 canoe slalom at the Beijing Olympics in 2008. He also won silver in the doubles C2 slalom at the 2012 Olympics in London and in Rio de Janeiro four years later.

Louise Shanahan is a middle-distance runner who competed for Ireland in the women’s 800m race at the delayed 2020 Summer Olympics while still doing a PhD in physics on the properties of nanodiamonds at the University of Cambridge. She has recently set up a sports website called TrackAthletes.

US professional golfer Bryson DeChambeau is nicknamed “The Scientist” owing to his analytical, science-based approach to the sport – and the fact that he majored in physics at Southern Methodist University in Dallas, US. DeChambeau won the 2020 and 2024 US Open.

In 2023 Harvard University’s Jenny Hoffman, who studies the electronic properties of exotic materials, became the fastest woman to run across the US, completing the 5000 km journey in 47 days, 12 hours and 35 minutes. In doing so, she beat the previous record by more than a week.

Matin Durrani

The transfer market

Jess says that the PSA group has been inundated with applications from physics students since it was set up. That’s not surprising, argues Jess, given that a physics degree provides many transferable skills to suit PSA’s broad scientific remit. Those skills include being able to manage, mine and interpret large data sets; disseminate complex results and actionable insights to a non-specialist audience; and work with industry partners in the sports technology sector.

“We’re looking for multidisciplinarians at PSA,” says Jess, with a nod to his group’s ongoing PhD recruitment opportunities. “The ideal candidates will be keen to move beyond their existing knowledge base in physics and maths to develop skills in other specialist fields.” There have also been discussions with QUB’s research and enterprise department about the potential for a PSA spin-out venture – though Jess, for his part, remains focused on research.

“My priority is to ensure the sustainability of PSA,” he concludes. “That means more grant funding – whether from the research councils or industry partners – while training up the next generation of early-career researchers. Longer term, though, I do think that PSA has the potential to be a ‘disruptor’ in the sports-analytics industry.”

The post Big data, big wins: how solar astrophysics can be a ‘game-changer’ in sports analytics appeared first on Physics World.

Researchers perform first real-time visualization of human embryo implantation

18 août 2025 à 09:40

Human reproduction is an inefficient process, with less than one third of conceptions leading to live births. Failure of the embryo to implant in the uterus is one of the main causes of miscarriage. Recording this implantation process in vivo in real time is not yet possible, but a team headed up at the Institute for Bioengineering of Catalonia (IBEC) has designed a platform that enables visualization of human embryo implantation in the laboratory. The researchers hope that quantifying the dynamics of implantation could impact fertility rates and help improve assisted reproductive technologies.

At its very earliest stage, an embryo comprises a small ball of cells called a blastocyst. About six days after fertilization, this blastocyst starts to embed itself into the walls of the uterus. To study this implantation process in real time, the IBEC team created an ex vivo platform that simulates the outer layers of the uterus. Unlike previous studies that mostly focused on the biochemical and genetic aspects of implantation, the new platform enables study of the mechanical forces exerted by the embryo to penetrate the uterus.

The implantation platform incorporates a collagen gel to mimic the extracellular matrix encountered in vivo, as well as globulin-rich proteins that are required for embryo development. The researchers designed two configurations: a 2D platform, in which blastocysts settle on top of a flat gel; and a 3D version where the blastocysts are placed directly inside collagen drops.

To capture the dynamics of blastocyst implantation, the researchers recorded time-lapse movies using fluorescence imaging and traction force microscopy. They imaged the matrix fibres and their deformations using light scattering and visualized autofluorescence from the embryo under multiphoton illumination. To quantify matrix deformation, they used the fibres as markers for real-time tracking and derived maps showing the direction and amplitude of fibre displacements – revealing the regions where the embryo applied force and invaded the matrix.

Quantifying implantation dynamics

In the 2D platform, 72% of human blastocysts attached to and then integrated into the collagen matrix, reaching a depth of up to 200 µm in the gel. The embryos increased in size over time and maintained a spherical shape without spreading on the surface. Implantation in the 3D platform, in which the embryo is embedded directly inside the matrix, led to 80% survival and invasion rate. In both platforms, the blastocysts showed motility in the matrix, illustrating the invasion capacity of human embryos.

Samuel Ojosnegros, Anna Seriola and Amélie Godeau
Research team From left to right: Samuel Ojosnegros, Anna Seriola and Amélie Godeau at IBEC labs. (Courtesy: Institute for Bioengineering of Catalonia)

The researchers also monitored the traction forces that the embryos exerted on the collagen matrix, moving and reorganising it with a displacement that increased over time. They note that the displacement was not perfectly uniform and that the pulling varied over time and space, suggesting that this pulsatile behaviour may help the embryos to continuously sense the environment.

“We have observed that human embryos burrow into the uterus, exerting considerable force during the process,” explains study leader Samuel Ojosnegros in a press statement. “These forces are necessary because the embryos must be able to invade the uterine tissue, becoming completely integrated with it. It is a surprisingly invasive process. Although it is known that many women experience abdominal pain and slight bleeding during implantation, the process itself had never been observed before.”

For comparison, the researchers also examined the implantation of mouse blastocysts. In contrast to the complete integration seen for human blastocysts, mouse embryo outgrowth was limited to the matrix surface. In both platforms, initial attachment was followed by invasion and proliferation of trophoblast cells (the outer layer of the blastocyst). The embryo applied strong pulling forces to the fibrous matrix, remodelling the collagen and aligning the fibres around it during implantation. The displacement maps revealed a fluctuating pattern, as seen for the human embryos.

“By measuring the direct impact of the embryo on the matrix scaffold, we reveal the underlying mechanics of embryo implantation,” the researchers write. “We found that mouse and human embryos generated forces during implantation using a species-specific pattern.”

The team is now working to incorporate a theoretical framework to better understand the physical processes underlying implantation. “Our observations at earlier stages show that attachment is a limiting factor at the onset of human embryo implantation,” co-first author Amélie Godeau tells Physics World. “Our next step is to identify the key elements that enable a successful initial connection between the embryo and the matrix.”

The study is reported in Science Advances.

The post Researchers perform first real-time visualization of human embryo implantation appeared first on Physics World.

Melting ice propels itself across a patterned surface

15 août 2025 à 15:42

Researchers in the US are first to show how a melting ice disc can quickly propel itself across a patterned surface in a manner reminiscent of the Leidenfrost effect. Jonathan Boreyko and colleagues at Virginia Tech demonstrated how the discs can suddenly slingshot themselves along herringbone channels when a small amount of heat is applied.

The Leidenfrost effect is a classic physics experiment whereby a liquid droplet levitates above a hot surface – buoyed by vapour streaming from the bottom of the droplet. In 2022, Boreyko’s team extended the effect to a disc of ice. This three-phase Leidenfrost effect requires a much hotter surface because the ice must first melt to liquid, which then evaporates.

The team also noticed that the ice discs can propel themselves in specific directions across an asymmetrically-patterned surface. This ratcheting effect also occurs with Leidenfrost droplets, and is related to the asymmetric emission of vapour.

“Quite separately, we found out about a really interesting natural phenomenon at Death Valley in California, where boulders slowly move across the desert,” Boreyko adds. “It turns out this happens because they are sitting on thin rafts of ice, which the wind can then push over the underlying meltwater.”

Combined effects

In their latest study, Boreyko’s team considered how these two effects could be combined – allowing ice discs to propel themselves across cooler surfaces like the Death Valley boulders, but without any need for external forces like the wind.

They patterned a surface with a network of V-shaped herringbone channels, each branching off at an angle from a central channel. At first, meltwater formed an even ring around the disc – but as the channels directed its subsequent flow, the ice began to move in the same direction.

“For the Leidenfrost droplet ratchets, they have to heat the surface way above the boiling point of the liquid,” Boreyko explains. “In contrast, for melting ice discs, any temperature above freezing will cause the ice to melt and then move along with the meltwater.”

The speed of the disc’s movement depended on how easily water spreads out on to the herringbone channels. When etched onto bare aluminium, the channels were hydrophilic – encouraging meltwater to flow along them. Predictably, since liquid water is far more dense and viscous than vapour, this effect unfolded far more slowly than the three-phase Leidenfrost effect demonstrated in the team’s previous experiment.

Surprising result

Yet as Boreyko describes, “a much more surprising result was when we tried spraying a water-repellent coating over the surface structure.” While preventing meltwater from flowing quickly through the channels, this coating roughened the surface with nanostructures, which initially locked the ice disc in place as it rested on the ridges between the channels.

As the ice melted, the ring of meltwater partially filled the channels beneath the disc. Gradually, however, the ratcheted surface directed more water to accumulate in front of the disc – introducing a Laplace pressure difference between both sides of the disc.

When this pressure difference is strong enough, the ice suddenly dislodges from the surface. “As the meltwater preferentially escaped on one side, it created a surface tension force that ‘slingshotted’ the ice at a dramatically higher speed,” Boreyko describes.

Applications of the new effect include surfaces could be de-iced with just a small amount of heating. Alternatively, energy could be harvested from ice-disc motion. It could also be used to propel large objects across a surface, says Boreyko. “It turns out that whenever you have more liquid on the front side of an object, and less on the backside, it creates a surface tension force that can be dramatic.”

The research is described in ACS Applied Materials & Interfaces.

The post Melting ice propels itself across a patterned surface appeared first on Physics World.

Android phone network makes an effective early warning system for earthquakes

15 août 2025 à 10:00

The global network of Android smartphones makes a useful earthquake early warning system, giving many users precious seconds to act before the shaking starts. These findings, which come from researchers at Android’s parent organization Google, are based on a three-year-long study involving millions of phones in 98 countries. According to the researchers, the network’s capabilities could be especially useful in areas that lack established early warning systems.

By using Android smartphones, which make up 70% of smartphones worldwide, the Android Earthquake Alert (AEA) system can help provide life-saving warnings in many places around the globe,” says study co-leader Richard Allen, a visiting faculty researcher at Google who directs the Berkeley Seismological Laboratory at the University of California, Berkeley, US.

Traditional earthquake early warning systems use networks of seismic sensors expressly designed for this purpose. First implemented in Mexico and Japan, and now also deployed in Taiwan, South Korea, the US, Israel, Costa Rica and Canada, they rapidly detect earthquakes in areas close to the epicentre and issue warnings across the affected region. Even a few seconds of warning can be useful, Allen explains, because it enables people to take protective actions such as the “drop, cover and hold on” (DCHO) sequence recommended in most countries.

Building such seismic networks is expensive, and many earthquake-prone regions do not have them. What they do have, however, is smartphones. Most such devices contain built-in accelerometers, and as their popularity soared in the 2010s, seismic scientists began exploring ways of using them to detect earthquakes. “Although the accelerometers in these phones are less sensitive than the permanent instruments used in traditional seismic networks, they can still detect tremors during strong earthquakes,” Allen tells Physics World.

A smartphone-based warning system

By the late 2010s, several teams had developed smartphone apps that could sense earthquakes when they happen, with early examples including Mexico’s SkyAlert and Berkeley’s ShakeAlert. The latest study takes this work a step further. “By using the accelerometers in a network of smartphones like a seismic array, we are now able to provide warnings in some parts of the world where they didn’t exist before and are most needed,” Allen explains.

Working with study co-leader Marc Stogaitis, a principal software engineer at Android, Allen and colleagues tested the AEA system between 2021 and 2024. During this period, the app detected an average of 312 earthquakes a month, with magnitudes ranging from 1.9 to 7.8 (corresponding to events in Japan and Türkiye, respectively).

Detecting earthquakes with smartphones

Animation showing phones detecting shaking as a magnitude 6.2 earthquake in Türkiye progressed. Yellow dots are phones that detect shaking. The yellow circle is the P-wave’s estimated location and the red circle is for the S-wave. Note that phones can detect shaking for reasons other than an earthquake, and the system needs to handle this source of noise. This video has no sound. (Courtesy: Google)

For earthquakes of magnitude 4.5 or higher, the system sent “TakeAction” alerts to users. These alerts are designed to draw users’ attention immediately and prompt them to take protective actions such as DCHO. The system sent alerts of this type on average 60 times per month during the study period, for an average of 18 million individual alerts per month. The system also delivered lesser “BeAware” alerts to regions expected to experience a shaking intensity of 3 or 4.

To assess how effective these alerts were, the researchers used Google Search to collect voluntary feedback via user surveys. Between 5 February 2023 and 30 April 2024, 1 555 006 people responded to a survey after receiving alerts generated from an AEA detection. Their responses indicated that 85% of them did indeed experience shaking, with 36% receiving the alert before the ground began to move, 28% during and 23% after.

Graphic showing responses to survey on the effectiveness of the AEA and users' responses to alerts
Feeling the Earth move: Feedback from users who received an alert. A total of 1 555 006 responses to the user survey were collected over the period 5 February 2023 to 30 April 2024. During this time, alerts were issued for 1042 earthquakes detected by AEA. (Courtesy: Google)

Principles of operation

AEA works on the same principles of seismic wave propagation as traditional earthquake detection systems. When an Android smartphone is stationary, the system uses the output of its accelerometer to detect the type of sudden increase in acceleration that P and S waves in an earthquake would trigger. Once a phone detects such a pattern, it sends a message to Google servers with the acceleration information and an approximate location. The servers then search for candidate seismic sources that tally with this information.

“When a candidate earthquake source satisfies the observed data with a high enough confidence, an earthquake is declared and its magnitude, hypocentre and origin time are estimated based on the arrival time and amplitude of the P and S waves,” explains Stogaitis. “This detection capability is deployed as part of Google Play Services core system software, meaning it is on by default for most Android smartphones. As there are billions of Android phones around the world, this system provides an earthquake detection capability wherever there are people, in both wealthy and less-wealthy nations.”

In the future, Allen says that he and his colleagues hope to use the same information to generate other hazard-reducing tools. Maps of ground shaking, for example, could assist the emergency response after an earthquake.

For now, the researchers, who report their work in Science, are focused on improving the AEA system. “We are learning from earthquakes as they occur around the globe and the Android Earthquake Alerts system is helping to collect information about these natural disasters at a rapid rate,” says Allen. “We think that we can continue to improve both the quality of earthquake detections, and also improve on our strategies to deliver effective alerts.”

The post Android phone network makes an effective early warning system for earthquakes appeared first on Physics World.

Predicted quasiparticles called ‘neglectons’ hold promise for robust, universal quantum computing

14 août 2025 à 16:00

Quantum computers open the door to profound increases in computational power, but the quantum states they rely on are fragile. Topologically protected quantum states are more robust, but the most experimentally promising route to topological quantum computing limits the calculations these states can perform. Now, however, a team of mathematicians and physicists in the US has found a way around this barrier. By exploiting a previously neglected aspect of topological quantum field theory, the team showed that these states can be much more broadly useful for quantum computation than was previously believed.

The quantum bits (qubits) in topological quantum computers are based on particle-like knots, or vortices, in the sea of electrons washing through a material. In two-dimensional materials, the behaviour of these quasiparticles diverges from that of everyday bosons and fermions, earning them the name of anyons (from “any”). The advantage of anyon-based quantum computing is that the only thing that can change the state of anyons is moving them around in relation to each other – a process called “braiding” that alters their relative topology.

Photo of a blackboard containing a diagram of anyon braiding. Writing on the blackboard says "Quantum gates are implemented by braiding anyons" and "Key idea: Quantum state evolves by braiding output only depends on the topology of the braid, *not* the path taken"
Topological protection: Diagram of a scheme for implementing quantum gates by braiding anyons. (Courtesy: Gus Ruelas/USC)

However, as team leader Aaron Lauda of the University of Southern California explains, not all anyons are up to the task. Certain anyons derived from mathematical symmetries appear to have a quantum dimension of zero, meaning that they cannot be manipulated in quantum computations. Traditionally, he says, “you just throw those things away”.

The problem is that in this so-called “semisimple” model, braiding the remaining anyons, which are known as Ising anyons, only lends itself to a limited range of computational logic gates. These gates are called Clifford gates, and they can be efficiently simulated by classical computers, which reduces their usefulness for truly ground-breaking quantum machines.

New mathematical tools for anyons

Lauda’s interest in this problem was piqued when he realized that there had been some progress in the mathematical tools that apply to anyons. Notably, in 2011, Nathan Geer at Utah State University and Jonathan Kujawa at Oklahoma University in the US, together with Bertrand Patureau-Mirand at Université de Bretagne-Sud in France showed that what appear to be zero-dimensional objects in topological quantum field theory (TQFT) can actually be manipulated in ways that were not previously thought possible.

“What excites us is that these new TQFTs can be more powerful and possess properties not present in the traditional setting,” says Geer, who was not involved in the latest work.

Photo of a blackboard containing an explanation of how to encode qubits into the collective state of a neglecton and two Ising anyons, which are quasiparticle vortices in a 2D material. The explanation includes a diagram showing the neglecton and the Ising anyons in a 2D material placed in a vertically oriented magnetic field. It also includes sketches showing how to perform braiding with this collection of particles and create 0 and 1 ket states
Just add neglectons: Encoding qubits into collective state of three anyons. (Courtesy: Gus Ruelas/USC)

As Lauda explains it, this new approach to TQFT led to “a different way to measure the contribution” of the anyons that the semisimple model leaves out – and surprisingly, the result wasn’t zero. Better still, he and his colleagues found that when certain types of discarded anyons – which they call “neglectons” because they were neglected in previous approaches – are added back into the model, Ising anyons can be braided around them in such a way as to allow any quantum computation.

The role of unitarity

Here, the catch was that including neglectons meant that the new model lacked a property known as unitarity. This is essential in the widely held probabilistic interpretation of quantum mechanics. “Most physicists start to get squeamish when you have, like, ‘non-unitarity’ or what we say, non positive definite [objects],” Lauda explains.

The team solved this problem with some ingenious workarounds created by Lauda’s PhD student, Filippo Iulianelli. Thanks to these workarounds, the team was able to confine the computational space to only those regions where anyon transformations work out as unitary.

Shawn Cui, who was not involved in this work, but whose research at Purdue University, US, centres around topological quantum field theory and quantum computation, describes the research by Lauda and colleagues as “a substantial theoretical advance with important implications for overcoming limitations of semisimple models”. However, he adds that realizing this progress in experimental terms “remains a long-term goal”.

For his part, Lauda points out that there are good precedents for particles being discovered after mathematical principles of symmetry were used to predict their existence. Murray Gell-Man’s prediction of the omega minus baryon in 1962 is, he says, a case in point. “One of the things I would say now is we already have systems where we’re seeing Ising anyons,” Lauda says. “We should be looking also for these neglectons in those settings.”

The research is published in Nature Communications.

The post Predicted quasiparticles called ‘neglectons’ hold promise for robust, universal quantum computing appeared first on Physics World.

Graphite ‘hijacks’ the journey from molten carbon to diamond

14 août 2025 à 13:00

At high temperatures and pressures, molten carbon has two options. It can crystallize into diamond and become one of the world’s most valuable substances. Alternatively, it can crystallize into graphite, which is industrially useful but somewhat less exciting.

Researchers in the US have now discovered what causes molten carbon to “choose” one crystalline form over the other. Their findings, which are based on sophisticated simulations that use machine learning to predict molecular behaviour, have implications for several fields, including geology, nuclear fusion and quantum computing as well as industrial diamond production.

Monitoring crystallization in molten carbon is challenging because the process is rapid and occurs under conditions that are hard to produce in a laboratory. When scientists have tried to study this region of carbon’s phase diagram using high pressure flash heating, their experiments have produced conflicting results.

A better understanding of phase changes near the crystallization point could bring substantial benefits. Liquid-phase carbon is a known intermediate in the synthesis of artificial diamonds, nanodiamonds and the nitrogen-vacancy-doped diamonds used in quantum computing. The presence of diamond in natural minerals can also shed light on tectonic processes in Earth-like planets and the deep-Earth carbon cycle.

Crystallization process can be monitored in detail

In the new work, a team led by chemist Davide Donadio of the University of California, Davis used machine-learning-accelerated, quantum-accurate molecular dynamics simulations to model how diamond and graphite form as liquid carbon cools from 5000 to 3000 K at pressures ranging from 5 to 30 GPa. While such extreme conditions can be created using laser heating, Donadio notes that doing so requires highly specialized equipment. Simulations also provide a level of control over conditions and an ability to monitor the crystallization process at the atomic scale that would be difficult, if not impossible, to achieve experimentally.

The team’s simulations showed that the crystallization behaviour of molten carbon is more complex than previously thought. While it crystallizes into diamond at higher pressures, at lower pressures (up to 15 GPa) it forms graphite instead. This was surprising, the researchers say, because even at these slightly lower pressures, the material’s most thermodynamically stable phase ought to be diamond rather than graphite.

“Nature taking the path of least resistance”

The team attributes this unexpected behaviour to an empirical observation known as Ostwald’s step rule, which states that crystallization often proceeds through intermediate metastable phases rather than directly to the phase that is most thermodynamically stable. In this case, the researchers say that graphite, a nucleating metastable crystal, acts as a stepping stone because its structure more closely resembles that of the parent liquid carbon. For this reason, it hinders the direct formation of the stable diamond phase.

“The liquid carbon essentially finds it easier to become graphite first, even though diamond is ultimately more stable under these conditions,” says co-author Tianshu Li, a professor of civil and environmental engineering at George Washington University. “It’s nature taking the path of least resistance.”

The insights gleaned from this work, which is described in Nature Communications, could help resolve inconsistencies among historical electrical and laser flash-heating experiments, Donadio says. Though these experiments were aimed at resolving the phase diagram of carbon near the graphite-diamond-liquid triple point, various experimental details and recrystallization conditions may have meant that their systems instead became “trapped” in metastable graphitic configurations. Understanding how this happens could prove useful for manufacturing carbon-based materials such as synthetic diamonds and nanodiamonds at high pressure and temperature.

“I have been studying crystal nucleation for 20 years and have always been intrigued by the behaviour of carbon,” Donadio tells Physics World. “Studies based on so-called empirical potentials have been typically unreliable in this context and ab initio density functional theory-based calculations are too slow. Machine learning potentials allow us to overcome these issues, having the right combination of accuracy and computational speed.”

Looking to the future, Donadio says he and his colleagues aim to study more complex chemical compositions. “We will also be focusing on targeted pressures and temperatures, the likes of which are found in the interiors of giant planets in our solar system.”

The post Graphite ‘hijacks’ the journey from molten carbon to diamond appeared first on Physics World.

Building a quantum powerhouse in the US Midwest

14 août 2025 à 11:59

In this episode of the Physics World Weekly podcast I am in conversation two physicists who are leading lights in the quantum science and technology community in the US state of Illinois. They are Preeti Chalsani who is chief quantum officer at Intersect Illinois, and David Awschalom who is director of Q-NEXT.

As well as being home to Chicago, the third largest urban area in the US, the state also hosts two national labs (Fermilab and Argonne) and several top universities. In this episode, Awschalom and Chalsani explain how the state is establishing itself as a burgeoning hub for quantum innovation – along with neighbouring regions in Wisconsin and Indiana.

Chalsani talks about the Illinois Quantum and Microelectronics Park, a 128-acre technology campus that being developed on the site of a former steel mill just south of Chicago. The park has already attracted its first major tenant, PsiQuantum, which will build a utility-scale, fault-tolerant quantum computer at the park.

Q-NEXT is led by Argonne National Laboratory, and Awschalom explains how academia, national labs, industry, and government are working together to make the region a quantum powerhouse.

  • Related podcasts include interviews with Celia Merzbacher of the US’s Quantum Economic Development Consortium; Nadya Mason of the Pritzker School of Molecular Engineering at the University of Chicago; and Travis Humble of the Quantum Science Center at Oak Ridge National Laboratory

Courtesy: American ElementsThis podcast is supported by American Elements, the world’s leading manufacturer of engineered and advanced materials. The company’s ability to scale laboratory breakthroughs to industrial production has contributed to many of the most significant technological advancements since 1990 – including LED lighting, smartphones, and electric vehicles.

The post Building a quantum powerhouse in the US Midwest appeared first on Physics World.

Richard Muller: ‘Physics stays the same. What changes is how the president listens’

13 août 2025 à 16:00

Richard Muller, a physicist at the University of California, Berkeley, was in his office when someone called Liz showed up who’d once taken one of his classes. She said her family had invited a physicist over for dinner, who touted controlled nuclear fusion as a future energy source. When Liz suggested solar power was a better option, the guest grew patronizing. “If you wanted to power California,” he told her, “you’d have to plaster the entire state with solar cells.”

Fortunately, Liz remembered what she’d learned on Muller’s course, entitled “Physics for Future Presidents”, and explained why the dinner guest was wrong. “There’s a kilowatt in a square metre of sunlight,” she told him, “which means a gigawatt in a square kilometre – only about the space of a nuclear power plant.” Stunned, the physicist grew silent. “Your numbers don’t sound wrong,” he finally said. “Of course, today’s solar cells are only 15% efficient. But I’ll take a look again.”

It’s a wonderful story that Muller told me when I visited him a few months ago to ask about his 2008 book Physics for Future Presidents: the Science Behind the Headlines. Based on the course that Liz took, the book tries to explain physics concepts underpinning key issues including energy and climate change. “She hadn’t just memorized facts,” Muller said. “She knew enough to shut up an expert who hadn’t done his homework. That’s what presidents should be able to do.” A president, Muller believes, should know enough science to have a sense for the value of expert advice.

Dissenting minds

Muller’s book was published shortly before Barack Obama’s two terms as US president. Obama was highly pro-science, appointing the Nobel-prize-winning physicist Steven Chu as his science adviser. With Donald Trump in the White House, I had come to ask Muller what advice – if any – he would change in the book. But it wasn’t easy for me to keep Muller on topic, as he derails easily with anecdotes of fascinating situations and extraordinary people that he’s encountered in his remarkable life.

Richard Muller
Talking physics Richard Muller explaining antimatter to students at the University of California, Berkeley, in 2005. (Courtesy: WikiCommons)

Born in New York City, Muller, 81, attended Bronx High School of Science and Columbia University, joining the University of California, Berkeley as a graduate student in the autumn of 1964. A few weeks after entering, he joined the Free Speech Movement to protest against the university’s ban on campus political activities. During a sit-in, Muller was arrested and dragged down the steps of Sproul Hall, Berkeley’s administration building.

As a graduate student, Muller worked with Berkeley physicist Luis Alvarez – who would later win the 1968 Nobel Prize for Physics – to send a balloon with a payload of cosmic-ray detectors over the Pacific. Known as the High Altitude Particle Physics Experiment (HAPPE), the apparatus crashed in the ocean. Or so Muller thought.

As Muller explained in a 2023 article in the Wall Street Journal, US intelligence recovered a Chinese surveillance device, shot down over Georgia by the US military, with a name that translated as “HAPI”. Muller found enough other similarities to conclude that the Chinese had recovered the device and copied it as a model for their balloons. But by then Muller had switched to studying negative kaon particles using bubble chambers. After his PhD, he stayed at Berkeley as a postdoc, eventually becoming a professor in 1980.

Muller is a prominent contrarian, publishing an article advancing the controversial – though some now argue that it’s plausible – view that the COVID-19 virus originated in a Chinese lab. For a long time he was a global-warming sceptic, but in 2012, after three years of careful analysis, he publicly changed his mind via an article in the New York Times. Former US President Bill Clinton cited Muller as “one of my heroes because he changed his mind on global warming”. Muller loved that remark, but told me: “I’m not a hero. I’m just a scientist.”

Muller was once shadowed by a sociology student for a week for a course project. “She was like [the anthropologist] Diane Fosse and I was a gorilla,” Muller recalls. She was astonished. “I thought physicists spent all their time thinking and experimenting,” the student told him. “You spend most of your time talking.” Muller wasn’t surprised. “You don’t want to spend your time rediscovering something somebody already knows,” he said. “So physicists talk a lot.”

Recommended recommendations

I tried again to steer Muller back to the book. He said it was based on a physics course at Berkeley known originally as “Qualitative physics” and informally as physics for poets or dummies. One of the first people to teach it had been the theorist and “father of the fusion bomb” Edward Teller. “Teller was exceedingly popular,” Muller told me, “possibly because he gave everyone in class an A and no exams.”

After Teller, fewer and fewer students attended the course until enrolment dropped to 20. So when Muller took over in 1999 he retitled it “Physics for future presidents”, he refocused it on contemporary issues, and rebuilt the enrolment until it typically filled a large auditorium with about 500 students. He retired in 2010 after a decade of teaching the course.

Making a final effort, I handed Muller a copy of his book, turned to the last page where he listed a dozen or so specific recommendations for future presidents, and asked him to say whether he had changed his mind in the intervening 17 years.

Fund strong programmes in energy efficiency and conservation? “Yup!”

Raise the miles-per-gallon of autos substantially? “Yup.”

Support efforts at sequestering carbon dioxide? “I’m not much in favour anymore because the developing world can’t afford it.”

Encourage the development of nuclear power? “Yeah. Particularly fission; fusion’s too far in the future. Also, I’d tell the president to make clear that nuclear waste storage is a solved problem, and make sure that Yucca mountain is quickly approved.”

See that China and India are given substantial carbon credits for building coal-fired power stations and nuclear plants? “Nuclear power plants yes, carbon credits no. Over a million and a half people in China die from coal pollution each year.”

Encourage solar and wind technologies? “Yes.” Cancel subsidies on corn ethanol? “Yes”. Encourage developments in efficient lighting? “Yes.” Insulation is better than heating? “Yes.” Cool roofs save more energy than air conditioners and often better than solar cells? “Yes.”

The critical point

Muller’s final piece of advice to the future president was that the “emphasis must be on technologies that the developing world can afford”. He was adamant. “If what you are doing is buying expensive electric automobiles that will never sell in the developing world, it’s just virtue signalling in luxury.”

I kept trying to find some new physics Muller would tell the president, but it wasn’t much. “Physics mostly stays the same,” Muller concluded, “so the advice mainly does, too.” But not everything remains unvarying. “What changes the most”, he conceded, “is how the president listens”. Or even whether the president is listening at all.

The post Richard Muller: ‘Physics stays the same. What changes is how the president listens’ appeared first on Physics World.

NASA launches TRACERS mission to study Earth’s ‘magnetic shield’

13 août 2025 à 13:02

NASA has successfully launched a mission to explore the interactions between the Sun’s and Earth’s magnetic fields. The Tandem Reconnection and Cusp Electrodynamics Reconnaissance Satellites (TRACERS) craft was sent into low-Earth orbit on 23 July from Vandenberg Space Force Base in California by a SpaceX Falcon 9 rocket. Following a month of calibration, the twin-satellite mission is expected to operate for a year.

The spacecraft will observe particles and electromagnetic fields in the Earth’s northern magnetic “cusp region”, which encircles the North Pole where the Earth’s magnetic field lines curve down toward Earth.

This unique vantage point allows researchers to study how magnetic reconnection — when field lines connect and explosively reconfigure — affects the space environment. Such observations will help researchers understand how processes change over both space and time.

The two satellites will collect data from over 3000 cusp crossings during the one-year mission with the information being used to understand space-weather phenomena that can disrupt satellite operations, communications and power grids on Earth.

Each nearly identical octagonal satellite – weighing less than 200 kg each – features six instruments including magnetomers, electric-field instruments and devices to measure the energy of ions and electrons in plasma around the spacecraft.

It will operate in a Sun-synchronous orbit about 590 km above ground with the satellites following one behind the other in close separation, passing through regions of space at least 10 seconds apart.

“TRACERS is an exciting mission,” says Stephen Fuselier from the Southwest Research Institute in Texas, who is the mission’s deputy principal investigator. “The data from that single pass through the cusp were amazing. We can’t wait to get the data from thousands of cusp passes.”

The post NASA launches TRACERS mission to study Earth’s ‘magnetic shield’ appeared first on Physics World.

Jet stream study set to improve future climate predictions

13 août 2025 à 11:35
Factors influencing the jet stream in the southern hemisphere
Driven by global warming The researchers identified which factors influence the jet stream in the southern hemisphere. (Courtesy: Leipzig University/Office for University Communications)

An international team of meteorologists has found that half of the recently observed shifts in the southern hemisphere’s jet stream are directly attributable to global warming – and pioneered a novel statistical method to pave the way for better climate predictions in the future.

Prompted by recent changes in the behaviour of the southern hemisphere’s summertime eddy-driven jet (EDJ) – a band of strong westerly winds located at a latitude of between 30°S and 60°S – the Leipzig University-led team sifted through historical measurement data to show that wind speeds in the EDJ have increased, while the wind belt has moved consistently toward the South Pole. They then used a range of innovative methods to demonstrate that 50% of these shifts are directly attributable to global warming, with the remainder triggered by other climate-related changes, including warming of the tropical Pacific and the upper tropical atmosphere, and the strengthening of winds in the stratosphere.

“We found that human fingerprints on the EDJ are already showing,” says lead author Julia Mindlin, research fellow at Leipzig University’s Institute for Meteorology. “Global warming, springtime changes in stratospheric winds linked to ozone depletion, and tropical ocean warming are all influencing the jet’s strength and position.”

“Interestingly, the response isn’t uniform, it varies depending on where you look, and climate models are underestimating how strong the jet is becoming. That opens up new questions about what’s missing in our models and where we need to dig deeper,” she adds.

Storyline approach

Rather than collecting new data, the researchers used existing, high-quality observational and reanalysis datasets – including the long-running HadCRUT5 surface temperature data, produced by the UK Met Office and the University of East Anglia, and a variety of sea surface temperature (SST) products including HadISST, ERSSTv5 and COBE.

“We also relied on something called reanalysis data, which is a very robust ‘best guess’ of what the atmosphere was doing at any given time. It is produced by blending real observations with physics-based models to reconstruct a detailed picture of the atmosphere, going back decades,” says Mindlin.

To interpret the data, the team – which also included researchers at the University of Reading, the University of Buenos Aires and the Jülich Supercomputing Centre – used a statistical approach called causal inference to help isolate the effects of specific climate drivers. They also employed “storyline” techniques to explore multiple plausible futures rather than simply averaging qualitatively different climate responses.

“These tools offer a way to incorporate physical understanding while accounting for uncertainty, making the analysis both rigorous and policy-relevant,” says Mindlin.

Future blueprint

For Mindlin, these findings are important for several reasons. First, they demonstrate “that the changes predicted by theory and climate models in response to human activity are already observable”. Second, she notes that they “help us better understand the physical mechanisms that drive climate change, especially the role of atmospheric circulation”.

“Third, our methodology provides a blueprint for future studies, both in the southern hemisphere and in other regions where eddy-driven jets play a role in shaping climate and weather patterns,” she says. “By identifying where and why models diverge from observations, our work also contributes to improving future projections and enhances our ability to design more targeted model experiments or theoretical frameworks.”

The team is now focused on improving understanding of how extreme weather events, like droughts, heatwaves and floods, are likely to change in a warming world. Since these events are closely linked to atmospheric circulation, Mindlin stresses that it is critical to understand how circulation itself is evolving under different climate drivers.

One of the team’s current areas of focus is drought in South America. Mindlin notes that this is especially challenging due to the short and sparse observational record in the region, and the fact that drought is a complex phenomenon that operates across multiple timescales.

“Studying climate change is inherently difficult – we have only one Earth, and future outcomes depend heavily on human choices,” she says. “That’s why we employ ‘storylines’ as a methodology, allowing us to explore multiple physically plausible futures in a way that respects uncertainty while supporting actionable insight.”

The results are reported in the Proceedings of the National Academy of Sciences.

The post Jet stream study set to improve future climate predictions appeared first on Physics World.

Festival opens up the quantum realm

13 août 2025 à 10:07
quantum hackathon day 1 NQCC
Collaborative insights: The UK Quantum Hackathon, organized by the NQCC for the fourth consecutive year and a cornerstone of the Quantum Fringe festival, allowed industry experts to work alongside early-career researchers to explore practical use cases for quantum computing. (Courtesy: NQCC)

The International Year of Quantum Science and Technology (IYQ) has already triggered an explosion of activities around the world to mark 100 years since the emergence of quantum mechanics. In the UK, the UNESCO-backed celebrations have provided the perfect impetus for the University of Edinburgh’s Quantum Software Lab (QSL) to work with the National Quantum Computing Centre (NQCC) to organize and host a festival of events that have enabled diverse communities to explore the transformative power of quantum computing.

Known collectively as the Quantum Fringe, in a clear nod to Edinburgh’s famous cultural festival, some 16 separate events have been held across Scotland throughout June and July. Designed to make quantum technologies more accessible and more relevant to the outside world, the programme combined education and outreach with scientific meetings and knowledge exchange.

The Quantum Fringe programme evolved from several regular fixtures in the quantum calendar. One of these cornerstones was the NQCC’s flagship event, the UK Quantum Hackathon, which is now in its fourth consecutive year. In common with previous editions, the 2025 event challenged teams of hackers to devise quantum solutions to real-world use cases set by mentors from different industry sectors. The teams were supported throughout the three-day event by the industry mentors, as well as by technical experts from providers of various quantum resources.

quantum hackathon - NQCC
Time constrained: the teams of hackers were given two days to formulate their solution and test it on simulators, annealers and physical processors. (Courtesy: NQCC)

This year, perhaps buoyed by the success of previous editions, there was a significant uptick in the number of use cases submitted by end-user organizations. “We had twice as many applications as we could accommodate, and over half of the use cases we selected came from newcomers to the event,” said Abby Casey, Quantum Readiness Delivery Lead at the NQCC. “That level of interest suggests that there is a real appetite among the end-user community for understanding how quantum computing could be used in their organizations.”

Reflecting the broader agenda of the IYQ, this year the NQCC particularly encouraged use cases that offered some form of societal benefit, and many of the 15 that were selected aimed to align with the UN’s Sustainable Development Goals. One team investigated the accuracy of quantum-powered neural networks for predicting the progression of a tumour, while another sought to optimize the performance of graphene-based catalysts for fuel cells. Moonbility, a start-up firm developing digital twins to optimize the usage of transport and infrastructure, challenged its team to develop a navigation system capable of mapping out routes for people with specific mobility requirements, such as step-free access or calmer environments for those with anxiety disorders.

During the event the hackers were given just two days to explore the use case, formulate a solution, and generate results using quantum simulators, annealers and physical processors. The last day provided an opportunity for the teams to share their findings with their peers and a five-strong judging panel that was chaired by Sir Peter Knight, one of the architects of the UK’s National Quantum Technologies Programme and co-chair of the IYQ’s Steering Committee a prime mover in the IYQ celebrations. “Your effort, energy and passion have been quite extraordinary,” commented Sir Peter at the end of the event. “It’s truly impressive to see what you have achieved in just two days.”

From the presentations it was clear that some of the teams had adapted their solution to reflect the physical constraints of the hardware platform they had been allocated. Those explorations were facilitated by the increased participation of mentors from hardware developers, including QuEra and Pasqal for cold-atom architectures, and Rigetti and IBM for gate-based superconducting processors. “Cold atoms offer greater connectivity than superconducting platforms, which may make them more suited to solving particular types of problems,” said Gerard Milburn of the University of Sussex, who has recently become a Quantum Fellow at the NQCC.

quantum hackathon day 3 NQCC
Results day: The final day of the hackathon allowed the teams to share their results with the other participants and a five-strong judging panel. (Courtesy: NQCC)

The winning team, which had been challenged by Aioi R&D Lab to develop a quantum-powered solution for scheduling road maintenance, won particular praise for framing the problem in a way that recognized the needs of all road users, not just motorists. “It was really interesting that they thought about the societal value right at the start, and then used those ethical considerations to inform the way they approached the problem,” said Knight.

The wider impact of the hackathon is clear to see, with the event providing a short, intense and collaborative learning experience for early-career researchers, technology providers, and both small start-up companies and large multinationals. This year, however, the hackathon also provided the finale to the Quantum Fringe, which was the brainchild of Elham Kashefi and her team at the QSL. Taking inspiration from the better-known Edinburgh Fringe, the idea was to create a diverse programme of events to engage and inspire different audiences with the latest ideas in quantum computing.

“We wanted to celebrate the International Year of Quantum in a unique way,” said Mina Doosti, one of the QSL’s lead researchers. “We had lots of very different events, many of which we hadn’t foreseen at the start. It was very refreshing, and we had a lot of fun.”

One of Doosti’s favourite events was a two-day summer school designed for senior high-school students. As well as introducing the students to the concepts of quantum computing, the QSL researchers challenged them to write some code that could be run on IBM’s free-to-access quantum computer. “The organizers and lecturers from the QSL worked hard to develop material that would make sense to the students, and the attendees really grabbed the opportunity to come and learn,” Doosti explained. “From the questions they were asking and the way they tackled the games and challenges, we could see that they were interested and that they had learnt something.”

From the outset the QSL team were also keen for the Quantum Fringe to become a focal point for quantum-inspired activities that were being planned by other organizations. Starting from a baseline of four pillar events that had been organized by the NQCC and the QSL in previous years, the programme eventually swelled to 16 separate gatherings with different aims and outcomes. That included a public lecture organized by the new QCi3 Hub – a research consortium focused on interconnected quantum technologies – which attracted around 200 people who wanted to know more about the evolution of quantum science and its likely impact across technology, industry, and society. An open discussion forum hosted by Quantinuum, one of the main sponsors of the festival, also brought together academic researchers, industry experts and members of the public to identify strategies for ensuring that quantum computing benefits everyone in society, not just a privileged few.

Quantum researchers also had plenty of technical events to choose from. The regular AIMday Quantum Computing, now in its third year, enabled academics to work alongside industry experts to explore a number of business-led challenges. More focused scientific meetings allowed researchers to share their latest results in quantum cryptography and cybersecurity, algorithms and complexity, and error correction in neutral atoms. For her part, Doosti co-led the third edition of Foundations in Quantum Computing, a workshop that combines invited talks with dedicated time for focused discussion. “The speakers are briefed to cover the evolution of a particular field and to highlight open challenges, and then we use the discussion sessions to brainstorm ideas around a specific question,” she explained.

Those scientific meetings were complemented by a workshop on responsible quantum innovation, again hosted by the QCi3 Hub, and a week-long summer school on the Isle of Skye that was run by Heriot-Watt University and the London School of Mathematics. “All of our partners ran their events in the way they wanted, but we helped them with local support and some marketing and promotion,” said Ramin Jafarzadegan, the QSL’s operations manager and the chair of the Quantum Fringe festival. “Bringing all of these activities together delivered real value because visitors to Edinburgh could take part in multiple events.”

Indeed, one clear benefit of this approach was that some of the visiting scientists stayed for longer, which also enabled them to work alongside the QSL team. That has inspired a new scheme, called QSL Visiting Scholars, that aims to encourage scientists from other institutions to spend a month or so in Edinburgh to pursue collaborative projects.

As a whole, the Quantum Fringe has helped both the NQCC and the QSL in their ambitions to bring diverse stakeholders together to create new connections and to grow the ecosystem for quantum computing in the UK. “The NQCC should have patented the ‘quantum hackathon’ name,” joked Sir Peter. “Similar events are popping up everywhere these days, but the NQCC’s was among the first.”

The post Festival opens up the quantum realm appeared first on Physics World.

Understanding strongly correlated topological insulators

13 août 2025 à 09:03

Topological insulators have generated a lot of interest in recent years because of their potential applications in quantum computing, spintronics and information processing.

The defining property of these materials is that their interior behaves as an electrical insulator while their surface behaves as an electrical conductor. In other words, electrons can only move along the material’s surface.

In some cases however, known as strongly correlated systems, the strong interactions between electrons cause this relatively simple picture to break down.

Understanding and modelling strongly correlated topological insulators, it turns out, is extremely challenging.

A team of researchers from the Kavli Institute for Theoretical Sciences, China, have recently tackled this challenge by using a new approach employing fermionic tensor states.

Their framework notably reduces the number of parameters needed in numerical simulations. This should lead to a greatly improved computational efficiency when modelling these systems.

By combining their methods with advanced numerical techniques, the researchers expect to be able to overcome the challenges posed by strong interaction effects.

This will lead to a deeper understanding of the properties of strongly correlated systems and could also enable the discovery of new materials with exciting new properties.

The post Understanding strongly correlated topological insulators appeared first on Physics World.

Physicists get dark excitons under control

12 août 2025 à 16:30
Dark exciton control: Researchers assemble a large cryostat in an experimental physics laboratory, preparing for ultra-low temperature experiments with quantum dots on a semiconductor chip. (Courtesy: Universität Innsbruck)

Physicists in Austria and Germany have developed a means of controlling quasiparticles known as dark excitons in semiconductor quantum dots for the first time. The new technique could be used to generate single pairs of entangled photons on demand, with potential applications in quantum information storage and communication.

Excitons are bound pairs of negatively charged electrons and positively charged “holes”. When these electrons and holes have opposite spins, they recombine easily, emitting a photon in the process. Excitons of this type are known as “bright” excitons. When the electrons and holes have parallel spins, however, direct recombination by emitting a photon is not possible because it would violate the conservation of spin angular momentum. This type of exciton is therefore known as a “dark” exciton.

Because dark excitons are not optically active, they have much longer lifetimes than their bright cousins. For quantum information specialists, this is an attractive quality, because it means that dark excitons can store quantum states – and thus the information contained within these states – for much longer. “This information can then be released at a later time and used in quantum communication applications, such as optical quantum computing, secure communication via quantum key distribution (QKD) and quantum information distribution in general,” says Gregor Weihs, a quantum photonics expert at the Universität Innsbruck, Austria who led the new study.

The problem is that dark excitons are difficult to create and control. In semiconductor quantum dots, for example, Weihs explains that dark excitons tend to be generated randomly, for example when a quantum dot in a higher-energy state decays into a lower-energy state.

Chirped laser pulses lead to reversible exciton production

In the new work, which is detailed in Science Advances, the researchers showed that they could control the production of dark excitons in quantum dots by using laser pulses that are chirped, meaning that the frequency (or colour) of the laser light varies within the pulse. Such chirped pulses, Weihs explains, can turn one quantum dot state into another.

“We first bring the quantum dot to the (bright) biexciton state using a conventional technique and then apply a (storage) chirped laser pulse that turns this biexciton occupation (adiabatically) into a dark state,” he says. “The storage pulse is negatively chirped – its frequency decreases with time, or in terms of colour, it turns redder.” Importantly, the process is reversible: “To convert the dark exciton back into a bright state, we apply a (positively chirped) retrieval pulse to it,” Weihs says.

One possible application for the new technique would be to generate single pairs of entangled photons on demand – the starting point for many quantum communication protocols. Importantly, Weihs adds that this should be possible with almost any type of quantum dot, whereas an alternative method known as polarization entanglement works for only a few quantum dot types with very special properties. “For example, it could be used to create ‘time-bin’ entangled photon pairs,” he tells Physics World. “Time-bin entanglement is particularly suited to transmitting quantum information through optical fibres because the quantum state stays preserved over very long distances.”

The study’s lead author, Florian Kappe, and his colleague Vikas Remesh describe the project as “a challenging but exciting and rewarding experience” that combined theoretical and experimental tools. “The nice thing, we feel, is that on this journey, we developed a number of optical excitation methods for quantum dots for various applications,” they say via e-mail.

The physicists are now studying the coherence time of the dark exciton states, which is an important property in determining how long they can store quantum information. According to Weihs, the results from this work could make it possible to generate higher-dimensional time-bin entangled photon pairs – for example, pairs of quantum states called qutrits that have three possible values.

“Thinking beyond this, we imagine that the technique could even be applied to multi-excitonic complexes in quantum dot molecules,” he adds. “This could possibly result in multi-photon entanglement, such as so-called GHZ (Greenberger-Horne-Zeilinger) states, which are an important resource in multiparty quantum communication scenarios.”

The post Physicists get dark excitons under control appeared first on Physics World.

IOP president-elect Michele Dougherty named next Astronomer Royal

12 août 2025 à 13:57

The space scientist Michele Dougherty from Imperial College London has been appointed the next Astronomer Royal – the first woman to hold the position. She will succeed the University of Cambridge cosmologist Martin Rees, who has held the role for the past three decades.

The title of Astronomer Royal dates back to the creation of the Royal Observatory in Greenwich in 1675, when it mostly involved advising Charles II on using the stars to improve navigation at sea. John Flamsteed from Derby was the first Astronomer Royal and since then 15 people have held the role.

Dougherty will now act as the official adviser to King Charles III on astronomical matters. She will hold the role alongside her Imperial job as well as being executive chair of the Science and Technology Facilities Council and the next president of the Institute of Physics (IOP), a two-year position she will take up in October.

After gaining a PhD in 1988 from the University of Natal in South Africa, Dougherty moved to Imperial in 1991, where she was head of physics from 2018 until 2024. She has been principal investigator of the magnetometer on the Cassini-Huygens mission to Saturn and its moons and also for the magnetometer for the JUICE craft, which is currently travelling to Jupiter to study its three icy moons.

She was made Commander of the Order of the British Empire in the 2018 New Year Honours for “services to UK Physical Science Research”. Dougherty is also a fellow of the Royal Society, who won its Hughes medal in 2008 for studying Saturn’s moons and had a Royal Society Research Professorship from 2014 to 2019.

“I am absolutely delighted to be taking on the important role of Astronomer Royal,” says Dougherty. “As a young child I never thought I’d end up working on planetary spacecraft missions and science, so I can’t quite believe I’m actually taking on this position. I look forward to engaging the general public in how exciting astronomy is, and how important it and its outcomes are to our everyday life.”

Tom Grinyer, IOP group chief executive officer, offered his “warmest congratulations” to Dougherty. “As incoming president of the IOP and the first woman to hold this historic role [of Astronomer Royal], Dougherty is an inspirational ambassador for science and a role model for every young person who has gazed up at the stars and imagined a future in physics or astronomy.”

The post IOP president-elect Michele Dougherty named next Astronomer Royal appeared first on Physics World.

❌