↩ Accueil

Vue normale

Microscale ‘wave-on-a-chip’ device sheds light on nonlinear hydrodynamics

28 novembre 2025 à 10:40

A new microscale version of the flumes that are commonly used to reproduce wave behaviour in the laboratory will make it far easier to study nonlinear hydrodynamics. The device consists of a layer of superfluid helium just a few atoms thick on a silicon chip, and its developers at the University of Queensland, Australia, say it could help us better understand phenomena ranging from oceans and hurricanes to weather and climate.

“The physics of nonlinear hydrodynamics is extremely hard to model because of instabilities that ultimately grow into turbulence,” explains study leader Warwick Bowen of Queensland’s Quantum Optics Laboratory. “It is also very hard to study in experiments since these often require hundreds-of-metre-long wave flumes.”

While such flumes are good for studying shallow-water dynamics like tsunamis and rogue waves, Bowen notes that they struggle to access many of the complex wave behaviours, such as turbulence, found in nature.

Amplifying the nonlinearities in complex behaviours

The team say that the geometrical structure of the new wave-on-a-chip device can be designed at will using lithographic techniques and built in a matter of days. Superfluid helium placed on its surface can then be controlled optomechanically. Thanks to these innovations, the researchers were able to experimentally measure nonlinear hydrodynamics millions of times faster than would be possible using traditional flumes. They could also “amplify” the nonlinearities of complex behaviours, making them orders of magnitude stronger than is possible in even the largest wave flumes.

“This promises to change the way we do nonlinear hydrodynamics, with the potential to discover new equations that better explain the complex physics behind it,” Bowen says. “Such a technique could be used widely to improve our ability to predict both natural and engineered hydrodynamic behaviours.”

So far, the team has measured several effects, including wave steepening, shock fronts and solitary wave fission thanks to the chip. While these nonlinear behaviours had been predicted in superfluids, they had never been directly observed there until now.

Waves can be generated in a very shallow depth

The Quantum Optics Laboratory researchers have been studying superfluid helium for over a decade. A key feature of this quantum liquid is that it flows without resistance, similar to the way electrons move without resistance in a superconductor. “We realized that this behaviour could be exploited in experimental studies of nonlinear hydrodynamics because it allows waves to be generated in a very shallow depth – even down to just a few atoms deep,” Bowen explains.

In conventional fluids, Bowen continues, resistance to motion becomes hugely important at small scales, and ultimately limits the nonlinear strengths accessible in traditional flume-based testing rigs. “Moving from the tens-of-centimetre depths of these flumes to tens-of-nanometres, we realized that superfluid helium could allow us to achieve many orders of magnitude stronger nonlinearities – comparable to the largest flows in the ocean – while also greatly increasing measurement speeds. It was this potential that attracted us to the project.”

The experiments were far from simple, however. To do them, the researchers needed to cryogenically cool the system to near absolute zero temperatures. They also needed to fabricate exceptionally thin superfluid helium films that interact very weakly with light, as well as optical devices with structures smaller than a micron. Combining all these components required what Bowen describes as “something of a hero experiment”, with important contributions coming from the team’s co-leader, Christopher Baker, and Walter Wasserman, who was then a PhD student in the group. The wave dynamics themselves, Bowen adds, were “exceptionally complex” and were analysed by Matthew Reeves, the first author of a Science paper describing the device.

As well as the applications areas mentioned earlier, the team say the new work, which is supported by the US Defense Advanced Research Project Agency’s APAQuS Program, could also advance our understanding of strongly-interacting quantum structures that are difficult to model theoretically. “Superfluid helium is a classic example of such a system,” explains Bowen, “and our measurements represent the most precise measurements of wave physics in these. Other applications may be found in quantum technologies, where the flow of superfluid helium could – somewhat speculatively – replace superconducting electron flow in future quantum computing architectures.”

The researchers now plan to use the device and machine learning techniques to search for new hydrodynamics equations.

The post Microscale ‘wave-on-a-chip’ device sheds light on nonlinear hydrodynamics appeared first on Physics World.

Is your WiFi spying on you?

25 novembre 2025 à 17:00

WiFi networks could pose significant privacy risks even to people who aren’t carrying or using WiFi-enabled devices, say researchers at the Karlsruhe Institute of Technology (KIT) in Germany. According to their analysis, the current version of the technology passively records information that is detailed enough to identify individuals moving through networks, prompting them to call for protective measures in the next iteration of WiFi standards.

Although wireless networks are ubiquitous and highly useful, they come with certain privacy and security risks. One such risk stems from a phenomenon known as WiFi sensing, which the researchers at KIT’s Institute of Information Security and Dependability (KASTEL) define as “the inference of information about the networks’ environment from its signal propagation characteristics”.

“As signals propagate through matter, they interfere with it – they are either transmitted, reflected, absorbed, polarized, diffracted, scattered, or refracted,” they write in their study, which is published in the Proceedings of the 2025 ACM SIGSAC Conference on Computer and Communications Security (CCS ’25). “By comparing an expected signal with a received signal, the interference can be estimated and used for error correction of the received data.”

 An under-appreciated consequence, they continue, is that this estimation contains information about any humans who may have unwittingly been in the signal’s path. By carefully analysing the signal’s interference with the environment, they say, “certain aspects of the environment can be inferred” – including whether humans are present, what they are doing and even who they are.

“Identity inference attack” is a threat

The KASTEL team terms this an “identity inference attack” and describes it as a threat that is as widespread as it is serious. “This technology turns every router into a potential means for surveillance,” says Julian Todt, who co-led the study with his KIT colleague Thorsten Strufe. “For example, if you regularly pass by a café that operates a WiFi network, you could be identified there without noticing it and be recognized later – for example by public authorities or companies.”

While Todt acknowledges that security services, cybercriminals and others do have much simpler ways of tracking individuals – for example by accessing data from CCTV cameras or video doorbells – he argues that the ubiquity of wireless networks lends itself to being co-opted as a near-permanent surveillance infrastructure. There is, he adds, “one concerning property” about wireless networks: “They are invisible and raise no suspicion.”

Identity of individuals could be extracted using a machine-learning model

Although the possibility of using WiFi networks in this way is not new, most previous WiFi-based security attacks worked by analysing so-called channel state information (CSI). These data indicate how a radio signal changes when it reflects off walls, furniture, people or animals. However, the KASTEL researchers note that the latest WiFi standard, known as WiFi 5 (802.11ac), changes the picture by enabling a new and potentially easier form of attack based on beamforming feedback information (BFI).

While beamforming uses similar information as CSI, Todt explains that it does so on the sender’s side instead of the receiver’s. This means that a BFI-based surveillance method would require nothing more than standard devices connected to the WiFi network. “The BFI could be used to create images from different perspectives that might then serve to identify persons that find themselves in the WiFi signal range,” Todt says. “The identity of individuals passing through these radio waves could then be extracted using a machine-learning model. Once trained, this model would make an identification in just a few seconds.”

In their experiments, Todt and colleagues studied 197 participants as they walked through a WiFi field while being simultaneously recorded with CSI and BFI from four different angles. The participants had five different “walking styles” (such as walking normally and while carrying a backpack) as well as different gaits. The researchers found that they could identify individuals with nearly 100% accuracy, regardless of the recording angle or the individual’s walking style or gait.

“Risks to our fundamental rights”

“The technology is powerful, but at the same time entails risks to our fundamental rights, especially to privacy,” says Strufe. He warns that authoritarian states could use the technology to track demonstrators and members of opposition groups, prompting him and his colleagues to “urgently call” for protective measures and privacy safeguards to be included in the forthcoming IEEE 802.11bf WiFi standard.

“The literature on all novel sensing solutions highlights their utility for various novel applications,” says Todt, “but the privacy risks that are inherent to such sensing are often overlooked, or worse — these sensors are claimed to be privacy-friendly without any rationale for these claims. As such, we feel it necessary to point out the privacy risks that novel solutions such as WiFi sensing bring with them.”

The researchers say they would like to see approaches developed that can mitigate the risk of identity inference attack. However, they are aware that this will be difficult, since this type of attack exploits the physical properties of the actual WiFi signal. “Ideally, we would influence the WiFi standard to contain privacy-protections in future versions,” says Todt, “but even the impact of this would be limited as there are already millions of WiFi devices out there that are vulnerable to such an attack.”

The post Is your WiFi spying on you? appeared first on Physics World.

‘Patchy’ nanoparticles emerge from new atomic stencilling technique

25 novembre 2025 à 10:00

Researchers in the US and Korea have created nanoparticles with carefully designed “patches” on their surfaces using a new atomic stencilling technique. These patches can be controlled with incredible precision, and could find use in targeted drug delivery, catalysis, microelectronics and tissue engineering.

The first step in the stencilling process is to create a mask on the surface of gold nanoparticles. This mask prevents a “paint” made from grafted-on polymers from attaching to certain areas of the nanoparticles.

“We then use iodide ions as a stencil,” explains Qian Chen, a materials scientist and engineer at the University of Illinois at Urbana-Champaign, US, who led the new research effort. “These adsorb (stick) to the surface of the nanoparticles in specific patterns that depend on the shape and atomic arrangement of the nanoparticles’ facets. That’s how we create the patches – the areas where the polymers selectively bind.” Chen adds that she and her collaborators can then tailor the surface chemistry of these tiny patchy nanoparticles in a very controlled way.

A gap in the field of microfabrication stencilling

The team decided to develop the technique after realizing there was a gap in the field of microfabrication stencilling. While techniques in this area have advanced considerably in recent years, allowing ever-smaller microdevices to be incorporated into ever-faster computer chips, most of them rely on top-down approaches for precisely controlling nanoparticles. By comparison, Chen says, bottom-up methods have been largely unexplored even though they are low-cost, solution-processable, scalable and compatible with complex, curved and three-dimensional surfaces.

Reporting their work in Nature, the researchers say they were inspired by the way proteins naturally self-assemble. “One of the holy grails in the field of nanomaterials is making complex, functional structures from nanoscale building blocks,” explains Chen. “It’s extremely difficult to control the direction and organization of each nanoparticle. Proteins have different surface domains, and thanks to their interactions with each other, they can make all the intricate machines we see in biology. We therefore adopted that strategy by creating patches or distinct domains on the surface of the nanoparticles.”

“Elegant and impressive”

Philip Moriarty, a physicist of the University of Nottingham, UK who was not involved in the project, describes it as “elegant and impressive” work. “Chen and colleagues have essentially introduced an entirely new mode of self-assembly that allows for much greater control of nanoparticle interactions,” he says, “and the ‘atomic stencil’ concept is clever and versatile.”

The team, which includes researchers at the University of Michigan, Pennsylvania State University, Cornell, Brookhaven National Laboratory and Korea’s Chonnam National University as well as Urbana-Champaign, agrees that the potential applications are vast. “Since we can now precisely control the surface properties of these nanoparticles, we can design them to interact with their environment in specific ways,” explains Chen. “That opens the door for more effective drug delivery, where nanoparticles can target specific cells. It could also lead to new types of catalysts, more efficient microelectronic components and even advanced materials with unique optical and mechanical properties.”

She and her colleagues say they now want to extend their approach to different types of nanoparticles and different substrates to find out how versatile it truly is. They will also be developing computational models that can predict the outcome of the stencilling process – something that would allow them to design and synthesize patchy nanoparticles for specific applications on demand.

The post ‘Patchy’ nanoparticles emerge from new atomic stencilling technique appeared first on Physics World.

Plasma bursts from young stars could shed light on the early life of the Sun

21 novembre 2025 à 10:00

The Sun frequently ejects high-energy bursts of plasma that then travel through interplanetary space. These so-called coronal mass ejections (CMEs) are accompanied by strong magnetic fields, which, when they interact with the Earth’s atmosphere, can trigger solar storms that can severely damage satellite systems and power grids.

In the early days of the solar system, the Sun was far more active than it is today and ejected much bigger CMEs. These might have been energetic enough to affect our planet’s atmosphere and therefore influence how life emerged and evolved on Earth, according to some researchers.

Since it is impossible to study the early Sun, astronomers use proxies – that is, stars that resemble it. These “exo-suns” are young G-, K- and M-type stars and are far more active than our Sun is today. They frequently produce CMEs with energies far larger than the most energetic solar flares recorded in recent times, which might not only affect their planets’ atmospheres, but may also affect the chemistry on these planets.

Until now, direct observational evidence for eruptive CME-like phenomena on young solar analogues has been limited. This is because clear signatures of stellar eruptions are often masked by the brightness of their host stars and flares on these. Measurements of Doppler shifts in optical lines have allowed astronomers to detect a few possible stellar eruptions associated with giant superflares on a young solar analogue, but these detections have been limited to single-wavelength data at “low temperatures” of around 104 K. Studies at higher temperatures have been few and far between. And although scientists have tried out promising techniques, such as X-ray and UV dimming, to advance their understanding of these “cool” stars, few simultaneous multi-wavelength observations have been made.

A large Carrington-class flare from EK Draconis

On 29 March 2024, astronomers at Kyoto University in Japan detected a large Carrington-class flare – or superflare – in the far-ultraviolet from EK Draconis, a G-type star located approximately 112 light-years away from the Sun. Thanks to simultaneous observations in the ultraviolet and optical ranges of the electromagnetic spectrum, they say they have now been able to obtain the first direct evidence for a multi-temperature CME from this young solar analogue (which is around 50 to 125 million years old and has a radius similar to the Sun).

The researchers’ campaign spanned four consecutive nights from 29 March to 1 April 2024. They made their ultraviolet observations with the Hubble Space Telescope and the Transiting Exoplanet Survey Satellite (TESS) and performed optical monitoring using three ground-based telescopes in Japan, Korea and the US.

They found that the far-ultraviolet and optical lines were Doppler shifted during and just before the superflare, with the ultraviolet observations showing blueshifted emission indicative of hot plasma. About 10 minutes later, the optical telescopes observed blueshifted absorption in the hydrogen Hα line, which indicates cooler gases. According to the team’s calculations, the hot plasma had a temperature of 100 000 K and was ejected at speeds of 300–550 km/s, while the “cooler” gas (with a temperature of 10 000 K) was ejected at 70 km/s.

“These findings imply that it is the hot plasma rather than the cool plasma that carries kinetic energy into planetary space,” explains study leader Kosuke Namekata. “The existence of this plasma suggests that such CMEs from our Sun in the past, if frequent and strong, could have driven shocks and energetic particles capable of eroding or chemically altering the atmosphere of the early Earth and the other planets in our solar system.”

“The discovery,” he tells Physics World, “provides the first observational link between solar and stellar eruptions, bridging stellar astrophysics, solar physics and planetary science.”

Looking forward, the researchers, who report their work in Nature Astronomy, now plan to conduct similar, multiwavelength campaigns on other young solar analogues to determine how frequently such eruptions occur and how they vary from star to star.

“In the near future, next-generation ultraviolet space telescopes such as JAXA’s LAPYUTA and NASA’s ESCAPADE, coordinated with ground-based facilities, will allow us to trace these events more systematically and understand their cumulative impact on planetary atmospheres,” says Namekata.

The post Plasma bursts from young stars could shed light on the early life of the Sun appeared first on Physics World.

Flattened halo of dark matter could explain high-energy ‘glow’ at Milky Way’s heart

20 novembre 2025 à 18:00

Astronomers have long puzzled over the cause of a mysterious “glow” of very high energy gamma radiation emanating from the centre of our galaxy. One possibility is that dark matter – the unknown substance thought to make up more than 25% of the universe’s mass – might be involved. Now, a team led by researchers at Germany’s Leibniz Institute for Astrophysics Potsdam (AIP) says that a flattened rather than spherical distribution of dark matter could account for the glow’s properties, bringing us a step closer to solving the mystery.

Dark matter is believed to be responsible for holding galaxies together. However, since it does not interact with light or other electromagnetic radiation, it can only be detected through its gravitational effects. Hence, while astrophysical and cosmological evidence has confirmed its presence, its true nature remains one of the greatest mysteries in modern physics.

“It’s extremely consequential and we’re desperately thinking all the time of ideas as to how we could detect it,” says Joseph Silk, an astronomer at Johns Hopkins University in the US and the Institut d’Astrophysique de Paris and Sorbonne University in France who co-led this research together with the AIP’s Moorits Mihkel Muru. “Gamma rays, and specifically the excess light we’re observing at the centre of our galaxy, could be our first clue.”

Models might be too simple

The problem, Muru explains, is that the way scientists have usually modelled dark matter to account for the excess gamma-ray radiation in astronomical observations was highly simplified. “This, of course, made the calculations easier, but simplifications always fuzzy the details,” he says. “We showed that in this case, the details are important: we can’t model dark matter as a perfectly symmetrical cloud and instead have to take into account the asymmetry of the cloud.”

Muru adds that the team’s findings, which are detailed in Phys. Rev. Lett., provide a boost to the “dark matter annihilation” explanation of the excess radiation. According to the standard model of cosmology, all galaxies – including our own Milky Way – are nested inside huge haloes of dark matter. The density of this dark matter is highest at the centre, and while it primarily interacts through gravity, some models suggest that it could be made of massive, neutral elementary particles that are their own antimatter counterparts. In these dense regions, therefore, such dark matter species could be mutually annihilating, producing substantial amounts of radiation.

Pierre Salati, an emeritus professor at the Université Savoie Mont Blanc, France, who was not involved in this work, says that in these models, annihilation plays a crucial role in generating a dark matter component with an abundance that agrees with cosmological observations. “Big Bang nucleosynthesis sets stringent bounds on these models as a result of the overall concordance between the predicted elemental abundances and measurements, although most models do survive,” Salati says. “One of the most exciting aspects of such explanations is that dark matter species might be detected through the rare antimatter particles – antiprotons, positrons and anti-deuterons – that they produce as they currently annihilate inside galactic halos.”

Silvia Manconi of the Laboratoire de Physique Théorique et Hautes Energies (LPTHE), France, who was also not involved in the study, describes it as “interesting and stimulating”. However, she cautions that – as is often the case in science – reality is probably more complex than even advanced simulations can capture. “This is not the first time that galaxy simulations have been used to study the implications of the excess and found non-spherical shapes,” she says, though she adds that the simulations in the new work offer “significant improvements” in terms of their spatial resolution.

Manconi also notes that the study does not demonstrate how the proposed distribution of dark matter would appear in data from the Fermi Gamma-ray Space Telescope’s Large Area Telescope (LAT), or how it would differ quantitatively from observations of a distribution of old stars. Forthcoming observations with radio telescopes such as MeerKat and FAST, she adds, may soon identify pulsars in this region of the galaxy, shedding further light on other possible contributions to the excess of gamma rays.

New telescopes could help settle the question

Muru acknowledges that better modelling and observations are still needed to rule out other possible hypotheses. “Studying dark matter is very difficult, because it doesn’t emit or block light, and despite decades of searching, no experiment has yet detected dark matter particles directly,” he tells Physics World. “A confirmation that this observed excess radiation is caused by dark matter annihilation through gamma rays would be a big leap forward.”

New gamma-ray telescopes with higher resolution, such as the Cherenkov Telescope Array, could help settle this question, he says. If these telescopes, which are currently under construction, fail to find star-like sources for the glow and only detect diffuse radiation, that would strengthen the alternative dark matter annihilation explanation.

Muru adds that a “smoking gun” for dark matter would be a signal that matches current theoretical predictions precisely. In the meantime, he and his colleagues plan to work on predicting where dark matter should be found in several of the dwarf galaxies that circle the Milky Way.

“It’s possible we will see the new data and confirm one theory over the other,” Silk says. “Or maybe we’ll find nothing, in which case it’ll be an even greater mystery to resolve.”

The post Flattened halo of dark matter could explain high-energy ‘glow’ at Milky Way’s heart appeared first on Physics World.

New cylindrical metamaterials could act as shock absorbers for sensitive equipment

20 novembre 2025 à 10:00

A 3D-printed structure called a kagome tube could form the backbone of a new system for muffling damaging vibrations. The structure is part of a class of materials known as topological mechanical metamaterials, and unlike previous materials in this group, it is simple enough to be deployed in real-world situations. According to lead developer James McInerney of the Wright-Patterson Air Force Base in Ohio, US, it could be used as shock protection for sensitive systems found in civil and aerospace engineering applications.

McInerney and colleagues’ tube-like design is made from a lattice of beams arranged in such a way that low-energy vibrational modes called floppy modes become localized to one side. “This provides good properties for isolating vibrations because energy input into the system on the floppy side does not propagate to the other side,” McInerney says.

The key to this desirable behaviour, he explains, is the arrangement of the beams that form the lattice structure. Using a pattern first proposed by the 19th century physicist James Clerk Maxwell, the beams are organized into repeating sub-units to form stable, two-dimensional structures known as topological Maxwell lattices.

Self-supporting design

Previous versions of these lattices could not support their own weight. Instead, they were attached to rigid external mounts, making it impractical to integrate them into devices. The new design, in contrast, is made by folding a flat Maxwell lattice into a cylindrical tube that is self-supporting. The tube features a connected inner and outer layer – a kagome bilayer – and its radius can be precisely engineered to give it the topological behaviour desired.

The researchers, who detail their work in Physical Review Applied, first tested their structure numerically by attaching a virtual version to a mechanically sensitive sample and a source of low-energy vibrations. As expected, the tube diverted the vibrations away from the sample and towards the other end of the tube.

Next, they developed a simple spring-and-mass model to understand the tube’s geometry by considering it as a simple monolayer. This modelling indicated that the polarization of the tube should be similar to the polarization of the monolayer. They then added rigid connectors to the tube’s ends and used a finite-element method to calculate the frequency-dependent patterns of vibrations propagating across the structure. They also determined the effective stiffness of the lattice as they applied loads parallel and perpendicular to it.

The researchers are targeting vibration-isolation applications that would benefit from a passive support structure, especially in cases where the performance of alternative passive mechanisms, such as viscoelastomers, is temperature-limited. “Our tubes do not necessarily need to replace other vibration isolation mechanisms,” McInerney explains. “Rather, they can enhance the capabilities of these by having the load-bearing structure assist with isolation.”

The team’s first and most important task, McInerney adds, will be to explore the implications of physically mounting the kagome tube on its vibration isolation structures. “The numerical study in our paper uses idealized mounting conditions so that the input and output are perfectly in phase with the tube vibrations,” he says. “Accounting for the potential impedance mismatch between the mounts and the tube will enable us to experimentally validate our work and provide realistic design scenarios.”

The post New cylindrical metamaterials could act as shock absorbers for sensitive equipment appeared first on Physics World.

New experiments on static electricity cast doubt on previous studies in the field

13 novembre 2025 à 10:00

Static electricity is an everyday phenomenon, but it remains poorly understood. Researchers at the Institute of Science and Technology Austria (ISTA) have now shed new light on it by capturing an “image” of charge distributions as charge transfers from one surface to another. Their conclusions challenge longstanding interpretations of previous experiments and enhance our understanding of how charge behaves on insulating surfaces.

Static electricity is also known as contact electrification because it occurs when charge is transferred from one object to another by touch. The most common laboratory example involves rubbing a balloon on someone’s head to make their hair stand on end. However, static electricity is also associated with many other activities, including coffee grinding, pollen transport and perhaps even the formation of rocky planets.

One of the most useful ways of studying contact electrification is to move a metal tip slowly over the surface of a sample without touching it, recording a voltage all the while. These so-called scanning Kelvin methods produce an “image” of voltages created by the transferred charge. At the macroscale, around 100 μm to 10 cm, the main method is termed scanning Kelvin probe microscopy (SKPM). At the nanoscale, around 10  nm to 100  μm, a related but distinct variant known as Kelvin probe force microscopy (KPFM) is used instead.

In previous fundamental physics studies using these techniques, the main challenges have been to make sense of the stationary patterns of charge left behind after contact electrification, and to investigate how these patterns evolve over space and time. In the latest work, the ISTA team chose to ask a slightly different question: when are the dynamics of charge transfer too fast for measured stationary patterns to yield meaningful information?

Mapping the charge on the contact-electrified surface of a polymer film

To find out, ISTA PhD student Felix Pertl built a special setup that could measure a sample’s surface charge with KPFM; transfer it below a linear actuator so that it could exchange charge when it contacted another material; and then transfer it underneath the KPFM again to image the resulting change in the surface charge.

“In a typical set-up, the sample transfer, moving the AFM to the right place and reinitiation and recalibration of the KPFM parameters can easily take as long as tens of minutes,” Pertl explains. “In our system, this happens in as little as around 30 s. As all aspects of the system are completely automated, we can repeat this process, and quickly, many times.”

An experimental set-up to measure static electricity
Whole setup side view of the experiment: the counter-sample (white rod with green sample holder and PDMS at the very end) approaches the sample and induces electric charge via contact. The AFM head is on the left waiting until the sample returns to its original position. (Courtesy: Felix Pertl)

This speed-up is important because static electricity dissipates relatively rapidly. In fact, the researchers found that the transferred charge disappeared from the sample’s surface quicker than the time required for most KPFM scans. Their data also revealed that the deposited charge was, in effect, uniformly distributed across the surface and that its dissipation depended on the material’s electrical conductivity. Additional mathematical modelling and subsequent experiments confirmed that the more insulating a material is, the slower it dissipates charge.

Surface heterogeneity likely not a feature of static electricity

Pertl says that these results call into question the validity of some previous static electricity studies that used KPFM to study charge transfer. “The most influential paper in our field to date reported surface charge heterogeneity using KPFM,” he tells Physics World. At first, the ISTA team’s goal was to understand the origin of this heterogeneity. But when their own experiments showed an essentially homogenous distribution of surface charge, the researchers had to change tack.

“The biggest challenge in our work was realizing – and then accepting – that we could not reproduce the results from this previous study,” Pertl says. “Convincing both my principal investigator and myself that our data revealed a very different physical mechanism required patience, persistence and trust in our experimental approach.”

The discrepancy, he adds, implies that the surface heterogeneity previously observed was likely not a feature of static electricity, as was claimed. Instead, he says, it was probably “an artefact of the inability to image the charge before it had left the sample surface”.

A historical precedent

Studies of contact electrification studies go back a long way. Philippe Molinié of France’s GeePs Laboratory, who was not involved in this work, notes that the first experiments were performed by the English scientist William Gilbert clear back in the sixteenth century. As well as coining the term “electricity” (from the Greek “elektra”, meaning amber), Gilbert was also the first to establish that magnets maintain their electrical attraction over time, while the forces produced by contact-charged insulators slowly decrease.

“Four centuries later, many mysteries remain unsolved in the contact electrification phenomenon,” Molinié observes. He adds that the surfaces of insulating materials are highly complex and usually strongly disordered, which affects their ability to transfer charge at the molecular scale. “The dynamics of the charge neutralization, as Pertl and colleagues underline, is also part of the process and is much more complex than could be described by a simple resistance-capacitor model,” Molinié says.

Although the ISTA team studied these phenomena with sophisticated Kelvin probe microscopy rather than the rudimentary tools available to Gilbert, it is, Molinié says, “striking that the competition between charge transfer and charge screening that comes from the conductivity of an insulator, first observed by Gilbert, is still at the very heart of the scientific interrogations that this interesting new work addresses.”

“A more critical interpretation”

The Austrian researchers, who detail their work in Phys. Rev. Lett., say they hope their experiments will “encourage a more critical interpretation” of KPFM data in the future, with a new focus on the role of sample grounding and bulk conductivity in shaping observed charge patterns. “We hope it inspires KPFM users to reconsider how they design and analyse experiments, which could lead to more accurate insights into charge behaviour in insulators,” Pertl says.

“We are now planning to deliberately engineer surface charge heterogeneity into our samples,” he reveals. “By tuning specific surface properties, we aim to control the sign and spatial distribution of charge on defined regions of these.”

The post New experiments on static electricity cast doubt on previous studies in the field appeared first on Physics World.

Twistelastics controls how mechanical waves move in metamaterials

7 novembre 2025 à 14:57
twisted surfaces can be used to manipulate mechanical waves
How it works Researchers use twisted surfaces to manipulate mechanical waves, enabling new technologies for imaging, electronics and sensors. (Courtesy: A Alù)

By simply placing two identical elastic metasurfaces atop each other and then rotating them relative to each other, the topology of the elastic waves dispersing through the resulting stacked structure can be changed – from elliptic to hyperbolic. This new control technique, from physicists at the CUNY Advanced Science Research Center in the US, works over a broad frequency range and has been dubbed “twistelastics”. It could allow for advanced reconfigurable phononic devices with potential applications in microelectronics, ultrasound sensing and microfluidics.

The researchers, led by Andrea Alù, say they were inspired by the recent advances in “twistronics” and its “profound impact” on electronic and photonic systems. “Our goal in this work was to explore whether similar twist-induced topological phenomena could be harnessed in elastodynamics in which phonons (vibrations of the crystal lattice) play a central role,” says Alù.

In twistelastics, the rotations between layers of identical, elastic engineered surfaces are used to manipulate how mechanical waves travel through the materials. The new approach, say the CUNY researchers, allows them to reconfigure the behaviour of these waves and precisely control them. “This opens the door to new technologies for sensing, communication and signal processing,” says Alù.

From elliptic to hyperbolic

In their work, the researchers used computer simulations to design metasurfaces patterned with micron-sized pillars. When they stacked one such metasurface atop the other and rotated them at different angles, the resulting combined structure changed the way phonons spread. Indeed, their dispersion topology went from elliptic to hyperbolic.

At a specific rotation angle, known as the “magic angle” (just like in twistronics), the waves become highly focused and begin to travel in one direction. This effect could allow for more efficient signal processing, says Alù, with the signals being easier to control over a wide range of frequencies.

The new twistelastic platform offers broadband, reconfigurable, and robust control over phonon propagation,” he tells Physics World. “This may be highly useful for a wide range of application areas, including surface acoustic wave (SAW) technologies, ultrasound imaging and sensing, microfluidic particle manipulation and on-chip phononic signal processing.

New frontiers

Since the twist-induced transitions are topologically protected, again like in twistronics, the system is resilient to fabrication imperfections, meaning it can be miniaturized and integrated into real-world devices, he adds. “We are part of an exciting science and technology centre called ‘New Frontiers of Sound’, of which I am one of the leaders. The goal of this ambitious centre is to develop new acoustic platforms for the above applications enabling disruptive advances for these technologies.”

Looking ahead, the researchers say they are looking into miniaturizing their metasurface design for integration into microelectromechanical systems (MEMS). They will also be studying multi-layer twistelastic architectures to improve how they can control wave propagation and investigating active tuning mechanisms, such as electromechanical actuation, to dynamically control twist angles. “Adding piezoelectric phenomena for further control and coupling to the electromagnetic waves,” is also on the agenda says Alù.

The present work is detailed in PNAS.

The post Twistelastics controls how mechanical waves move in metamaterials appeared first on Physics World.

Ternary hydride shows signs of room-temperature superconductivity at high pressures

7 novembre 2025 à 10:00
Crystal lattice structure of a new high-temperature superconductor
Crystal structure In the new high-Tc superconductor, lanthanum and scandium atoms constitute the MgB2-type sublattice, while the surrounding hydrogen atoms form two types of cage-like configurations. (Courtesy: Guangtao Liu, Jilin University)

Researchers in China claim to have made the first ever room-temperature superconductor by compressing an alloy of lanthanum-scandium (La-Sc) and the hydrogen-rich material ammonia borane (NH3BH3) together at pressures of 250–260 GPa, observing superconductivity with a maximum onset temperature of 298 K. While these high pressures are akin to those at the centre of the Earth, the work marks a milestone in the field of superconductivity, they say.

Superconductors conduct electricity without resistance and many materials do this when cooled below a certain transition temperature, Tc. In most cases this temperature is very low – for example, solid mercury, the first superconductor to be discovered, has a Tc of 4.2 K. Researchers have therefore been looking for superconductors that operate at higher temperatures – perhaps even at room temperature. Such materials could revolutionize a host of application areas, including increasing the efficiency of electrical generators and transmission lines through lossless electricity transmission. They would also greatly simplify technologies such as MRI, for instance, that rely on the generation or detection of magnetic fields.

Researchers made considerable progress towards this goal in the 1980s and 1990s with the discovery of the “high-temperature” copper oxide superconductors, which have Tc values between 30 and 133 K. Fast-forward to 2015 and the maximum known critical temperature rose even higher thanks to the discovery of a sulphide material, H3S, that has a Tc of 203 K when compressed to pressures of 150 GPa.

This result sparked much interest in solid materials containing hydrogen atoms bonded to other elements and in 2019, the record was broken again, this time by lanthanum decahydride (LaH10), which was found to have a Tc of 250–260 K, albeit again at very high pressures. Then in 2021, researchers observed high-temperature superconductivity in the cerium hydrides, CeH9 and CeH10, which are remarkable because they are stable and boast high-temperature superconductivity at lower pressures (about 80 GPa, or 0.8 million atmospheres) than the other so-called “superhydrides”.

Ternary hydrides

In recent years, researchers have started turning their attention to ternary hydrides – substances that comprise three different atomic species rather than just two. Compared with binary hydrides, ternary hydrides are more structurally complex, which may allow them to have higher Tc values. Indeed, Li2MgH16 has been predicted to exhibit “hot” superconductivity with a Tc of 351–473 K under multimegabar pressures and several other high-Tc hydrides, including MBxHy, MBeH8 and Mg2IrH6-7, have been predicted to be stable under comparatively lower pressures.

In the new work, a team led by physicist Yanming Ma of Jilin University, studied LaSc2H24 – a compound that’s made by doping Sc into the well-known La-H binary system. Ma and colleagues had already predicted in theory – using the crystal structure prediction (CALYPSO) method – that this ternary material should feature a hexagonal P6/mmm symmetry. Introducing Sc into the La-H results in the formation of two novel interlinked H24 and H30 hydrogen clathrate “cages” with the H24 surrounding Sc and the H30 surrounding La.

The researchers predicted that these two novel hydrogen frameworks should produce an exceptionally large hydrogen-derived density of states at the Fermi level (the highest energy level that electrons can occupy in a solid at a temperature of absolute zero), as well as enhancing coupling between electrons and phonons (vibrations of the crystal lattice) in the material, leading to an exceptionally high Tc of up to 316 K at high pressure.

To characterize their material, the researchers placed it in a diamond-anvil cell, a device that generates extreme pressures as it squeezes the sample between two tiny, gem-grade crystals of diamond (one of the hardest substances known) while heating it with a laser. In situ X-ray diffraction experiments revealed that the compound crystallizes into a hexagonal structure, in excellent agreement with the predicted P6/mmm LaSc2H24 structure.

A key piece of experimental evidence for superconductivity in the La-Sc-H ternary system, says co-author Guangtao Liu, came from measurements that repeatedly demonstrated the onset of zero electrical resistance below the Tc.

Another significant proof, Liu adds, is that the Tc decreases monotonically with the application of an external magnetic field in a number of independently synthesized samples. “This behaviour is consistent with the conventional theory of superconductivity since an external magnetic field disrupts Cooper pairs – the charge carriers responsible for the zero-resistance state – thereby suppressing superconductivity.”

“These two main observations demonstrate the superconductivity in our synthesized La-Sc-H compound,” he tells Physics World.

Difficult experiments

The experiments were not easy, Liu recalls. The first six months of attempting to synthesize LaSc2H24 below 200 GPa yielded no obvious Tc enhancement. “We then tried higher pressure and above 250 GPa, we had to manually deposit three precursor layers and ensure that four electrodes (for subsequent conductance measurements) were properly connected to the alloy in an extremely small sample chamber, just 10 to 15 µm in size,” he says. “This required hundreds of painstaking repetitions.”

And that was not all: to synthesize the LaSc2H24, the researchers had to prepare the correct molar ratios of a precursor alloy. The Sc and La elements cannot form a solid solution because of their different atomic radii, so using a normal melting method makes it hard to control this ratio. “After about a year of continuous investigations, we finally used the magnetron sputtering method to obtain films of LaSc2H24 with the molar ratios we wanted,” Liu explains. “During the entire process, most of our experiments failed and we ended up damaging at least 70 pairs of diamonds.”

Sven Friedemann of the University of Bristol, who was not involved in this work, says that the study is “an important step forward” for the field of superconductivity with a new record transition temperature of 295 K. “The new measurements show zero resistance (within resolution) and suppression in magnetic fields, thus strongly suggesting superconductivity,” he comments. “It will be exciting to see future work probing other signatures of superconductivity. The X-ray diffraction measurements could be more comprehensive and leave some room for uncertainty to whether it is indeed the claimed LaSc2H24 structure giving rise to the superconductivity.”

Ma and colleagues say they will continue to study the properties of this compound – and in particular, verify the isotope effect (a signature of conventional superconductors) or measure the superconducting critical current. “We will also try to directly detect the Meissner effect – a key goal for high-temperature superhydride superconductors in general,” says Ma. “Guided by rapidly advancing theoretical predictions, we will also synthesize new multinary superhydrides to achieve better superconducting properties under much lower pressures.”

The study is available on the arXiv pre-print server.

The post Ternary hydride shows signs of room-temperature superconductivity at high pressures appeared first on Physics World.

Portable source could produce high-energy muon beams

3 novembre 2025 à 10:00

Due to government shutdown restrictions currently in place in the US, the researchers who headed up this study have not been able to comment on their work

Laser plasma acceleration (LPA) may be used to generate multi-gigaelectronvolt muon beams, according to physicists at the Lawrence Berkeley National Laboratory (LBNL) in the US. Their work might help in the development of ultracompact muon sources for applications such as muon tomography – which images the interior of large objects that are inaccessible to X-ray radiography.

Muons are charged subatomic particles that are produced in large quantities when cosmic rays collide with atoms 15–20 km high up in the atmosphere. Muons have the same properties as electrons but are around 200 times heavier. This means they can travel much further through solid structures than electrons. This property is exploited in muon tomography, which analyses how muons penetrate objects and then exploits this information to produce 3D images.

The technique is similar to X-ray tomography used in medical imaging, with the cosmic-ray radiation taking the place of artificially generated X-rays and muon trackers the place of X-ray detectors. Indeed, depending on their energy, muons can traverse metres of rock or other materials, making them ideal for imaging thick and large structures. As a result, the technique has been used to peer inside nuclear reactors, pyramids and volcanoes.

As many as 10,000 muons from cosmic rays reach each square metre of the Earth’s surface every minute. These naturally produced particles have unpredictable properties, however, and they also only come from the vertical direction. This fixed directionality means that can take months to accumulate enough data for tomography.

Another option is to use the large numbers of low-energy muons that can be produced in proton accelerator facilities by smashing a proton beam onto a fixed carbon target. However, these accelerators are large and expensive facilities, limiting their use in muon tomography.

A new compact source

Physicists led by Davide Terzani have now developed a new compact muon source based on LPA-generated electron beams. Such a source, if optimized, could be deployed in the field and could even produce muon beams in specific directions.

In LPA, an ultra-intense, ultra-short, and tightly focused laser pulse propagates into an “under-dense” gas. The pulse’s extremely high electric field ionizes the gas atoms, freeing the electrons from the nuclei, so generating a plasma. The ponderomotive force, or radiation pressure, of the intense laser pulse displaces these electrons and creates an electrostatic wave that produces accelerating fields orders of magnitude higher than what is possible in the traditional radio-frequency cavities used in conventional accelerators.

LPAs have all the advantages of an ultra-compact electron accelerator that allows for muon production in a small-size facility such as BeLLA, where Terzani and his colleagues work. Indeed, in their experiment, they succeeded in generating a 10 GeV electron beam in a 30 cm gas target for the first time.

The researchers collided this beam with a dense target, such as tungsten. This slows the beam down so that it emits Bremsstrahlung, or braking radiation, which interacts with the material, producing secondary products that include lepton–antilepton pairs, such as electron–positron and muon–antimuon pairs. Behind the converter target, there is also a short-lived burst of muons that propagates roughly along the same axis as the incoming electron beam. A thick concrete shielding then filters most of the secondary products, letting the majority of muons pass through it.

Crucially, Terzani and colleagues were able to separate the muon signal from the large background radiation – something that can be difficult to do because of the inherent inefficiency of the muon production process. This allowed them to identify two different muon populations coming from the accelerator. These were a collimated, forward directed population, generated by pair production; and a low-energy, isotropic, population generated by meson decay.

Many applications

Muons can ne used in a range of fields, from imaging to fundamental particle physics. As mentioned, muons from cosmic rays are currently used to inspect large and thick objects not accessible to regular X-ray radiography – a recent example of this is the discovery of a hidden chamber in Khufu’s Pyramid. They can also be used to image the core of a burning blast furnace or nuclear waste storage facilities.

While the new LPA-based technique cannot yet produce muon fluxes suitable for particle physics experiments – to replace a muon injector, for example – it could offer the accelerator community a convenient way to test and develop essential elements towards making a future muon collider.

The experiment in this study, which is detailed in Physical Review Accelerators and Beams, focused on detecting the passage of muons, unequivocally proving their signature. The researchers conclude that they now have a much better understanding of the source of these muons.

Unfortunately, the original programme that funded this research has ended, so future studies are limited at the moment. Not to be disheartened, the researchers say they strongly believe in the potential of LPA-generated muons and are working on resuming some of their experiments. For example, they aim to measure the flux and the spectrum of the resulting muon beam using completely different detection techniques based on ultra-fast particle trackers, for example.

The LBNL team also wants to explore different applications, such as imaging deep ore deposits – something that will be quite challenging because it poses strict limitations on the minimum muon energy required to penetrate soil. Therefore, they are looking into how to increase the muon energy of their source.

The post Portable source could produce high-energy muon beams appeared first on Physics World.

Young rogue planet grows like a star

31 octobre 2025 à 15:00

When a star rapidly accumulates gas and dust during its early growth phase, it’s called an accretion burst. Now, for the first time, astronomers have observed a planet doing the same thing. The discovery, made using the European Southern Observatory’s Very Large Telescope (VLT) and the James Webb Space Telescope (JWST), shows that the infancy of certain planetary-mass objects and that of newborn stars may share similar characteristics.

In their study, which is detailed in The Astrophysical Journal Letters, astronomers led by Víctor Almendros-Abad at Italy’s Palermo Astronomical Observatory; Ray Jayawardhana of Johns Hopkins University in the US; and Belinda Damian and Aleks Scholz of the University of St Andrews, UK, focused on a planet known as Cha1107-7626. Located around 620 light-years from Earth, this planet has a mass approximately five to 10 times that of Jupiter. Unlike Jupiter, though, it does not orbit around a central star. Instead, it floats freely in space as a “rogue” planet, one of many identified in recent years.

An accretion burst in Cha1107-7626

Like other rogue planets, Cha1107-7626 was known to be surrounded by a disk of dust and gas. When material from this disk spirals, or accretes, onto the planet, the planet grows.

What Almendros-Abad and colleagues discovered is that this process is not uniform. Using the VLT’s XSHOOTER and the NIRSpec and MIRI instruments on JWST, they found that Cha1107-7626 experienced a burst of accretion beginning in June 2025. This is the first time anyone has seen an accretion burst in an object with such a low mass, and the peak accretion rate of six billion tonnes per second makes it the strongest accretion episode ever recorded in a planetary-mass object. It may not be over, either. At the end of August, when the observing campaign ended, the burst was still ongoing.

An infancy similar to a star’s

The team identified several parallels between Cha1107-7626’s accretion burst and those that young stars experience. Among them were clear signs that gas is being funnelled onto the planet. “This indicates that magnetic fields structure the flow of gas, which is again something well known from stars,” explains Scholz. “Overall, our discovery is establishing interesting, perhaps surprising parallels between stars and planets, which I’m not sure we fully understand yet.”

The astronomers also found that the chemistry of the disc around the planet changed during accretion, with water being present in this phase even though it hadn’t been before. This effect has previously been spotted in stars, but never in a planet until now.

“We’re struck by quite how much the infancy of free-floating planetary-mass objects resembles that of stars like the Sun,” Jayawardhana says. “Our new findings underscore that similarity and imply that some objects comparable to giant planets form the way stars do, from contracting clouds of gas and dust accompanied by disks of their own, and they go through growth episodes just like newborn stars.”

The researchers have been studying similar objects for many years and earlier this year published results based on JWST observations that featured a small sample of planetary-mass objects. “This particular study is part of that sample,” Scholz tells Physics World, “and we obtained the present results because Victor wanted to look in detail at the accretion flow onto Cha1107-7626, and in the process discovered the burst.”

The researchers say they are “keeping an eye” on Cha1107-7626 and other such objects that are still growing because their environment is dynamic and unstable. “More to the point, we really don’t understand what drives these accretion events, and we need detailed follow-up to figure out the underlying reasons for these processes,” Scholz says.

The post Young rogue planet grows like a star appeared first on Physics World.

Entangled light leads to quantum advantage

28 octobre 2025 à 09:00
Photo showing the optical components used to manipulate the quantum fluctuations of light
Quantum manipulation: The squeezer – an optical parametric oscillator (OPO) that uses a nonlinear crystal inside an optical cavity to manipulate the quantum fluctuations of light – is responsible for the entanglement. (Courtesy: Jonas Schou Neergaard-Nielsen)

Physicists at the Technical University of Denmark have demonstrated what they describe as a “strong and unconditional” quantum advantage in a photonic platform for the first time. Using entangled light, they were able to reduce the number of measurements required to characterize their system by a factor of 1011, with a correspondingly huge saving in time.

“We reduced the time it would take from 20 million years with a conventional scheme to 15 minutes using entanglement,” says Romain Brunel, who co-led the research together with colleagues Zheng-Hao Liu and Ulrik Lund Andersen.

Although the research, which is described in Science, is still at a preliminary stage, Brunel says it shows that major improvements are achievable with current photonic technologies. In his view, this makes it an important step towards practical quantum-based protocols for metrology and machine learning.

From individual to collective measurement

Quantum devices are hard to isolate from their environment and extremely sensitive to external perturbations. That makes it a challenge to learn about their behaviour.

To get around this problem, researchers have tried various “quantum learning” strategies that replace individual measurements with collective, algorithmic ones. These strategies have already been shown to reduce the number of measurements required to characterize certain quantum systems, such as superconducting electronic platforms containing tens of quantum bits (qubits), by as much as a factor of 105.

A photonic platform

In the new study, Brunel, Liu, Andersen and colleagues obtained a quantum advantage in an alternative “continuous-variable” photonic platform. The researchers note that such platforms are far easier to scale up than superconducting qubits, which they say makes them a more natural architecture for quantum information processing. Indeed, photonic platforms have already been crucial to advances in boson sampling, quantum communication, computation and sensing.

The team’s experiment works with conventional, “imperfect” optical components and consists of a channel containing multiple light pulses that share the same pattern, or signature, of noise. The researchers began by performing a procedure known as quantum squeezing on two beams of light in their system. This caused the beams to become entangled – a quantum phenomenon that creates such a strong linkage that measuring the properties of one instantly affects the properties of the other.

The team then measured the properties of one of the beams (the “probe” beam) in an experiment known as a 100-mode bosonic displacement process. According to Brunel, one can imagine this experiment as being like tweaking the properties of 100 independent light modes, which are packets or beams of light. “A ‘bosonic displacement process’ means you slightly shift the amplitude and phase of each mode, like nudging each one’s brightness and timing,” he explains. “So, you then have 100 separate light modes, and each one is shifted in phase space according to a specific rule or pattern.”

By comparing the probe beam to the second (“reference”) beam in a single joint measurement, Brunel explains that he and his colleagues were able to cancel out much of the uncertainties in these measurements. This meant they could extract more information per trial than they could have by characterizing the probe beam alone. This information boost, in turn, allowed them to significantly reduce the number of measurements – in this case, by a factor of 1011.

While the DTU researchers acknowledge that they have not yet studied a practical, real-world system, they emphasize that their platform is capable of “doing something that no classical system will ever be able to do”, which is the definition of a quantum advantage. “Our next step will therefore be to study a more practical system in which we can demonstrate a quantum advantage,” Brunel tells Physics World.

The post Entangled light leads to quantum advantage appeared first on Physics World.

New adaptive optics technology boosts the power of gravitational wave detectors

27 octobre 2025 à 09:00

Future versions of the Laser Interferometer Gravitational Wave Observatory (LIGO) will be able to run at much higher laser powers thanks to a sophisticated new system that compensates for temperature changes in optical components. Known as FROSTI (for FROnt Surface Type Irradiator) and developed by physicists at the University of California Riverside, US, the system will enable next-generation machines to detect gravitational waves emitted when the universe was just 0.1% of its current age, before the first stars had even formed.

Gravitational waves are distortions in spacetime that occur when massive astronomical objects accelerate and collide. When these distortions pass through the four-kilometre-long arms of the two LIGO detectors, they create a tiny difference in the (otherwise identical) distance that light travels between the centre of the observatory and the mirrors located at the end of each arm. The problem is that detecting and studying gravitational waves requires these differences in distance to be measured with an accuracy of 10-19 m, which is 1/10 000th the size of a proton.

Extending the frequency range

LIGO overcame this barrier 10 years ago when it detected the gravitational waves produced when two black holes located roughly 1.3 billion light–years from Earth merged. Since then, it and two smaller facilities, KAGRA and VIRGO, have observed many other gravitational waves at frequencies ranging from 30–2000 Hz.

Observing waves at lower and higher frequencies in the gravitational wave spectrum remains challenging, however. At lower frequencies (around 10–30 Hz), the problem stems from vibrational noise in the mirrors. Although these mirrors are hefty objects – each one measures 34 cm across, is 20 cm thick and has a mass of around 40 kg – the incredible precision required to detect gravitational waves at these frequencies means that even the minute amount of energy they absorb from the laser beam is enough to knock them out of whack.

At higher frequencies (150 – 2000 Hz), measurements are instead limited by quantum shot noise. This is caused by the random arrival time of photons at LIGO’s output photodetectors and is a fundamental consequence of the fact that the laser field is quantized.

A novel adaptive optics device

Jonathan Richardson, the physicist who led this latest study, explains that FROSTI is designed to reduce quantum shot noise by allowing the mirrors to cope with much higher levels of laser power. At its heart is a novel adaptive optics device that is designed to precisely reshape the surfaces of LIGO’s main mirrors under laser powers exceeding 1 megawatt (MW), which is nearly five times the power used at LIGO today.

Though its name implies cooling, FROSTI actually uses heat to restore the mirror’s surface to its original shape. It does this by projecting infrared radiation onto test masses in the interferometer to create a custom heat pattern that “smooths out” distortions and so allows for fine-tuned, higher-order corrections.

The single most challenging aspect of FROSTI’s design, and one that Richardson says shaped its entire concept, is the requirement that it cannot introduce even more noise into the LIGO interferometer. “To meet this stringent requirement, we had to use the most intensity-stable radiation source available – that is, an internal blackbody emitter with a long thermal time constant,” he tells Physics World. “Our task, from there, was to develop new non-imaging optics capable of reshaping the blackbody thermal radiation into a complex spatial profile, similar to one that could be created with a laser beam.”

Richardson anticipates that FROSTI will be a critical component for future LIGO upgrades – upgrades that will themselves serve as blueprints for even more sensitive next-generation observatories like the proposed Cosmic Explorer in the US and the Einstein Telescope in Europe. “The current prototype has been tested on a 40-kg LIGO mirror, but the technology is scalable and will eventually be adapted to the 440-kg mirrors envisioned for Cosmic Explorer,” he says.

Jan Harms, a physicist at Italy’s Gran Sasso Science Institute who was not involved in this work, describes FROSTI as “an ingenious concept to apply higher-order corrections to the mirror profile.” Though it still needs to pass the final test of being integrated into the actual LIGO detectors, Harms notes that “the results from the prototype are very promising”.

Richardson and colleagues are continuing to develop extensions to their technology, building on the successful demonstration of their first prototype. “In the future, beyond the next upgrade of LIGO (A+), the FROSTI radiation will need to be shaped into an even more complex spatial profile to enable the highest levels of laser power (1.5 MW) ultimately targeted,” explains Richardson. “We believe this can be achieved by nesting two or more FROSTI actuators together in a single composite, with each targeting a different radial zone of the test mass surfaces. This will allow us to generate extremely finely-matched optical wavefront corrections.”

The present study is detailed in Optica.

The post New adaptive optics technology boosts the power of gravitational wave detectors appeared first on Physics World.

❌