↩ Accueil

Vue normale

Reçu aujourd’hui — 25 novembre 2025 Physics World

‘Caustic’ light patterns inspire new glass artwork

25 novembre 2025 à 18:00

UK artist Alison Stott has created a new glass and light artwork – entitled Naturally Focused – that is inspired by the work of theoretical physicist Michael Berry from the University of Bristol.

Stott, who recently competed an MA in glass at Arts University Plymouth, spent over two decades previously working in visual effects for film and television, where she focussed on creating photorealistic imagery.

Her studies touched on how complex phenomena can arise from seemingly simple set-ups, for example in a rotating glass sculpture lit by LEDs.

“My practice inhabits the spaces between art and science, glass and light, craft and experience,” notes Stott. “Working with molten glass lets me embrace chaos, indeterminacy, and materiality, and my work with caustics explores the co-creation of light, matter, and perception.”

The new artwork is based on “caustics” – the curved patterns that form when light is reflected or refracted by curved surfaces or objects

The focal point of the artwork is a hand-blown glass lens that was waterjet-cut into a circle and polished so that its internal structure and optical behaviour are clearly visible. The lens is suspended within stainless steel gyroscopic rings and held by a brass support and stainless stell backplate.

The rings can be tilted or rotated to “activate shifting field of caustic projections that ripple across” the artwork. Mathematical equations are also engraved onto the brass that describe the “singularities of light” that are visible on the glass surface.

The work is inspired by Berry’s research into the relationship between classical and quantum behaviour and how subtle geometric structures govern how waves and particles behave.

Berry recently won the 2025 Isaac Newton Medal and Prize, which is presented by the Institute of Physics, for his “profound contributions across mathematical and theoretical physics in a career spanning over 60 years”.

Stott says that working with Berry has pushed her understanding of caustics. “The more I learn about how these structures emerge and why they matter across physics, the more compelling they become,” notes Stott. “My aim is to let the phenomena speak for themselves, creating conditions where people can directly encounter physical behaviour and perhaps feel the same awe and wonder I do.”

The artwork will go on display at the University of Bristol following a ceremony to be held on 27 November.

The post ‘Caustic’ light patterns inspire new glass artwork appeared first on Physics World.

Is your WiFi spying on you?

25 novembre 2025 à 17:00

WiFi networks could pose significant privacy risks even to people who aren’t carrying or using WiFi-enabled devices, say researchers at the Karlsruhe Institute of Technology (KIT) in Germany. According to their analysis, the current version of the technology passively records information that is detailed enough to identify individuals moving through networks, prompting them to call for protective measures in the next iteration of WiFi standards.

Although wireless networks are ubiquitous and highly useful, they come with certain privacy and security risks. One such risk stems from a phenomenon known as WiFi sensing, which the researchers at KIT’s Institute of Information Security and Dependability (KASTEL) define as “the inference of information about the networks’ environment from its signal propagation characteristics”.

“As signals propagate through matter, they interfere with it – they are either transmitted, reflected, absorbed, polarized, diffracted, scattered, or refracted,” they write in their study, which is published in the Proceedings of the 2025 ACM SIGSAC Conference on Computer and Communications Security (CCS ’25). “By comparing an expected signal with a received signal, the interference can be estimated and used for error correction of the received data.”

 An under-appreciated consequence, they continue, is that this estimation contains information about any humans who may have unwittingly been in the signal’s path. By carefully analysing the signal’s interference with the environment, they say, “certain aspects of the environment can be inferred” – including whether humans are present, what they are doing and even who they are.

“Identity inference attack” is a threat

The KASTEL team terms this an “identity inference attack” and describes it as a threat that is as widespread as it is serious. “This technology turns every router into a potential means for surveillance,” says Julian Todt, who co-led the study with his KIT colleague Thorsten Strufe. “For example, if you regularly pass by a café that operates a WiFi network, you could be identified there without noticing it and be recognized later – for example by public authorities or companies.”

While Todt acknowledges that security services, cybercriminals and others do have much simpler ways of tracking individuals – for example by accessing data from CCTV cameras or video doorbells – he argues that the ubiquity of wireless networks lends itself to being co-opted as a near-permanent surveillance infrastructure. There is, he adds, “one concerning property” about wireless networks: “They are invisible and raise no suspicion.”

Identity of individuals could be extracted using a machine-learning model

Although the possibility of using WiFi networks in this way is not new, most previous WiFi-based security attacks worked by analysing so-called channel state information (CSI). These data indicate how a radio signal changes when it reflects off walls, furniture, people or animals. However, the KASTEL researchers note that the latest WiFi standard, known as WiFi 5 (802.11ac), changes the picture by enabling a new and potentially easier form of attack based on beamforming feedback information (BFI).

While beamforming uses similar information as CSI, Todt explains that it does so on the sender’s side instead of the receiver’s. This means that a BFI-based surveillance method would require nothing more than standard devices connected to the WiFi network. “The BFI could be used to create images from different perspectives that might then serve to identify persons that find themselves in the WiFi signal range,” Todt says. “The identity of individuals passing through these radio waves could then be extracted using a machine-learning model. Once trained, this model would make an identification in just a few seconds.”

In their experiments, Todt and colleagues studied 197 participants as they walked through a WiFi field while being simultaneously recorded with CSI and BFI from four different angles. The participants had five different “walking styles” (such as walking normally and while carrying a backpack) as well as different gaits. The researchers found that they could identify individuals with nearly 100% accuracy, regardless of the recording angle or the individual’s walking style or gait.

“Risks to our fundamental rights”

“The technology is powerful, but at the same time entails risks to our fundamental rights, especially to privacy,” says Strufe. He warns that authoritarian states could use the technology to track demonstrators and members of opposition groups, prompting him and his colleagues to “urgently call” for protective measures and privacy safeguards to be included in the forthcoming IEEE 802.11bf WiFi standard.

“The literature on all novel sensing solutions highlights their utility for various novel applications,” says Todt, “but the privacy risks that are inherent to such sensing are often overlooked, or worse — these sensors are claimed to be privacy-friendly without any rationale for these claims. As such, we feel it necessary to point out the privacy risks that novel solutions such as WiFi sensing bring with them.”

The researchers say they would like to see approaches developed that can mitigate the risk of identity inference attack. However, they are aware that this will be difficult, since this type of attack exploits the physical properties of the actual WiFi signal. “Ideally, we would influence the WiFi standard to contain privacy-protections in future versions,” says Todt, “but even the impact of this would be limited as there are already millions of WiFi devices out there that are vulnerable to such an attack.”

The post Is your WiFi spying on you? appeared first on Physics World.

Why quantum metrology is the driving force for best practice in quantum standardization

24 novembre 2025 à 12:10
3d render quantum computer featuring qubit chip
Quantum advantage international standardization efforts will, over time, drive economies of scale and multivendor interoperability across the nascent quantum supply chain. (Courtesy: iStock/Peter Hansen)

How do standards support the translation of quantum science into at-scale commercial opportunities?

The standardization process helps to promote the legitimacy of emerging quantum technologies by distilling technical inputs and requirements from all relevant stakeholders across industry, research and government. Put simply: if you understand a technology well enough to standardize elements of it, that’s when you know it’s moved beyond hype and theory into something of practical use for the economy and society.

What are the upsides of standardization for developers of quantum technologies and, ultimately, for end-users in industry and the public sector?

Standards will, over time, help the quantum technology industry achieve critical mass on the supply side, with those economies of scale driving down prices and increasing demand. As the nascent quantum supply chain evolves – linking component manufacturers, subsystem developers and full-stack quantum computing companies – standards will also ensure interoperability between products from different vendors and different regions.

Those benefits flow downstream as well because standards, when implemented properly, increase trust among end-users by defining a minimum quality of products, processes and services. Equally important, as new innovations are rolled out into the marketplace by manufacturers, standards will ensure compatibility across current and next-generation quantum systems, reducing the likelihood of lock-ins to legacy technologies.

What’s your role in coordinating NPL’s standards effort in quantum science and technology?

I have strategic oversight of our core technical programmes in quantum computing, quantum networking, quantum metrology and quantum-enabled PNT (position, navigation and timing). It’s a broad-scope remit that spans research, training as well as responsibility for standardization and international collaboration, with the latter often going hand-in-hand.

Right now, we have over 150 people working within the NPL quantum metrology programme. Their collective focus is on developing the measurement science necessary to build, test and evaluate a wide range of quantum devices and systems. Our research helps innovators, whether in an industry or university setting, to push the limits of quantum technology by providing leading-edge capabilities and benchmarking to measure the performance of new quantum products and services.

tim prior headshot image
Tim Prior “We believe that quantum metrology and standardization are key enablers of quantum innovation.” (Courtesy: NPL)

It sounds like there are multiple layers of activity.

That’s right. For starters, we have a team focusing on the inter-country strategic relationships, collaborating closely with colleagues at other National Metrology Institutes (like NIST in the US and PTB in Germany). A key role in this regard is our standards specialist who, given his background working in the standards development organizations (SDOs), acts as a “connector” between NPL’s quantum metrology teams and, more widely, the UK’s National Quantum Technology Programme and the international SDOs.

We also have a team of technical experts who sit on specialist working groups within the SDOs. Their inputs to standards development are not about NPL’s interests, rather providing expertise and experience gained from cutting-edge metrology; also building a consolidated set of requirements gathered from stakeholders across the quantum community to further the UK’s strategic and technical priorities in quantum.

So NPL’s quantum metrology programme provides a focal point for quantum standardization?

Absolutely. We believe that quantum metrology and standardization are key enablers of quantum innovation, fast-tracking the adoption and commercialization of quantum technologies while building confidence among investors and across the quantum supply chain and early-stage user base. For NPL and its peers, the task right now is to agree on the terminology and best practice as we figure out the performance metrics, benchmarks and standards that will enable quantum to go mainstream.

How does NPL engage the UK quantum community on standards development?

Front-and-centre is the UK Quantum Standards Network Pilot. This initiative – which is being led by NPL – brings together representatives from industry, academia and government to work on all aspects of standards development: commenting on proposals and draft standards; discussing UK standards policy and strategy; and representing the UK in the European and international SDOs. The end-game? To establish the UK as a leading voice in quantum standardization, both strategically and technically, and to ensure that UK quantum technology companies have access to global supply chains and markets.

What about NPL outreach to prospective end-users of quantum technologies?

The Quantum Standards Network Pilot also provides a direct line to prospective end-users of quantum technologies in business sectors like finance, healthcare, pharmaceuticals and energy. What’s notable is that the end-users are often preoccupied with questions that link in one way or another to standardization. For example: how well do quantum technologies stack up against current solutions? Are quantum systems reliable enough yet? What does quantum cost to implement and maintain, including long-term operational costs? Are there other emerging technologies that could do the same job? Is there a solid, trustworthy supply chain?

It’s clear that international collaboration is mandatory for successful standards development. What are the drivers behind the recently announced NMI-Q collaboration?

The quantum landscape is changing fast, with huge scope for disruptive innovation in quantum computing, quantum communications and quantum sensing. Faced with this level of complexity, NMI-Q leverages the combined expertise of the world’s leading National Metrology Institutes – from the G7 countries and Australia – to accelerate the development and adoption of quantum technologies.

No one country can do it all when it comes to performance metrics, benchmarks and standards in quantum science and technology. As such, NMI-Q’s priorities are to conduct collaborative pre-standardization research; develop a set of “best measurement practices” needed by industry to fast-track quantum innovation; and, ultimately, shape the global standardization effort in quantum. NPL’s prominent role within NMI-Q (I am the co-chair along with Barbara Goldstein of NIST) underscores our commitment to evidence-based decision-making in standards development and, ultimately, to the creation of a thriving quantum ecosystem.

What are the attractions of NPL’s quantum programme for early-career physicists?

Every day, our measurement scientists address cutting-edge problems in quantum – as challenging as anything they’ll have encountered previously in an academic setting. What’s especially motivating, however, is that the NPL is a mission-driven endeavour with measurement outcomes linking directly to wider societal and economic benefits – not just in the UK, but internationally as well.

Quantum metrology: at your service

Measurement for Quantum (M4Q) is a flagship NPL programme that provides industry partners with up to 20 days of quantum metrology expertise to address measurement challenges in applied R&D and product development. The service – which is free of charge for projects approved after peer review – helps companies to bridge the gap from technology prototype to full commercialization.

To date, more than two-thirds of the companies to participate in M4Q report that their commercial opportunity has increased as a direct result of NPL support. In terms of specifics, the M4Q offering includes the following services:

  • Small-current and quantum-noise measurements
  • Measurement of material-induced noise in superconducting quantum circuits
  • Nanoscale imaging of physical properties for applications in quantum devices
  • Characterization of single-photon sources and detectors
  • Characterization of compact lasers and other photonic components
  • Semiconductor device characterisation at cryogenic temperatures

Apply for M4Q support here.

Further reading

Performance metrics and benchmarks point the way to practical quantum advantage

End note: NPL retains copyright on this article.

The post Why quantum metrology is the driving force for best practice in quantum standardization appeared first on Physics World.

Reversible degradation phenomenon in PEMWE cells

25 novembre 2025 à 15:10

 

In proton exchange membrane water electrolysis (PEMWE) systems, voltage cycles dropping below a threshold are associated with reversible performance improvements, which remain poorly understood despite being documented in literature. The distinction between reversible and irreversible performance changes is crucial for accurate degradation assessments. One approach in literature to explain this behaviour is the oxidation and reduction of iridium. Iridium-based electrocatalyst activity and stability in PEMWE hinge on their oxidation state, influenced by the applied voltage. Yet, full-cell PEMWE dynamic performance remains under-explored, with a focus typically on stability rather than activity. This study systematically investigates reversible performance behaviour in PEMWE cells using Ir-black as an anodic catalyst. Results reveal a recovery effect when the low voltage level drops below 1.5 V, with further enhancements observed as the voltage decreases, even with a short holding time of 0.1 s. This reversible recovery is primarily driven by improved anode reaction kinetics, likely due to changing iridium oxidation states, and is supported by alignment between the experimental data and a dynamic model that links iridium oxidation/reduction processes to performance metrics. This model allows distinguishing between reversible and irreversible effects and enables the derivation of optimized operation schemes utilizing the recovery effect.

Tobias Krenz
Tobias Krenz

Tobias Krenz is a simulation and modelling engineer at Siemens Energy in the Transformation of Industry business area focusing on reducing energy consumption and carbon-dioxide emissions in industrial processes. He completed his PhD from Liebniz University Hannover in February 2025. He earned a degree from Berlin University of Applied Sciences in 2017 and a MSc from Technische Universität Darmstadt in 2020.

Alexander Rex
Alexander Rex

 

Alexander Rex is a PhD candidate at the Institute of Electric Power Systems at Leibniz University Hannover. He holds a degree in mechanical engineering from Technische Universität Braunschweig, an MEng from Tongji University, and an MSc from Karlsruhe Institute of Technology (KIT). He was a visiting scholar at Berkeley Lab from 2024 to 2025.

The post Reversible degradation phenomenon in PEMWE cells appeared first on Physics World.

Ramy Shelbaya: the physicist and CEO capitalizing on quantum randomness

25 novembre 2025 à 14:59

Ramy Shelbaya has been hooked on physics ever since he was a 12-year-old living in Egypt and read about the Joint European Torus (JET) fusion experiment in the UK. Biology and chemistry were interesting to him but never quite as “satisfying”, especially as they often seemed to boil down to physics in the end. “So I thought, maybe that’s where I need to go,” Shelbaya recalls.

His instincts seem to have led him in the right direction. Shelbaya is now chief executive of Quantum Dice, an Oxford-based start-up he co-founded in 2020 to develop quantum hardware for exploiting the inherent randomness in quantum mechanics. It closed its first funding round in 2021 with a seven-figure investment from a consortium of European investors, while also securing grant funding on the same scale.

Now providing cybersecurity hardware systems for clients such as BT, Quantum Dice is launching a piece of hardware for probabilistic computing, based on the same core innovation. Full of joy and zeal for his work, Shelbaya admits that his original decision to pursue physics was “scary”. Back then, he didn’t know anyone who had studied the subject and was not sure where it might lead.

The journey to a start-up

Fortunately, Shelbaya’s parents were onboard from the start and their encouragement proved “incredibly helpful”. His teachers also supported him to explore physics in his extracurricular reading, instilling a confidence in the subject that eventually led Shelbaya to do undergraduate and master’s degrees in physics at École normale supérieure PSL in France.

He then moved to the UK to do a PhD in atomic and laser physics at the University of Oxford. Just as he was wrapping up his PhD, Oxford University Innovation (OUI) – which manages its technology transfer and consulting activities – launched a new initiative that proved pivotal to Shelbaya’s career.

Ramy Shelbaya
From PhD student to CEO Ramy Shelbaya transformed a research idea into a commercial product after winning a competition for budding entrepreneurs. (Courtesy: Quantum Dice)

OUI had noted that the university generated a lot of IP and research results that could be commercialized but that the academics producing it often favoured academic work over progressing the technology transfer themselves. On the other hand, lots of students were interested in entering the world of business.

To encourage those who might be business-minded to found their own firms, while also fostering more spin-outs from the university’s patents and research, OUI launched the Student Entrepreneurs’ Programme (StEP). A kind of talent show to match budding entrepreneurs with technology ready for development, StEP invited participants to team up, choose commercially promising research from the university, and pitch for support and mentoring to set up a company.

As part of Oxford’s atomic and laser physics department, Shelbaya was aware that it had been developing a quantum random number generator. So when the competition was launched, he collaborated with other competition participants to pitch the device. “My team won, and this is how Quantum Dice was born.”

Random value

The initial technology was geared towards quantum random number generation, for particular use in cybersecurity. Random numbers are at the heart of all encryption algorithms, but generating truly random numbers has been a stumbling block, with the “pseudorandom” numbers people make do with being prone to prediction and hence security violation.

Quantum mechanics provides a potential solution because there is inherent randomness in the values of certain quantum properties. Although for a long time this randomness was “a bane to quantum physicists”, as Shelbaya puts it, Quantum Dice and other companies producing quantum random number generators are now harnessing it for useful technologies.

Where Quantum Dice sees itself as having an edge over its competitors is in its real-time quality assurance of the true quantum randomness of the device’s output. This means it should be able to spot any corruption to the output due to environmental noise or someone tampering with the device, which is an issue with current technologies.

Quantum Dice already offers Quantum Random Number Generator (QRNG) products in a range of form factors that integrate directly within servers, PCs and hardware security systems. Clients can also access the company’s cloud-based solution –  Quantum Entropy-as-a-Service – which, powered by its QRNG hardware, integrates into the Internet of Things and cloud infrastructure.

Recently Quantum Dice has also launched a probabilistic computing processor based on its QRNG for use in algorithms centred on probabilities. These are often geared towards optimization problems that apply in a number of sectors, including supply chains and logistics, finance, telecommunications and energy, as well as simulating quantum systems, and Boltzmann machines – a type of energy-based machine learning model for which Shelbaya says researchers “have long sought efficient hardware”.

Stress testing

After winning the start-up competition in 2019 things got trickier when Quantum Dice was ready to be incorporated, which occurred just at the start of the first COVID-19 lockdown. Shelbaya moved the prototype device into his living room because it was the only place they could ensure access to it, but it turned out the real challenges lay elsewhere.

“One of the first things we needed was investments, and really, at that stage of the company, what investors are investing in is you,” explains Shelbaya, highlighting how difficult this is when you cannot meet in person. On the plus side, since all his meetings were remote, he could speak to investors in Asia in the morning, Europe in the afternoon and the US in the evening, all within the same day.

Another challenge was how to present the technology simply enough so that people would understand and trust it, while not making it seem so simple that anyone could be doing it. “There’s that sweet spot in the middle,” says Shelbaya. “That is something that took time, because it’s a muscle that I had never worked.”

Due rewards

The company performed well for its size and sector in terms of securing investments when their first round of funding closed in 2021. Shelbaya is shy of attributing the success to his or even the team’s abilities alone, suggesting this would “underplay a lot of other factors”. These include the rising interest in quantum technologies, and the advantages of securing government grant funding programmes at the same time, which he feels serves as “an additional layer of certification”.

For Shelbaya every day is different and so are the challenges. Quantum Dice is a small new company, where many of the 17 staff are still fresh from university, so nurturing trust among clients, particularly in the high-stakes world of cybersecurity is no small feat. Managing a group of ambitious, energetic and driven young people can be complicated too.

But there are many rewards, ranging from seeing a piece of hardware finally work as intended and closing a deal with a client, to simply seeing a team “you have been working to develop, working together without you”.

For others hoping to follow a similar career path, Shelbaya’s advice is to do what you enjoy – not just because you will have fun but because you will be good at it too. “Do what you enjoy,” he says, “because you will likely be great at it.”

The post Ramy Shelbaya: the physicist and CEO capitalizing on quantum randomness appeared first on Physics World.

‘Patchy’ nanoparticles emerge from new atomic stencilling technique

25 novembre 2025 à 10:00

Researchers in the US and Korea have created nanoparticles with carefully designed “patches” on their surfaces using a new atomic stencilling technique. These patches can be controlled with incredible precision, and could find use in targeted drug delivery, catalysis, microelectronics and tissue engineering.

The first step in the stencilling process is to create a mask on the surface of gold nanoparticles. This mask prevents a “paint” made from grafted-on polymers from attaching to certain areas of the nanoparticles.

“We then use iodide ions as a stencil,” explains Qian Chen, a materials scientist and engineer at the University of Illinois at Urbana-Champaign, US, who led the new research effort. “These adsorb (stick) to the surface of the nanoparticles in specific patterns that depend on the shape and atomic arrangement of the nanoparticles’ facets. That’s how we create the patches – the areas where the polymers selectively bind.” Chen adds that she and her collaborators can then tailor the surface chemistry of these tiny patchy nanoparticles in a very controlled way.

A gap in the field of microfabrication stencilling

The team decided to develop the technique after realizing there was a gap in the field of microfabrication stencilling. While techniques in this area have advanced considerably in recent years, allowing ever-smaller microdevices to be incorporated into ever-faster computer chips, most of them rely on top-down approaches for precisely controlling nanoparticles. By comparison, Chen says, bottom-up methods have been largely unexplored even though they are low-cost, solution-processable, scalable and compatible with complex, curved and three-dimensional surfaces.

Reporting their work in Nature, the researchers say they were inspired by the way proteins naturally self-assemble. “One of the holy grails in the field of nanomaterials is making complex, functional structures from nanoscale building blocks,” explains Chen. “It’s extremely difficult to control the direction and organization of each nanoparticle. Proteins have different surface domains, and thanks to their interactions with each other, they can make all the intricate machines we see in biology. We therefore adopted that strategy by creating patches or distinct domains on the surface of the nanoparticles.”

“Elegant and impressive”

Philip Moriarty, a physicist of the University of Nottingham, UK who was not involved in the project, describes it as “elegant and impressive” work. “Chen and colleagues have essentially introduced an entirely new mode of self-assembly that allows for much greater control of nanoparticle interactions,” he says, “and the ‘atomic stencil’ concept is clever and versatile.”

The team, which includes researchers at the University of Michigan, Pennsylvania State University, Cornell, Brookhaven National Laboratory and Korea’s Chonnam National University as well as Urbana-Champaign, agrees that the potential applications are vast. “Since we can now precisely control the surface properties of these nanoparticles, we can design them to interact with their environment in specific ways,” explains Chen. “That opens the door for more effective drug delivery, where nanoparticles can target specific cells. It could also lead to new types of catalysts, more efficient microelectronic components and even advanced materials with unique optical and mechanical properties.”

She and her colleagues say they now want to extend their approach to different types of nanoparticles and different substrates to find out how versatile it truly is. They will also be developing computational models that can predict the outcome of the stencilling process – something that would allow them to design and synthesize patchy nanoparticles for specific applications on demand.

The post ‘Patchy’ nanoparticles emerge from new atomic stencilling technique appeared first on Physics World.

Reçu hier — 24 novembre 2025 Physics World

Scientists in China celebrate the completion of the underground JUNO neutrino observatory

24 novembre 2025 à 18:00

The $330m Jiangmen Underground Neutrino Observatory (JUNO) has released its first results following the completion of the huge underground facility in August.

JUNO is located in Kaiping City, Guangdong Province, in the south of the country around 150 km west of Hong Kong.

Construction of the facility began in 2015 and was set to be complete some five years later. Yet the project suffered from serious flooding, which delayed construction.

JUNO, which is expected to run for more than 30 years, aims to study the relationship between the three types of neutrino: electron, muon and tau. Although JUNO will be able to detect neutrinos produced by supernovae as well as those from Earth, the observatory will mainly measure the energy spectrum of electron antineutrinos released by the Yangjiang and Taishan nuclear power plants, which both lie 52.5 km away.

To do this, the facility has a 80 m high and 50 m diameter experimental hall located 700 m underground. Its main feature is a 35 m radius spherical neutrino detector, containing 20,000 tonnes of liquid scintillator. When an electron antineutrino occasionally bumps into a proton in the liquid, it triggers a reaction that results in two flashes of light that are detected by the 43,000 photomultiplier tubes that observe the scintillator.

On 18 November, a paper was submitted to the arXiv preprint server concluding that the detector’s key performance indicators fully meet or surpass design expectations.

New measurement 

Neutrinos oscillate from one flavour to another as they travel near the speed of light, rarely interacting with matter. This oscillation is a result of each flavour being a combination of three neutrino mass states.

Yet scientists do not know the absolute masses of the three neutrinos but can measure neutrino oscillation parameters, known as θ12, θ23 and θ13, as well as the square of the mass differences (Δm2) between two different types of neutrinos.

A second JUNO paper submitted on 18 November used data collected between 26 August and 2 November to measure the solar neutrino oscillation parameter θ12 and Δm221 with a factor of 1.6 better precision than previous experiments.

Those earlier results, which used solar neutrinos instead of reactor antineutrinos, showed a 1.5 “sigma” discrepancy with the Standard Model of particle physics. The new JUNO measurements confirmed this difference, dubbed the solar neutrino tension, but further data will be needed to prove or disprove the finding.

“Achieving such precision within only two months of operation shows that JUNO is performing exactly as designed,” says Yifang Wang from the Institute of High Energy Physics of the Chinese Academy of Sciences, who is JUNO project manager and spokesperson. “With this level of accuracy, JUNO will soon determine the neutrino mass ordering, test the three-flavour oscillation framework, and search for new physics beyond it.”

JUNO, which is an international collaboration of more than 700 scientists from 75 institutions across 17 countries including China, France, Germany, Italy, Russia, Thailand, and the US, is the second neutrino experiment in China, after the Daya Bay Reactor Neutrino Experiment. It successfully measured a key neutrino oscillation parameter called θ13 in 2012 before being closed down in 2020.

JUNO is also one of three next-generation neutrino experiments, the other two being the Hyper-Kamiokande in Japan and the Deep Underground Neutrino Experiment in the US. Both are expected to become operational later this decade.

The post Scientists in China celebrate the completion of the underground JUNO neutrino observatory appeared first on Physics World.

Accelerator experiment sheds light on missing blazar radiation

24 novembre 2025 à 16:08

New experiments at CERN by an international team have ruled out a potential source of intergalactic magnetic fields. The existence of such fields is invoked to explain why we do not observe secondary gamma rays originating from blazars.

Led by Charles Arrowsmith at the UK’s University of Oxford, the team suggests the absence of gamma rays could be the result of an unexplained phenomenon that took place in the early universe.

A blazar is an extraordinarily bright object with a supermassive black hole at its core. Some of the matter falling into the black hole is accelerated outwards in a pair of opposing jets, creating intense beams of radiation. If a blazar jet points towards Earth, we observe a bright source of light including high-energy teraelectronvolt gamma rays.

During their journey across intergalactic space, these gamma-ray photons will occasionally collide with the background starlight that permeates the universe. These collisions can create cascades of electrons and positrons that can then scatter off photons to create gamma rays in the gigaelectronvolt energy range. These gamma-rays should travel in the direction of the original jet, but this secondary radiation has never been detected.

Deflecting field

Magnetic fields could be the reason for this dearth, as Arrowsmith explains: “The electrons and positrons in the pair cascade would be deflected by an intergalactic magnetic field, so if this is strong enough, we could expect these pairs to be steered away from the line of sight to the blazar, along with the reprocessed gigaelectronvolt gamma rays.” It is not clear, however, that such fields exist – and if they do, what could have created them.

Another explanation for the missing gamma rays involves the extremely sparse plasma that permeates intergalactic space. The beam of electron–positron pairs could interact with this plasma, generating magnetic fields that separate the pairs. Over millions of years of travel, this process could lead to beam–plasma instabilities that reduce the beam’s ability to create gigaelectronvolt gamma rays that are focused on Earth.

Oxford’s Gianluca Gregori  explains, “We created an experimental platform at the HiRadMat facility at CERN to create electron–positron pairs and transport them through a metre-long ambient argon plasma, mimicking the interaction of pair cascades from blazars with the intergalactic medium”. Once the pairs had passed through the plasma, the team measured the degree to which they had been separated.

Tightly focused

Called Fireball, the experiment found that the beams remained far more tightly focused than expected. “When these laboratory results are scaled up to the astrophysical system, they confirm that beam–plasma instabilities are not strong enough to explain the absence of the gigaelectronvolt gamma rays from blazars,” Arrowsmith explains. Unless the pair beam is perfectly collimated, or composed of pairs with exactly equal energies, instabilities were actively suppressed in the plasma.

While the experiment suggests that an intergalactic magnetic field remains the best explanation for the lack of gamma rays, the mystery is far from solved. Gregori explains, “The early universe is believed to be extremely uniform – but magnetic fields require electric currents, which in turn need gradients and inhomogeneities in the primordial plasma.” As a result, confirming the existence of such a field could point to new physics beyond the Standard Model, which may have dominated in the early universe.

More information could come with opening of the Cherenkov Telescope Array Observatory. This will comprise ground-based gamma-ray detectors planned across facilities in Spain and Chile, which will vastly improve on the resolutions of current-generation detectors.

The research is described in PNAS.

The post Accelerator experiment sheds light on missing blazar radiation appeared first on Physics World.

Ask me anything: Jason Palmer – ‘Putting yourself in someone else’s shoes is a skill I employ every day’

24 novembre 2025 à 12:00

What skills do you use every day in your job?

One thing I can say for sure that I got from working in academia is the ability to quickly read, summarize and internalize information from a bunch of sources. Journalism requires a lot of that. Being able to skim through papers – reading the abstract, reading the conclusion, picking the right bits from the middle and so on – that is a life skill.

In terms of other skills, I’m always considering who’s consuming what I’m doing rather than just thinking about how I’d like to say something. You have to think about how it’s going to be received – what’s the person on the street going to hear? Is this clear enough? If I were hearing this for the first time, would I understand it? Putting yourself in someone else’s shoes – be it the listener, reader or viewer – is a skill I employ every day.

What do you like best and least about your job?

The best thing is the variety. I ended up in this business and not in scientific research because of a desire for a greater breadth of experience. And boy, does this job have it. I get to talk to people around the world about what they’re up to, what they see, what it’s like, and how to understand it. And I think that makes me a much more informed person than I would be had I chosen to remain a scientist.

When I did research – and even when I was a science journalist – I thought “I don’t need to think about what’s going on in that part of the world so much because that’s not my area of expertise.” Now I have to, because I’m in this chair every day. I need to know about lots of stuff, and I like that feeling of being more informed.

I suppose what I like the least about my job is the relentlessness of it. It is a newsy time. It’s the flip side of being well informed, you’re forced to confront lots of bad things – the horrors that are going on in the world, the fact that in a lot of places the bad guys are winning.

What do you know today that you wish you knew when you were starting out in your career?

When I started in science journalism, I wasn’t a journalist – I was a scientist pretending to be one. So I was always trying to show off what I already knew as a sort of badge of legitimacy. I would call some professor on a topic that I wasn’t an expert in yet just to have a chat to get up to speed, and I would spend a bunch of time showing off, rabbiting on about what papers I’d read and what I knew, just to feel like I belonged in the room or on that call. And it’s a waste of time. You have to swallow your ego and embrace the idea that you may sound like you don’t know stuff even if you do. You might sound dumber, but that’s okay – you’ll learn more and faster, and you’ll probably annoy people less.

In journalism in particular, you don’t want to preload the question with all of the things that you already know because then the person you’re speaking to can fill in those blanks – and they’re probably going to talk about things you didn’t know you didn’t know, and take your conversation in a different direction.

It’s one of the interesting things about science in general. If you go into a situation with experts, and are open and comfortable about not knowing it all, you’re showing that you understand that nobody can know everything and that science is a learning process.

The post Ask me anything: Jason Palmer – ‘Putting yourself in someone else’s shoes is a skill I employ every day’ appeared first on Physics World.

Reçu avant avant-hier Physics World

Sympathetic cooling gives antihydrogen experiment a boost

21 novembre 2025 à 15:20

Physicists working on the Antihydrogen Laser Physics Apparatus (ALPHA) experiment at CERN have trapped and accumulated 15,000 antihydrogen atoms in less than 7 h. This accumulation rate is more than 20 times the previous record. Large ensembles of antihydrogen could be used to search for tiny, unexpected differences between matter and antimatter – which if discovered could point to physics beyond the Standard Model.

According to the Standard Model every particle has an antimatter counterpart – or antiparticle. It also says that roughly equal amounts of matter and antimatter were created in the Big Bang. But, today there is much more matter than antimatter in the visible universe, and the reason for this “baryon asymmetry” is one of the most important mysteries of physics.

The Standard Model predicts the properties of antiparticles. An antiproton, for example, has the same mass as a proton and the opposite charge. The Standard Model also predicts how antiparticles interact with matter and antimatter. If physicists could find discrepancies between the measured and predicted properties of antimatter, it could help explain the baryon asymmetry and point to other new physics beyond the Standard Model.

Powerful probe

Just as a hydrogen atom comprises a proton bound to an electron, an antihydrogen antiatom comprises an antiproton bound to an antielectron (positron). Antihydrogen offers physicists several powerful ways to probe antimatter at a fundamental level. Trapped antiatoms can be released in freefall to determine if they respond to gravity in the same way as atoms. Spectroscopy can be used to make precise measurements of how the electromagnetic force binds the antiproton and positron in antihydrogen with the aim of finding differences compared to hydrogen.

So far, antihydrogen’s gravitational and electromagnetic properties appear to be identical to hydrogen. However, these experiments were done using small numbers of antiatoms, and having access to much larger ensembles would improve the precision of such measurements and could reveal tiny discrepancies. However, creating and storing antihydrogen is very difficult.

Today, antihydrogen can only be made in significant quantities at CERN in Switzerland. There, a beam of protons is fired at a solid target, creating antiprotons that are then cooled and stored using electromagnetic fields. Meanwhile, positrons are gathered from the decay of radioactive nuclei and cooled and stored using electromagnetic fields. These antiprotons and positrons are then combined in a special electromagnetic trap to create antihydrogen.

This process works best when the antiprotons and positrons have very low kinetic energies (temperatures) when combined. If the energy is too high, many antiatoms will be escape the trap. So, it is crucial that the positrons and antiprotons to be as cold as possible.

Sympathetic cooling

Recently, ALPHA physicists have used a technique called sympathetic cooling on positrons, and in a new paper they describe their success.  Sympathetic cooling has been used for several decades to cool atoms and ions. It originally involved mixing a hard-to-cool atomic species with atoms that are relatively easy to cool using lasers. Energy is transferred between the two species via the electromagnetic interaction, which chills the hard-to-cool atoms.

The ALPHA team used beryllium ions to sympathetically cool positrons to 10 K, which is five degrees colder than previously achieved using other techniques. These cold positrons boosted the efficiency of the creation and trapping of antihydrogen, allowing the team to accumulate 15,000 antihydrogen atoms in less than 7 h. This is more than a 20-fold improvement over their previous record of accumulating 2000 antiatoms in 24 h.

Science fiction

“These numbers would have been considered science fiction 10 years ago,” says ALPHA spokesperson Jeffrey Hangst, who is a Denmark’s Aarhus University.

Team member Maria Gonçalves, a PhD student at the UK’s Swansea University, says, “This result was the culmination of many years of hard work. The first successful attempt instantly improved the previous method by a factor of two, giving us 36 antihydrogen atoms”.

The effort was led by Niels Madsen of the UK’s Swansea University. He enthuses, “It’s more than a decade since I first realized that this was the way forward, so it’s incredibly gratifying to see the spectacular outcome that will lead to many new exciting measurements on antihydrogen”.

The cooling technique is described in Nature Communications.

The post Sympathetic cooling gives antihydrogen experiment a boost appeared first on Physics World.

Plasma bursts from young stars could shed light on the early life of the Sun

21 novembre 2025 à 10:00

The Sun frequently ejects high-energy bursts of plasma that then travel through interplanetary space. These so-called coronal mass ejections (CMEs) are accompanied by strong magnetic fields, which, when they interact with the Earth’s atmosphere, can trigger solar storms that can severely damage satellite systems and power grids.

In the early days of the solar system, the Sun was far more active than it is today and ejected much bigger CMEs. These might have been energetic enough to affect our planet’s atmosphere and therefore influence how life emerged and evolved on Earth, according to some researchers.

Since it is impossible to study the early Sun, astronomers use proxies – that is, stars that resemble it. These “exo-suns” are young G-, K- and M-type stars and are far more active than our Sun is today. They frequently produce CMEs with energies far larger than the most energetic solar flares recorded in recent times, which might not only affect their planets’ atmospheres, but may also affect the chemistry on these planets.

Until now, direct observational evidence for eruptive CME-like phenomena on young solar analogues has been limited. This is because clear signatures of stellar eruptions are often masked by the brightness of their host stars and flares on these. Measurements of Doppler shifts in optical lines have allowed astronomers to detect a few possible stellar eruptions associated with giant superflares on a young solar analogue, but these detections have been limited to single-wavelength data at “low temperatures” of around 104 K. Studies at higher temperatures have been few and far between. And although scientists have tried out promising techniques, such as X-ray and UV dimming, to advance their understanding of these “cool” stars, few simultaneous multi-wavelength observations have been made.

A large Carrington-class flare from EK Draconis

On 29 March 2024, astronomers at Kyoto University in Japan detected a large Carrington-class flare – or superflare – in the far-ultraviolet from EK Draconis, a G-type star located approximately 112 light-years away from the Sun. Thanks to simultaneous observations in the ultraviolet and optical ranges of the electromagnetic spectrum, they say they have now been able to obtain the first direct evidence for a multi-temperature CME from this young solar analogue (which is around 50 to 125 million years old and has a radius similar to the Sun).

The researchers’ campaign spanned four consecutive nights from 29 March to 1 April 2024. They made their ultraviolet observations with the Hubble Space Telescope and the Transiting Exoplanet Survey Satellite (TESS) and performed optical monitoring using three ground-based telescopes in Japan, Korea and the US.

They found that the far-ultraviolet and optical lines were Doppler shifted during and just before the superflare, with the ultraviolet observations showing blueshifted emission indicative of hot plasma. About 10 minutes later, the optical telescopes observed blueshifted absorption in the hydrogen Hα line, which indicates cooler gases. According to the team’s calculations, the hot plasma had a temperature of 100 000 K and was ejected at speeds of 300–550 km/s, while the “cooler” gas (with a temperature of 10 000 K) was ejected at 70 km/s.

“These findings imply that it is the hot plasma rather than the cool plasma that carries kinetic energy into planetary space,” explains study leader Kosuke Namekata. “The existence of this plasma suggests that such CMEs from our Sun in the past, if frequent and strong, could have driven shocks and energetic particles capable of eroding or chemically altering the atmosphere of the early Earth and the other planets in our solar system.”

“The discovery,” he tells Physics World, “provides the first observational link between solar and stellar eruptions, bridging stellar astrophysics, solar physics and planetary science.”

Looking forward, the researchers, who report their work in Nature Astronomy, now plan to conduct similar, multiwavelength campaigns on other young solar analogues to determine how frequently such eruptions occur and how they vary from star to star.

“In the near future, next-generation ultraviolet space telescopes such as JAXA’s LAPYUTA and NASA’s ESCAPADE, coordinated with ground-based facilities, will allow us to trace these events more systematically and understand their cumulative impact on planetary atmospheres,” says Namekata.

The post Plasma bursts from young stars could shed light on the early life of the Sun appeared first on Physics World.

Flattened halo of dark matter could explain high-energy ‘glow’ at Milky Way’s heart

20 novembre 2025 à 18:00

Astronomers have long puzzled over the cause of a mysterious “glow” of very high energy gamma radiation emanating from the centre of our galaxy. One possibility is that dark matter – the unknown substance thought to make up more than 25% of the universe’s mass – might be involved. Now, a team led by researchers at Germany’s Leibniz Institute for Astrophysics Potsdam (AIP) says that a flattened rather than spherical distribution of dark matter could account for the glow’s properties, bringing us a step closer to solving the mystery.

Dark matter is believed to be responsible for holding galaxies together. However, since it does not interact with light or other electromagnetic radiation, it can only be detected through its gravitational effects. Hence, while astrophysical and cosmological evidence has confirmed its presence, its true nature remains one of the greatest mysteries in modern physics.

“It’s extremely consequential and we’re desperately thinking all the time of ideas as to how we could detect it,” says Joseph Silk, an astronomer at Johns Hopkins University in the US and the Institut d’Astrophysique de Paris and Sorbonne University in France who co-led this research together with the AIP’s Moorits Mihkel Muru. “Gamma rays, and specifically the excess light we’re observing at the centre of our galaxy, could be our first clue.”

Models might be too simple

The problem, Muru explains, is that the way scientists have usually modelled dark matter to account for the excess gamma-ray radiation in astronomical observations was highly simplified. “This, of course, made the calculations easier, but simplifications always fuzzy the details,” he says. “We showed that in this case, the details are important: we can’t model dark matter as a perfectly symmetrical cloud and instead have to take into account the asymmetry of the cloud.”

Muru adds that the team’s findings, which are detailed in Phys. Rev. Lett., provide a boost to the “dark matter annihilation” explanation of the excess radiation. According to the standard model of cosmology, all galaxies – including our own Milky Way – are nested inside huge haloes of dark matter. The density of this dark matter is highest at the centre, and while it primarily interacts through gravity, some models suggest that it could be made of massive, neutral elementary particles that are their own antimatter counterparts. In these dense regions, therefore, such dark matter species could be mutually annihilating, producing substantial amounts of radiation.

Pierre Salati, an emeritus professor at the Université Savoie Mont Blanc, France, who was not involved in this work, says that in these models, annihilation plays a crucial role in generating a dark matter component with an abundance that agrees with cosmological observations. “Big Bang nucleosynthesis sets stringent bounds on these models as a result of the overall concordance between the predicted elemental abundances and measurements, although most models do survive,” Salati says. “One of the most exciting aspects of such explanations is that dark matter species might be detected through the rare antimatter particles – antiprotons, positrons and anti-deuterons – that they produce as they currently annihilate inside galactic halos.”

Silvia Manconi of the Laboratoire de Physique Théorique et Hautes Energies (LPTHE), France, who was also not involved in the study, describes it as “interesting and stimulating”. However, she cautions that – as is often the case in science – reality is probably more complex than even advanced simulations can capture. “This is not the first time that galaxy simulations have been used to study the implications of the excess and found non-spherical shapes,” she says, though she adds that the simulations in the new work offer “significant improvements” in terms of their spatial resolution.

Manconi also notes that the study does not demonstrate how the proposed distribution of dark matter would appear in data from the Fermi Gamma-ray Space Telescope’s Large Area Telescope (LAT), or how it would differ quantitatively from observations of a distribution of old stars. Forthcoming observations with radio telescopes such as MeerKat and FAST, she adds, may soon identify pulsars in this region of the galaxy, shedding further light on other possible contributions to the excess of gamma rays.

New telescopes could help settle the question

Muru acknowledges that better modelling and observations are still needed to rule out other possible hypotheses. “Studying dark matter is very difficult, because it doesn’t emit or block light, and despite decades of searching, no experiment has yet detected dark matter particles directly,” he tells Physics World. “A confirmation that this observed excess radiation is caused by dark matter annihilation through gamma rays would be a big leap forward.”

New gamma-ray telescopes with higher resolution, such as the Cherenkov Telescope Array, could help settle this question, he says. If these telescopes, which are currently under construction, fail to find star-like sources for the glow and only detect diffuse radiation, that would strengthen the alternative dark matter annihilation explanation.

Muru adds that a “smoking gun” for dark matter would be a signal that matches current theoretical predictions precisely. In the meantime, he and his colleagues plan to work on predicting where dark matter should be found in several of the dwarf galaxies that circle the Milky Way.

“It’s possible we will see the new data and confirm one theory over the other,” Silk says. “Or maybe we’ll find nothing, in which case it’ll be an even greater mystery to resolve.”

The post Flattened halo of dark matter could explain high-energy ‘glow’ at Milky Way’s heart appeared first on Physics World.

Talking physics with an alien civilization: what could we learn?

20 novembre 2025 à 14:55

It is book week here at Physics World and over the course of three days we are presenting conversations with the authors of three fascinating and fun books about physics. Today, my guest is the physicist Daniel Whiteson, who along with the artist Andy Warner has created the delightful book Do Aliens Speak Physics?.

Is physics universal, or is it shaped by human perspective? This will be a very important question if and when we are visited by an advanced alien civilization. Would we recognize our visitors’ alien science – or indeed, could a technologically-advanced civilization have no science at all? And would we even be able to communicate about science with our alien guests?

Whiteson, who is a particle physicist at the University of California Irvine, tackles these profound questions and much more in this episode of the Physics World Weekly podcast.

APS logo

 

This episode is supported by the APS Global Physics Summit, which takes place on 15–20 March, 2026, in Denver, Colorado, and online.

The post Talking physics with an alien civilization: what could we learn? appeared first on Physics World.

International Quantum Year competition for science journalists begins

20 novembre 2025 à 11:05

Are you a science writer attending the 2025 World Conference of Science Journalists (WCSJ) in Pretoria, South Africa? To mark the International Year of Quantum Science and Technology, Physics World (published by the Institute of Physics) and Physics Magazine (published by the American Physical Society) are teaming up to host a special Quantum Pitch Competition for WCSJ attendees.

The two publications invite journalists to submit story ideas on any aspect of quantum science and technology. At least two selected pitches will receive paid assignments and be published in one of the magazines.

Interviews with physicists and career profiles – either in academia or industry – are especially encouraged, but the editors will also consider news stories, podcasts, visual media and other creative storytelling formats that illuminate the quantum world for diverse audiences.

Participants should submit a brief pitch (150–300 words recommended), along with a short journalist bio and a few representative clips, if available. Editors from Physics World and Physics Magazine will review all submissions and announce the winning pitches after the conference. Pitches should be submitted to physics@aps.org by 8 December 2025, with the subject line “2025WCSJ Quantum Pitch”.

Whether you’re drawn to quantum materials, computing, sensing or the people shaping the field, this is an opportunity to feature fresh voices and ideas in two leading physics publications.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post International Quantum Year competition for science journalists begins appeared first on Physics World.

New cylindrical metamaterials could act as shock absorbers for sensitive equipment

20 novembre 2025 à 10:00

A 3D-printed structure called a kagome tube could form the backbone of a new system for muffling damaging vibrations. The structure is part of a class of materials known as topological mechanical metamaterials, and unlike previous materials in this group, it is simple enough to be deployed in real-world situations. According to lead developer James McInerney of the Wright-Patterson Air Force Base in Ohio, US, it could be used as shock protection for sensitive systems found in civil and aerospace engineering applications.

McInerney and colleagues’ tube-like design is made from a lattice of beams arranged in such a way that low-energy vibrational modes called floppy modes become localized to one side. “This provides good properties for isolating vibrations because energy input into the system on the floppy side does not propagate to the other side,” McInerney says.

The key to this desirable behaviour, he explains, is the arrangement of the beams that form the lattice structure. Using a pattern first proposed by the 19th century physicist James Clerk Maxwell, the beams are organized into repeating sub-units to form stable, two-dimensional structures known as topological Maxwell lattices.

Self-supporting design

Previous versions of these lattices could not support their own weight. Instead, they were attached to rigid external mounts, making it impractical to integrate them into devices. The new design, in contrast, is made by folding a flat Maxwell lattice into a cylindrical tube that is self-supporting. The tube features a connected inner and outer layer – a kagome bilayer – and its radius can be precisely engineered to give it the topological behaviour desired.

The researchers, who detail their work in Physical Review Applied, first tested their structure numerically by attaching a virtual version to a mechanically sensitive sample and a source of low-energy vibrations. As expected, the tube diverted the vibrations away from the sample and towards the other end of the tube.

Next, they developed a simple spring-and-mass model to understand the tube’s geometry by considering it as a simple monolayer. This modelling indicated that the polarization of the tube should be similar to the polarization of the monolayer. They then added rigid connectors to the tube’s ends and used a finite-element method to calculate the frequency-dependent patterns of vibrations propagating across the structure. They also determined the effective stiffness of the lattice as they applied loads parallel and perpendicular to it.

The researchers are targeting vibration-isolation applications that would benefit from a passive support structure, especially in cases where the performance of alternative passive mechanisms, such as viscoelastomers, is temperature-limited. “Our tubes do not necessarily need to replace other vibration isolation mechanisms,” McInerney explains. “Rather, they can enhance the capabilities of these by having the load-bearing structure assist with isolation.”

The team’s first and most important task, McInerney adds, will be to explore the implications of physically mounting the kagome tube on its vibration isolation structures. “The numerical study in our paper uses idealized mounting conditions so that the input and output are perfectly in phase with the tube vibrations,” he says. “Accounting for the potential impedance mismatch between the mounts and the tube will enable us to experimentally validate our work and provide realistic design scenarios.”

The post New cylindrical metamaterials could act as shock absorbers for sensitive equipment appeared first on Physics World.

Breakfast physics, delving into quantum 2.0, the science of sound, an update to everything: micro reviews of recent books

19 novembre 2025 à 15:00

Physics Around the Clock: Adventures in the Science of Everyday Living
By Michael Banks

Why do Cheerios tend to stick together while floating in a bowl of milk? Why does a runner’s ponytail swing side to side? These might not be the most pressing questions in physics, but getting to the answers is both fun and provides insights into important scientific concepts. These are just two examples of everyday physics that Physics World news editor Michael Banks explores in his book Physics Around the Clock, which begins with the physics (and chemistry) of your morning coffee and ends with a formula for predicting the winner of those cookery competitions that are mainstays of evening television. Hamish Johnston

 

Quantum 2.0: the Past, Present and Future of Quantum Physics
By Paul Davies

You might wonder why the world needs yet another book about quantum mechanics, but for physicists there’s no better guide than Paul Davies. Based for the last two decades at Arizona State University in the US, in Quantum 2.0 Davies tackles the basics of quantum physics – along with its mysteries, applications and philosophical implications – with great clarity and insight. The book ends with truly strange topics such as quantum Cheshire cats and delayed-choice quantum erasers – see if you prefer his descriptions to those we’ve attempted in Physics World this year. Matin Durrani

 

Can You Get Music on the Moon? the Amazing Science of Sound and Space
By Sheila Kanani, illustrated by Liz Kay

Why do dogs bark but wolves howl? How do stars “sing”? Why does thunder rumble? This delightful, fact-filled children’s book answers these questions and many more, taking readers on an adventure through sound and space. Written by planetary scientist Sheila Kanani and illustrated by Liz Kay, Can you get Music on the Moon? reveals not only how sound is produced but why it can make us feel certain things. Each of the 100 or so pages brims with charming illustrations that illuminate the many ways that sound is all around us. Michael Banks

  • 2025 Puffin Books

 

A Short History of Nearly Everything 2.0
By Bill Bryson

Alongside books such as Stephen Hawking’s A Brief History of Time and Carl Sagan’s Cosmos, British-American author Bill Bryson’s A Short History of Nearly Everything is one of the bestselling popular-science books of the last 50 years. First published in 2003, the book became a fan favourite of readers across the world and across disciplines as Bryson wove together a clear and humorous narrative of our universe. Now, 22 years later, he has released an updated and revised volume – A Short History of Nearly Everything 2.0 – that covers major updates in science from the past two decades. This includes the discovery of the Higgs boson and the latest on dark-matter research. The new edition is still imbued with all the wit and wisdom of the original, making it the perfect Christmas present for scientists and anyone else curious about the world around us. Tushna Commissariat

  • 2025 Doubleday

The post Breakfast physics, delving into quantum 2.0, the science of sound, an update to everything: micro reviews of recent books appeared first on Physics World.

Quantum 2.0: Paul Davies on the next revolution in physics

19 novembre 2025 à 14:00

In this episode of Physics World Stories, theoretical physicist, cosmologist and author Paul Davies discusses his latest book, Quantum 2.0: the Past, Present and Future of Quantum Physics. A Regents Professor at Arizona State University, Davies reflects on how the first quantum revolution transformed our understanding of nature – and what the next one might bring.

He explores how emerging quantum technologies are beginning to merge with artificial intelligence, raising new ethical and philosophical questions. Could quantum AI help tackle climate change or tackle issues like hunger? And how far should we go in outsourcing planetary management to machines that may well prioritize their own survival?

Davies also turns his gaze to the arts, imagining a future where quantum ideas inspire music, theatre and performance. From jazz improvized by quantum algorithms to plays whose endings depend on quantum outcomes, creativity itself could enter a new superposition.

Hosted by Andrew Glester, this episode blends cutting-edge science and imagination in trademark Paul Davies style.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

 

The post Quantum 2.0: Paul Davies on the next revolution in physics appeared first on Physics World.

💾

Flexible electrodes for the future of light detection

19 novembre 2025 à 09:04

Photodetectors convert light into electrical signals and are essential in technologies ranging from consumer electronics and communications to healthcare. They also play a vital role in scientific research. Researchers are continually working to improve their sensitivity, response speed, spectral range, and design efficiency.

Since the discovery of graphene’s remarkable electrical properties, there has been growing interest in using graphene and other two-dimensional (2D) materials to advance photodetection technologies. When light interacts with these materials, it excites electrons that must travel to a nearby contact electrode to generate an electrical signal. The ease with which this occurs depends on the work functions of the materials involved, specifically, the difference between them, known as the Schottky barrier height. Selecting an optimal combination of 2D material and electrode can minimize this barrier, enhancing the photodetector’s sensitivity and speed. Unfortunately, traditional electrode materials have fixed work functions which are limiting 2D photodetector technology.

PEDOT:PSS is a widely used electrode material in photodetectors due to its low cost, flexibility, and transparency. In this study, the researchers have developed PEDOT:PSS electrodes with tunable work functions ranging from 5.1 to 3.2 eV, making them compatible with a variety of 2D materials and ideal for optimizing device performance in metal-semiconductor-metal architectures. In addition, their thorough investigation demonstrates that the produced photodetectors performed excellently, with a significant forward current flow (rectification ratio ~10⁵), a strong conversion of light to electrical output (responsivity up to 1.8 A/W), and an exceptionally high Ilight/Idark ratio of 10⁸. Furthermore, the detectors were highly sensitive with low noise, had very fast response times (as fast as 3.2 μs), and thanks to the transparency of PEDOT:PSS, showed extended sensitivity into the near-infrared region.

This study demonstrates a tunable, transparent polymer electrode that enhances the performance and versatility of 2D photodetectors, offering a promising path toward flexible, self-powered, and wearable optoelectronic systems, and paving the way for next-generation intelligent interactive technologies.

Read the full article

A homogenous polymer design with widely tunable work functions for high-performance two-dimensional photodetectors

Youchen Chen et al 2025 Rep. Prog. Phys. 88 068003

Do you want to learn more about this topic?

Two-dimensional material/group-III nitride hetero-structures and devices by Tingting LinYi ZengXinyu LiaoJing LiChangjian Zhou and Wenliang Wang (2025)

The post Flexible electrodes for the future of light detection appeared first on Physics World.

Quantum cryptography in practice

19 novembre 2025 à 09:02

Quantum Conference Key Agreement (QCKA) is a cryptographic method that allows multiple parties to establish a shared secret key using quantum technology. This key can then be used for secure communication among the parties.

Unlike traditional methods that rely on classical cryptographic techniques, QCKA leverages the principles of quantum mechanics, particularly multipartite entanglement, to ensure security.

A key aspect of QCKA is creating and distributing entangled quantum states among the parties. These entangled states have unique properties that make it impossible for an eavesdropper to intercept the key without being detected.

Researchers measure the efficiency and performance of the key agreement protocol using a metric known as the key rate.

One problem with state-of-the-art QCKA schemes is that this key rate decreases exponentially with the number of users.

Previous solutions to this problem, based on single-photon interference, have come at the cost of requiring global phase locking. This makes them impractical to put in place experimentally.

However, the authors of this new study have been able to circumvent this requirement, by adopting an asynchronous pairing strategy. Put simply, this means that measurements taken by different parties in different places do not need to happen at exactly at the same time.

Their solution effectively removes the need for global phase locking while still maintaining the favourable scaling of the key rate as in other protocols based on single-photon interference.

The new scheme represents an important step towards realising QCKA at long distances by allowing for much more practical experimental configurations.

Quantum conference key agreement
Schematic representation of quantum group network via circular asynchronous interference (Courtesy: Hua-Lei Yin)

Read the full article

Repeater-like asynchronous measurement-device-independent quantum conference key agreement – IOPscience

Yu-Shuo Lu et al., 2025 Rep. Prog. Phys. 88 067901

The post Quantum cryptography in practice appeared first on Physics World.

Scientists realize superconductivity in traditional semiconducting material

18 novembre 2025 à 17:00
Superconducting germanium:gallium trilayer
Coherent crystalline interfaces Atomic-resolution image of a superconducting germanium:gallium (Ge:Ga) trilayer with alternating Ge:Ga and silicon layers demonstrating precise control of atomic interfaces. (Courtesy: Salva Salmani-Rezaie)

The ability to induce superconductivity in materials that are inherently semiconducting has been a longstanding research goal. Improving the conductivity of semiconductor materials could help develop quantum technologies with a high speed and energy efficiency, including superconducting quantum bits (qubits) and cryogenic CMOS control circuitry. However, this task has proved challenging in traditional semiconductors – such as silicon or germanium – as it is difficult to maintain the optimal superconductive atomic structure.

In a new study, published in Nature Nanotechnology, researchers have used molecular beam epitaxy (MBE) to grow gallium-hyperdoped germanium films that retain their superconductivity. When asked about the motivation for this latest work, Peter Jacobson from the University of Queensland tells Physics World about his collaboration with Javad Shabani from New York University.

“I had been working on superconducting circuits when I met Javad and discovered the new materials their team was making,” he explains. “We are all trying to understand how to control materials and tune interfaces in ways that could improve quantum devices.”

Germanium: from semiconductor to superconductor

Germanium is a group IV element, so its properties bridge those of both metals and insulators. Superconductivity can be induced in germanium by manipulating its atomic structure to introduce more electrons into the atomic lattice. These extra electrons interact with the germanium lattice to create electron pairs that move without resistance, or in other words, they become superconducting.

Hyperdoping germanium (at concentrations well above the solid solubility limit) with gallium induces a superconducting state. However, this material is traditionally unstable due to the presence of structural defects, dopant clustering and poor thickness control. There have also been many questions raised as to whether these materials are intrinsically superconducting, or whether it is actually gallium clusters and unintended phases that are solely responsible for the superconductivity of gallium-doped germanium.

Considering these issues and looking for a potential new approach, Jacobson notes that X-ray absorption measurements at the Australian Synchrotron were “the first real sign” that Shabani’s team had grown something special. “The gallium signal was exceptionally clean, and early modelling showed that the data lined up almost perfectly with a purely substitutional picture,” he explains. “That was a genuine surprise. Once we confirmed and extended those results, it became clear that we could probe the mechanism of superconductivity in these films without the usual complications from disorder or spurious phases.”

Epitaxial growth improves superconductivity control

In a new approach, Jacobson, Shabani and colleagues used MBE to grow the crystals instead of relying on ion implantation techniques, allowing the germanium to by hyperdoped with gallium. Using MBE forces the gallium atoms to replace germanium atoms within the crystal lattice at levels much higher than previously seen. The process also provided better control over parasitic heating during film growth, allowing the researchers to achieve the structural precision required to understand and control the superconductivity of these germanium:gallium (Ge:Ga) materials, which were found to become superconducting at 3.5 K with a carrier concentration of 4.15 × 1021 holes/cm3. The critical gallium dopant threshold to achieve this was 17.9%.

Using synchrotron-based X-ray absorption, the team found that the gallium dopants were substitutionally incorporated into the germanium lattice and induced a tetragonal distortion to the unit cell. Density functional theory calculations showed that this causes a shift in the Fermi level into the valence band and flattens electronic bands. This suggests that the structural order of gallium in the germanium lattice creates a narrow band that facilitates superconductivity in germanium, and that this superconductivity arises intrinsically in the germanium, rather than being governed by defects and gallium clusters.

The researchers tested trilayer heterostructures – Ge:Ga/Si/Ge:Ga and Ge:Ga/Ge/Ge:Ga – as proof-of-principle designs for vertical Josephson junction device architectures. In the future, they hope to develop these into fully fledged Josephson junction devices.

Commenting on the team’s future plans for this research, Jacobson concludes: “I’m very keen to examine this material with low-temperature scanning tunnelling microscopy (STM) to directly measure the superconducting gap, because STM adds atomic-scale insights that complement our other measurements and will help clarify what sets hyperdoped germanium apart”.

The post Scientists realize superconductivity in traditional semiconducting material appeared first on Physics World.

Better coffee, easier parking and more: the fascinating physics of daily life

18 novembre 2025 à 15:20

It is book week here at Physics World and over the course of three days we are presenting conversations with the authors of three fascinating and fun books about physics. First up is my Physics World colleague Michael Banks, whose book Physics Around the Clock: Adventures in the Science of Everyday Living starts with your morning coffee and ends with a formula for making your evening television viewing more satisfying.

As well as the rich physics of coffee, we chat about strategies for finding the best parking spot and the efficient boarding of aeroplanes. If you have ever wondered why a runner’s ponytail swings from side-to-side when they reach a certain speed – we have the answer for you.

Other daily mysteries that we explore include how a hard steel razor blade can be dulled by cutting relatively soft hairs and why quasiparticles called “jamitons” are helping physicists understand the spontaneous appearance of traffic jams. And a warning for squeamish listeners, we do talk about the amazing virus-spreading capabilities of a flushing toilet.

APS logo

 

This episode is supported by the APS Global Physics Summit, which takes place on 15–20 March, 2026, in Denver, Colorado, and online.

The post Better coffee, easier parking and more: the fascinating physics of daily life appeared first on Physics World.

Cosmic dawn: the search for the primordial hydrogen signal

18 novembre 2025 à 12:00

“This is one of the big remaining frontiers in astronomy,” says Phil Bull, a cosmologist at the Jodrell Bank Centre for Astrophysics at the University of Manchester. “It’s quite a pivotal era of cosmic history that, it turns out, we don’t actually understand.”

Bull is referring to the vital but baffling period in the early universe – from 380,000 years to one billion years after the Big Bang – when its structure went from simple to complex. To lift the veil on this epoch, experiments around the world – from Australia to the Arctic – are racing to find a specific but elusive signal from the earliest hydrogen atoms. This signal could confirm or disprove scientists’ theories of how the universe evolved and the physics that governs it.

Hydrogen is the most abundant element in the universe. As neutral hydrogen atoms change states, they can emit or absorb photons. This spectral transition, which can be stimulated by radiation, produces an emission or absorption radio wave signal with a wavelength of 21 cm. To find out what happened during that early universe, astronomers are searching for these 21 cm photons that were emitted by primordial hydrogen atoms.

But despite more teams joining the hunt every year, no-one has yet had a confirmed detection of this radiation. So who will win the race to find this signal and how is the hunt being carried out?

A blank spot

Let’s first return to about 380,000 years after the Big Bang, when the universe had expanded and cooled to below 3000 K. At this stage, neutral atoms, including atomic hydrogen, could form. Thanks to the absence of free electrons, ordinary matter particles could decouple from light, allowing it to travel freely across the universe. This ancient radiation that permeates the sky is known as the cosmic microwave background (CMB).

But after that we don’t know much about what happened for the next few hundred million years. Meanwhile, the oldest known galaxy MoM-z14 – which existed about 280 million years after the Big Bang – was observed in April 2025 by the James Webb Space Telescope. So there is currently a gap of just under 280 million years in our observations of the early universe. “It’s one of the last blank spots,” says Anastasia Fialkov, an astrophysicist at the Institute of Astronomy of the University of Cambridge.

This “blank spot” is a bridge between the early, simple universe and today’s complex structured cosmos. During this early epoch, the universe went from being filled with a thick cloud of neutral hydrogen, to being diversely populated with stars, black holes and everything in between. It covers the end of the cosmic dark ages, the cosmic dawn, and the epoch of reionization – and is arguably one of the most exciting periods in our universe’s evolution.

During the cosmic dark ages, after the CMB flooded the universe, the only “ordinary” matter (made up of protons, neutrons and electrons) was neutral hydrogen (75% by mass) and neutral helium (25%), and there were no stellar structures to provide light. It is thought that gravity then magnified any slight fluctuations in density, causing some of this primordial gas to clump and eventually form the first stars and galaxies – a time called the cosmic dawn. Next came the epoch of reionization, when ultraviolet and X-ray emissions from those first celestial objects heated and ionized the hydrogen atoms, turning the neutral gas into a charged plasma of electrons and protons.

Stellar imprint

The 21 cm signal astronomers are searching for was produced when the spectral transition was excited by collisions in the hydrogen gas during the dark ages and then by the first photons from the first stars during the cosmic dawn. However, the intensity of the 21 cm signal can only be measured against the CMB, which acts as a steady background source of 21 cm photons.

When the hydrogen was colder than the background radiation, there were few collisions, and the atoms would have absorbed slightly more 21 cm photons from the CMB than they emitted themselves. The 21 cm signal would appear as a deficit, or absorption signal, against the CMB. But when the neutral gas was hotter than the CMB, the atoms would emit more photons than they absorbed, causing the 21 cm signal to be seen as a brighter emission against the CMB. These absorption and emission rates depend on the density and temperature of the gas, and the timing and intensity of radiation from the first cosmic sources. Essentially, the 21 cm signal became imprinted with how those early sources transformed the young universe.

One way scientists are trying to observe this imprint is to measure the average – or “global” – signal across the sky, looking at how it shifts from absorption to emission compared to the CMB. Normally, a 21 cm radio wave signal has a frequency of about 1420 MHz. But this ancient signal, according to theory, has been emitted and absorbed at different intensities throughout this cosmic “blank spot”, depending on the universe’s evolutionary processes at the time. The expanding universe has also stretched and distorted the signal as it travelled to Earth. Theories predict that it would now be in the 1 to 200 MHz frequency range – with lower frequencies corresponding to older eras – and would have a wavelength of metres rather than centimetres.

Importantly, the shape of the global 21 cm signal over time could confirm the lambda-cold dark matter (ΛCDM) model, which is the most widely accepted theory of the cosmos; or it could upend it. Many astronomers have dedicated their careers to finding this radiation, but it is challenging for a number of reasons.

Unfortunately, the signal is incredibly faint. Its brightness temperature, which is measured as the change in the CMB’s black body temperature (2.7 K), will only be in the region of 0.1 K.

1 The 21 cm signal across cosmic time

The 21 cm signal across cosmic time
(a CC BY 4.0 The Royal Society/A Fialkov et al. 2024 Philos. Trans. A Math. Phys. Eng. Sci. 382 20230068; b Copyright Springer Nature. Reused with permission from E de Lera Acedo et al. 2022 Nature Astronomy 6 984)

A simulation of the sky-averaged (global) signal as a function of time (horizontal) and space (vertical). b A typical model of the global 21 cm line with the main cosmic events highlighted. Each experiment searching for the global 21 cm signal focuses on a particular frequency band. For example, the Radio Experiment for the Analysis of Cosmic Hydrogen (REACH) is looking at the 50–170 MHz range (blue).

There is also no single source of this emission, so, like the CMB, it permeates the universe. “If it was the only signal in the sky, we would have found it by now,” says Eloy de Lera Acedo, head of Cavendish Radio Astronomy and Cosmology at the University of Cambridge. But the universe is full of contamination, with the Milky Way being a major culprit. Scientists are searching for 0.1 K in an environment “that’s a million times brighter”, he explains.

And even before this signal reaches the radio-noisy Earth, it has to travel through the atmosphere, which further distorts and contaminates it. “It’s a very difficult measurement,” says Rigel Cappallo, a research scientist at the MIT Haystack Observatory. “It takes a really, really well calibrated instrument that you understand really well, plus really good modelling.”

Seen but not confirmed

In 2018 the Experiment to Detect the Global EoR Signature (EDGES) – a collaboration between Arizona State University and MIT Haystack Observatory – hit the headlines when it claimed to have detected the global 21 cm signal (Nature 555 67).

The EDGES instrument is a dipole antenna, which resembles a ping-pong table with a gap in the middle (see photo at top of article for the 2024 set-up). It is mounted on a large metal groundsheet, which is about 30 × 30 m. Its ground-breaking observation was made at a remote site in western Australia, far from radio frequency interference.

But in the intervening seven years, no-one else has been able to replicate the EDGES results.

The spectrum dip that EDGES detected was very different from what theorists had expected. “There is a whole family of models that are predicted by the different cosmological scenarios,” explains Ravi Subrahmanyan, a research scientist at Australia’s national science agency CSIRO. “When we take measurements, we compare them with the models, so that we can rule those models in or out.”

In general, the current models predict a very specific envelope of signal possibilities (see figure 1). First, they anticipate an absorption dip in brightness temperature of around 0.1 to 0.2 K, caused by the temperature difference between the cold hydrogen gas (in an expanding universe) and the warmer CMB. Then, a speedy rise and photon emission is predicted as the gas starts to warm when the first stars form, and the signal should spike dramatically when the first X-ray binary stars fire up and heat up the surrounding gas. The signal is then expected to fade as the epoch of reionization begins, because ionized particles cannot undergo the spectral transition. With models, scientists theorize when this happened, how many stars there were, and how the cosmos unfurled.

2 Weird signal

The 21 cm signals predicted by standard cosmology (coloured lines
(Courtesy: SARAS Team)

The 21 cm signals predicted by current cosmology models (coloured lines) and the detection by the EDGES experiment (dashed black line).

“It’s just one line, but it packs in so many physical phenomena,” says Fialkov, referring to the shape of the 21 cm signal’s brightness temperature over time. The timing of the dip, its gradient and magnitude all represent different milestones in cosmic history, which affect how it evolved.

The EDGES team, however, reported a dip of more than double the predicted size, at about 78 MHz (see figure 2). While the frequency was consistent with predictions, the very wide and deep dip of the signal took the community by surprise.

“It would be a revolution in physics, because that signal will call for very, very exotic physics to explain it,” says de Lera Acedo. “Of course, the first thing we need to do is to make sure that that is actually the signal.”

A spanner in the works

The EDGES claim has galvanized the cosmology community. “It set a cat among the pigeons,” says Bull. “People realized that, actually, there’s some very exciting science to be done here.” Some groups are trying to replicate the EDGES observation, while others are trying new approaches to detect the signal that the models promise.

The Radio Experiment for the Analysis of Cosmic Hydrogen (REACH) – a collaboration between the University of Cambridge and Stellenbosch University in South Africa – focuses on the 50–170 MHz frequency range. Sitting on the dry and empty plains of South Africa’s Northern Cape, it is targeting the EDGES observation (Nature Astronomy 6 984).

A large metal mesh topped with two antennas, in a desert
The race to replicate REACH went online in the Karoo region of South Africa in December 2023. (Courtesy: Saurabh Pegwal, REACH collaboration)

In this radio-quiet environment, REACH has set up two antennas: one looks like EDGES’ dipole ping-pong table, while the other is a spiral cone. They sit on top of a giant metallic mesh – the ground plate – in the shape of a many-pointed star, which aims to minimize reflections from the ground.

Hunting for this signal “requires precision cosmology and engineering”, says de Lera Acedo, the principal investigator on REACH. Reflections from the ground or mesh, calibration errors, and signals from the soil, are the kryptonite of cosmic dawn measurements. “You need to reduce your systemic noise, do better analysis, better calibration, better cleaning [to remove other sources from observations],” he says.

Desert, water, snow

Another radio telescope, dubbed the Shaped Antenna measurement of the background Radio Spectrum (SARAS) – which was established in the late 2000s by the Raman Research Institute (RRI) in Bengaluru, India – has undergone a number of transformations to reduce noise and limit other sources of radiation. Over time, it has morphed from a dipole on the ground to a metallic cone floating on a raft. It is looking at 40 to 200 MHz (Exp. Astron. 51 193).

After the EDGES claim, SARAS pivoted its attention to verifying the detection, explains Saurabh Singh, a research scientist at the RRI. “Initially, we were not able to get down to the required sensitivity to be able to say anything about their detection,” he explains. “That’s why we started floating our radiometer on water.” Buoying the experiment reduces ground contamination and creates a more predictable surface to include in calculations.

Four photos of the SARAS telescope with different designs and in different locations
Floating telescope Evolution of the SARAS experiment and sites up to 2020. The third edition of the telescope, SARAS 3, was deployed on lakes to further reduce radio interference. (Courtesy: SARAS Team)

Using data from their floating radiometer, in 2022 Singh and colleagues disfavoured EDGES’ claim (Nature Astronomy 6 607), but for many groups the detection still remains a target for observations.

While SARAS has yet to detect a cosmic-dawn signal of its own, Singh says that non-detection is also an important element of finding the global 21 cm signal. “Non-detection gives us an opportunity to rule out a lot of these models, and that has helped us to reject a lot of properties of these stars and galaxies,” he says.

Raul Monsalve Jara – a cosmologist at the University of California, Berkeley – has been part of the EDGES collaboration since 2012, but decided to also explore other ways to detect the signal. “My view is that we need several experiments doing different things and taking different approaches,” he says.

The Mapper of the IGM Spin Temperature (MIST) experiment, of which Monsalve is co-principal investigator, is a collaboration between Chilean, Canadian, Australian and American researchers. These instruments are looking at 25 to 105 MHz (MNRAS 530 4125). “Our approach was to simplify the instrument, get rid of the metal ground plate, and to take small, portable instruments to remote locations,” he explains. These locations have to fulfil very specific requirements – everything around the instrument, from mountains to the soil, can impact the instrument’s performance. “If the soil itself is irregular, that will be very difficult to characterize and its impact will be difficult to remove [from observations],” Monsalve says.

Two photos of a small portable radio telescope – in a snowy Arctic region and in a hot desert
Physics on the move MIST conducts measurements of the sky-averaged radio spectrum at frequencies below 200 MHz. Its monopole and dipole variants are highly portable and have been deployed in some of the most remote sites on Earth, including the Arctic (top) and the Nevada desert (bottom). (Courtesy: Raul Monsalve)

So far, the MIST instrument, which is also a dipole ping-pong table, has visited a desert in California, another in Nevada, and even the Arctic. Each time, the researchers spend a few weeks at the site collecting data, and it is portable and easy to set up, Monsalve explains. The team is planning more observations in Chile. “If you suspect that your environment could be doing something to your measurements, then you need to be able to move around,” continues Monsalve. “And we are contributing to the field by doing that.”

Aaron Parsons, also from the University of California, Berkeley, decided that the best way to detect this elusive signal would be to try and eliminate the ground entirely – by suspending a rotating antenna over a giant canyon with 100 m empty space in every direction.

His Electromagnetically Isolated Global Signal Estimation Platform (EIGSEP) includes an antenna hanging four storeys above the ground, attached to Kevlar cable strung across a canyon in Utah. It’s observing at 50 to 250 MHz. “It continuously rotates around and twists every which way,” Parsons explains. Hopefully, that will allow them to calibrate the instrument very accurately. Two antennas on the ground cross-correlate observations. EIGSEP began making observations last year.

More experiments are expected to come online in the next year. The Remote HI eNvironment Observer (RHINO), an initiative of the University of Manchester, will have a horn-shaped receiver made of a metal mesh that is usually used to construct skyscrapers. Horn shapes are particularly good for calibration, allowing for very precise measurements. The most famous horn-shaped antenna is Bell Laboratories’ Holmdel Horn Antenna in the US, with which two scientists accidentally discovered the CMB in 1965.

Initially, RHINO will be based at Jodrell Bank Observatory in the UK, but like other experiments, it could travel to other remote locations to hunt for the 21 cm signal.

Similarly, Subrahmanyan – who established the SARAS experiment in India and is now with CSIRO in Australia – is working to design a new radiometer from scratch. The instrument, which will focus on 40–160 MHz, is called Global Imprints from Nascent Atoms to Now (GINAN). He says that it will feature a recently patented self-calibrating antenna. “It gives a much more authentic measurement of the sky signal as measured by the antenna,” he explains.

In the meanwhile, the EDGES collaboration has not been idle. MIT Haystack Observatory’s Cappallo project manages EDGES, which is currently in its third iteration. It is still the size of a desk, but its top now looks like a box, with closed sides and its electronics tucked inside, and an even larger metal ground plate. The team has now made observations from islands in the Canadian archipelago and in Alaska’s Aleutian island chain (see photo at top of article).

“The 2018 EDGES result is not going to be accepted by the community until somebody completely independently verifies it,” Cappallo explains. “But just for our own sanity and also to try to improve on what we can do, we want to see it from as many places as possible and as many conditions as possible.” The EDGES team has replicated its results using the same data analysis pipeline, but no-one else has been able to reproduce the unusual signal.

All the astronomers interviewed welcomed the introduction of new experiments. “I think it’s good to have a rich field of people trying to do this experiment because nobody is going to trust any one measurement,” says Parsons. “We need to build consensus here.”

Taking off

Some astronomers have decided to avoid the struggles of trying to detect the global 21 cm signal from Earth – instead, they have their sights set on the Moon. Earth’s atmosphere is one of the reasons why the 21 cm signal is so difficult to measure. The ionosphere, a charged region of the atmosphere, distorts and contaminates this incredibly faint signal. On the far side of the Moon, any antenna would also be shielded from the cacophony of radio-frequency interference from Earth.

“This is why some experiments are going to the Moon,” says Parsons, adding that he is involved in NASA’s LuSEE-Night experiment. LuSEE-Night, or the Lunar Surface Electromagnetics Experiment, aims to land a low-frequency experiment on the Moon next year.

In July, at the National Astronomical Meeting in Durham, the University of Cambridge’s de Lera Acedo presented a proposal to put a miniature radiometer into lunar orbit. Dubbed “Cosmocube”, it will be a nanosatellite that will orbit the Moon searching for this 21 cm signal.

Illustration of a satellite with sails
Taking the hunt to space Provisional illustration of the CosmoCube with its antenna deployed for the 21 cm signal detection, i.e. in operational mode in space. This nanosatellite would travel to the far side of the Moon to get away from the Earth’s ionosphere, which introduces substantial distortions and absorption effects to any radio signal detection. (CC BY 4.0 Artuc and de Lera Acedo 2024 RAS Techniques and Instruments 4 rzae061)

“It is just in the making,” says de Lera Acedo, adding that it will not be in operation for at least a decade. “But it is the next step.”

In the meanwhile, groups here on Earth are in a race to detect this elusive signal. The instruments are getting more sensitive, the modelling is improving, and the unknowns are reducing. “If we do the experiments right, we will find the signal,” Monsalve believes. The big question is who, of the many groups with their hat in the ring, is doing the experiment “right”.

The post Cosmic dawn: the search for the primordial hydrogen signal appeared first on Physics World.

Ten-ion system brings us a step closer to large-scale qubit registers

17 novembre 2025 à 17:15
Photo of the members of Ben Lanyon's research group
Team effort Based at the University of Innsbruck, Ben Lanyon’s group has created a novel qubit register by trapping ten ions. (Courtesy: Victor Krutyanskiy/University of Innsbruck)

Researchers in Austria have entangled matter-based qubits with photonic qubits in a ten-ion system. The technique is scalable to larger ion-qubit registers, paving the way for the creation of larger and more complex quantum networks.

Visualization of the ten ion quantum
Ions in motion Each ion (large object) is moved one at a time into the “sweet spot” of the optical cavity. Once there, a laser beam drives the emission of a single photon (small object), entangled with the ion. The colours indicate ion–photon entanglement. (Courtesy: Universität Innsbruck/Harald Ritsch)

Quantum networks consist of matter-based nodes that store and process quantum information and are linked through photons (quanta of light). Already, Ben Lanyon’s group at the University of Innsbruck has made advances in this direction by entangling two ions in different systems. Now, in a new paper published in Physical Review Letters , they describe how they have developed and demonstrated a new method to entangle a string of ten ions with photons. In the future, this approach could enable the entanglement of sets of ions in different locations through light, rather than one ion at a time.

To achieve this, Lanyon and colleagues trapped a chain of 10 calcium ions in a linear trap in an optical cavity. By changing the trapping voltages in the trap, each ion was moved, one-by-one, into the cavity. Once inside, the ion was placed in the “sweet spot”, where the ion’s interaction with the cavity is the strongest. There, the ion  emitted a single photon when exposed to a 393 nm Raman laser beam. This beam was tightly focused on one ion, guaranteeing that the emitted photon – collected in a single-mode optical fibre – comes out from one ion at a time. This process was carried out ten times, one per ion, to obtain a train of ten photons.

By using quantum state tomography, the researchers reconstructed the density matrix, which describes the correlation between the states of ions (i) and photons (j).  To do so, they measure every ion and photon state in three different basis, resulting in nine Pauli-basis configurations of quantum measurements. From the density matrix, the concurrence (a measure of entanglement) between the ion (i) and photon (j) was found to be positive only when  i = j, and equal to zero otherwise. This implies that the ion is uniquely entangled with the photon it produced, and unentangled with the photon produced by other ions.

From the density matrix, they also calculate the fidelity with the Bell state (a state of maximum entanglement), yielding an average 92%. As Marco Canteri points out, “this fidelity characterizes the quality of entanglement between the ion-photon pair for i=j”.

This work developed and demonstrated a technique whereby matter-based qubits and photonic qubits can be entangled, one  at a time, in ion strings.  Now, the group aims to “demonstrate universal quantum logic within the photon-interfaced 10-ion register and, building up towards entangling two remote 10-ion processors through the exchange of photons between them,” explains team member Victor Krutyanskiy. If this method effectively scales to larger systems, more complex quantum networks could be built. This would lead to applications in quantum communication and quantum sensing.

The post Ten-ion system brings us a step closer to large-scale qubit registers appeared first on Physics World.

Non-invasive wearable device measures blood flow to the brain

17 novembre 2025 à 10:45

Measuring blood flow to the brain is essential for diagnosing and developing treatments for neurological disorders such as stroke, vascular dementia or traumatic brain injury. Performing this measurement non-invasively is challenging, however, and achieved predominantly using costly MRI and nuclear medicine imaging techniques.

Emerging as an alternative, modalities based on optical transcranial measurement are cost-effective and easy to use. In particular, speckle contrast optical spectroscopy (SCOS) – an offshoot of laser speckle contrast imaging, which uses laser light speckles to visualize blood vessels – can measure cerebral blood flow (CBF) with high temporal resolution, typically above 30 Hz, and cerebral blood volume (CBV) through optical signal attenuation.

Researchers at the California Institute of Technology (Caltech) and the Keck School of Medicine’s USC Neurorestoration Center have designed a lightweight SCOS system that accurately measures blood flow to the brain, distinguishing it from blood flow to the scalp. Co-senior author Charles Liu of the Keck School of Medicine and team describe the system and their initial experimentation with it in APL Bioengineering.

Detection channels in a speckle contrast optical spectroscopy system
Seven simultaneous measurements Detection channels with differing source-to-detector distances monitor blood dynamics in the scalp, skull and brain layers. (Courtesy: CC BY 4.0/APL Bioeng. 10.1063/5.0263953)

The SCOS system consists of a 3D-printed head mount designed for secure placement over the temple region. It holds a single 830 nm laser illumination fibre and seven detector fibres positioned at seven different source-to-detector (S–D) distances (between 0.6 and 2.6 cm) to simultaneously capture blood flow dynamics across layers of the scalp, skull and brain. Fibres with shorter S–D distances acquire shallower optical data from the scalp, while those with greater distances obtain deeper and broader data. The seven channels are synchronized to exhibit identical oscillation frequencies corresponding to the heart rate and cardiac cycle.

When the SCOS system directs the laser light onto a sample, multiple random scattering events occur before the light exits the sample, creating speckles. These speckles, which materialize on rapid timescales, are the result of interference of light travelling along different trajectories. Movement within the sample (of red blood cells, for instance) causes dynamic changes in the speckle field. These changes are captured by a multi-million-pixel camera with a frame rate above 30 frames/s and quantified by calculating the speckle contrast value for each image.

Human testing

The researchers used the SCOS system to perform CBF and CBV measurements in 20 healthy volunteers. To isolate and obtain surface blood dynamics from brain signals, the researchers gently pressed on the superficial temporal artery (a terminal branch of the external carotid artery that supplies blood to the face and scalp) to block blood flow to the scalp.

In tests on the volunteers, when temporal artery blood flow was occluded for 8 s, scalp-sensitive channels exhibited significant decreases in blood flow while brain-sensitive channels showed minimal change, enabling signals from the internal carotid artery that supplies blood to the brain to be clearly distinguished. Additionally, the team found that positioning the detector 2.3 cm or more away from the source allowed for optimal brain blood flow measurement while minimizing interference from the scalp.

“Combined with the simultaneous measurements at seven S–D separations, this approach enables the first quantitative experimental assessment of how scalp and brain signal contributions vary with depth in SCOS-based CBF measurements and, more broadly, in optical measurements,” they write. “This work also provides crucial insights into the optimal device S–D distance configuration for preferentially probing brain signal over scalp signal, with a practical and subject-friendly alternative for evaluating depth sensitivity, and complements more advanced, hardware-intensive strategies such as time-domain gating.”

The researchers are now working to improve the signal-to-noise ratio of the system. They plan to introduce a compact, portable laser and develop a custom-designed extended camera that spans over 3 cm in one dimension, enabling simultaneous and continuous measurement of blood dynamics across S–D distances from 0.5 to 3.5 cm. These design advancements will enhance spatial resolution and enable deeper brain measurements.

“This crucial step will help transition the system into a compact, wearable form suitable for clinical use,” comments Liu. “Importantly, the measurements described in this publication were achieved in human subjects in a very similar manner to how the final device will be used, greatly reducing barriers to clinical application.”

“I believe this study will advance the engineering of SCOS systems and bring us closer to a wearable, clinically practical device for monitoring brain blood flow,” adds co-author Simon Mahler, now at Stevens Institute of Technology. “I am particularly excited about the next stage of this project: developing a wearable SCOS system that can simultaneously measure both scalp and brain blood flow, which will unlock many fascinating new experiments.”

The post Non-invasive wearable device measures blood flow to the brain appeared first on Physics World.

❌