New NASA Administrator Jared Isaacman said he wants NASA to move faster on programs such as Artemis but acknowledged he needs time to get up to speed on NASA’s activities.
Merge Labs, a brain-computer interface startup that seeks to read brain activity using ultrasound, is being spun out of Forest Neurotech, a Los Angeles nonprofit.
Achieving a profound understanding of any subject is hard. When that subject is quantum mechanics, it’s even harder. And when one departs from ideal theoretical scenarios and enters the real world of experimental limitations, it becomes more challenging still – yet that is what physicists at the Freie Universität Berlin (FU-Berlin), Germany recently did by exploring what happens to entanglement theory in real quantum computers. In doing so, they created a bridge between two fields that have so far largely developed in parallel: entanglement theory (rooted in physics) and computational complexity (rooted in computer science).
Ebits, the standard currency of entanglement
In quantum mechanics, a composite system is said to be entangled when its total wavefunction cannot be written as a product of the states of its individual subsystems. This leads to correlations between subsystems that arise from the structure of the quantum state, not from any shared classical information. Many speed-ups achieved in quantum computing, quantum cryptography and quantum metrology rely heavily on entanglement, but not every form of entanglement is equally useful. Only specific kinds of entanglement will enable a given computational or communication task.
To make quantum technologies practical, the available entangled resources must therefore often be converted into forms suitable for specific applications. One major conversion process involves transforming partially entangled states into, or extracting them from, the maximally entangled bit (ebit) that acts as the standard unit of entanglement. High-fidelity ebits – entangled pairs that are extremely close to the ideal perfectly entangled state – can be distilled from noisy or imperfect entangled states through entanglement distillation, while entanglement dilution allows one to reconstruct the desired entangled states from purified ebits.
In an idealized setting, with an infinite number of copies of entangled states and unlimited computational power, a single quantity called the von Neumann entropy fully determines how many ebits can be extracted or are required. But reality is far less forgiving: we never have infinite resources, and computational power is always limited, just like we don’t have an infinite amount of gold on Earth.
Entanglement under finite resources
In the present work, which is published in Nature Physics, the FU-Berlin team of Lorenzo Leone, Jacopo Rizzo, Jens Eisert and Sofiene Jerbi asked what happens when these ideal assumptions break down. They study the case where only a finite number of entangled states, which can scale at most polynomially with the number of quantum bits (qubits) in the system, are considered and all local operations and classical communication (LOCC) are performed in a finite polynomial time.
They found that the simple correspondence between von Neumann entropy and extractable or required ebits no longer holds: even when a state has a large von Neumann entropy, the number of ebits that can be efficiently extracted may be much lower. In these cases, the number is bounded instead by the min-entropy of the reduced state (an operational measure determined solely by the state’s largest eigenvalue that captures how much entanglement can be reliably distilled from a single copy of the state) without averaging over many uses. On the other hand, even a state with negligible von Neumann entanglement may require a maximal ebit budget for efficient dilution.
Leone and Eisert say they were inspired to perform this study by recent work on so-called pseudo-entangled states, which are states that look at lot more entangled than they are for computationally bounded observers. Their construction of pseudo-entangled states highlights a dramatic worst-case scenario: a state that appears almost unentangled by conventional measures may still require a large number of ebits to create it efficiently. The takeaway is that computability matters, and quantum resources you might have thought were available may be, in effect, locked away simply because they cannot be processed efficiently. In other words, practical limitations make the line between a “resource” and a “usable resource” even sharper.
Quantum resources in a limited world
The researchers say that their study raises multiple questions for future exploration. One such question concerns whether a similar computational‐efficiency gap exists for other quantum resources such as magic and coherence. Another is whether one can build a full resource theory with complexity constraints, where quantities reflect not just what can be converted, but how efficient that conversion is.
Regardless of the answers, the era of entanglement under infinite book‐keeping is giving way to an era of entanglement under limited books, limited clocks and limited gates. And in this more realistic space, quantum technologies may still shine, but the calculus of what can be done and what can be harnessed needs a serious retooling.
Polling shows that nearly half of Americans now believe aliens have visited this planet—and that the number who aren't sure has dropped by two-thirds. The reasons why, experts say, are complicated.
CTV segmentation test Target contouring in two example slices of a patient’s CT scan, using BCTVNet and 12 comparison models. Red and green contours represent the ground truth and the model predictions, respectively. Each image is annotated with the corresponding Dice similarity coefficient. (Courtesy: CC BY 4.0/Mach. Learn.: Sci. Technol. 10.1088/2632-2153/ae2233
Brachytherapy – a cancer treatment that destroys tumours using small radioactive sources implanted inside the body – plays a critical role in treating cervical cancer, offering an important option for patients with inoperable locally advanced disease. Brachytherapy can deliver high radiation doses directly to the tumour while ensuring nearby healthy tissues receive minimal dose; but its effectiveness relies on accurate delineation of the treatment target. A research team in China is using a hybrid deep-learning model to help with this task.
Planning brachytherapy treatments requires accurate contouring of the clinical target volume (CTV) on a CT scan, a task that’s traditionally performed manually. The limited soft-tissue contrast of CT, however, can result in unclear target boundaries, while applicator or needle insertion (used to deliver the radioactive sources) can deform and displace nearby organs. This makes manual contouring a time-consuming and subjective task that requires a high level of operator expertise.
Automating this process could reduce reliance on operator experience, increase workflow efficiency and improve contouring consistency. With this aim, the research team – headed up by He Ma from Northeastern University and Lin Zhang from Shanghai University of International Business and Economics – developed a 3D hybrid neural network called BCTVNet.
Currently, most brachytherapy segmentation models are based on convolutional neural networks (CNNs). CNNs effectively capture local structural features and can model fine anatomical details but struggle with long-range dependencies, which can cause problems if the target extends across multiple CT slices. Another option is to use transformer-based models that can integrate spatial information across distant regions and slices; but these are less effective at capturing fine-grained local detail.
To combine the strengths of both, BCTVNet integrates CNN with transformer branches to provide strong local detail extraction along with global information integration. BCTVNet performs 3D segmentation directly on post-insertion CT images, enabling the CTV to be defined based on the actual treatment geometry.
Model comparisons
Zhang, Ma and colleagues assessed the performance of BCTVNet using a private CT dataset from 95 patients diagnosed with locally advanced cervical cancer and treated with CT-guided 3D brachytherapy (76 in the training set, 19 in the test set). The scans had an average of 96 slices per patient and a slice thickness of 3 mm.
CT scans used to plan cervical cancer brachytherapy often exhibit unclear target boundaries. To enhance the local soft-tissue contrast and improve boundary recognition, the researchers pre-processed the CT volumes with a 3D version of the CLAHE (contrast-limited adaptive histogram equalization) algorithm, which processes the entire CT scan as a volumetric input. They then normalized the intensity values to standardize the input for the segmentation models.
The researchers compared BCTVNet with 12 popular CNN- and transformer-based segmentation models, evaluating segmentation performance via a series of metrics, including Dice similarity coefficient (DSC), Jaccard index, Hausdorff distance 95th percentile (HD95) and average surface distance.
Contours generated by BCTVNet were closest to the ground truth, reaching a DSC of 83.24% and a HD95 (maximum distance from ground truth excluding the worst 5%) of 3.53 mm. BCTVNet consistently outperformed the other models across all evaluation metrics. It also demonstrated strong classification accuracy, with a precision of 82.10% and a recall of 85.84%, implying fewer false detections and successful capture of target regions.
To evaluate the model’s generalizability, the team conducted additional experiments on the public dataset SegTHOR, which contains 60 thoracic 3D CT scans (40 for training, 20 for testing) from patients with oesophageal cancer. Here again, BCTVNet achieved the best scores among all the segmentation models, with the highest average DSC of 87.09% and the lowest average HD95 of 7.39 mm.
“BCTVNet effectively overcomes key challenges in CTV segmentation and achieves superior performance compared to existing methods,” the team concludes. “The proposed approach provides an effective and reliable solution for automatic CTV delineation and can serve as a supportive tool in clinical workflows.”
Under pressure A researcher in Beijing operates an apparatus used to make 2D metals. (Courtesy: CAS IOP/Handout via Xinhua)
The Physics World 2025 Breakthrough of the Year is awarded to Guangyu Zhang, Luojun Du and colleagues at the Institute of Physics of the Chinese Academy of Sciences for producing the first 2D sheets of metal. The team produced five atomically thin 2D metals – bismuth, tin, lead, indium and gallium – with the thinnest being around 6.3 Å. The researchers say their work is just the “tip of the iceberg” and now aim to use their new materials to probe the fundamentals of physics. Their breakthrough could also lead to the development of new technologies.
Since the discovery of graphene – a sheet of carbon just one atom thick – in 2004, hundreds of other 2D materials have been fabricated and studied. In most of these, layers of covalently bonded atoms are separated by gaps where neighbouring layers are held together only by weak van der Waals (vdW) interactions, making it relatively easy to “shave off” single layers to make 2D sheets. Many thought that making atomically thin metals would be impossible given that each atom in a metal is strongly bonded to surrounding atoms in all directions.
The technique developed by Zhang, Du and colleagues involves heating powders of pure metals between two monolayer-MoS2/sapphire vdW anvils. Once the metal powders are melted into a droplet, the researchers applied a pressure of 200 MPa and continued this “vdW squeezing” until the opposite sides of the anvils cooled to room temperature and 2D sheets of metal were formed.
“Right now, we have reported five single element metals, but actually we can do more because of the 88 metals in the periodic table,” Zhang explains in today’s episode of the Physics World Weekly podcast. In the podcast, he also talks about the team’s motivation creating 2D metals and some of the possible technological applications of the materials.
The Breakthrough of the Year was chosen by the Physics World editorial team. We looked back at all the scientific discoveries we have reported on since 1 January and picked the most important. In addition to being reported in Physics World in 2025, the breakthrough must meet the following criteria:
Significant advance in knowledge or understanding
Importance of work for scientific progress and/or development of real-world applications
Of general interest to Physics World readers
Before we picked our winners, we released the Physics World Top 10 Breakthroughs for 2025, which served as our shortlist. The other nine breakthroughs are listed below in no particular order.
Analysing returned samples Tim McCoy (right), curator of meteorites at the Smithsonian’s National Museum of Natural History, and research geologist Cari Corrigan examine scanning electron microscope (SEM) images of a Bennu sample. (Courtesy: James Di Loreto, Smithsonian)
To Tim McCoy, Sara Russell, Danny Glavin, Jason Dworkin, Yoshihiro Furukawa, Ann Nguyen, Scott Sandford, Zack Gainsforth and an international team of collaborators for identifying salt, ammonia, sugar, nitrogen- and oxygen-rich organic materials, and traces of metal-rich supernova dust, in samples returned from the near-Earth asteroid 101955 Bennu. The incredible chemical richness of this asteroid, which NASA’s OSIRIS-REx spacecraft visited in 2020, lends support to the longstanding hypothesis that asteroid impacts could have “seeded” the early Earth with the raw ingredients needed for life to form. The discoveries also enhance our understanding of how Bennu and other objects in the solar system formed out of the disc of material that coalesced around the young Sun.
To Takamasa Momose of the University of British Columbia, Canada, and Susumu Kuma of the RIKEN Atomic, Molecular and Optical Physics Laboratory, Japan for observing superfluidity in a molecule for the first time. Molecular hydrogen is the simplest and lightest of all molecules, and theorists predicted that it would enter a superfluid state at a temperature between 1‒2 K. But this is well below the molecule’s freezing point of 13.8 K, so Momose, Kuma and colleagues first had to develop a way to keep the hydrogen in a liquid state. Once they did that, they then had to work out how to detect the onset of superfluidity. It took them nearly 20 years, but by confining clusters of hydrogen molecules inside helium nanodroplets, embedding a methane molecule within the clusters, and monitoring the methane’s rotation, they were finally able to do it. They now plan to study larger clusters of hydrogen, with the aim of exploring the boundary between classical and quantum behaviour in this system.
To researchers at the University of Southampton and Microsoft Azure Fiber in the UK, for developing a new type of optical fibre that reduces signal loss, boosts bandwidth and promises faster, greener communications. The team, led by Francesco Poletti, achieved this feat by replacing the glass core of a conventional fibre with air and using glass membranes that reflect light at certain frequencies back into the core to trap the light and keep it moving through the fibre’s hollow centre. Their results show that the hollow-core fibres exhibit 35% less attenuation than standard glass fibres – implying that fewer amplifiers would be needed in long cables – and increase transmission speeds by 45%. Microsoft has begun testing the new fibres in real systems, installing segments in its network and sending live traffic through them. These trials open the door to gradual rollout and Poletti suggests that the hollow-core fibres could one day replace existing undersea cables.
PAT pioneers The research team in the proton therapy gantry room. (Courtesy: UO Fisica Sanitaria and UO Protonterapia, APSS, Trento)
To Francesco Fracchiolla and colleagues at the Trento Proton Therapy Centre in Italy for delivering the first clinical treatments using proton arc therapy (PAT). Proton therapy – a precision cancer treatment – is usually performed using pencil-beam scanning to precisely paint the dose onto the tumour. But this approach can be limited by the small number of beam directions deliverable in an acceptable treatment time. PAT overcomes this by moving to an arc trajectory with protons delivered over a large number of beam angles and the potential to optimize the number of energies used for each beam direction. Working with researchers at RaySearch Laboratories in Sweden, the team performed successful dosimetric comparisons with clinical proton therapy plans. Following a feasibility test that confirmed the viability of clinical PAT delivery, the researchers used PAT to treat nine cancer patients. Importantly, all treatments were performed using the centre’s existing proton therapy system and clinical workflow.
To Peter Maurer and David Awschalom at the University of Chicago Pritzker School of Molecular Engineering and colleagues for designing a protein quantum bit (qubit) that can be produced directly inside living cells and used as a magnetic field sensor. While many of today’s quantum sensors are based on nitrogen–vacancy (NV) centres in diamond, they are large and hard to position inside living cells. Instead, the team used fluorescent proteins, which are just 3 nm in diameter and can be produced by cells at a desired location with atomic precision. These proteins possess similar optical and spin properties to those of NV centre-based qubits – namely that they have a metastable triplet state. The researchers used a near-infrared laser pulse to optically address a yellow fluorescent protein and read out its triplet spin state with up to 20% spin contrast. They then genetically modified the protein to be expressed in bacterial cells and measured signals with a contrast of up to 8%. They note that although this performance does not match that of NV quantum sensors, it could enable magnetic resonance measurements directly inside living cells, which NV centres cannot do.
To the team led by Yichao Zhang at the University of Maryland and Pinshane Huang of the University of Illinois at Urbana-Champaign for capturing the highest-resolution images ever taken of individual atoms in a material. The team used an electron-microscopy technique called electron ptychography to achieve a resolution of 15 pm, which is about 10 times smaller than the size of an atom. They studied a stack of two atomically-thin layers of tungsten diselenide, which were rotated relative to each other to create a moiré superlattice. These twisted 2D materials are of great interest to physicists because their electronic properties can change dramatically with small changes in rotation angle. The extraordinary resolution of their microscope allowed them to visualize collective vibrations in the material called moiré phasons. These are similar to phonons, but had never been observed directly until now. The team’s observations align with theoretical predictions for moiré phasons. Their microscopy technique should boost our understanding of the role that moiré phasons and other lattice vibrations play in the physics of solids. This could lead to the engineering of new and useful materials.
Exquisite control Physicist Barbara Latacz at the BASE experiment at CERN. (Courtesy: CERN)
To CERN’s BASE collaboration for being the first to perform coherent spin spectroscopy on a single antiproton – the antimatter counterpart of the proton. Their breakthrough is the most precise measurement yet of the antiproton’s magnetic properties, and could be used to test the Standard Model of particle physics. The experiment begins with the creation of high-energy antiprotons in an accelerator. These must be cooled (slowed down) to cryogenic temperatures without being lost to annihilation. Then, a single antiproton is held in an ultracold electromagnetic trap, where microwave pulses manipulate its spin state. The resulting resonance peak was 16 times narrower than previous measurements, enabling a significant leap in precision. This level of quantum control opens the door to highly sensitive comparisons of the properties of matter (protons) and antimatter (antiprotons). Unexpected differences could point to new physics beyond the Standard Model and may also reveal why there is much more matter than antimatter in the visible universe.
To Richard Allen, director of the Berkeley Seismological Laboratory at the University of California, Berkeley, and Google’s Marc Stogaitis and colleagues for creating a global network of Android smartphones that acts as an earthquake early warning system. Traditional early warning systems use networks of seismic sensors that rapidly detect earthquakes in areas close to the epicentre and issue warnings across the affected region. Building such seismic networks, however, is expensive, and many earthquake-prone regions do not have them. The researchers utilized the accelerometer in millions of phones in 98 countries to create the Android Earthquake Alert (AEA) system. Testing the app between 2021 and 2024 led to the detection of an average of 312 earthquakes a month, with magnitudes ranging from 1.9 to 7.8. For earthquakes of magnitude 4.5 or higher, the system sent “TakeAction” alerts to users, sending them, on average, 60 times per month for an average of 18 million individual alerts per month. The system also delivered lesser “BeAware” alerts to regions expected to experience a shaking intensity of magnitude 3 or 4. The team now aims to produce maps of ground shaking, which could assist the emergency response services following an earthquake.
To Lisa Nortmann at Germany’s University of Göttingen and colleagues for creating the first detailed “weather map” of an exoplanet. The forecast for exoplanet WASP-127b is brutal with winds reaching 33,000 km/hr, which is much faster than winds found anywhere in the Solar System. The WASP-127b is a gas giant located about 520 light–years from Earth and the team used the CRIRES+ instrument on the European Southern Observatory’s Very Large Telescope to observe the exoplanet as it transited across its star in less than 7 h. Spectral analysis of the starlight that filtered through WASP-127b’s atmosphere revealed Doppler shifts caused by supersonic equatorial winds. By analysing the range of Doppler shifts, the team created a rough weather map of WASP-127b, even though they could not resolve light coming from specific locations on the exoplanet. Nortmann and colleagues concluded that the exoplanet’s poles are cooler that the rest of WASP-127b, where temperatures can exceed 1000 °C. Water vapour was detected in the atmosphere, raising the possibility of exotic forms of rain.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
This episode of the Physics World Weekly podcast features Guangyu Zhang. Along with his colleagues at the Institute of Physics of the Chinese Academy of Sciences, Zhang has bagged the 2025 Physics World Breakthrough of the Year award for creating the first 2D metals.
In a wide-ranging conversation, we chat about the motivation behind the team’s research; the challenges in making 2D metals and how these were overcome; and how 2D metals could be used to boost our understanding of condensed matter physics and create new technologies.
I am also joined by my Physics World colleague Matin Durrani to talk about some of the exciting physics that we will be showcasing in 2025.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
Julia Sutcliffe (second from the left), Chief Scientific Advisor for the UK’s Department for Business and Trade, visits the NQCC’s experimental facilities on the Harwell Cluster (Courtesy: NQCC)
Ever since the National Quantum Computing Centre was launched five years ago, its core mission has been to accelerate the pathway towards practical adoption of the technology. That has required technical innovation to scale up hardware platforms and create the software tools and algorithms needed to tackle real-world applications, but there has also been a strong focus on engaging with companies to build connections, provide access to quantum resources, and identify opportunities for deriving near-term value from quantum computing.
It makes sense, then, that the NQCC should form the cornerstone of a new Quantum Cluster at the Harwell Campus of Science and Innovation in Oxfordshire. The hope is that the NQCC’s technical expertise and infrastructure, combined with the services and facilities available on the wider Harwell Campus, will provide a magnet for new quantum start-ups as well as overseas companies that are seeking to establish a presence within the UK’s quantum ecosystem.
By accelerating collaboration across government, industry and academia, we will turn research excellence into industrial strength.
“We want to leverage the public investment that has been made into the NQCC to catalyse business growth and attract more investment into the UK’s quantum sector,” said Najwa Sidqi, manager of the Harwell Quantum Cluster, at the official launch event in November. “By accelerating collaboration across government, industry and academia, we will turn research excellence into industrial strength.”
The cluster, which has been ramping up its activities over the last year, is working to ambitious targets. Over the next decade the aim is to incubate at least 100 quantum companies on the Harwell site, create more than 1000 skilled jobs, and generate more than £1 billion of private and public investment. “Our aim is to build the foundations of a globally competitive quantum economy that delivers impact far beyond science and research,” added Sidqi.
Tangible evidence that the approach works is offered by the previous clustering activities on the Harwell Campus, notably the Space Cluster that has expanded rapidly since its launch in 2010. Anchored by the RAL Space national laboratory and bolstered by the presence of ESA and the UK Space Agency, the Space Cluster now comprises more than 100 organizations that range from small start-ups to the UK technology hubs of global heavyweights such as Airbus and Lockheed Martin.
More generally, the survival rate of start-up companies operating on the Harwell site is around 95%, compared with an average of around 50%. “At Harwell there is a high density of innovators sharing the same space, which generates more connections and more ideas,” said Julia Sutcliffe, Chief Scientific Advisor for the UK’s Department for Business and Trade. “It provides an incredible combination of world-class infrastructure and expertise, accelerating the innovation pathway and helping to create a low-risk environment for early-stage businesses and investors.”
The NQCC has already seeded that innovation activity through its early engagement with both quantum companies and end users of the technology. One major initiative has been the testbed programme, which has enabled the NQCC to invest £30m in seven hardware companies to deploy prototype quantum computers on the Harwell Campus. As well as providing access to operational systems based on all of the leading qubit modalities, the testbed programme has also provided an impetus for inward investment and job creation.
One clear example is provided by QuEra Computing, a US-based spin-off from Harvard University and the Massachusetts Institute of Technology that is developing a hardware platform based on neutral atoms. QuEra was one of the companies to win funding through the testbed programme, with the firm setting up a UK-based team to deploy its prototype system on the Harwell Campus. But the company could soon see the benefits of establishing a UK centre for technology development on the site. “Harwell is immensely helpful to us,” said Ed Durking, Corporate Director of QuEra Computing UK. “It’s a nucleus where we enjoy access to world-class talent, vendors, customers, and suppliers.”
On a practical level, establishing its UK headquarters on the Harwell Campus has provided QueEra with easy access to specialist contractors and services for fitting out and its laboratories. In June the company moved into a building that is fully equipped with flexible lab space for R&D and manufacturing, and since then the UK-based team has started to build the company’s most powerful quantum computer at the facility. Longer term, establishing a base within the UK could open the door to new collaborations and funding opportunities for QuEra to further develop its technology, with the company now focused on integrating full error correction into its neutral-atom platform by 2026.
Access to the world-class infrastructure on the Harwell Campus has benefitted the other testbed providers in different ways. For ORCA Computing, a UK company developing and manufacturing photonic quantum computers, the goal was to install a testbed system within Harwell’s high-performance computing centre rather than the NQCC’s experimental labs. “Our focus is to build commercial photonic quantum systems that can be integrated into conventional datacentres, enabling hybrid quantum-classical workflows for real-world applications,” explained Geoff Barnes, Head of Customer Success at ORCA. “Having the NQCC as an expert customer enabled us to demonstrate and validate our capabilities, building the system in our own facility and then deploying it within an operational environment.”
This process provided a valuable learning experience for the ORCA engineers. The experts at Harwell helped them to navigate the constraints of installing equipment within a live datacentre, while also providing practical assistance with the networking infrastructure. Now that the system is up and running, the Harwell site also provides ORCA with an open environment for showcasing its technology to prospective customers. “As well as delivering a testbed system to the NQCC, we can now demonstrate our platform to clients within a real-world setting,” added Barnes. “It has also been a critical step toward commercial deployment on our roadmap, enabling our partners to access our systems remotely for applications development.”
Michael Cuthbert (left), Director of the NQCC, takes Sutcliffe and other visitors on a tour of the national lab (Courtesy: NQCC)
While the NQCC has already played a vital role in supporting companies as they make the transition towards commercialization, the Quantum Cluster has a wider remit to extend those efforts into other quantum technologies, such as sensing and communications, that are already finding real-world applications. It will also have a more specific focus on attracting new investment into the UK, and supporting the growth of companies that are transitioning from the start-up phase to establish larger scale commercial operations.
“In the UK we have traditionally been successful in creating spin-off activities from our strong research base, but it has been more challenging to generate the large capital investments needed to scale businesses in the technology sector,” commented Sidqi. “We want to strengthen that pipeline to ensure that the UK can translate its leadership in quantum research and early-stage innovation into long-term prosperity.”
To accelerate that process the Quantum Cluster announced a strategic partnership with Quantum Exponential, the first UK-based venture capital fund to be entirely focused on quantum technologies. Ian Pearson, the non-executive chairman of the Quantum Exponential, explained that the company is working to generate an investment fund of £100m by the end of 2027 that will support quantum companies as they commercialize their technologies and scale up their businesses. “Now is the time for investment into quantum sector,” said Pearson. “A specialist quantum fund with the expertise needed to analyse and price deals, and to do all the necessary due diligence, will attract more private investment that will help UK companies to grow and scale.”
Around two-thirds of the investments will be directed towards UK-based companies, and as part of the partnership Quantum Exponential will work with the Quantum Cluster to identify and support high-potential quantum businesses within the Harwell Campus. The Quantum Cluster will also play a crucial role in boosting investor confidence – particularly in the unique ability of the Harwell Campus to nurture successful technology businesses – and making connections with international innovation networks to provide UK-based companies with improved access to global markets.
“This new cluster strengthens our national capability and sends a clear signal to global investors that the UK is the place to develop and scale quantum technologies,” commented Michael Cuthbert, Director of the NQCC. “It will help to ensure that quantum innovation delivers benefits not just for science and industry, but for the economy and society as a whole.”
An aerogel material that is more than 99% transparent to light and is an excellent thermal insulator has been developed by Ivan Smalyukh and colleagues at the University of Colorado Boulder in the US. Called MOCHI, the material can be manufactured in large slabs and could herald a major advance in energy-efficient windows.
While the insulating properties of building materials have steadily improved over the past decades, windows have consistently lagged behind. The problem is that current materials used in windows – mostly glass – have an inherent trade-off between insulating ability and optical transparency. This is addressed to some extent by using two or three layers of glass in double- and triple-glazed windows. However, windows remain the largest source of heat loss from most buildings.
A solution to the window problem could lie with aerogels in which the liquid component of a regular gel is replaced with air. This creates solid materials with networks of pores that make aerogels the lightest solid materials ever produced. If the solid component is a poor conductor of heat, then the aerogel will be an extremely good thermal insulator.
“Conventional aerogels, like the silica and cellulose based ones, are common candidates for transparent, thermally insulating materials,” Smalyukh explains. “However, their visible-range optical transparency is intrinsically limited by the scattering induced by their polydisperse pores – which can range from nanometres to micrometres in scale.”
Hazy appearance
While this problem can be overcome fairly easily in thin aerogel films, creating appropriately-sized pores on the scale of practical windows has so far proven much more difficult, leading to a hazy, translucent appearance.
Now, Smalyukh’s team has developed a new fabrication technique involving a removable template. Their approach hinges on the tendency of surfactant molecules called CPCL to self-assemble in water. Under carefully controlled conditions, the molecules spontaneously form networks of cylindrical tubes, called micelles. Once assembled, the aerogel precursor – a silicone material called polysiloxane – condenses around the micelles, freezing their structure in place.
“The ensuing networks of micelle-templated polysiloxane tubes could be then preserved upon the removal of surfactant, and replacing the fluid solvent with air,” Smalyukh describes. The end result was a consistent mesoporous structure, with pores ranging from 2–50 nm in diameter. This is too small to scatter visible light, but large enough to interfere with heat transport.
As a result, the mesoporous, optically clear heat insulator (MOCHI) maintains its transparency even when fabricated in slabs over 3 cm thick and a square metre in area. This suggests that it could be used to create practical windows.
High thermal performance
“We demonstrated thermal conductivity lower than that of still air, as well as an average light transmission above 99%,” Smalyukh says. “Therefore, MOCHI glass units can provide a similar rate of heat transfer to high-performing building roofs and walls, with thicknesses comparable to double pane windows.”
If rolled out on commercial scales, this could lead to entirely new ways to manage interior heating and cooling. According to the team’s calculations, a building retrofitted with MOCHI windows could boost its energy efficiency from around 6% (a typical value in current buildings) to over 30%, while reducing the heat energy passing through by around 50%.
With its ability to admit light while blocking heat transport, the researchers suggest that MOCHI could unlock entirely new functionalities for conventional windows. “Such transparent insulation also allows for efficient harnessing of thermal energy from unconcentrated solar radiation in different climate zones, promising the use of parts of opaque building envelopes as solar thermal energy generating panels,” Smalyukh adds.
As the world looks to incorporate more renewables into energy grids, centuries-old systems that can balance supply and demand are being reappraised and innovated upon.
Errors are the bugbear of quantum computing, and they’re hard to avoid. While quantum computers derive their computational clout from the fact that their qubits can simultaneously court multiple values, the fragility of qubit states ramps up their error rates. Many research groups are therefore seeking to reduce or manage errors so they can increase the number of qubits without reducing the whole enterprise to gibberish.
A team at the US-based firm Atom Computing is now reporting substantial success in this area thanks to a multi-part strategy for keeping large numbers of qubits operational in quantum processors based on neutral atoms. “These capabilities allow for the execution of more complex, longer circuits that are not possible without them,” says Matt Norcia, one of the Atom Computing researchers behind this work.
While neutral atoms offer several advantages over other qubit types, they traditionally have significant drawbacks for one of the most common approaches to error correction. In this approach, some of the entangled qubits are set aside as so-called “ancillaries”, used for mid-circuit measurements that can indicate how a computation is going and what error correction interventions may be necessary.
In neutral-atom quantum computing, however, such interventions are generally destructive. Atoms that are not in their designated state are simply binned off – a profligate approach that makes it challenging to scale up atom-based computers. The tendency to discard atoms is particularly awkward because the traps that confine atoms are already prone to losing atoms, which introduces additional errors while reducing the number of atoms available for computations.
Reduce, re-use, replenish
As well as demonstrating protocols for performing measurements to detect errors in quantum circuits with little atom loss, the researchers at Atom Computing also showed they could re-use ancillary atoms – a double-pronged way of retaining more atoms for calculations. In addition, they demonstrated that they could replenish the register of atoms for the computation from a spatially separated stash in a magneto-optic trap without compromising the quantum state of the atoms already in the register.
Norcia says that these achievements — replacing atoms from a continuous source, while reducing the number of atoms needing replacement to begin with — are key to running computations without running out of atoms. “To our knowledge, any useful quantum computations will require the execution of many layers of gates, which will not be possible unless the atom number can be maintained at a steady-state level throughout the computation,” he tells Physics World.
Cool and spaced out
Norcia and his collaborators at Microsoft Quantum, the Colorado School of Mines and Stanford University worked with ytterbium (Yb) atoms, which he describes as “natural qubits” since they have two ground states. A further advantage is that the transitions between these qubit states and other states used for imaging and cooling are weak, meaning the researchers could couple just one qubit state to these other states at a time. The team also leveraged a previously-developed approach for mid-circuit measurement that scatters light from only one qubit state and does not disturb the other, making it less destructive.
Still, Norcia tells Physics World, “the challenge was to re-use atoms, and key to this was cooling and performance.” To this end, they first had to shift the atoms undergoing mid-circuit measurements away from the atoms in the computational register, to avoid scattering laser light off the latter. They further avoided laser-related collateral damage by designing the register such that the measurement and cooling light was not at the resonant wavelength of the register atoms. Next, they demonstrated they could cool already-measured atoms for re-use in the calculation. Finally, they showed they could non-disruptively replenish these atoms with others from a magneto-optical trap positioned 300 nm below the tweezer arrays that held atoms for the computational register.
Mikhail Lukin, a physicist at Harvard University, US who has also worked on the challenges of atom loss and re-use in scalable, fault-tolerant neutral atom computing, has likewise recently reported successful atom re-use and diminished atom loss. Although Lukin’s work differs from that of the Atom Computing team in various ways – using rubidium instead of ytterbium atoms and a different approach for low atom loss mid-circuit measurements, for starters – he says that the work by Norcia and his team “represents an important technical advance for the Yb quantum computing platform, complementing major progress in the neutral atom quantum computing community in 2025”.
The latest episode of Physics World Stories takes you inside CUWiP+, the Conference for Undergraduate Women and Non-Binary Physicists, and the role the annual event plays in shaping early experiences of studying physics.
The episode features June McCombie from the University of Nottingham, who discusses what happens at CUWiP+ events and why they are so important for improving the retention of women and non-binary students in STEM. She reflects on how the conferences create space for students to explore career paths, build confidence and see themselves as part of the physics community.
Reflections and tips from CUWiP+ 2025
University of Birmingham students Tanshpreet Kaur and Harriett McCormick share their experiences of attending the 2025 CUWiP+ event at the University of Warwick and explain why they are excited for the next event, set for Birmingham, 19–22 March 2026. They describe standout moments from 2025, including being starstruck at meeting Dame Jocelyn Bell Burnell, who discovered radio pulsars in 1967.
The episode provides practical advice to get the most out of the event. Organizers design the programme to cater for all personalities – whether you thrive in lively, social situations, or prefer time to step back and reflect. Either way, CUWiP+ offers opportunities to be inspired and to make meaningful connections.
Hosted by Andrew Glester, the episode highlights how shared experiences and supportive networks can balance the often-solitary nature of studying physics, especially when you feel excluded from the majority group.
Quantum physics, kung-fu, LEGO and singing are probably not things you would normally put together. But that’s exactly what happened at this year’s Quantum Carousel.
The event is a free variety show where incredible performers from across academia and industry converge for an evening of science communication. Held in Bristol, UK, on 14 November 2025, this was the second year the event was run – and once again it was entirely sold out.
As organizers, our goal was to bring together those involved in quantum and adjacent fields for an evening of learning and laughter. Each act was only seven minutes long and audience participation was encouraged, with questions saved for the dinner and drinks intervals.
All together now Speakers at Quantum Carousel 2025, which was organized by Zulekha Samiullah (second from right) and Hugh Barrett (far right). (Courtesy: Yolan Ankaine)
The evening kicked off with a rousing speech and song from Chris Stewart, motivating the promotion of science communication and understanding. Felix Flicker related electron spin rotations to armlocks, with a terrific demonstration on volunteer Tony Short, while Michael Berry entertained us all with his eye-opening talk on how quantum physics has democratized music.
PhD student double act Eesa Ali and Sebastien Bisdee then welcomed volunteers to the stage to see who could align a laser fastest. Maria Violaris expertly taught us the fundamentals of quantum error correction using LEGO.
Mike Shubrook explained the quantum thermodynamics of beer through stand-up comedy. And finally, John Rarity and his assistant Hugh Barrett (event co-organizer and co-author of this article) rounded off the night by demonstrating the magic of entanglement.
Our event sponsors introduced the food and drinks portions of the evening, with Antonia Seymour (chief executive of IOP Publishing) and Matin Durrani (editor-in-chief of Physics World)opening the dinner interval, while Josh Silverstone (founder and chief executive of Hartley Ultrafast) kickstarted the networking drinks reception.
Singing praises
Whether it was singing along to an acoustic guitar or rotating hands to emulate electron spin, everyone got involved, and feedback cited audience participation as a highlight.
“The event ran very smoothly, it was lots of fun and a great chance to network in a relaxed atmosphere,” said one attendee. Another added: “The atmosphere was really fun, and it was a really nice event to get loads of the quantum community together in an enjoyable setting.”
Appreciation of the atmosphere went both ways, with one speaker saying that their favourite part of the night was that “the audience was very inviting and easy to perform to”.
Audience members also enjoyed developing a better understanding of the science that drives their industry. “I understood it and I don’t have any background in physics,” said one attendee. “I feel a marker of being a good scientist is being able to explain it in layperson’s terms.”
Reaching out
With the quantum community rapidly expanding, it needs people from a wide range of backgrounds such as computer science, engineering and business. Quantum Carousel was designed to strike a balance between high-level academic discussion and entertainment through entry-level talks, such as explaining error correction with props, or relating research to impact from stimulated emission to CDs.
By focusing on real-world analogies, these talks can help newcomers to develop an intuitive and memorable understanding. Meanwhile, those already in the field can equip themselves with new ways of communicating elements of their research.
We look forward to hosting Quantum Carousel again in the future. We want to make it bigger and better, with an even greater range of diverse acts.
But if you’re interested in organizing a similar outreach event of your own, it helps to consider how you can create an environment that can best spark connections between both speakers and attendees. Consider your audience and how your event can attract different people for different reasons. In our case, this included the chance to network, engage with the performances, and enjoy the food and drink.
Quantum Carousel was founded by Zulekha Samiullah in 2024, and she and Hugh Barrett now co-lead the event. Quantum Carousel 2025 was sponsored by the QE-CDT, IOP Publishing and Hartley Ultrafast.