↩ Accueil

Vue normale

Reçu aujourd’hui — 15 octobre 2025 6.5 📰 Sciences English

Beyond Gravity plans US push after fivefold boost in European solar mechanism output

15 octobre 2025 à 14:00

Beyond Gravity is weighing expanding solar array drive mechanism production in Florida to support Golden Dome and other U.S. space projects, after doubling manufacturing space in Europe for hardware that keeps satellites pointed toward the sun.

The post Beyond Gravity plans US push after fivefold boost in European solar mechanism output appeared first on SpaceNews.

Polish space company Scanway Space secures U.S., European deals amid international expansion drive

15 octobre 2025 à 12:00
Optical systems built by Scanway deployed on YPSat. Credit: ESA

WARSAW — The Polish optical systems manufacturer Scanway Space has secured its first order from an American company, in this case from Intuitive Machines for a multispectral telescope instrument to map the moon’s surface. Scanway CEO Jędrzej Kowalewski told SpaceNews the optical instrument, set to be launched in 2026, will allow Intuitive Machines to search […]

The post Polish space company Scanway Space secures U.S., European deals amid international expansion drive appeared first on SpaceNews.

Evo CT-Linac eases access to online adaptive radiation therapy

15 octobre 2025 à 14:15

Adaptive radiation therapy (ART) is a personalized cancer treatment in which a patient’s treatment plan can be updated throughout their radiotherapy course to account for any anatomical variations – either between fractions (offline ART) or immediately prior to dose delivery (online ART). Using high-fidelity images to enable precision tumour targeting, ART improves outcomes while reducing side effects by minimizing healthy tissue dose.

Elekta, the company behind the Unity MR-Linac, believes that in time, all radiation treatments will incorporate ART as standard. Towards this goal, it brings its broad knowledge base from the MR-Linac to the new Elekta Evo, a next-generation CT-Linac designed to improve access to ART. Evo incorporates AI-enhanced cone-beam CT (CBCT), known as Iris, to provide high-definition imaging, while its Elekta ONE Online software automates the entire workflow, including auto-contouring, plan adaptation and end-to-end quality assurance.

A world first

In February of this year, Matthias Lampe and his team at the private centre DTZ Radiotherapy in Berlin, Germany became the first in the world to treat patients with online ART (delivering daily plan updates while the patient is on the treatment couch) using Evo. “To provide proper tumour control you must be sure to hit the target – for that, you need online ART,” Lampe tells Physics World.

The team at DTZ Radiotherapy
Initiating online ART The team at DTZ Radiotherapy in Berlin treated the first patient in the world using Evo. (Courtesy: Elekta)

The ability to visualize and adapt to daily anatomy enables reduction of the planning target volume, increasing safety for nearby organs-at-risk (OARs). “It is highly beneficial for all treatments in the abdomen and pelvis,” says Lampe. “My patients with prostate cancer report hardly any side effects.”

Lampe selected Evo to exploit the full flexibility of its C-arm design. He notes that for the increasingly prevalent hypofractionated treatments, a C-arm configuration is essential. “CT-based treatment planning and AI contouring opened up a new world for radiation oncologists,” he explains. “When Elekta designed Evo, they enabled this in an achievable way with an extremely reliable machine. The C-arm linac is the primary workhorse in radiotherapy, so you have the best of everything.”

Time considerations

While online ART can take longer than conventional treatments, Evo’s use of automation and AI limits the additional time requirement to just five minutes – increasing the overall workflow from 12 to 17 minutes and remaining within the clinic’s standard time slots.

Patient being set up on an Elekta treatment system
Elekta Evo Evo is a next-generation CT-Linac designed to improve access to adaptive radiotherapy. (Courtesy: Elekta)

The workflow begins with patient positioning and CBCT imaging, with Evo’s AI-enhanced Iris imaging significantly improving image quality, crucial when performing ART. The radiation therapist then matches the cone-beam and planning CTs and performs any necessary couch shift.

Simultaneously, Elekta ONE Online performs AI auto-contouring of OARs, which are reviewed by the physician, and the target volume is copied in. The physicist then simulates the dose distribution on the new contours, followed by a plan review. “Then you can decide whether to adapt or not,” says Lampe. “This is an outstanding feature.” The final stage is treatment delivery and online dosimetry.

When DTZ Berlin first began clinical treatments with Evo, some of Lampe’s colleagues were apprehensive as they were attached to the conventional workflow. “But now, with CBCT providing the chance to see what will be treated, every doctor on my team has embraced the shift and wouldn’t go back,” he says.

The first treatments were for prostate cancer, a common indication that’s relatively easy to treat. “I also thought that if the Elekta ONE workflow struggled, I could contour this on my own in a minute,” says Lampe. “But this was never necessary, the process is very solid. Now we also treat prostate cancer patients with lymph node metastases and those with relapse after radiotherapy. It’s a real success story.”

Lampe says that older and frailer patients may benefit the most from online ART, pointing out that while published studies often include relatively young, healthy patients, “our patients are old, they have chronic heart disease, they’re short of breath”.

For prostate cancer, for example, patients are instructed to arrive with a full bladder and an empty rectum. “But if a patient is in his eighties, he may not be able to do this and the volumes will be different every day,” Lampe explains. “With online adaptive, you can tell patients: ‘if this is not possible, we will handle it, don’t stress yourself’. They are very thankful.”

Making ART available to all

At UMC Utrecht in the Netherlands, the radiotherapy team has also added CT-Linac online adaptive to its clinical toolkit.

UMC Utrecht is renowned for its development of MR-guided radiotherapy, with physicists Bas Raaymakers and Jan Lagendijk pioneering the development of a hybrid MR-Linac. “We come from the world of MR-guidance, so we know that ART makes sense,” says Raaymakers. “But if we only offer MR-guided radiotherapy, we miss out on a lot of patients. We wanted to bring it to the wider community.”

The radiotherapy team at UMC Utrecht
ART for all The radiotherapy team at UMC Utrecht in the Netherlands has added CT-Linac online adaptive to its clinical toolkit. (Courtesy: UMC Utrecht)

At the time of speaking to Physics World, the team was treating its second patient with CBCT-guided ART, and had delivered about 30 fractions. Both patients were treated for bladder cancer, with future indications to explore including prostate, lung and breast cancers and bone metastases.

“We believe in ART for all patients,” says medical physicist Anette Houweling. “If you have MR and CT, you should be able to choose the optimal treatment modality based on image quality. For below the diaphragm, this is probably MR, while for the thorax, CT might be better.”

Ten minute target for OART

Houweling says that ART delivery has taken 19 minutes on average. “We record the CBCT, perform image fusion and then the table is moved, that’s all standard,” she explains. “Then the adaptive part comes in: delineation on the CBCT and creating a new plan with Elekta ONE Planning as part of Elekta One Online.”

The plan adaptation, when selected to perform, takes roughly four minutes to create a clinical-grade volumetric-modulated arc therapy (VMAT) plan. With the soon to be installed next-generation optimizer, it is expected to take less than one minute to generate a VMAT plan.

“As you start with the regular workflow, you can still decide not to choose adaptive treatment, and do a simple couch shift, up until the last second,” says Raaymakers. It’s very close to the existing workflow, which makes adoption easier. Also, the treatment slots are comparable to standard slots. Now with CBCT it takes 19 minutes and we believe we can get towards 10. That’s one of the drivers for cone-beam adaptive.”

Shorter treatment times will impact the decision as to which patients receive ART. If fully automated adaptive treatment is deliverable in a 10-minute time slot, it could be available to all patients. “From the physics side, our goal is to have no technological limitations to delivering ART. Then it’s up to the radiation oncologists to decide which patients might benefit,” Raaymakers explains.

Future gazing

Looking to the future, Raaymakers predicts that simulation-free radiotherapy will be adopted for certain standard treatments. “Why do you need days of preparation if you can condense the whole process to the moment when the patient is on the table,” he says. “That would be very much helped by online ART.”

“Scroll forward a few years and I expect that ART will be automated and fast such that the user will just sign off the autocontours and plan in one, maybe tune a little, and then go ahead,” adds Houweling. “That will be the ultimate goal of ART. Then there’s no reason to perform radiotherapy the traditional way.”

The post Evo CT-Linac eases access to online adaptive radiation therapy appeared first on Physics World.

Jesper Grimstrup’s The Ant Mill: could his anti-string-theory rant do string theorists a favour?

15 octobre 2025 à 12:00

Imagine you had a bad breakup in college. Your ex-partner is furious and self-publishes a book that names you in its title. You’re so humiliated that you only dimly remember this ex, though the book’s details and anecdotes ring true.

According to the book, you used to be inventive, perceptive and dashing. Then you started hanging out with the wrong crowd, and became competitive, self-involved and incapable of true friendship. Your ex struggles to turn you around; failing, they leave. The book, though, is so over-the-top that by the end you stop cringing and find it a hoot.

That’s how I think most Physics World readers will react to The Ant Mill: How Theoretical High-energy Physics Descended into Groupthink, Tribalism and Mass Production of Research. Its author and self-publisher is the Danish mathematician-physicist Jesper Grimstrup, whose previous book was Shell Beach: the Search for the Final Theory.

After receiving his PhD in theoretical physics at the Technical University of Vienna in 2002, Grimstrup writes, he was “one of the young rebels” embarking on “a completely unexplored area” of theoretical physics, combining elements of loop quantum gravity and noncommutative geometry. But there followed a decade of rejected articles and lack of opportunities.

Grimstrup became “disillusioned, disheartened, and indignant” and in 2012 left the field, selling his flat in Copenhagen to finance his work. Grimstrup says he is now a “self-employed researcher and writer” who lives somewhere near the Danish capital. You can support him either through Ko-fi or Paypal.

Fomenting fear

The Ant Mill opens with a copy of the first page of the letter that Grimstrup’s fellow Dane Niels Bohr sent in 1917 to the University of Copenhagen successfully requesting a four-storey building for his physics institute. Grimstrup juxtaposes this incident with the rejection of his funding request, almost a century later, by the Danish Council for Independent Research.

Today, he writes, theoretical physics faces a situation “like the one it faced at the time of Niels Bohr”, but structural and cultural factors have severely hampered it, making it impossible to pursue promising new ideas. These include Grimstrup’s own “quantum holonomy theory, which is a candidate for a fundamental theory”. The Ant Mill is his diagnosis of how this came about.

The Standard Model of particle physics, according to Grimstrup, is dominated by influential groups that squeeze out other approaches.

A major culprit, in Grimstrup’s eyes, was the Standard Model of particle physics. That completed a structure for which theorists were trained to be architects and should have led to the flourishing of a new crop of theoretical ideas. But it had the opposite effect. The field, according to Grimstrup, is now dominated by influential groups that squeeze out other approaches.

The biggest and most powerful is string theory, with loop quantum gravity its chief rival. Neither member of the coterie can make testable predictions, yet because they control jobs, publications and grants they intimidate young researchers and create what Grimstrup calls an “undercurrent of fear”. (I leave assessment of this claim to young theorists.)

Half the chapters begin with an anecdote in which Grimstrup describes an instance of rejection by a colleague, editor or funding agency. In the book’s longest chapter Grimstrup talks about his various rejections – by the Carlsberg Foundation, The European Physics Journal C, International Journal of Modern Physics A, Classical and Quantum Gravity, Reports on Mathematical Physics, Journal of Geometry and Physics, and the Journal of Noncommutative Geometry.

Grimstrup says that the reviewers and editors of these journals told him that his papers variously lacked concrete physical results, were exercises in mathematics, seemed the same as other papers, or lacked “relevance and significance”. Grimstrup sees this as the coterie’s handiwork, for such journals are full of string theory papers open to the same criticism.

“Science is many things,” Grimstrup writes at the end. “[S]imultaneously boring and scary, it is both Indiana Jones and anonymous bureaucrats, and it is precisely this diversity that is missing in the modern version of science”. What the field needs is “courage…hunger…ambition…unwillingness to compromise…anarchy.

Grimstrup hopes that his book will have an impact, helping to inspire young researchers to revolt, and to make all the scientific bureaucrats and apparatchiks and bookkeepers and accountants “wake up and remember who they truly are”.

The critical point

The Ant Mill is an example of what I have called “rant literature” or rant-lit. Evangelical, convinced that exposing truth will make sinners come to their senses and change their evil ways, rant lit can be fun to read, for it is passionate and full of florid metaphors.

Theoretical physicists, Grimstrup writes, have become “obedient idiots” and “technicians”. He slams theoretical physics for becoming a “kingdom”, a “cult”, a “hamster wheel”, and “ant mill”, in which the ants march around in a pre-programmed “death spiral”.

Grimstrup hammers away at theories lacking falsifiability, but his vehemence invites you to ask: “Is falsifiability really the sole criterion for deciding whether to accept or fail to pursue a theory?”

An attentive reader, however, may come away with a different lesson. Grimstrup calls falsifiability the “crown jewel of the natural sciences” and hammers away at theories lacking it. But his vehemence invites you to ask: “Is falsifiability really the sole criterion for deciding whether to accept or fail to pursue a theory?”

In his 2013 book String Theory and the Scientific Method, for instance, the Stockholm University philosopher of science Richard Dawid suggested rescuing the scientific status of string theory by adding such non-empirical criteria to evaluating theories as clarity, coherence and lack of alternatives. It’s an approach that both rescues the formalistic approach to the scientific method and undermines it.

Dawid, you see, is making the formalism follow the practice rather than the other way around. In other words, he is able to reformulate how we make theories because he already knows how theorizing works – not because he only truly knows what it is to theorize after he gets the formalism right.

Grimstrup’s rant, too, might remind you of the birth of the Yang–Mills theory in 1954. Developed by Chen Ning Yang and Robert Mills, it was a theory of nuclear binding that integrated much of what was known about elementary particle theory but implied the existence of massless force-carrying particles that then were known not to exist. In fact, at one seminar Wolfgang Pauli unleashed a tirade against Yang for proposing so obviously flawed a theory.

The theory, however, became central to theoretical physics two decades later, after theorists learned more about the structure of the world. The Yang-Mills story, in other words, reveals that theory-making does not always conform to formal strictures and does not always require a testable prediction. Sometimes it just articulates the best way to make sense of the world apart from proof or evidence.

The lesson I draw is that becoming the target of a rant might not always make you feel repentant and ashamed. It might inspire you into deep reflection on who you are in a way that is insightful and vindicating. It might even make you more rather than less confident about why you’re doing what you’re doing

Your ex, of course, would be horrified.

The post Jesper Grimstrup’s <em>The Ant Mill</em>: could his anti-string-theory rant do string theorists a favour? appeared first on Physics World.

Further evidence for evolving dark energy?

15 octobre 2025 à 11:34

The term dark energy, first used in 1998, is a proposed form of energy that affects the universe on the largest scales. Its primary effect is to drive the accelerating expansion of the universe – an observation that was awarded the 2011 Nobel Prize in Physics.

Dark energy is now a well established concept and forms a key part of the standard model of Big Bang cosmology, the Lambda-CDM model.

The trouble is, we’ve never really been able to explain exactly what dark energy is, or why it has the value that it does.

Even worse, new data acquired by cutting-edge telescopes have suggested that dark energy might not even exist as we had imagined it.

This is where the new work by Mukherjee and Sen comes in. They combined two of these datasets, while making as few assumptions as possible, to understand what’s going on.

The first of these datasets came from baryon acoustic oscillations. These are patterns in the distribution of matter in the universe, created by sound waves in the early universe.

The second dataset is based on a survey of supernovae data from the last 5 years. Both sets of data can be used to track the expansion history of the universe by measuring distances at different snapshots in time.

The team’s results are in tension with the Lambda-CDM model at low redshifts. Put simply, the results disagree with the current model at recent times. This provides further evidence for the idea that dark energy, previously considered to have a constant value, is evolving over time.

Evolving dark energy
The tension in the expansion rate is most evident at low redshifts (Courtesy: P. Mukherjee)

The is far from the end of the story with dark energy. New observational data, and new analyses such as this one are urgently required to provide a clearer picture.

However, where there’s uncertainty, there’s opportunity. Understanding dark energy could hold the key to understanding quantum gravity, the Big Bang and the ultimate fate of the universe.

 

 

 

The post Further evidence for evolving dark energy? appeared first on Physics World.

Searching for dark matter particles

15 octobre 2025 à 11:34

Dark matter is hypothesised form of matter that does not emit, absorb, or reflect light, making it invisible to electromagnetic observations. Although we have never detected it, its existence is inferred from its gravitational effects on visible matter and the large-scale structure of the universe.

The Standard Model of particle physics does not contain any dark matter particles but there have been several proposed extensions of how they might be included. Several of these are very low mass particles such as the axion or the sterile neutrino.

Detecting these hypothesised particles is very challenging, however, due to the extreme sensitivity required.

Electromagnetic resonant systems, such as cavities and LC circuits, are widely used for this purpose, as well as to detect high-frequency gravitational waves.

When an external signal matches one of these systems’ resonant frequencies, the system responds with a large amplitude, making the signal possible to detect. However, there is always a trade-off between the sensitivity of the detector and the range of frequencies it is able to detect (its bandwidth).

A natural way to overcome this compromise is to consider multi-mode resonators, which can be viewed as coupled networks of harmonic oscillators. Their scan efficiency can be significantly enhanced beyond the standard quantum limit of simple single-mode resonators.

In a recent paper, the researchers demonstrated how multi-mode resonators can achieve the advantages of both sensitive and broadband detection. By connecting adjacent modes inside the resonant cavity, and  tuning these interactions to comparable magnitudes, off-resonant (i.e. unwanted) frequency shifts are effectively cancelled increasing the overall response of the system.

Their method allows us to search for these elusive dark matter particles in a faster, more efficient way.

Dark matter detection circuit
A multi-mode detector design, where the first mode couples to dark matter and the last mode is read out (Courtesy: Y. Chen)

The post Searching for dark matter particles appeared first on Physics World.

Physicists explain why some fast-moving droplets stick to hydrophobic surfaces

15 octobre 2025 à 10:00

What happens when a microscopic drop of water lands on a water-repelling surface? The answer is important for many everyday situations, including pesticides being sprayed on crops and the spread of disease-causing aerosols. Naively, one might expect it to depend on the droplet’s speed, with faster-moving droplets bouncing off the surface and slower ones sticking to it. However, according to new experiments, theoretical work and simulations by researchers in the UK and the Netherlands, it’s more complicated than that.

“If the droplet moves too slowly, it sticks,” explains Jamie McLauchlan, a PhD student at the University of Bath, UK who led the new research effort with Bath’s Adam Squires and Anton Souslov of the University of Cambridge. “Too fast, and it sticks again. Only in between is bouncing possible, where there is enough momentum to detach from the surface but not so much that it collapses back onto it.”

As well as this new velocity-dependent condition, the researchers also discovered a size effect in which droplets that are too small cannot bounce, no matter what their speed. This size limit, they say, is set by the droplets’ viscosity, which prevents the tiniest droplets from leaving the surface once they land on it.

Smaller-sized, faster-moving droplets

While academic researchers and industrialists have long studied single-droplet impacts, McLauchlan says that much of this earlier work focused on millimetre-sized drops that took place on millisecond timescales. “We wanted to push this knowledge to smaller sizes of micrometre droplets and faster speeds, where higher surface-to-volume ratios make interfacial effects critical,” he says. “We were motivated even further during the COVID-19 pandemic, when studying how small airborne respiratory droplets interact with surfaces became a significant concern.”

Working at such small sizes and fast timescales is no easy task, however. To record the outcome of each droplet landing, McLauchlan and colleagues needed a high-speed camera that effectively slowed down motion by a factor of 100 000. To produce the droplets, they needed piezoelectric droplet generators capable of dispensing fluid via tiny 30-micron nozzles. “These dispensers are highly temperamental,” McLauchlan notes. “They can become blocked easily by dust and fibres and fail to work if the fluid viscosity is too high, making experiments delicate to plan and run. The generators are also easy to break and expensive.”

Droplet modelled as a tiny spring

The researchers used this experimental set-up to create and image droplets between 30‒50 µm in diameter as they struck water-repelling surfaces at speeds of 1‒10 m/s. They then compared their findings with calculations based on a simple mathematical model that treats a droplet like a tiny spring, taking into account three main parameters in addition to its speed: the stickiness of the surface; the viscosity of the droplet liquid; and the droplet’s surface tension.

Previous research had shown that on perfectly non-wetting surfaces, bouncing does not depend on velocity. Other studies showed that on very smooth surfaces, droplets can bounce on a thin air layer. “Our work has explored a broader range of hydrophobic surfaces, showing that bouncing occurs due to a delicate balance of kinetic energy, viscous dissipation and interfacial energies,” McLauchlan tells Physics World.

This is exciting, he adds, because it reveals a previously unexplored regime for bounce behaviour: droplets that are too small, or too slow, will always stick, while sufficiently fast droplets can rebound. “This finding provides a general framework that explains bouncing at the micron scale, which is directly relevant for aerosol science,” he says.

A novel framework for engineering microdroplet processes

McLauchlan thinks that by linking bouncing to droplet velocity, size and surface properties, the new framework could make it easier to engineer microdroplets for specific purposes. “In agriculture, for example, understanding how spray velocities interact with plant surfaces with different hydrophobicity could help determine when droplets deposit fully versus when they bounce away, improving the efficiency of crop spraying,” he says.

Such a framework could also be beneficial in the study of airborne diseases, since exhaled droplets frequently bump into surfaces while floating around indoors. While droplets that stick are removed from the air, and can no longer transmit disease via that route, those that bounce are not. Quantifying these processes in typical indoor environments will provide better models of airborne pathogen concentrations and therefore disease spread, McLauchlan says. For example, in healthcare settings, coatings could be designed to inhibit or promote bouncing, ensuring that high-velocity respiratory droplets from sneezes either stick to hospital surfaces or recoil from them, depending on which mode of potential transmission (airborne or contact-based) is being targeted.

The researchers now plan to expand their work on aqueous droplets to droplets with more complex soft-matter properties. “This will include adding surfactants, which introduce time-dependent surface tensions, and polymers, which give droplets viscoelastic properties similar to those found in biological fluids,” McLauchlan reveals. “These studies will present significant experimental challenges, but we hope they broaden the relevance of our findings to an even wider range of fields.”

The present work is detailed in PNAS.

The post Physicists explain why some fast-moving droplets stick to hydrophobic surfaces appeared first on Physics World.

A Quarter of the CDC Is Gone

14 octobre 2025 à 23:51
Another round of terminations, combined with previous layoffs and departures, has reduced the Centers for Disease Control and Prevention workforce by about 3,000 people since January.

Reçu hier — 14 octobre 2025 6.5 📰 Sciences English

Viasat and Space42’s D2D joint venture finds first mobile partner in UAE

14 octobre 2025 à 19:22

Equatys, the U.S.-based Viasat and Emirati satellite operator Space42’s shared space infrastructure joint venture for direct-to-device services, has gained its first mobile network partner as it seeks to challenge SpaceX’s growing lead in the emerging market.

The post Viasat and Space42’s D2D joint venture finds first mobile partner in UAE appeared first on SpaceNews.

Quantum computing on the verge: a look at the quantum marketplace of today

14 octobre 2025 à 17:40

“I’d be amazed if quantum computing produces anything technologically useful in ten years, twenty years, even longer.” So wrote University of Oxford physicist David Deutsch – often considered the father of the theory of quantum computing – in 2004. But, as he added in a caveat, “I’ve been amazed before.”

We don’t know how amazed Deutsch, a pioneer of quantum computing, would have been had he attended a meeting at the Royal Society in London in February on “the future of quantum information”. But it was tempting to conclude from the event that quantum computing has now well and truly arrived, with working machines that harness quantum mechanics to perform computations being commercially produced and shipped to clients. Serving as the UK launch of the International Year of Quantum Science and Technology (IYQ) 2025, it brought together some of the key figures of the field to spend two days discussing quantum computing as something like a mature industry, even if one in its early days.

Werner Heisenberg – who worked out the first proper theory of quantum mechanics 100 years ago – would surely have been amazed to find that the formalism he and his peers developed to understand the fundamental behaviour of tiny particles had generated new ways of manipulating information to solve real-world problems in computation. So far, quantum computing – which exploits phenomena such as superposition and entanglement to potentially achieve greater computational power than the best classical computers can muster – hasn’t tackled any practical problems that can’t be solved classically.

Although the fundamental quantum principles are well-established and proven to work, there remain many hurdles that quantum information technologies have to clear before this industry can routinely deliver resources with transformative capabilities. But many researchers think that moment of “practical quantum advantage” is fast approaching, and an entire industry is readying itself for that day.

Entangled marketplace

So what are the current capabilities and near-term prospects for quantum computing?

The first thing to acknowledge is that a booming quantum-computing market exists. Devices are being produced for commercial use by a number of tech firms, from the likes of IBM, Google, Canada-based D-Wave, and Rigetti who have been in the field for a decade or more; to relative newcomers like Nord Quantique (Canada), IQM (Finland), Quantinuum (UK and US), Orca (UK) and PsiQuantum (US), Silicon Quantum Computing (Australia).

The global quantum ecosystem

Map showing the investments globally into quantum computing
(Courtesy: QURECA)

We are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. This includes quantum computers; quantum sensing (ultra-high precision clocks, sensors for medical diagnostics); as well as quantum communications (a quantum internet). Indeed, according to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. As of this year, worldwide investments in quantum tech – by governments and industry – exceed $55.7 billion, and the market is projected to reach $106 billion by 2040. With the multitude of ground-breaking capabilities that quantum technologies bring globally, it’s unsurprising that governments all over the world are eager to invest in the industry.

With data from a number of international reports and studies, quantum education and skills firm QURECA has summarized key programmes and efforts around the world. These include total government funding spent through 2025, as well as future commitments spanning 2–10 year programmes, varying by country. These initiatives generally represent government agencies’ funding announcements, related to their countries’ advancements in quantum technologies, excluding any private investments and revenues.

A supply chain is also organically developing, which includes manufacturers of specific hardware components, such as Oxford Instruments and Quantum Machines and software developers like Riverlane, based in Cambridge, UK, and QC Ware in Palo Alto, California. Supplying the last link in this chain are a range of eager end-users, from finance companies such as J P Morgan and Goldman Sachs to pharmaceutical companies such as AstraZeneca and engineering firms like Airbus. Quantum computing is already big business, with around 400 active companies and current global investment estimated at around $2 billion.

But the immediate future of all this buzz is hard to assess. When the chief executive of computer giant Nvidia announced at the start of 2025 that “truly useful” quantum computers were still two decades away, the previously burgeoning share prices of some leading quantum-computing companies plummeted. They have since recovered somewhat, but such volatility reflects the fact that quantum computing has yet to prove its commercial worth.

The field is still new and firms need to manage expectations and avoid hype while also promoting an optimistic enough picture to keep investment flowing in. “Really amazing breakthroughs are being made,” says physicist Winfried Hensinger of the University of Sussex, “but we need to get away from the expectancy that [truly useful] quantum computers will be available tomorrow.”

The current state of play is often called the “noisy intermediate-scale quantum” (NISQ) era. That’s because the “noisy” quantum bits (qubits) in today’s devices are prone to errors for which no general and simple correction process exists. Current quantum computers can’t therefore carry out practically useful computations that could not be done on classical high-performance computing (HPC) machines. It’s not just a matter of better engineering either; the basic science is far from done.

IBM quantum computer cryogenic chandelier
Building up Quantum computing behemoth IBM says that by 2029, its fault-tolerant system should accurately run 100 million gates on 200 logical qubits, thereby truly achieving quantum advantage. (Courtesy: IBM)

“We are right on the cusp of scientific quantum advantage – solving certain scientific problems better than the world’s best classical methods can,” says Ashley Montanaro, a physicist at the University of Bristol who co-founded the quantum software company Phasecraft. “But we haven’t yet got to the stage of practical quantum advantage, where quantum computers solve commercially important and practically relevant problems such as discovering the next lithium-ion battery.” It’s no longer if or how, but when that will happen.

Pick your platform

As the quantum-computing business is such an emerging area, today’s devices use wildly different types of physical systems for their qubits. There is still no clear sign as to which of these platforms, if any, will emerge as the winner. Indeed many researchers believe that no single qubit type will ever dominate.

The top-performing quantum computers, like those made by Google (with its 105-qubit Willow chip) and IBM (which has made the 121-qubit Condor), use qubits in which information is encoded in the wavefunction of a superconducting material. Until recently, the strongest competing platform seemed to be trapped ions, where the qubits are individual ions held in electromagnetic traps – a technology being developed into working devices by the US company IonQ, spun out from the University of Maryland, among others.

But over the past few years, neutral trapped atoms have emerged as a major contender, thanks to advances in controlling the positions and states of these qubits. Here the atoms are prepared in highly excited electronic states called Rydberg atoms, which can be entangled with one another over few a microns. A Harvard startup called QuEra is developing this technology, as is the French start-up Pasqal. In September a team from the California Institute of Technology announced a 6100-qubit array made from neutral atoms. “Ten years ago I would not have included [neutral-atom] methods if I were hedging bets on the future of quantum computing,” says Deutsch’s Oxford colleague, the quantum information theorist Andrew Steane. But like many, he thinks differently now.

Some researchers believe that optical quantum computing, using photons as qubits, will also be an important platform. One advantage here is that there is no need for complex conversion of photonic signals in existing telecommunications networks going to or from the processing units, which is also handy for photonic interconnections between chips. What’s more, photonic circuits can work at room temperature, whereas trapped ions and superconducting qubits need to be cooled. Photonic quantum computing is being developed by firms like PsiQuantum, Orca, and Xanadu.

Other efforts, for example at Intel and Silicon Quantum Computing in Australia, make qubits from either quantum dots (Intel) or precision-placed phosphorus atoms (SQC), both in good old silicon, which benefits from a very mature manufacturing base. “Small qubits based on ions and atoms yield the highest quality processors”, says Michelle Simmons of the University of New South Wales, who is the founder and CEO of SQC. “But only atom-based systems in silicon combine this quality with manufacturability.”

Intel's silicon spin qubits are now being manufactured on an industrial scale
Spinning around Intel’s silicon spin qubits are now being manufactured on an industrial scale. (Courtesy: Intel Corporation)

And it’s not impossible that entirely new quantum computing platforms might yet arrive. At the start of 2025, researchers at Microsoft’s laboratories in Washington State caused a stir when they announced that they had made topological qubits from semiconducting and superconducting devices, which are less error-prone than those currently in use. The announcement left some scientists disgruntled because it was not accompanied by a peer-reviewed paper providing the evidence for these long-sought entities. But in any event, most researchers think it would take a decade or more for topological quantum computing to catch up with the platforms already out there.

Each of these quantum technologies has its own strengths and weaknesses. “My personal view is that there will not be a single architecture that ‘wins’, certainly not in the foreseeable future,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC), which aims to facilitate the transition of quantum computing from basic research to an industrial concern. Cuthbert thinks the best platform will differ for different types of computation: cold neutral atoms might be good for quantum simulations of molecules, materials and exotic quantum states, say, while superconducting and trapped-ion qubits might be best for problems involving machine learning or optimization.

Measures and metrics

Given these pros and cons of different hardware platforms, one difficulty in assessing their merits is finding meaningful metrics for making comparisons. Should we be comparing error rates, coherence times (basically how long qubits remain entangled), gate speeds (how fast a single computational step can be conducted), circuit depth (how many steps a single computation can sustain), number of qubits in a processor, or what? “The metrics and measures that have been put forward so far tend to suit one or other platform more than others,” says Cuthbert, “such that it becomes almost a marketing exercise rather than a scientific benchmarking exercise as to which quantum computer is better.”

The NQCC evaluates the performance of devices using a factor known as the “quantum operation” (QuOp). This is simply the number of quantum operations that can be carried out in a single computation, before the qubits lose their coherence and the computation dissolves into noise. “If you want to run a computation, the number of coherent operations you can run consecutively is an objective measure,” Cuthbert says. If we want to get beyond the NISQ era, he adds, “we need to progress to the point where we can do about a million coherent operations in a single computation. We’re now at the level of maybe a few thousand. So we’ve got a long way to go before we can run large-scale computations.”

One important issue is how amenable the platforms are to making larger quantum circuits. Cuthbert contrasts the issue of scaling up – putting more qubits on a chip – with “scaling out”, whereby chips of a given size are linked in modular fashion. Many researchers think it unlikely that individual quantum chips will have millions of qubits like the silicon chips of today’s machines. Rather, they will be modular arrays of relatively small chips linked at their edges by quantum interconnects.

Having made the Condor, IBM now plans to focus on modular architectures (scaling out) – a necessity anyway, since superconducting qubits are micron-sized, so a chip with millions of them would be “bigger than your dining room table”, says Cuthbert. But superconducting qubits are not easy to scale out because microwave frequencies that control and read out the qubits have to be converted into optical frequencies for photonic interconnects. Cold atoms are easier to scale up, as the qubits are small, while photonic quantum computing is easiest to scale out because it already speaks the same language as the interconnects.

To be able to build up so called “fault tolerant” quantum computers, quantum platforms must solve the issue of error correction, which will enable more extensive computations without the results becoming degraded into mere noise.

In part two of this feature, we will explore how this is being achieved and meet the various firms developing quantum software. We will also look into the potential high-value commercial uses for robust quantum computers – once such devices exist.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum computing on the verge: a look at the quantum marketplace of today appeared first on Physics World.

The Mars moment: Why now is the time to build the future 

14 octobre 2025 à 15:00

We’re entering a new era of space. One defined not by exploration alone, but by the infrastructure that makes a sustained presence possible.  For decades, our presence in space has been limited to short-term missions: land, explore and return. But now that’s changing. Artemis is preparing the moon as a steppingstone to Mars, shifting the […]

The post The Mars moment: Why now is the time to build the future  appeared first on SpaceNews.

❌