“I’d be amazed if quantum computing produces anything technologically useful in ten years, twenty years, even longer.” So wrote University of Oxford physicist David Deutsch – often considered the father of the theory of quantum computing – in 2004. But, as he added in a caveat, “I’ve been amazed before.”
We don’t know how amazed Deutsch, a pioneer of quantum computing, would have been had he attended a meeting at the Royal Society in London in February on “the future of quantum information.” But it was tempting to conclude from the event that quantum computing has now well and truly arrived, with working machines that harness quantum mechanics to perform computations being commercially produced and shipped to clients. Serving as the UK launch of the International Year of Quantum Science and Technology (IYQ) 2025, it brought together some of the key figures of the field to spend two days discussing quantum computing as something like a mature industry, even if one in its early days.
Werner Heisenberg – who worked out the first proper theory of quantum mechanics 100 years ago – would surely have been amazed to find that the formalism he and his peers developed to understand the fundamental behaviour of tiny particles had generated new ways of manipulating information to solve real-world problems in computation. So far, quantum computing – which exploits phenomena such as superposition and entanglement to potentially achieve greater computational power than the best classical computers can muster – hasn’t tackled any practical problems that can’t be solved classically.
Although the fundamental quantum principles are well-established and proven to work, there remain many hurdles that quantum information technologies have to clear before this industry can routinely deliver resources with transformative capabilities. But many researchers think that moment of “practical quantum advantage” is fast approaching, and an entire industry is readying itself for that day.
Entangled marketplace
So what are the current capabilities and near-term prospects for quantum computing?
The first thing to acknowledge is that a booming quantum-computing market exists. Devices are being produced for commercial use by a number of tech firms, from the likes of IBM, Google, Canada-based D-Wave, and Rigetti who have been in the field for a decade or more; to relative newcomers like Nord Quantique (Canada), IQM (Finland), Quantinuum (UK and US), Orca (UK) and PsiQuantum (US), Silicon Quantum Computing (Australia).
The global quantum ecosystem
(Courtesy: QURECA)
We are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. This includes quantum computers; quantum sensing (ultra-high precision clocks, sensors for medical diagnostics); as well as quantum communications (a quantum internet). Indeed, according to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. As of this year, worldwide investments in quantum tech – by governments and industry – exceed $55.7 billion, and the market is projected to reach $106 billion by 2040. With the multitude of ground-breaking capabilities that quantum technologies bring globally, it’s unsurprising that governments all over the world are eager to invest in the industry.
With data from a number of international reports and studies, quantum education and skills firm QURECA has summarized key programmes and efforts around the world. These include total government funding spent through 2025, as well as future commitments spanning 2–10 year programmes, varying by country. These initiatives generally represent government agencies’ funding announcements, related to their countries’ advancements in quantum technologies, excluding any private investments and revenues.
A supply chain is also organically developing, which includes manufacturers of specific hardware components, such as Oxford Instruments and Quantum Machines and software developers like Riverlane, based in Cambridge, UK, and QC Ware in Palo Alto, California. Supplying the last link in this chain are a range of eager end-users, from finance companies such as J P Morgan and Goldman Sachs to pharmaceutical companies such as AstraZeneca and engineering firms like Airbus. Quantum computing is already big business, with around 400 active companies and current global investment estimated at around $2 billion.
But the immediate future of all this buzz is hard to assess. When the chief executive of computer giant Nvidia announced at the start of 2025 that “truly useful” quantum computers were still two decades away, the previously burgeoning share prices of some leading quantum-computing companies plummeted. They have since recovered somewhat, but such volatility reflects the fact that quantum computing has yet to prove its commercial worth.
The field is still new and firms need to manage expectations and avoid hype while also promoting an optimistic enough picture to keep investment flowing in. “Really amazing breakthroughs are being made,” says physicist Winfried Hensinger of the University of Sussex, “but we need to get away from the expectancy that [truly useful] quantum computers will be available tomorrow.”
The current state of play is often called the “noisy intermediate-scale quantum” (NISQ) era. That’s because the “noisy” quantum bits (qubits) in today’s devices are prone to errors for which no general and simple correction process exists. Current quantum computers can’t therefore carry out practically useful computations that could not be done on classical high-performance computing (HPC) machines. It’s not just a matter of better engineering either; the basic science is far from done.
Building up Quantum computing behemoth IBM says that by 2029, their fault-tolerant system should accurately run 100 million gates on 200 logical qubits, thereby truly achieving quantum advantage. (Courtesy: IBM)
“We are right on the cusp of scientific quantum advantage – solving certain scientific problems better than the world’s best classical methods can,” says Ashley Montanaro, a physicist at the University of Bristol who co-founded the quantum software company Phasecraft. “But we haven’t yet got to the stage of practical quantum advantage, where quantum computers solve commercially important and practically relevant problems such as discovering the next lithium-ion battery.” It’s no longer if or how, but when that will happen.
Pick your platform
As the quantum-computing business is such an emerging area, today’s devices use wildly different types of physical systems for their qubits. There is still no clear sign as to which of these platforms, if any, will emerge as the winner. Indeed many researchers believe that no single qubit type will ever dominate.
The top-performing quantum computers, like those made by Google (with its 105-qubit Willow chip) and IBM (which has made the 121-qubit Condor), use qubits in which information is encoded in the wavefunction of a superconducting material. Until recently, the strongest competing platform seemed to be trapped ions, where the qubits are individual ions held in electromagnetic traps – a technology being developed into working devices by the US company IonQ, spun out from the University of Maryland, among others.
But over the past few years, neutral trapped atoms have emerged as a major contender, thanks to advances in controlling the positions and states of these qubits. Here the atoms are prepared in highly excited electronic states called Rydberg atoms, which can be entangled with one another over few a micrometres. A Harvard startup called QuEra is developing this technology, as is the French start-up Pasqal. In September a team from the California Institute of Technology announced a 6,100-qubit array made from neutral atoms. “Ten years ago I would not have included [neutral-atom] methods if I were hedging bets on the future of quantum computing,” says Deutsch’s Oxford colleague, the quantum information theorist Andrew Steane. But like many, he thinks differently now.
Some researchers believe that optical quantum computing, using photons as qubits, will also be an important platform. One advantage here is that there is no need for complex conversion of photonic signals in existing telecommunications networks going to or from the processing units, which is also handy for photonic interconnections between chips. What’s more, photonic circuits can work at room temperature, whereas trapped ions and superconducting qubits need to be cooled. Photonic quantum computing is being developed by firms like PsiQuantum, Orca, and Xanadu.
Other efforts, for example at Intel and Silicon Quantum Computing in Australia, make qubits from either quantum dots (Intel) or precision-placed phosphorus atoms (SQC), both in good old silicon, which benefits from a very mature manufacturing base. “Small qubits based on ions and atoms yield the highest quality processors”, says Michelle Simmons of the University of New South Wales, who is the founder and CEO of SQC. “But only atom-based systems in silicon combine this quality with manufacturability.”
Spinning around Intel’s silicon spin qubits are now being manufactured on an industrial scale. (Courtesy: Intel Corporation)
And it’s not impossible that entirely new quantum computing platforms might yet arrive. At the start of 2025, researchers at Microsoft’s laboratories in Washington State caused a stir when they announced that they had made topological qubits from semiconducting and superconducting devices, which are less error-prone than those currently in use. The announcement left some scientists disgruntled because it was not accompanied by a peer-reviewed paper providing the evidence for these long-sought entities. But in any event, most researchers think it would take a decade or more for topological quantum computing to catch up with the platforms already out there.
Each of these quantum technologies has its own strengths and weaknesses. “My personal view is that there will not be a single architecture that ‘wins’, certainly not in the foreseeable future,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC), which aims to facilitate the transition of quantum computing from basic research to an industrial concern. Cuthbert thinks the best platform will differ for different types of computation: cold neutral atoms might be good for quantum simulations of molecules, materials and exotic quantum states, say, while superconducting and trapped-ion qubits might be best for problems involving machine learning or optimization.
Measures and metrics
Given these pros and cons of different hardware platforms, one difficulty in assessing their merits is finding meaningful metrics for making comparisons. Should we be comparing error rates, coherence times (basically how long qubits remain entangled), gate speeds (how fast a single computational step can be conducted), circuit depth (how many steps a single computation can sustain), number of qubits in a processor, or what? “The metrics and measures that have been put forward so far tend to suit one or other platform more than others,” says Cuthbert, “such that it becomes almost a marketing exercise rather than a scientific benchmarking exercise as to which quantum computer is better.”
The NQCC evaluates the performance of devices using a factor known as the “quantum operation” (QuOp). This is simply the number of quantum operations that can be carried out in a single computation, before the qubits lose their coherence and the computation dissolves into noise. “If you want to run a computation, the number of coherent operations you can run consecutively is an objective measure,” Cuthbert says. If we want to get beyond the NISQ era, he adds, “we need to progress to the point where we can do about a million coherent operations in a single computation. We’re now at the level of maybe a few thousand. So we’ve got a long way to go before we can run large-scale computations.”
One important issue is how amenable the platforms are to making larger quantum circuits. Cuthbert contrasts the issue of scaling up – putting more qubits on a chip – with “scaling out”, whereby chips of a given size are linked in modular fashion. Many researchers think it unlikely that individual quantum chips will have millions of qubits like the silicon chips of today’s machines. Rather, they will be modular arrays of relatively small chips linked at their edges by quantum interconnects.
Having made the Condor, IBM now plans to focus on modular architectures (scaling out) – a necessity anyway, since superconducting qubits are micron-sized, so a chip with millions of them would be “bigger than your dining room table”, says Cuthbert. But superconducting qubits are not easy to scale out because microwave frequencies that control and read out the qubits have to be converted into optical frequencies for photonic interconnects. Cold atoms are easier to scale up, as the qubits are small, while photonic quantum computing is easiest to scale out because it already speaks the same language as the interconnects.
To be able to build up so called “fault tolerant” quantum computers, quantum platforms must solve the issue of error correction, which will enable more extensive computations without the results becoming degraded into mere noise.
In part two of this feature, we will explore how this is being achieved and meet the various firms developing quantum software. We will also look into the potential high-value commercial uses for robust quantum computers – once such devices exist.
We’re entering a new era of space. One defined not by exploration alone, but by the infrastructure that makes a sustained presence possible. For decades, our presence in space has been limited to short-term missions: land, explore and return. But now that’s changing. Artemis is preparing the moon as a steppingstone to Mars, shifting the […]
Space Pioneer has secured around $350 million in new funding rounds to support its Tianlong-3 rocket and next-generation launch vehicle and engine development.
The 4th Annual SmallSat Education Conference will take place on Saturday. Oct. 25 10am-5pm to Sunday Oct. 26, 2025 10am to 1pm at the Center for Space Education at Kennedy […]
The SpaceNews Icon Awards celebrate the year’s most iconic achievements in shaping the future of the space industry. Today, we’re proud to announce the shortlist for 2025 — a selection of individuals, organizations, and missions that exemplify excellence in innovation, exploration and sustainability. This year’s honorees represent a range of accomplishments — from pioneering commercial […]
Imagine two particles so interconnected that measuring one immediately reveals information about the other, even if the particles are light–years apart. This phenomenon, known as quantum entanglement, is the foundation of a variety of technologies such as quantum cryptography and quantum computing. However, entangled states are notoriously difficult to control. Now, for the first time, a team of physicists in Japan has performed a collective quantum measurement on a W state comprising three entangled photons. This allowed them to analyse the three entangled photons at once rather than one at a time. This achievement, reported in Science Advances, marks a significant step towards the practical development of quantum technologies.
Physicists usually measure entangled particles using a technique known as quantum tomography. In this method, many identical copies of a particle are prepared, and each copy is measured at a different angle. The results of these measurements are then combined to reconstruct its full quantum state. To visualize this, imagine being asked to take a family photo. Instead of taking one group picture, you have to photograph each family member individually and then combine all the photos into a single portrait. Now imagine taking a photo properly: taking one photograph of the entire family. This is essentially what happens in an entangled measurement: where all particles are measured simultaneously rather than separately. This approach allows for significantly faster and more efficient measurements.
So far, for three-particle systems, entangled measurements have only been performed on Greenberger–Horne–Zeilinger (GHZ) states, where all qubits (quantum bits of a system) are either in one state or another. Until now, no one had carried out an entangled measurement for a more complicated set of states known as W states, which do not share this all-or-nothing property. In their experiment, the researchers at Kyoto University and Hiroshima University specifically used the simplest type of W state, made up of three photons, where each photon’s polarization (horizontal or vertical) is represented by one qubit.
“In a GHZ state, if you measure one qubit, the whole superposition collapses. But in a W state, even if you measure one particle, entanglement still remains,” explains Shigeki Takeuchi, corresponding author of the paper describing the study. This robustness makes the W state particularly appealing for quantum technologies.
Fourier transformations
The team took advantage of the fact that different W states look almost identical but differ by tiny phase shift, which acts as a hidden label that distinguishes one state from another. Using a tool called a discrete Fourier transform (DFT) circuit, researchers were able to “decode” this phase and tell the states apart.
The DFT exploits a special type of symmetry inherent to W states. Since the method relies on symmetry, in principle it can be extended to systems containing any number of photons. The researchers prepared photons in controlled polarization states and ran them through the DFT, which provided each state’s phase label. After, the photons were sent through polarizing beam splitters that separate them into vertically and horizontally polarized groups. By counting both sets of photons, and combining this with information from the DFT, the team could identify the W state.
The experiment identified the correct W state about 87% of the time, well above the 15% success rate typically achieved using tomography-based measurements. Maintaining this level of performance was a challenge, as tiny fluctuations in optical paths or photon loss can easily destroy the fragile interference pattern. The fact that the team could maintain stable performance long enough to collect statistically reliable data marks an important technical milestone.
Scalable to larger systems
“Our device is not just a single-shot measurement: it works with 100% efficiency,” Takeuchi adds. “Most linear optical protocols are probabilistic, but here the success probability is unity.” Although demonstrated with three photons, this procedure is directly scalable to larger systems, as the key insight is the symmetry that the DFT can detect.
“In terms of applications, quantum communication seems the most promising,” says Takeuchi. “Because our device is highly efficient, our protocol could be used for robust communication between quantum computer chips. The next step is to build all of this on a tiny photonic chip, which would reduce errors and photon loss and help make this technology practical for real quantum computers and communication networks.”
Physicists at the University of Tokyo, Japan have performed quantum mechanical squeezing on a nanoparticle for the first time. The feat, which they achieved by levitating the particle and rapidly varying the frequency at which it oscillates, could allow us to better understand how very small particles transition between classical and quantum behaviours. It could also lead to improvements in quantum sensors.
Oscillating objects that are smaller than a few microns in diameter have applications in many areas of quantum technology. These include optical clocks and superconducting devices as well as quantum sensors. Such objects are small enough to be affected by Heisenberg’s uncertainty principle, which places a limit on how precisely we can simultaneously measure the position and momentum of a quantum object. More specifically, the product of the measurement uncertainties in the position and momentum of such an object must be greater than or equal to ħ/2, where ħ is the reduced Planck constant.
In these circumstances, the only way to decrease the uncertainty in one variable – for example, the position – is to boost the uncertainty in the other. This process has no classical equivalent and is called squeezing because reducing uncertainty along one axis of position-momentum space creates a “bulge” in the other, like squeezing a balloon.
A charge-neutral nanoparticle levitated in an optical lattice
In the new work, which is detailed in Science, a team led by Kiyotaka Aikawa studied a single, charge-neutral nanoparticle levitating in a periodic intensity pattern formed by the interference of criss-crossed laser beams. Such patterns are known as optical lattices, and they are ideal for testing the quantum mechanical behaviour of small-scale objects because they can levitate the object. This keeps it isolated from other particles and allows it to sustain its fragile quantum state.
After levitating the particle and cooling it to its motional ground state, the team rapidly varied the intensity of the lattice laser. This had the effect of changing the particle’s oscillation frequency, which in turn changed the uncertainty in its momentum. To measure this change (and prove they had demonstrated quantum squeezing), the researchers then released the nanoparticle from the trap and let it propagate for a short time before measuring its velocity. By repeating these time-of-flight measurements many times, they were able to obtain the particle’s velocity distribution.
The telltale sign of quantum squeezing, the physicists say, is that the velocity distribution they measured for the nanoparticle was narrower than the uncertainty in velocity for the nanoparticle at its lowest energy level. Indeed, the measured velocity variance was narrower than that of the ground state by 4.9 dB, which they say is comparable to the largest mechanical quantum squeezing obtained thus far.
“Our system will enable us to realize further exotic quantum states of motions and to elucidate how quantum mechanics should behave at macroscopic scales and become classical,” Aikawa tells Physics World. “This could allow us to develop new kinds of quantum devices in the future.”
SpaceX successfully completed the final flight of version 2 of Starship on Oct. 13, performing in-flight tests as the company prepares to introduce an upgraded version of the reusable rocket.
China conducted an orbital launch Monday with no apparent advance indication, successfully sending the Shiyan-31 remote sensing test satellite into orbit.
Learn more about these ancient microorganisms that have been buried in permafrost for 40,000 years, and how, as the climate warms, they may be releasing more greenhouse gases.
(Houston) — Tory Bruno, president and CEO of United Launch Alliance, will be Honorary Chair of World Space Week 2026, it was announced today by World Space Week Association. Led by Bruno, […]
Will China beat America in returning astronauts to the moon? We hear this question time and again. It is a perennial concern, both from a prestige and national security standpoint. But it is the wrong question. What we really need to assess is whether or not China is on track to dominate America in space. […]
BRISBANE, Australia – Momentus and Solstar Space announced a three-year agreement Oct. 13 to expand communications, transportation and infrastructure services for government and commercial missions in low-Earth orbit. The reciprocal-services agreement is designed to combine “the respective strengths, products and services of each company to deliver comprehensive low-Earth orbit space capabilities to address a broad […]
The Japanese space agency JAXA has selected Rocket Lab to launch a set of technology demonstration satellites on Electron rockets after continued delays with a Japanese launch vehicle.