Sen. Michael Bennet (D-Colo.), a member of the Senate Intelligence Committee, and Sen. Kevin Cramer (R-N.D.), who sits on the Armed Services Committee, introduced the Quad Space Act
This episode of the Physics World Weekly podcast features George Efstathiou and Richard Bond, who share the 2025 Shaw Prize in Astronomy, “for their pioneering research in cosmology, in particular for their studies of fluctuations in the cosmic microwave background (CMB). Their predictions have been verified by an armada of ground-, balloon- and space-based instruments, leading to precise determinations of the age, geometry, and mass-energy content of the universe.”
Bond and Efstathiou talk about how the CMB emerged when the universe was just 380,000 years old and explain how the CMB is observed today. They explain why studying fluctuations in today’s CMB provides a window into the nature of the universe as it existed long ago, and how future studies could help physicists understand the nature of dark matter – which is one of the greatest mysteries in physics.
Efstathiou is emeritus professor of astrophysics at the University of Cambridge in the UK – and Richard Bond is a professor at the Canadian Institute for Theoretical Astrophysics (CITA) and university professor at the University of Toronto in Canada. Bond and Efstathiou share the 2025 Shaw Prize in Astronomy and its $1.2m prize money equally.
This podcast is sponsored by The Shaw Prize Foundation.
The Pentagon’s long war with its own procurement bureaucracy is entering a new phase. The Trump administration has directed an overhaul of defense acquisition by accelerating modernization and embracing commercial […]
In this week's episode of Space Minds Retired Space Force Lt. Gen. John Shaw explains what it was like building a new military branch, the risks of commercial integration and the race to protect U.S. satellites in orbit.
SAN FRANCISCO — Hydrosat announced plans June 5 to gather thermal-infrared imagery with a second satellite launching later this month on the SpaceX Transporter-14 rideshare. VanZyl-2 has four times the […]
The SQMS approach involves placing a superconducting qubit chip (held at temperatures as low as 10–20 mK) inside a three-dimensional superconducting radiofrequency (3D SRF) cavity – a workhorse technology for particle accelerators employed in high-energy physics (HEP), nuclear physics and materials science. In this set-up, it becomes possible to preserve and manipulate quantum states by encoding them in microwave photons (modes) stored within the SRF cavity (which is also cooled to the millikelvin regime).
Put another way: by pairing superconducting circuits and SRF cavities at cryogenic temperatures, SQMS researchers create environments where microwave photons can have long lifetimes and be protected from external perturbations – conditions that, in turn, make it possible to generate quantum states, manipulate them and read them out. The endgame is clear: reproducible and scalable realization of such highly coherent superconducting qubits opens the way to more complex and scalable quantum computing operations – capabilities that, over time, will be used within Fermilab’s core research programme in particle physics and fundamental physics more generally.
Fermilab is in a unique position to turn this quantum technology vision into reality, given its decadal expertise in developing high-coherence SRF cavities. In 2020, for example, Fermilab researchers demonstrated record coherence lifetimes (of up to two seconds) for quantum states stored in an SRF cavity.
“It’s no accident that Fermilab is a pioneer of SRF cavity technology for accelerator science,” explains Sir Peter Knight, senior research investigator in physics at Imperial College London and an SQMS advisory board member. “The laboratory is home to a world-leading team of RF engineers whose niobium superconducting cavities routinely achieve very high quality factors (Q) from 1010 to above 1011 – figures of merit that can lead to dramatic increases in coherence time.”
Moreover, Fermilab offers plenty of intriguing HEP use-cases where quantum computing platforms could yield significant research dividends. In theoretical studies, for example, the main opportunities relate to the evolution of quantum states, lattice-gauge theory, neutrino oscillations and quantum field theories in general. On the experimental side, quantum computing efforts are being lined up for jet and track reconstruction during high-energy particle collisions; also for the extraction of rare signals and for exploring exotic physics beyond the Standard Model.
Collaborate to accumulate SQMS associate scientists Yao Lu (left) and Tanay Roy (right) worked with PhD student Taeyoon Kim (centre) to develop a two-qudit superconducting QPU with a record coherence lifetime (>20 ms). (Courtesy: Hannah Brumbaugh, Fermilab)
Key to success here is an extensive collaborative effort in materials science and the development of novel chip fabrication processes, with the resulting transmon qubit ancillas shaping up as the “nerve centre” of the 3D SRF cavity-based quantum computing platform championed by SQMS. What’s in the works is essentially a unique quantum analogue of a classical computing architecture: the transmon chip providing a central logic-capable quantum information processor and microwave photons (modes) in the 3D SRF cavity acting as the random-access quantum memory.
As for the underlying physics, the coupling between the transmon qubit and discrete photon modes in the SRF cavity allows for the exchange of coherent quantum information, as well as enabling quantum entanglement between the two. “The pay-off is scalability,” says Alexander Romanenko, a senior scientist at Fermilab who leads the SQMS quantum technology thrust. “A single logic-capable processor qubit, such as the transmon, can couple to many cavity modes acting as memory qubits.”
In principle, a single transmon chip could manipulate more than 10 qubits encoded inside a single-cell SRF cavity, substantially streamlining the number of microwave channels required for system control and manipulation as the number of qubits increases. “What’s more,” adds Romanenko, “instead of using quantum states in the transmon [coherence times just crossed into milliseconds], we can use quantum states in the SRF cavities, which have higher quality factors and longer coherence times [up to two seconds].”
In terms of next steps, continuous improvement of the ancilla transmon coherence times will be critical to ensure high-fidelity operation of the combined system – with materials breakthroughs likely to be a key rate-determining step. “One of the unique differentiators of the SQMS programme is this ‘all-in’ effort to understand and get to grips with the fundamental materials properties that lead to losses and noise in superconducting qubits,” notes Knight. “There are no short-cuts: wide-ranging experimental and theoretical investigations of materials physics – per the programme implemented by SQMS – are mandatory for scaling superconducting qubits into industrial and scientifically useful quantum computing architectures.”
Laying down a marker, SQMS researchers recently achieved a major milestone in superconducting quantum technology by developing the longest-lived multimode superconducting quantum processor unit (QPU) ever built (coherence lifetime >20 ms). Their processor is based on a two-cell SRF cavity and leverages its exceptionally high quality factor (~1010) to preserve quantum information far longer than conventional superconducting platforms (typically 1 or 2 ms for rival best-in-class implementations).
Coupled with a superconducting transmon, the two-cell SRF module enables precise manipulation of cavity quantum states (photons) using ultrafast control/readout schemes (allowing for approximately 104 high-fidelity operations within the qubit lifetime). “This represents a significant achievement for SQMS,” claims Yao Lu, an associate scientist at Fermilab and co-lead for QPU connectivity and transduction in SQMS. “We have demonstrated the creation of high-fidelity [>95%] quantum states with large photon numbers [20 photons] and achieved ultra-high-fidelity single-photon entangling operations between modes [>99.9%]. It’s work that will ultimately pave the way to scalable, error-resilient quantum computing.”
Scalable thinking The SQMS multiqudit QPU prototype (above) exploits 3D SRF cavities held at millikelvin temperatures. (Courtesy: Ryan Postel, Fermilab)
Fast scaling with qudits
There’s no shortage of momentum either, with these latest breakthroughs laying the foundations for SQMS “qudit-based” quantum computing and communication architectures. A qudit is a multilevel quantum unit that can be more than two states and, in turn, hold a larger information density – i.e. instead of working with a large number of qubits to scale information processing capability, it may be more efficient to maintain a smaller number of qudits (with each holding a greater range of values for optimized computations).
Scale-up to a multiqudit QPU system is already underway at SQMS via several parallel routes (and all with a modular computing architecture in mind). In one approach, coupler elements and low-loss interconnects integrate a nine-cell multimode SRF cavity (the memory) to a two-cell SRF cavity quantum processor. Another iteration uses only two-cell modules, while yet another option exploits custom-designed multimodal cavities (10+ modes) as building blocks.
One thing is clear: with the first QPU prototypes now being tested, verified and optimized, SQMS will soon move to a phase in which many of these modules will be assembled and put together in operation. By extension, the SQMS effort also encompasses crucial developments in control systems and microwave equipment, where many devices must be synchronized optimally to encode and analyse quantum information in the QPUs.
Along a related coordinate, complex algorithms can benefit from fewer required gates and reduced circuit depth. What’s more, for many simulation problems in HEP and other fields, it’s evident that multilevel systems (qudits) – rather than qubits – provide a more natural representation of the physics in play, making simulation tasks significantly more accessible. The work of encoding several such problems into qudits – including lattice-gauge-theory calculations and others – is similarly ongoing within SQMS.
Taken together, this massive R&D undertaking – spanning quantum hardware and quantum algorithms – can only succeed with a “co-design” approach across strategy and implementation: from identifying applications of interest to the wider HEP community to full deployment of QPU prototypes. Co-design is especially suited to these efforts as it demands sustained alignment of scientific goals with technological implementation to drive innovation and societal impact.
In addition to their quantum computing promise, these cavity-based quantum systems will play a central role in serving both as the “adapters” and low-loss channels at elevated temperatures for interconnecting chip or cavity-based QPUs hosted in different refrigerators. These interconnects will provide an essential building block for the efficient scale-up of superconducting quantum processors into larger quantum data centres.
Quantum insights Researchers in the control room of the SQMS Quantum Garage facility, developing architectures and gates for SQMS hardware tailored toward HEP quantum simulations. From left to right: Nick Bornman, Hank Lamm, Doga Kurkcuoglu, Silvia Zorzetti, Julian Delgado, Hans Johnson (Courtesy: Hannah Brumbaugh)
“The SQMS collaboration is ploughing its own furrow – in a way that nobody else in the quantum sector really is,” says Knight. “Crucially, the SQMS partners can build stuff at scale by tapping into the phenomenal engineering strengths of the National Laboratory system. Designing, commissioning and implementing big machines has been part of the ‘day job’ at Fermilab for decades. In contrast, many quantum computing start-ups must scale their R&D infrastructure and engineering capability from a far-less-developed baseline.”
The last word, however, goes to Romanenko. “Watch this space,” he concludes, “because SQMS is on a roll. We don’t know which quantum computing architecture will ultimately win out, but we will ensure that our cavity-based quantum systems will play an enabling role.”
Scaling up: from qubits to qudits
Left: conceptual illustration of the SQMS Center’s superconducting TESLA cavity coupled to a transmon ancilla qubit (AI-generated). Right: an ancilla qubit with two energy levels – ground ∣g⟩ and excited ∣e⟩ – is used to control a high-coherence (d+1) dimensional qudit encoded in a cavity resonator. The ancilla enables state preparation, control and measurement of the qudit. (Courtesy: Fermilab)
A space station research conference has been canceled and the future of a long-running planetary science conference is in doubt as NASA pulls back support for those events.
Europe is expected to publish a draft law by the end of June to overhaul the regulation of space services, introducing unified rules for companies operating in or selling to the European market.
Here’s how to find reliable information and keep safe during the summer heat and hurricane season following the unprecedented cuts at federal agencies.
Learn about an unlikely strategy for repelling ticks that involves a compound secreted by the skin of donkeys, which may help prevent the spread of Lyme disease.
Since the start of the second Trump administration, NASA’s formal advisory committees have largely been suspended. While the congressionally mandated Aerospace Safety Advisory Committee has continued its work, the NASA […]
By adapting mathematical techniques used in particle physics, researchers in Germany have developed an approach that could boost our understanding of the gravitational waves that are emitted when black holes collide. Led by Jan Plefka at The Humboldt University of Berlin, the team’s results could prove vital to the success of future gravitational-wave detectors.
Nearly a decade on from the first direct observations of gravitational waves, physicists are hopeful that the next generation of ground- and space-based observatories will soon allow us to study these ripples in space–time with unprecedented precision. But to ensure the success of upcoming projects like the LISA space mission, the increased sensitivity offered by these detectors will need to be accompanied with a deeper theoretical understanding of how gravitational waves are generated through the merging of two black holes.
In particular, they will need to predict more accurately the physical properties of gravitational waves produced by any given colliding pair and account for factors including their respective masses and orbital velocities. For this to happen, physicists will need to develop more precise solutions to the relativistic two-body problem. This problem is a key application of the Einstein field equations, which relate the geometry of space–time to the distribution of matter within it.
No exact solution
“Unlike its Newtonian counterpart, which is solved by Kepler’s Laws, the relativistic two-body problem cannot be solved exactly,” Plefka explains. “There is an ongoing international effort to apply quantum field theory (QFT) – the mathematical language of particle physics – to describe the classical two-body problem.”
In their study, Plefka’s team started from state-of-the-art techniques used in particle physics for modelling the scattering of colliding elementary particles, while accounting for their relativistic properties. When viewed from far away, each black hole can be approximated as a single point which, much like an elementary particle, carries a single mass, charge, and spin.
Taking advantage of this approximation, the researchers modified existing techniques in particle physics to create a framework called worldline quantum field theory (WQFT). “The advantage of WQFT is a clean separation between classical and quantum physics effects, allowing us to precisely target the classical physics effects relevant for the vast distances involved in astrophysical observables,” Plefka describes
Ordinarily, doing calculations with such an approach would involve solving millions of integrals that sum-up every single contribution to the black hole pair’s properties across all possible ways that the interaction between them could occur. To simplify the problem, Plefka’s team used a new algorithm that identified relationships between the integrals. This reduced the problem to just 250 “master integrals”, making the calculation vastly more manageable.
With these master integrals, the team could finally produce expressions for three key physical properties of black hole binaries within WQFT. These includes the changes in momentum during the gravity-mediated scattering of two black holes and the total energy radiated by both bodies over the course of the scattering.
Genuine physical process
Altogether, the team’s WQFT framework produced the most accurate solution to the Einstein field equations ever achieved to date. “In particular, the radiated energy we found contains a new class of mathematical functions known as ‘Calabi–Yau periods’,” Plefka explains. “While these functions are well-known in algebraic geometry and string theory, this marks the first time they have been shown to describe a genuine physical process.”
With its unprecedented insights into the structure of the relativistic two-body problem, the team’s approach could now be used to build more precise models of gravitational-wave formation, which could prove invaluable for the next generation of gravitational-wave detectors.
More broadly, however, Plefka predicts that the appearance of Calabi–Yau periods in their calculations could lead to an entirely new class of mathematical functions applicable to many areas beyond gravitational waves.
“We expect these periods to show up in other branches of physics, including collider physics, and the mathematical techniques we employed to calculate the relevant integrals will no doubt also apply there,” he says.
On May 5, 2025, Interlune — a Seattle-based startup — announced a new agreement with the U.S. Department of Energy (DOE): they’ll deliver helium-3, harvested from the moon, to Earth […]
Japanese company ispace is set to make its second attempt to land on the moon this week as the company looks ahead to larger, more ambitious lunar landers.
When the White House’s Office of Management and Budget (OMB) released its top-level budget proposal for fiscal year 2026 on May 2, NASA and the space industry feared the worst. […]
A study claimed that people who eat high-fat, low-carb diets weren’t seeing their arteries fill up with plaque, despite having high levels of blood cholesterol. Critics disagreed—and all hell broke loose.
CP Snow’s classic The Two Cultures lecture, published in book form in 1959, is the usual go-to reference when exploring the divide between the sciences and humanities. It is a culture war that was raging long before the term became social-media shorthand for today’s tribal battles over identity, values and truth.
While Snow eloquently lamented the lack of mutual understanding between scientific and literary elites, the 21st-century version of the two-cultures debate often plays out with a little less decorum and a lot more profanity. Hip hop duo Insane Clown Posse certainly didn’t hold back in their widely memed 2010 track “Miracles”, which included the lyric “And I don’t wanna talk to a scientist / Y’all motherfuckers lying and getting me pissed”. An extreme example to be sure, but it hammers home the point: Snow’s two-culture concerns continue to resonate strongly almost 70 years after his influential lecture and writings.
I’ve also contributed, in a nanoscopically small way, to this music-meets-science corpus with an analysis of the deep and fundamental links between quantum physics and heavy metal (When The Uncertainty Principle Goes To 11), and have a long-standing interest in music composed from maths and physics principles and constants (see my Lateral Thoughts articles from September 2023 and July 2024). Darling’s book, therefore, struck a chord with me.
Darling is not only a talented science writer with an expansive back-catalogue to his name but he is also an accomplished musician (check out his album Songs Of The Cosmos ), and his enthusiasm for all things musical spills off the page. Furthermore, he is a physicist, with a PhD in astronomy from the University of Manchester. So if there’s a writer who can genuinely and credibly inhabit both sides of the arts–science cultural divide, it’s Darling.
But is A Perfect Harmony in tune with the rest of the literary ensemble, or marching to a different beat? In other words, is this a fresh new take on the music-meets-maths (meets pop sci) genre or, like too many bands I won’t mention, does it sound suspiciously like something you’ve heard many times before? Well, much like an old-school vinyl album, Darling’s work has the feel of two distinct sides. (And I’ll try to make that my final spin on groan-worthy musical metaphors. Promise.)
Not quite perfect pitch
Although the subtitle for A Perfect Harmony is “Music, Mathematics and Science”, the first half of the book is more of a history of the development and evolution of music and musical instruments in various cultures, rather than a new exploration of the underpinning mathematical and scientific principles. Engaging and entertaining though this is – and all credit to Darling for working in a reference to Van Halen in the opening lines of chapter 1 – it’s well-worn ground: Pythagorean tuning, the circle of fifths, equal temperament, Music of the Spheres (not the Coldplay album, mercifully), resonance, harmonics, etc. I found myself wishing, at times, for a take that felt a little more off the beaten track.
One case in point is Darling’s brief discussion of the theremin. If anything earns the title of “The Physicist’s Instrument”, it’s the theremin – a remarkable device that exploits the innate electrical capacitance of the human body to load a resonant circuit and thus produce an ethereal, haunting tone whose pitch can be varied, without, remarkably, any physical contact.
While I give kudos to Darling for highlighting the theremin, the brevity of the description is arguably a lost opportunity when put in the broader context of the book’s aim to explain the deeper connections between music, maths and science. This could have been a novel and fascinating take on the links between electrical and musical resonance that went well beyond the familiar territory mapped out in standard physics-of-music texts.
Using the music of the eclectic Australian band King Gizzard and the Lizard Wizard to explain microtonality is nothing short of inspired
As the book progresses, however, Darling moves into more distinctive territory, choosing a variety of inventive examples that are often fascinating and never short of thought-provoking. I particularly enjoyed his description of orbital resonance in the system of seven planets orbiting the red dwarf TRAPPIST-1, 41 light-years from Earth. The orbital periods have ratios, which, when mapped to musical intervals, correspond to a minor sixth, a major sixth, two perfect fifths, a perfect fourth and another perfect fifth. And it’s got to be said that using the music of the eclectic Australian band King Gizzard and the Lizard Wizard to explain microtonality is nothing short of inspired.
A Perfect Harmony doesn’t entirely close the cultural gap highlighted by Snow all those years ago, but it does hum along pleasantly in the space between. Though the subject matter occasionally echoes well-trodden themes, Darling’s perspective and enthusiasm lend it freshness. There’s plenty here to enjoy, especially for physicists inclined to tune into the harmonies of the universe.
Astroscale has completed the critical design review of a servicer aiming to remove a OneWeb broadband satellite from low Earth orbit next year, the Japanese venture’s British subsidiary announced June 4.
Impulse Space has raised $300 million in a “preemptive” funding round to enable the in-space transportation company to expand and develop electric propulsion technologies.