Physicists searching for signs of quantum gravity have long faced a frustrating problem. Even if gravity does have a quantum nature, its effects are expected to show up only at extremely small distances, far beyond the reach of experiments. A new theoretical study by Benjamin Koch and colleagues at the Technical University of Vienna in Austria suggests a different strategy. Instead of looking for quantum gravity where space–time is tiny, the researchers argue that subtle quantum effects could influence how particles and light move across huge cosmical distances.
Their work introduces a new concept called q-desics, short for quantum-corrected paths through space–time. These paths generalize the familiar trajectories predicted by Einstein’s general theory of relativity and could, in principle, leave observable fingerprints in cosmology and astrophysics.
General relativity and quantum mechanics are two of the most successful theories in physics, yet they describe nature in radically different ways. General relativity treats gravity as the smooth curvature of space–time, while quantum mechanics governs the probabilistic behavior of particles and fields. Reconciling the two has been one of the central challenges of theoretical physics for decades.
“One side of the problem is that one has to come up with a mathematical framework that unifies quantum mechanics and general relativity in a single consistent theory,” Koch explains. “Over many decades, numerous attempts have been made by some of the most brilliant minds humanity has to offer.” Despite this effort, no approach has yet gained universal acceptance.
Deeper difficulty
There is another, perhaps deeper difficulty. “We have little to no guidance, neither from experiments nor from observations that could tell us whether we actually are heading in the right direction or not,” Koch says. Without experimental clues, many ideas about quantum gravity remain largely speculative.
That does not mean the quest lacks value. Fundamental research often pays off in unexpected ways. “We rarely know what to expect behind the next tree in the jungle of knowledge,” Koch says. “We only can look back and realize that some of the previously explored trees provided treasures of great use and others just helped us to understand things a little better.”
Almost every test of general relativity relies on a simple assumption. Light rays and freely falling particles follow specific paths, known as geodesics, determined entirely by the geometry of space–time. From gravitational lensing to planetary motion, this idea underpins how physicists interpret astronomical data.
Koch and his collaborators asked what happens to this assumption when space–time itself is treated as a quantum object. “Almost all interpretations of observational astrophysical and astronomical data rest on the assumption that in empty space light and particles travel on a path which is described by the geodesic equation,” Koch says. “We have shown that in the context of quantum gravity this equation has to be generalized.”
Generalized q-desic
The result is the q-desic equation. Instead of relying only on an averaged, classical picture of space–time, q-desics account for the underlying quantum structure more directly. In practical terms, this means that particles may follow paths that deviate slightly from those predicted by classical general relativity, even when space–time looks smooth on average.
Crucially, the team found that these deviations are not confined to tiny distances. “What makes our first results on the q-desics so interesting is that apart from these short distance effects, there are also long range effects possible, if one takes into account the existence of the cosmological constant,” Koch says.
This opens the door to possible tests using existing astronomical data. According to the study, q-desics could differ from ordinary geodesics over cosmological distances, affecting how matter and light propagate across the universe.
“The q-desics might be distinguished from geodesics at cosmological large distances,” Koch says, “which would be an observable manifestation of quantum gravity effects.”
Cosmological tensions
The researchers propose revisiting cosmological observations. “Currently, there are many tensions popping up between the Standard Model of cosmology and observed data,” Koch notes. “All these tensions are linked, one way or another, to the use of geodesics at vastly different distance scales.” The q-desic framework offers a new lens through which to examine such discrepancies.
So far, the team has explored simplified scenarios and idealized models of quantum space–time. Extending the framework to more realistic situations will require substantial effort.
“The initial work was done with one PhD student (Ali Riahina) and one colleague (Ángel Rincón),” Koch says. “There are many things to be revisited and explored that our to-do list is growing far too long for just a few people.” One immediate goal is to encourage other researchers to engage with the idea and test it in different theoretical settings.
Whether q-desics will provide an observational window into quantum gravity remains to be seen. But by shifting attention from the smallest scales to the largest structures in the cosmos, the work offers a fresh perspective on an enduring problem.
Academics who switch to hybrid working and remote collaboration do less impactful research. That’s according to an analysis of how scientists’ collaboration networks and academic outputs evolved before, during and after the COVID-19 pandemic (arXiv: 2511.18481). It involved studying author data from the arXiv preprint repository and the online bibliographic catalogue OpenAlex.
To explore the geographic spread of collaboration networks, Sara Venturini from the Massachusetts Institute of Technology and colleagues looked at the average distance between the institutions of co-authors. They found that while the average distance between team members on publications increased from 2000 to 2021, there was a particularly sharp rise after 2022.
This pattern, the researchers claim, suggests that the pandemic led to scientists collaborating more often with geographically distant colleagues. They found consistent patterns when they separated papers related to COVID-19 from those in unrelated areas, suggesting the trend was not solely driven by research on COVID-19.
The researchers also examined how the number of citations a paper received within a year of publication changed with distance between the co-authors’ institutions. In general, as the average distance between collaborators increases, citations fall, the authors found.
They suggest that remote and hybrid working hampers research quality by reducing spontaneous, serendipitous in-person interactions that can lead to deep discussions and idea exchange.
Despite what the authors say is a “concerning decline” in citation impact, there are, however, benefits to increasing remote interactions. In particular, as the geography of collaboration networks increases, so too does international partnerships and authorship diversity.
Remote tools
Lingfei Wu, a computational social scientist at the University of Pittsburgh, who was not involved in the study, told Physics World that he was surprised by the finding that remote teams produce less impactful work.
“In our earlier research, we found that historically, remote collaborations tended to produce more impactful but less innovative work,” notes Wu. “For example, the Human Genome Project published in 2001 shows how large, geographically distributed teams can also deliver highly impactful science. One would expect the pandemic-era shift toward remote collaboration to increase impact rather than diminish it.”
Wu says his work suggests that remote work is effective for implementing ideas but less effective for generating them, indicating that scientists need a balance between remote and in-person interactions. “Use remote tools for efficient execution, but reserve in-person time for discussion, brainstorming, and informal exchange,” he adds.
There are 12 questions in total: blue is your current question and white means unanswered, with green and red being right and wrong. Check your scores at the end – and why not test your colleagues too?
How did you do?
10–12 Top shot – congratulations, you’re the next John Hopfield
7–9 Strong skills – good, but not quite Nobel standard
4–6 Weak performance – should have asked ChatGPT
0–3 Worse than random – are you a bot?
Physics World‘s coverage of this interactive quiz is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
More molecules and compounds vital to the origin of life have been detected in asteroid samples delivered to Earth by NASA’s OSIRIS-REx mission. The discovery strengthens the case that not only did life’s building blocks originate in space, but that the ingredients of RNA, and perhaps RNA itself, were brought to our planet by asteroids.
Two new papers in Nature Geoscience and Nature Astronomy describe the discovery of the sugars ribose and glucose in the 120 g of samples returned from the near-Earth asteroid 101955 Bennu, as well as an unusual carbonaceous “gum” that holds important compounds for life. The findings complement the earlier discovery of amino acids and the nucleobases of RNA and DNA in the Bennu samples.
A third new paper, in Nature Astronomy, addresses the abundance of pre-solar grains, which is dust that originated from before the birth of our Solar System, such as dust from supernovae. Scientists led by Ann Nguyen of NASA’s Johnson Space Center found six times more dust direct from supernova explosions than is found, on average, in meteorites and other sampled asteroids. This could suggest differences in the concentration of different pre-solar dust grains in the disc of gas and dust that formed the Solar System.
Space gum
It’s the discovery of organic materials useful for life that steals the headlines, though. For example, the discovery of the space gum, which is essentially a hodgepodge chain of polymers, represents something never found in space before.
Scott Sandford of NASA’s Ames Research Center, co-lead author of the Nature Astronomy paper describing the gum discovery, tells Physics World: “The material we see in our samples is a bit of a molecular jumble. It’s carbonaceous, but much richer in nitrogen and, to a lesser extent, oxygen, than most of the organic compounds found in extraterrestrial materials.”
Sandford refers to the material as gum because of its pliability, bending and dimpling when pressure is applied, rather like chewing gum. And while much of its chemical functionality is replicated in similar materials on our planet, “I doubt it matches exactly with anything seen on Earth,” he says.
Initially, Sandford found the gum using an infrared microscope, nicknaming the dust grains containing the gum “Lasagna” and “Neapolitan” because the grains are layered. To extract them from the rock in the sample, Sandford went to Zack Gainsforth of the University of California, Berkeley, who specializes in analysing and extracting materials from samples like this.
Platinum scaffolding
Having welded a tungsten needle to the Neapolitan sample in order to lift it, the pair quickly realised that the grain was very delicate.
“When we tried to lift the sample it began to deform,” Gainsforth says. “Scott and I practically jumped out of our chairs and brainstormed what to do. After some discussion, we decided that we should add straps to give it enough mechanical rigidity to survive the lift.”
Fragile sample A microscopic particle of asteroid Bennu is manipulated under a transmission electron microscope. To move the 30 µm fragment for further analysis, the researchers reinforced it with thin platinum strips (the L shape on the surface). (Courtesy: NASA/University of California, Berkeley)
By straps, Gainsforth is referring to micro-scale platinum scaffolding applied to the grain to reinforce its structure while they cut it away with an ion beam. Platinum is often used as a radiation shield to protect samples from an ion beam, “but how we used it was anything but standard,” says Gainsforth. “Scott and I made an on-the-fly decision to reinforce the samples based on how they were reacting to our machinations.”
With the sample extracted and reinforced, they used the ion beam cutter to shave it down until it was a thousand times thinner than a human hair, at which point it could be studied by electron microscopy and X-ray spectrometry. “It was a joy to watch Zack ‘micro-manipulate’ [the sample],” says Sandford.
The nitrogen in the gum was found to be in nitrogen heterocycles, which are the building blocks of nucleobases in DNA and RNA. This brings us to the other new discovery, reported in Nature Geoscience, of the sugars ribose and glucose in the Bennu samples, by a team led by Yoshihiro Furukawa of Tohoku University in Japan.
The ingredients of RNA
Glucose is the primary source of energy for life, while ribose is a key component of the sugar-phosphate backbone that connects the information-carrying nucleobases in RNA molecules. Furthermore, the discovery of ribose now means that everything required to assemble RNA molecules is present in the Bennu sample.
Notable by its absence, however, was deoxyribose, which is ribose minus one oxygen atom. Deoxyribose in DNA performs the same job as ribose in RNA, and Furukawa believes that its absence supports a popular hypothesis about the origin of life on Earth called RNA world. This describes how the first life could have used RNA instead of DNA to carry genetic information, catalyse biochemical reactions and self-replicate.
Intriguingly, the presence of all RNA’s ingredients on Bennu raises the possibility that RNA could have formed in space before being brought to Earth.
“Formation of RNA from its building blocks requires a dehydration reaction, which we can expect to have occurred both in ancient Bennu and on primordial Earth,” Furukawa tells Physics World.
However, RNA would be very hard to detect because of its expected low abundance in the samples, making identifying it very difficult. So until there’s information to the contrary, “the present finding means that the ingredients of RNA were delivered from space to the Earth,” says Furukawa.
Nevertheless, these discoveries are major milestones in the quest of astrobiologists and space chemists to understand the origin of life on Earth. Thanks to Bennu and the asteroid 162173 Ryugu, from which a sample was returned by the Japanese Aerospace Exploration Agency (JAXA) mission Hayabusa2, scientists are increasingly confident that the building blocks of life on Earth came from space.
Using a novel spectroscopy technique, physicists in Japan have revealed how organic materials accumulate electrical charge through long-term illumination by sunlight – leading to material degradation. Ryota Kabe and colleagues at the Okinawa Institute of Science and Technology have shown how charge separation occurs gradually via a rare multi-photon ionization process, offering new insights into how plastics and organic semiconductors degrade in sunlight.
In a typical organic solar cell, an electron-donating material is interfaced with an electron acceptor. When the donor absorbs a photon, one of its electrons may jump across the interface, creating a bound electron-hole pair which may eventually dissociate – creating two free charges from which useful electrical work can be extracted.
Although such an interface vastly boosts the efficiency of this process, it is not necessary for charge separation to occur when an electron donor is illuminated. “Even single-component materials can generate tiny amounts of charge via multiphoton ionization,” Kabe explains. “However, experimental evidence has been scarce due to the extremely low probability of this process.”
To trigger charge separation in this way, an electron needs to absorb one or more additional photons while in its excited state. Since the vast majority of electrons fall back into their ground states before this can happen, the spectroscopic signature of this charge separation is very weak. This makes it incredibly difficult to detect using conventional spectroscopy techniques, which can generally only make observations over timescales of up to a few milliseconds.
The opposite approach
“While weak multiphoton pathways are easily buried under much stronger excited-state signals, we took the opposite approach in our work,” Kabe describes. “We excited samples for long durations and searched for traces of accumulated charges in the slow emission decay.”
Key to this approach was an electron donor called NPD. This organic material has a relatively long triplet lifetime, where an excited electron is prevented from transitioning back to its ground state. As a result, these molecules emit phosphorescence over relatively long timescales.
In addition, Kabe’s team dispersed their NPD samples into different host materials with carefully selected energy levels. In one medium, the energies of both the highest-occupied and lowest-unoccupied molecular orbitals lay below NPD’s corresponding levels, so that the host material acted as an electron acceptor. As a result, charge transfer occurred in the same way as it would across a typical donor-acceptor interface.
Yet in another medium, the host’s lowest-unoccupied orbital lay above NPD’s – blocking charge transfer, and allowing triplet states to accumulate instead. In this case, the only way for charge separation to occur was through multi-photon ionization.
Slow emission decay analysis
Since NPD’s long triplet lifetime allowed its electrons to be excited gradually over an extended period of illumination, its weak charge accumulation became detectable through slow emission decay analysis. In contrast, more conventional methods involve multiple, ultra-fast laser pulses, severely restricting the timescale over which measurements can be made. Altogether, this approach enabled the team to clearly distinguish between the two charge generation pathways.
“Using this method, we confirmed that charge generation occurred via resonance-enhanced multiphoton ionization mediated by long-lived triplet states, even in single-component organic materials,” Kabe describes.
This result offers insights into how plastics and organic semiconductors are degraded by sunlight over years or decades. The conventional explanation is that sunlight generates free radicals. These are molecules that lose an electron through ionization, leaving behind an unpaired electron which readily reacts with other molecules in the surrounding environment. Since photodegradation unfolds over such a long timescale, researchers could not observe this charge generation in single-component organic materials – until now.
“The method will be useful for analysing charge behaviour in organic semiconductor devices and for understanding long-term processes such as photodegradation that occur gradually under continuous light exposure,” Kabe says.
Travis Humble is a research leader who’s thinking big, dreaming bold, yet laser-focused on operational delivery. The long-game? To translate advances in fundamental quantum science into a portfolio of enabling technologies that will fast-track the practical deployment of quantum computers for at-scale scientific, industrial and commercial applications.
Validation came in spades last month when, despite the current turbulence around US science funding, QSC was given follow-on DOE backing of $125 million over five years (2025–30) to create “a new scientific ecosystem” for fault-tolerant, quantum-accelerated high-performance computing (QHPC). In short, QSC will target the critical research needed to amplify the impact of quantum computing through its convergence with leadership-class exascale HPC systems.
“Our priority in Phase II QSC is the creation of a common software ecosystem to host the compilers, programming libraries, simulators and debuggers needed to develop hybrid-aware algorithms and applications for QHPC,” explains Humble. Equally important, QSC researchers will develop and integrate new techniques in quantum error correction, fault-tolerant computing protocols and hybrid algorithms that combine leading-edge computing capabilities for pre- and post-processing of quantum programs. “These advances will optimize quantum circuit constructions and accelerate the most challenging computational tasks within scientific simulations,” Humble adds.
Classical computing, quantum opportunity
At the heart of the QSC programme sits ORNL’s leading-edge research infrastructure for classical HPC, a capability that includes Frontier, the first supercomputer to break the exascale barrier and still one of the world’s most powerful. On that foundation, QSC is committed to building QHPC architectures that take advantage of both quantum computers and exascale supercomputing to tackle all manner of scientific and industrial problems beyond the reach of today’s HPC systems alone.
“Hybrid classical-quantum computing systems are the future,” says Humble. “With quantum computers connecting both physically and logically to existing HPC systems, we can forge a scalable path to integrate quantum technologies into our scientific infrastructure.”
Quantum acceleration ORNL’s current supercomputer, Frontier, was the first high-performance machine to break the exascale barrier. Plans are in motion for a next-generation supercomputer, Discovery, to come online at ORNL by 2028. (Courtesy: Carlos Jones/ORNL, US DOE)
Industry partnerships are especially important in this regard. Working in collaboration with the likes of IonQ, Infleqtion and QuEra, QSC scientists are translating a range of computationally intensive scientific problems – quantum simulations of exotic matter, for example – onto the vendors’ quantum computing platforms, generating excellent results out the other side.
“With our broad representation of industry partners,” notes Humble, “we will establish a common framework by which scientific end-users, software developers and hardware architects can collaboratively advance these tightly coupled, scalable hybrid computing systems.”
It’s a co-development model that industry values greatly. “Reciprocity is key,” Humble adds. “At QSC, we get to validate that QHPC can address real-world research problems, while our industry partners gather user feedback to inform the ongoing design and optimization of their quantum hardware and software.”
Quantum impact
Innovation being what it is, quantum computing systems will continue to trend on an accelerating trajectory, with more qubits, enhanced fidelity, error correction and fault-tolerance key reference points on the development roadmap. Phase II QSC, for its part, will integrate five parallel research thrusts to advance the viability and uptake of QHPC technologies.
The collaborative software effort, led by ORNL’s Vicente Leyton, will develop openQSE, an adaptive, end-to-end software ecosystem for QHPC systems and applications. Yigit Subasi from Los Alamos National Laboratory (LANL) will lead the hybrid algorithms thrust, which will design algorithms that combine conventional and quantum methods to solve challenging problems in the simulation of model materials.
Meanwhile, the QHPC architectures thrust, under the guidance of ORNL’s Chris Zimmer, will co-design hybrid computing systems that integrate quantum computers with leading-edge HPC systems. The scientific applications thrust, led by LANL’s Andrew Sornberger, will develop and validate applications of quantum simulation to be implemented on prototype QHPC systems. Finally, ORNL’s Michael McGuire will lead the thrust to establish experimental baselines for quantum materials that ultimately validate QHPC simulations against real-world measurements.
Longer term, ORNL is well placed to scale up the QHPC model. After all, the laboratory is credited with pioneering the hybrid supercomputing model that uses graphics processing units in addition to conventional central processing units (including the launch in 2012 of Titan, the first supercomputer of this type operating at over 10 petaFLOPS).
“The priority for all the QSC partners,” notes Humble, “is to transition from this still-speculative research phase in quantum computing, while orchestrating the inevitable convergence between quantum technology, existing HPC capabilities and evolving scientific workflows.”
Collaborate, coordinate, communicate
Much like its NQISRC counterparts (which have also been allocated further DOE funding through 2030), QSC provides the “operational umbrella” for a broad-scope collaboration of more than 300 scientists and engineers from 20 partner institutions. With its own distinct set of research priorities, that collective activity cuts across other National Laboratories (Los Alamos and Pacific Northwest), universities (among them Berkeley, Cornell and Purdue) and businesses (including IBM and IQM) to chart an ambitious R&D pathway addressing quantum-state (qubit) resilience, controllability and, ultimately, the scalability of quantum technologies.
“QSC is a multidisciplinary melting pot,” explains Humble, “and I would say, alongside all our scientific and engineering talent, it’s the pooled user facilities that we are able to exploit here at Oak Ridge and across our network of partners that gives us our ‘grand capability’ in quantum science [see box, “Unique user facilities unlock QSC opportunities”]. Certainly, when you have a common research infrastructure, orchestrated as part a unified initiative like QSC, then you can deliver powerful science that translates into real-world impacts.”
Unique user facilities unlock QSC opportunities
Neutron insights ORNL director Stephen Streiffer tours the linear accelerator tunnel at the Spallation Neutron Source (SNS). QSC scientists are using the SNS to investigate entirely new classes of strongly correlated materials that demonstrate topological order and quantum entanglement. (Courtesy: Alonda Hines/ORNL, US DOE)
Deconstructed, QSC’s Phase I remit (2020–25) spanned three dovetailing and cross-disciplinary research pathways: discovery and development of advanced materials for topological quantum computing (in which quantum information is stored in a stable topological state – or phase – of a physical system rather than the properties of individual particles or atoms); development of next-generation quantum sensors (to characterize topological states and support the search for dark matter); as well as quantum algorithms and simulations (for studies in fundamental physics and quantum chemistry).
Underpinning that collective effort: ORNL’s unique array of scientific user facilities. A case in point is the Spallation Neutron Source (SNS), an accelerator-based neutron-scattering facility that enables a diverse programme of pure and applied research in the physical sciences, life sciences and engineering. QSC scientists, for example, are using SNS to investigate entirely new classes of strongly correlated materials that demonstrate topological order and quantum entanglement – properties that show great promise for quantum computing and quantum metrology applications.
“The high-brightness neutrons at SNS give us access to this remarkable capability for materials characterization,” says Humble. “Using the SNS neutron beams, we can probe exotic materials, recover the neutrons that scatter off of them and, from the resultant signals, infer whether or not the materials exhibit quantum properties such as entanglement.”
While SNS may be ORNL’s “big-ticket” user facility, the laboratory is also home to another high-end resource for quantum studies: the Center for Nanophase Material Science (CNMS), one of the DOE’s five national Nanoscience Research Centers, which offers QSC scientists access to specialist expertise and equipment for nanomaterials synthesis; materials and device characterization; as well as theory, modelling and simulation in nanoscale science and technology.
Thanks to these co-located capabilities, QSC scientists pioneered another intriguing line of enquiry – one that will now be taken forward elsewhere within ORNL – by harnessing so-called quantum spin liquids, in which electron spins can become entangled with each other to demonstrate correlations over very large distances (relative to the size of individual atoms).
In this way, it is possible to take materials that have been certified as quantum-entangled and use them to design new types of quantum devices with unique geometries – as well as connections to electrodes and other types of control systems – to unlock novel physics and exotic quantum behaviours. The long-term goal? Translation of quantum spin liquids into a novel qubit technology to store and process quantum information.
SNS, CNMS and Oak Ridge Leadership Computing Facility (OLCF) are DOE Office of Science user facilities.
When he’s not overseeing the technical direction of QSC, Humble is acutely attuned to the need for sustained and accessible messaging. The priority? To connect researchers across the collaboration – physicists, chemists, material scientists, quantum information scientists and engineers – as well as key external stakeholders within the DOE, government and industry.
“In my experience,” he concludes, ”the ability of the QSC teams to communicate efficiently – to understand each other’s concepts and reasoning and to translate back and forth across disciplinary boundaries – remains fundamental to the success of our scientific endeavours.”
The next generation Quantum science graduate students and postdoctoral researchers present and discuss their work during a poster session at the fifth annual QSC Summer School. Hosted at Purdue University in April this year, the school is one of several workforce development efforts supported by QSC. (Courtesy: Dave Mason/Purdue University)
With an acknowledged shortage of skilled workers across the quantum supply chain, QSC is doing its bit to bolster the scientific and industrial workforce. Front-and-centre: the fifth annual QSC Summer School, which was held at Purdue University in April this year, hosting 130 graduate students (the largest cohort to date) through an intensive four-day training programme.
The Summer School sits as part of a long-term QSC initiative to equip ambitious individuals with the specialist domain knowledge and skills needed to thrive in a quantum sector brimming with opportunity – whether that’s in scientific research or out in industry with hardware companies, software companies or, ultimately, the end-users of quantum technologies in key verticals like pharmaceuticals, finance and healthcare.
“While PhD students and postdocs are integral to the QSC research effort, the Summer School exposes them to the fundamental ideas of quantum science elaborated by leading experts in the field,” notes Vivien Zapf, a condensed-matter physicist at Los Alamos National Laboratory who heads up QSC’s advanced characterization efforts.
“It’s all about encouraging the collective conversation,” she adds, “with lots of opportunities for questions and knowledge exchange. Overall, our emphasis is very much on training up scientists and engineers to work across the diversity of disciplines needed to translate quantum technologies out of the lab into practical applications.”
The programme isn’t for the faint-hearted, though. Student delegates kicked off this year’s proceedings with a half-day of introductory presentations on quantum materials, devices and algorithms. Next up: three and a half days of intensive lectures, panel discussions and poster sessions covering everything from entangled quantum networks to quantum simulations of superconducting qubits.
Many of the Summer School’s sessions were also made available virtually on Purdue’s Quantum Coffeehouse Live Stream on YouTube – the streamed content reaching quantum learners across the US and further afield. Lecturers were drawn from the US National Laboratories, leading universities (such as Harvard and Northwestern) and the quantum technology sector (including experts from IBM, PsiQuantum, NVIDIA and JPMorganChase).
As a physicist in industry, I spend my days developing new types of photovoltaic (PV) panels. But I’m also keen to do something for the transition to green energy outside work, which is why I recently installed two PV panels on the balcony of my flat in Munich. Fitting them was great fun – and I can now enjoy sunny days even more knowing that each panel is generating electricity.
However, the panels, which each have a peak power of 440 W, don’t cover all my electricity needs, which prompted me to take an interest in a plan to build six wind turbines in a forest near me on the outskirts of Munich. Curious about the project, I particularly wanted to find out when the turbines will start generating electricity for the grid. So when I heard that a weekend cycle tour of the site was being organized to showcase it to local residents, I grabbed my bike and joined in.
As we cycle, I discover that the project – located in Forstenrieder Park – is the joint effort of four local councils and two “citizen-energy” groups, who’ve worked together for the last five years to plan and start building the six turbines. Each tower will be 166 m high and the rotor blades will be 80 m long, with the plan being for them to start operating in 2027.
I’ve never thought of Munich as a particularly windy city, but at the height at which the blades operate, there’s always a steady, reliable flow of wind
I’ve never thought of Munich as a particularly windy city. But tour leader Dieter Maier, who’s a climate adviser to Neuried council, explains that at the height at which the blades operate, there’s always a steady, reliable flow of wind. In fact, each turbine has a designed power output of 6.5 MW and will deliver a total of 10 GWh in energy over the course of a year.
Practical questions
Cycling around, I’m excited to think that a single turbine could end up providing the entire electricity demand for Neuried. But installing wind turbines involves much more than just the technicalities of generating electricity. How do you connect the turbines to the grid? How do you ensure planes don’t fly into the turbines? What about wildlife conservation and biodiversity?
At one point of our tour, we cycle round a 90-degree bend in the forest and I wonder how a huge, 80 m-long blade will be transported round that kind of tight angle? Trees will almost certainly have to be felled to get the blade in place, which sounds questionable for a supposedly green project. Fortunately, project leaders have been working with the local forest manager and conservationists, finding ways to help improve the local biodiversity despite the loss of trees.
As a representative of BUND (one of Germany’s biggest conservation charities) explains on the tour, a natural, or “unmanaged”, forest consists of a mix of areas with a higher or lower density of trees. But Forstenrieder Park has been a managed forest for well over a century and is mostly thick with trees. Clearing trees for the turbines will therefore allow conservationists to grow more of the bushes and plants that currently struggle to find space to flourish.
Cut and cover Trees in Forstenrieder Park have had to be chopped down to provide room for new wind turbines to be installed, but the open space will let conservationists grow plants and bushes to boost biodiversity. (Courtesy: Janina Moereke)
To avoid endangering birds and bats native to this forest, meanwhile, the turbines will be turned off when the animals are most active, which coincidentally corresponds to low wind periods in Munich. Insurance costs have to be factored in too. Thankfully, it’s quite unlikely that a turbine will burn down or get ice all over its blades, which means liability insurance costs are low. But vandalism is an ever-present worry.
In fact, at the end of our bike tour, we’re taken to a local wind turbine that is already up and running about 13 km further south of Forstenrieder Park. This turbine, I’m disappointed to discover, was vandalized back in 2024, which led to it being fenced off and video surveillance cameras being installed.
But for all the difficulties, I’m excited by the prospect of the wind turbines supporting the local energy needs. I can’t wait for the day when I’m on my balcony, solar panels at my side, sipping a cup of tea made with water boiled by electricity generated by the rotor blades I can see turning round and round on the horizon.