Vue normale
Global Thaw 10,000 Years Ago May Have Fueled Volcanoes and Sped Up Continental Drift
Cancer Mortality Rates Declined Over the Past 20 Years in the United States
In China, trade war with U.S. taking a toll on research labs
Tardigrade Tattoos Could Pave the Way for Microscopic Medical Devices
Rattlesnake Venom Evolves and Adapts to Climate Change
-
Discover Mag
- Giant Kangaroos' Weight at 375 Pounds and Limited Roaming Likely Led to Their Extinction
Giant Kangaroos' Weight at 375 Pounds and Limited Roaming Likely Led to Their Extinction
Ancient Bite Wounds Confirm Roman Gladiators Did Fight Lions in Combat
Superconducting device delivers ultrafast changes in magnetic field
Precise control over the generation of intense, ultrafast changes in magnetic fields called “magnetic steps” has been achieved by researchers in Hamburg, Germany. Using ultrashort laser pulses, Andrea Cavalleri and colleagues at the Max Planck Institute for the Structure and Dynamics of Matter disrupted the currents flowing through a superconducting disc. This alters the superconductor’s local magnetic environment on very short timescales – creating a magnetic step.
Magnetic steps rise to their peak intensity in just a few picoseconds, before decaying more slowly in several nanoseconds. They are useful to scientists because they rise and fall on timescales far shorter than the time it takes for materials to respond to external magnetic fields. As a result, magnetic steps could provide fundamental insights into the non-equilibrium properties of magnetic materials, and could also have practical applications in areas such as magnetic memory storage.
So far, however, progress in this field has been held back by technical difficulties in generating and controlling magnetic steps on ultrashort timescales. Previous strategies have employed technologies including microcoils, specialized antennas, and circularly polarized light pulses. However, each of these schemes offer a limited degree of control over the properties of the magnetic steps they generated.
Quenching supercurrents
Now, Cavalleri’s team has developed a new technique that involves the quenching of currents in a superconductor. Normally, these “supercurrents” will flow indefinitely without losing energy, and will act to expel any external magnetic fields from the superconductor’s interior. However, if these currents are temporarily disrupted on ultrashort timescales, a sudden change will be triggered in the magnetic field close to the superconductor – which could be used to create a magnetic step.
To create this process, Cavalleri and colleagues applied ultrashort laser pulses to a thin, superconducting disc of yttrium barium copper oxide (YBCO), while also exposing the disc to an external magnetic field.
To detect whether magnetic steps had been generated, they placed a crystal of the semiconductor gallium phosphide in the superconductor’s vicinity. This material exhibits an extremely rapid Faraday response. This involves the rotation of the polarization of light passing through the semiconductor in response to changes in the local magnetic field. Crucially, this rotation can occur on sub-picosecond timescales.
In their experiments, researchers monitored changes to the polarization of an ultrashort “probe” laser pulse passing through the semiconductor shortly after they quenched supercurrents in their YBCO disc using a separate ultrashort “pump” laser pulse.
“By abruptly disrupting the material’s supercurrents using ultrashort laser pulses, we could generate ultrafast magnetic field steps with rise times of approximately one picosecond – or one trillionth of a second,” explains team member Gregor Jotzu.
Broadband step
This was used to generate an extremely broadband magnetic step, which contains frequencies ranging from sub-gigahertz to terahertz. In principle, this should make the technique suitable for studying magnetization in a diverse variety of materials.
To demonstrate practical applications, the team used these magnetic steps to control the magnetization of a ferrimagnet. Such a magnet has opposing magnetic moments, but has a non-zero spontaneous magnetization in zero magnetic field.
When they placed a ferrimagnet on top of their superconductor and created a magnetic step, the step field caused the ferrimagnet’s magnetization to rotate.
For now, the magnetic steps generated through this approach do not have the speed or amplitude needed to switch materials like a ferrimagnet between stable states. Yet through further tweaks to the geometry of their setup, the researchers are confident that this ability may not be far out of reach.
“Our goal is to create a universal, ultrafast stimulus that can switch any magnetic sample between stable magnetic states,” Cavalleri says. “With suitable improvements, we envision applications ranging from phase transition control to complete switching of magnetic order parameters.”
The research is described in Nature Photonics.
The post Superconducting device delivers ultrafast changes in magnetic field appeared first on Physics World.
Just Like Us, Chimpanzees Love To Drink and Share Alcohol
-
Science Magazine
- How much climate damage do polluters actually cause? New method comes up with price tag
How much climate damage do polluters actually cause? New method comes up with price tag
-
Science Magazine
- Most Phoenicians did not come from the land of Canaan, challenging historical assumptions
Most Phoenicians did not come from the land of Canaan, challenging historical assumptions
Massive pea study solves last genetic riddles of famed friar
How to support colleagues who are dealing with personal issues
Over the years, first as a PhD student and now as a postdoc, I have been approached by many students and early-career academics who have confided their problems with me. Their issues, which they struggled to deal with alone, ranged from anxiety and burnout to personal and professional relationships as well as mental-health concerns. Sadly, such discussions were not one-off incidents but seemed worryingly common in academia where people are often under pressure to perform, face uncertainty over their careers and need to juggle lots of different tasks simultaneously.
But it can be challenging to even begin to approach someone else with a problem. That first step can take days or weeks of mental preparation, so to those who are approached for help, it is our responsibility to listen and act appropriately when someone does finally open up. This is especially so given that a supervisor, mentor, teaching assistant, or anybody in a position of seniority, may be the first point of contact when a difficulty becomes debilitating.
I am fortunate to have had excellent relationships with my PhD and postdoc supervisors – providing great examples to follow. Even then, however, it was difficult to subdue the feeling of nausea when I knocked on their office doors to have a difficult conversation. I was worried about their response and reaction and how they would judge me. While that first conversation is challenging for both parties, fortunately it does gets easier from there.
Yet it can also be hard for the person who is trying to offer help, especially if they haven’t done so before. In fact, when colleagues began to confide in me, I’d had no formal preparation or training to support them. But through experience and some research, I found a few things that worked well in such complex situations. The first is to set and maintain boundaries or where your personal limits lie. This includes which topics are off limits and to what extent you will engage with somebody. Someone who has recently experienced bereavement, for example, may not want to engage deeply with a student who is enduring the same and so should make it clear they can’t offer help. Yet at the same time, that person may feel confident providing support for someone struggling with imposter syndrome – a feeling that you don’t deserve to be there and aren’t good at your work.
Time restrictions can also be used as boundaries. If you are working on a critical experiment, have an article deadline or are about to go on holiday, explain that you can only help them until a certain point, after which you will explore alternative solutions together. Setting boundaries can also be handy for mentors to prepare to help someone struggling. This could involve taking a mental-health first-aid course to support a person who experiences panic attacks or is relapsing into depression. It could also mean finding contact details for professionals, either on campus or beyond, would could help. While providing such information might sound trivial and unimportant, remember that for a person who is feeling overwhelmed, it can be hugely appreciated.
Following up
Sharing problems takes courage. It also requires trust because if information leaks out, rumours and accusations can spread quickly and worsen situations. It is, however, possible to ask more senior colleagues for advice without identifying anyone or their exact circumstances, perhaps in cases when dealing with less than amicable relationships with collaborators. It is also possible to let colleagues know that a particular person needs more support without explicitly saying why.
There are times, however, when that confidentiality must be broken. In my experience, this should always be first addressed with the person at hand and broken to somebody who is sure to have a concrete solution. For a student who is struggling with a particular subject, it could, for example, be the lecturer responsible for that course. For somebody not coping with divorce, say, it could be someone from HR or a supervisor for a colleague. It could even be a university’s support team or the police for a student who has experienced sexual assault.
Even if the situation has been handed over to someone else, it’s important to follow up with the person struggling, which helps them know they’re being heard and respected
I have broken confidentiality at times and it can be nerve-wracking, but it is essential to provide the best possible support and take a situation that you cannot handle off your hands. Even if the issue has been handed over to someone else, it’s important to follow up with the person struggling, which helps them know they’re being heard and respected. Following up is not always a comfortable conversation, potentially invoking trauma or broaching sensitive topics. But it also allows them to admit that they are still looking for more support or that their situation has worsened.
A follow-up conversation could also be held in a discrete environment with reassurance that nobody is obliged to go into detail. It may be as simple as asking “How are you feeling today?”. Letting someone express themselves without judgement can help them come to terms with their situation, let them speak or have confidence to approach you again.
Regularly reflecting on your boundaries and limits as well as having a good knowledge of possible resources can help you prepare for unexpected circumstances. It gives students and colleagues immediate care and relief at what might be their lowest point. But perhaps the most important aspect when approached by someone is to ask yourself this: “What kind of person would I want to speak to if I were struggling?”. That is the person you want to be.
The post How to support colleagues who are dealing with personal issues appeared first on Physics World.
FLIR MIX – a breakthrough in infrared and visible imaging
Until now, researchers have had to choose between thermal and visible imaging: One reveals heat signatures while the other provides structural detail. Recording both and trying to align them manually — or harder still, synchronizing them temporally — can be inconsistent and time-consuming. The result is data that is close but never quite complete. The new FLIR MIX is a game changer, capturing and synchronizing high-speed thermal and visible imagery at up to 1000 fps. Visible and high-performance infrared cameras with FLIR Research Studio software work together to deliver one data set with perfect spatial and temporal alignment — no missed details or second guessing, just a complete picture of fast-moving events.

Jerry Beeney is a seasoned global business development leader with a proven track record of driving product growth and sales performance in the Teledyne FLIR Science and Automation verticals. With more than 20 years at Teledyne FLIR, he has played a pivotal role in launching new thermal imaging solutions, working closely with technical experts, product managers, and customers to align products with market demands and customer needs. Before assuming his current role, Beeney held a variety of technical and sales positions, including senior scientific segment engineer. In these roles, he managed strategic accounts and delivered training and product demonstrations for clients across diverse R&D and scientific research fields. Beeney’s dedication to achieving meaningful results and cultivating lasting client relationships remains a cornerstone of his professional approach.
The post FLIR MIX – a breakthrough in infrared and visible imaging appeared first on Physics World.
We May Value Our Dogs More than Our Human Relationships
The guardian’s rifle: why mission-essential space support cannot be outsourced

In matters of national defense and credible deterrence, some capabilities are simply too vital to outsource. If they falter, armies lose battles and nations can lose wars. Today, commercial space […]
The post The guardian’s rifle: why mission-essential space support cannot be outsourced appeared first on SpaceNews.
Muscle Memory Isn’t What You Think It Is
Dual-robot radiotherapy system designed to reduce the cost of cancer treatment
Researchers at the University of Victoria in Canada are developing a low-cost radiotherapy system for use in low- and middle-income countries and geographically remote rural regions. Initial performance characterization of the proof-of-concept device produced encouraging results, and the design team is now refining the system with the goal of clinical commercialization.
This could be good news for people living in low-resource settings, where access to cancer treatment is an urgent global health concern. The WHO’s International Agency for Research on Cancer estimates that there are at least 20 million new cases of cancer diagnosed annually and 9.7 million annual cancer-related deaths, based on 2022 data. By 2030, approximately 75% of cancer deaths are expected to occur in low- and middle-income countries, due to rising populations, healthcare and financial disparities, and a general lack of personnel and equipment resources compared with high-income countries.
The team’s orthovoltage radiotherapy system, known as KOALA (kilovoltage optimized alternative for adaptive therapy), is designed to create, optimize and deliver radiation treatments in a single session. The device, described in Biomedical Physics & Engineering Express, consists of a dual-robot system with a 225 kVp X-ray tube mounted onto one robotic arm and a flat-panel detector mounted on the other.
The same X-ray tube can be used to acquire cone-beam CT (CBCT) images, as well as to deliver treatment, with a peak tube voltage of 225 kVp and a maximum tube current of 2.65 mA for a 1.2 mm focal spot. Due to its maximum reach of 2.05 m and collision restrictions, the KOALA system has a limited range of motion, achieving 190° arcs for both CBCT acquisition and treatments.
Device testing
To characterize the KOALA system, lead author Olivia Masella and colleagues measured X-ray spectra for tube voltages of 120, 180 and 225 kVp. At 120 and 180 kVp, they observed good agreement with spectra from SpekPy (a Python software toolkit for modelling X-ray tube spectra). For the 225 kVp spectrum, they found a notable overestimation in the higher energies.
The researchers performed dosimetric tests by measuring percent depth dose (PDD) curves for a 120 kVp imaging beam and a 225 kVp therapy beam, using solid water phantom blocks with a Farmer ionization chamber at various depths. They used an open beam with 40° divergence and a source-to-surface distance of 30 cm. They also measured 2D dose profiles with radiochromic film at various depths in the phantom for a collimated 225 kVp therapy beam and a dose of approximately 175 mGy at the surface.
The PDD curves showed excellent agreement between experiment and simulations at both 120 and 225 kVp, with dose errors of less than 2%. The 2D profile results were less than optimal. The team aims to correct this by using a more optimal source-to-collimator distance (100 mm) and a custom-built motorized collimator.

Geometrical evaluation conducted using a coplanar star-shot test showed that the system demonstrated excellent geometrical accuracy, generating a wobble circle with a diameter of just 0.3 mm.
Low costs and clinical practicality
Principal investigator Magdalena Bazalova-Carter describes the rationale behind the KOALA’s development. “I began the computer simulations of this project about 15 years ago, but the idea originated from Michael Weil, a radiation oncologist in Northern California,” she tells Physics World. “He and our industrial partner, Tai-Nang Huang, the president of Linden Technologies, are overseeing the progress of the project. Our university team is diversified, working in medical physics, computer science, and electrical and mechanical engineering. Orimtech, a medical device manufacturer and collaborator, developed the CBCT acquisition and reconstruction software and built the imaging prototype.”
Masella says that the team is keeping costs low is various ways. “Megavoltage X-rays are most commonly used in conventional radiotherapy, but KOALA’s design utilizes low-energy kilovoltage X-rays for treatment. By using a 225 kVp X-ray tube, the X-ray generation alone is significantly cheaper compared to a conventional linac, at a cost of USD $150,000 compared to $3 million,” she explains. “By operating in the kilovoltage instead of megavoltage range, only about 4 mm of lead shielding is required, instead of 6 to 7 feet of high-density concrete, bringing the shielding cost down from $2 million to $50,000. We also have incorporated components that are much lower cost than [those in] a conventional radiotherapy system.”
“Our novel iris collimator leaves are only 1-mm thick due to the lower treatment X-ray beam energy, and its 12 leaves are driven by a single motor,” adds Bazalova-Carter. “Although multileaf collimators with 120 leaves utilized with megavoltage X-ray radiotherapy are able to create complex fields, they are about 8-cm thick and are controlled by 120 separate motors. Given the high cost and mechanical vulnerability of multileaf collimators, our single motor design offers a more robust and reliable alternative.”
The team is currently developing a new motorized collimator, an improved treatment couch and a treatment planning system. They plan to improve CBCT imaging quality with hardware modifications, develop a CBCT-to-synthetic CT machine learning algorithm, refine the auto-contouring tool and integrate all of the software to smooth the workflow.
The researchers are planning to work with veterinarians to test the KOALA system with dogs diagnosed with cancer. They will also develop quality assurance protocols specific to the KOALA device using a dog-head phantom.
“We hope to demonstrate the capabilities of our system by treating beloved pets for whom available cancer treatment might be cost-prohibitive. And while our system could become clinically adopted in veterinary medicine, our hope is that it will be used to treat people in regions where conventional radiotherapy treatment is insufficient to meet demand,” they say.
The post Dual-robot radiotherapy system designed to reduce the cost of cancer treatment appeared first on Physics World.
‘Terrible crocodile’ was not related to modern alligators
Finland Could Be the First Country in the World to Bury Nuclear Waste Permanently
Eli Lilly Sues 4 GLP-1 Telehealth Startups, Escalating War on Knockoff Drugs
This Artificial Wetland Is Reusing Wastewater to Revive a Lost Ecosystem
Supercritical water reveals its secrets
Contrary to some theorists’ expectations, water does not form hydrogen bonds in its supercritical phase. This finding, which is based on terahertz spectroscopy measurements and simulations by researchers at Ruhr University Bochum, Germany, puts to rest a long-standing controversy and could help us better understand the chemical processes that occur near deep-sea vents.
Water is unusual. Unlike most other materials, it is denser as a liquid than it is as the ice that forms when it freezes. It also expands rather than contracting when it cools; becomes less viscous when compressed; and exists in no fewer than 17 different crystalline phases.
Another unusual property is that at high temperatures and pressures – above 374 °C and 221 bars – water mostly exists as a supercritical fluid, meaning it shares some properties with both gases and liquids. Though such extreme conditions are rare on the Earth’s surface (at least outside a laboratory), they are typical for the planet’s crust and mantle. They are also present in so-called black smokers, which are hydrothermal vents that exist on the seabed in certain geologically active locations. Understanding supercritical water is therefore important for understanding the geochemical processes that occur in such conditions, including the genesis of gold ore.
Supercritical water also shows promise as an environmentally friendly solvent for industrial processes such as catalysis, and even as a mediator in nuclear power plants. Before any such applications see the light of day, however, researchers need to better understand the structure of water’s supercritical phase.
Probing the hydrogen bonding between molecules
At ambient conditions, the tetrahedrally-arranged hydrogen bonds (H-bonds) in liquid water produce a three-dimensional H-bonded network. Many of water’s unusual properties stem from this network, but as it approaches its supercritical point, its structure changes.
Previous studies of this change have produced results that were contradictory or unclear at best. While some pointed to the existence of distorted H-bonds, others identified heterogeneous structures involving rigid H-bonded dimers or, more generally, small clusters of tetrahedrally-bonded water surrounded by nonbonded gas-like molecules.
To resolve this mystery, an experimental team led by Gerhard Schwaab and Martina Havenith, together with Philipp Schienbein and Dominik Marx, investigated how water absorbs light in the far infrared/terahertz (THz) range of the spectrum. They performed their experiments and simulations at temperatures of 20° to 400°C and pressures from 1 bar up to 240 bars. In this way, they were able to investigate the hydrogen bonding between molecules in samples of water that were entering the supercritical state and samples that were already in it.
Diamond and gold cell
Because supercritical water is highly corrosive, the researchers carried out their experiments in a specially-designed cell made from diamond and gold. By comparing their experimental data with the results of extensive ab initio simulations that probed different parts of water’s high-temperature phase diagram, they obtained a molecular picture of what was happening.
The researchers found that the terahertz spectrum of water in its supercritical phase was practically identical to that of hot gaseous water vapour. This, they say, proves that supercritical water is different from both liquid water at ambient conditions and water in a low-temperature gas phase where clusters of molecules form directional hydrogen bonds. No such molecular clusters appear in supercritical water, they note.
The team’s ab initio molecular dynamics simulations also revealed that two water molecules in the supercritical phase remain close to each other for a very limited time – much shorter than the typical lifetime of hydrogen bonds in liquid water – before distancing themselves. What is more, the bonds between hydrogen and oxygen atoms in supercritical water do not have a preferred orientation. Instead, they are permanently and randomly rotating. “This is completely different to the hydrogen bonds that connect the water molecules in liquid water at ambient conditions, which do have a persisting preferred orientation,” Havenith says.
Now that they have identified a clear spectroscopic fingerprint for supercritical water, the researchers want to study how solutes affect the solvation properties of this substance. They anticipate that the results from this work, which is published in Science Advances, will enable them to characterize the properties of supercritical water for use as a “green” solvent.
The post Supercritical water reveals its secrets appeared first on Physics World.
NIH guts its first and largest study centered on women
Atmos Space Cargo declares first test flight a success despite reentry uncertainty

German startup Atmos Space Cargo says it considers the first flight of its reentry vehicle a success despite limited data on how it performed during reentry.
The post Atmos Space Cargo declares first test flight a success despite reentry uncertainty appeared first on SpaceNews.
Astra targets cargo delivery with Rocket 4 in Pentagon-backed plan

Former public rocket startup focusing on defense applications after going private
The post Astra targets cargo delivery with Rocket 4 in Pentagon-backed plan appeared first on SpaceNews.
Iridium shields supply chain as higher tariffs loom

TAMPA, Fla. — Iridium is ramping up tariff countermeasures to shield the U.S.-based satellite operator from import tax hikes as global trade tensions escalate. The operator has historically imported satellite […]
The post Iridium shields supply chain as higher tariffs loom appeared first on SpaceNews.
Northwood raises $30 million to establish ground station network

SAN FRANCISCO — Northwood Space raised $30 million in a Series A round to establish a global network of phased array ground stations. Alpine Space Ventures and Andreessen Horowitz led […]
The post Northwood raises $30 million to establish ground station network appeared first on SpaceNews.
SAIC wins $55 million Space Development Agency contract for satellite network integration

By introducing a program integrator role, SDA aims to ensure better compatibility among satellites and cohesion across the network
The post SAIC wins $55 million Space Development Agency contract for satellite network integration appeared first on SpaceNews.
In killing grants, NSF appears to follow Ted Cruz’s blueprint
Molecules From Space May Have Sparked Life on Earth Billions of Years Ago
Laser Tech May Have Discovered a New Color Never Before Seen by Human Eye
Microdosing Psychedelics as Treatment Could Increase Flexible Thinking
A Disintegrating Planet Is Shedding Matter and Has a Comet-Like Tail
Unraveling the Power of Silk
Top-quark pairs at ATLAS could shed light on the early universe
Physicists working on the ATLAS experiment on the Large Hadron Collider (LHC) are the first to report the production of top quark–antiquark pairs in collisions involving heavy nuclei. By colliding lead ions, CERN’s LHC creates a fleeting state of matter called the quark–gluon plasma. This is an extremely hot and dense soup of subatomic particles that includes deconfined quarks and gluons. This plasma is believed to have filled the early universe microseconds after the Big Bang.
“Heavy-ion collisions at the LHC recreate the quark–gluon plasma in a laboratory setting,” Anthony Badea, a postdoctoral researcher at the University of Chicago and one of the lead authors of a paper describing the research. As well as boosting our understanding of the early universe, studying the quark–gluon plasma at the LHC could also provide insights into quantum chromodynamics (QCD), which is the theory of how quarks and gluons interact.
Although the quark–gluon plasma at the LHC vanishes after about 10-23 s, scientists can study it by analysing how other particles produced in collisions move through it. The top quark is the heaviest known elementary particle and its short lifetime and distinct decay pattern offer a unique way to explore the quark–gluon plasma. This because the top quark decays before the quark–gluon plasma dissipates.
“The top quark decays into lighter particles that subsequently further decay,” explains Stefano Forte at the University of Milan, who was not involved in the research. “The time lag between these subsequent decays is modified if they happen within the quark–gluon plasma, and thus studying them has been suggested as a way to probe [quark–gluon plasma’s] structure. In order for this to be possible, the very first step is to know how many top quarks are produced in the first place, and determining this experimentally is what is done in this [ATLAS] study.”
First observations
The ATLAS team analysed data from lead–lead collisions and searched for events in which a top quark and its antimatter counterpart were produced. These particles can then decay in several different ways and the researchers focused on a less frequent but more easily identifiable mode known as the di-lepton channel. In this scenario, each top quark decays into a bottom quark and a W boson, which is a weak force-carrying particle that then transforms into a detectable lepton and an invisible neutrino.
The results not only confirmed that top quarks are created in this complex environment but also showed that their production rate matches predictions based on our current understanding of the strong nuclear force.
“This is a very important study,” says Juan Rojo, a theoretical physicist at the Free University of Amsterdam who did not take part in the research. “We have studied the production of top quarks, the heaviest known elementary particle, in the relatively simple proton–proton collisions for decades. This work represents the first time that we observe the production of these very heavy particles in a much more complex environment, with two lead nuclei colliding among them.”
As well as confirming QCD’s prediction of heavy-quark production in heavy-nuclei collisions, Rojo explains that “we have a novel probe to resolve the structure of the quark–gluon plasma”. He also says that future studies will enable us “to understand novel phenomena in the strong interactions such as how much gluons in a heavy nucleus differ from gluons within the proton”.
Crucial first step
“This is a first step – a crucial one – but further studies will require larger samples of top quark events to explore more subtle effects,” adds Rojo.
The number of top quarks created in the ATLAS lead–lead collisions agrees with theoretical expectations. In the future, more detailed measurements could help refine our understanding of how quarks and gluons behave inside nuclei. Eventually, physicists hope to use top quarks not just to confirm existing models, but to reveal entirely new features of the quark–gluon plasma.
Rojo says we could, “learn about the time structure of the quark–gluon plasma, measurements which are ‘finer’ would be better, but for this we need to wait until more data is collected, in particular during the upcoming high-luminosity run of the LHC”.
Badea agrees that ATLAS’s observation opens the door to deeper explorations. “As we collect more nuclei collision data and improve our understanding of top-quark processes in proton collisions, the future will open up exciting prospects”.
The research is described in Physical Review Letters.
The post Top-quark pairs at ATLAS could shed light on the early universe appeared first on Physics World.
"Unjammable" Quantum Sensors Navigate by Earth's Magnetic Field
-
Physics World
- Grete Hermann: the quantum physicist who challenged Werner Heisenberg and John von Neumann
Grete Hermann: the quantum physicist who challenged Werner Heisenberg and John von Neumann

In the early days of quantum mechanics, physicists found its radical nature difficult to accept – even though the theory had successes. In particular Werner Heisenberg developed the first comprehensive formulation of quantum mechanics in 1925, while the following year Erwin Schrödinger was able to predict the spectrum of light emitted by hydrogen using his eponymous equation. Satisfying though these achievements were, there was trouble in store.
Long accustomed to Isaac Newton’s mechanical view of the universe, physicists had assumed that identical systems always evolve with time in exactly the same way, that is to say “deterministically”. But Heisenberg’s uncertainty principle and the probabilistic nature of Schrödinger’s wave function suggested worrying flaws in this notion. Those doubts were famously expressed by Albert Einstein, Boris Podolsky and Nathan Rosen in their “EPR” paper of 1935 (Phys. Rev. 47 777) and in debates between Einstein and Niels Bohr.
But the issues at stake went deeper than just a disagreement among physicists. They also touched on long-standing philosophical questions about whether we inhabit a deterministic universe, the related question of human free will, and the centrality of cause and effect. One person who rigorously addressed the questions raised by quantum theory was the German mathematician and philosopher Grete Hermann (1901–1984).
Hermann stands out in an era when it was rare for women to contribute to physics or philosophy, let alone to both. Writing in The Oxford Handbook of the History of Quantum Interpretations, published in 2022, the City University of New York philosopher of science Elise Crull has called Hermann’s work “one of the first, and finest, philosophical treatments of quantum mechanics”.
Grete Hermann upended the famous ‘proof’, developed by the Hungarian-American mathematician and physicist John von Neumann, that ‘hidden variables’ are impossible in quantum mechanics
What’s more, Hermann upended the famous “proof”, developed by the Hungarian-American mathematician and physicist John von Neumann, that “hidden variables” are impossible in quantum mechanics. But why have Hermann’s successes in studying the roots and meanings of quantum physics been so often overlooked? With 2025 being the International Year of Quantum Science and Technology, it’s time to find out.
Free thinker
Hermann was born on 2 March 1901 in the north German port city of Bremen. One of seven children, her mother was deeply religious, while her father was a merchant, a sailor and later an itinerant preacher. According to the 2016 book Grete Hermann: Between Physics and Philosophy by Crull and Guido Bacciagaluppi, she was raised according to her father’s maxim: “I train my children in freedom!” Essentially, he enabled Hermann to develop a wide range of interests and benefit from the best that the educational system could offer a woman at the time.
She was eventually admitted as one of a handful of girls at the Neue Gymnasium – a grammar school in Bremen – where she took a rigorous and broad programme of subjects. In 1921 Hermann earned a certificate to teach high-school pupils – an interest in education that reappeared in her later life – and began studying mathematics, physics and philosophy at the University of Göttingen.
In just four years, Hermann earned a PhD under the exceptional Göttingen mathematician Emmy Noether (1882–1935), famous for her groundbreaking theorem linking symmetry to physical conservation laws. Hermann’s final oral exam in 1925 featured not just mathematics, which was the subject of her PhD, but physics and philosophy too. She had specifically requested to be examined in the latter by the Göttingen philosopher Leonard Nelson, whose “logical sharpness” in lectures had impressed her.

By this time, Hermann’s interest in philosophy was starting to dominate her commitment to mathematics. Although Noether had found a mathematics position for her at the University of Freiburg, Hermann instead decided to become Nelson’s assistant, editing his books on philosophy. “She studies mathematics for four years,” Noether declared, “and suddenly she discovers her philosophical heart!”
Hermann found Nelson to be demanding and sometimes overbearing but benefitted from the challenges he set. “I gradually learnt to eke out, step by step,” she later declared, “the courage for truth that is necessary if one is to utterly place one’s trust, also within one’s own thinking, in a method of thought recognized as cogent.” Hermann, it appeared, was searching for a path to the internal discovery of truth, rather like Einstein’s Gedankenexperimente.
After Nelson died in 1927 aged just 45, Hermann stayed in Göttingen, where she continued editing and expanding his philosophical work and related political ideas. Espousing a form of socialism based on ethical reasoning to produce a just society, Nelson had co-founded a political action group and set up the associated Philosophical-Political Academy (PPA) to teach his ideas. Hermann contributed to both and also wrote for the PPA’s anti-Nazi newspaper.
Hermann’s involvement in the organizations Nelson had founded later saw her move to other locations in Germany, including Berlin. But after Hitler came to power in 1933, the Nazis banned the PPA, and Hermann and her socialist associates drew up plans to leave Germany. Initially, she lived at a PPA “school-in-exile” in neighbouring Denmark. As the Nazis began to arrest socialists, Hermann feared that Germany might occupy Denmark (as it indeed later did) and so moved again, first to Paris and then London.
Amid all these disruptions, Hermann continued to bring her dual philosophical and mathematical perspectives to physics, and especially to quantum mechanics
Arriving in Britain in early 1938, Hermann became acquainted with Edward Henry, another socialist, whom she later married. It was, however, merely a marriage of convenience that gave Hermann British citizenship and – when the Second World War started in 1939 – stopped her from being interned as an enemy alien. (The couple divorced after the war.) Amid all these disruptions, Hermann continued to bring her dual philosophical and mathematical perspectives to physics, and especially to quantum mechanics.
Mixing philosophy and physics
A major stimulus for Hermann’s work came from discussions she had in 1934 with Heisenberg and Carl Friedrich von Weizsäcker, who was then his research assistant at the Institute for Theoretical Physics in Leipzig. The previous year Hermann had written an essay entitled “Determinism and quantum mechanics”, which analysed whether the indeterminate nature of quantum mechanics – central to the “Copenhagen interpretation” of quantum behaviour – challenged the concept of causality.
Much cherished by physicists, causality says that every event has a cause, and that a given cause always produces a single specific event. Causality was also a tenet of the 18th-century German philosopher Immanuel Kant, best known for his famous 1781 treatise Critique of Pure Reason. He believed that causality is fundamental for how humans organize their experiences and make sense of the world.
Hermann, like Nelson, was a “neo-Kantian” who believed that Kant’s ideas should be treated with scientific rigour. In her 1933 essay, Hermann examined how the Copenhagen interpretation undermines Kant’s principle of causality. Although the article was not published at the time, she sent copies to Heisenberg, von Weizsäcker, Bohr and also Paul Dirac, who was then at the University of Cambridge in the UK.
In fact, we only know of the essay’s existence because Crull and Bacciagaluppi discovered a copy in Dirac’s archives at Churchill College, Cambridge. They also found a 1933 letter to Hermann from Gustav Heckmann, a physicist who said that Heisenberg, von Weizsäcker and Bohr had all read her essay and took it “absolutely and completely seriously”. Heisenberg added that Hermann was a “fabulously clever woman”.
Heckmann then advised Hermann to discuss her ideas more fully with Heisenberg, who he felt would be more open than Bohr to new ideas from an unexpected source. In 1934 Hermann visited Heisenberg and von Weizsäcker in Leipzig, with Heisenberg later describing his interaction in his 1971 memoir Physics and Beyond: Encounters and Conversations.
In that book, Heisenberg relates how rigorously Hermann wanted to treat philosophical questions. “[She] believed she could prove that the causal law – in the form Kant had given it – was unshakable,” Heisenberg recalled. “Now the new quantum mechanics seemed to be challenging the Kantian conception, and she had accordingly decided to fight the matter out with us.”
Their interaction was no fight, but a spirited discussion, with some sharp questioning from Hermann. When Heisenberg suggested, for instance, that a particular radium atom emitting an electron is an example of an unpredictable random event that has no cause, Hermann countered by saying that just because no cause has been found, it didn’t mean no such cause exists.
Significantly, this was a reference to what we now call “hidden variables” – the idea that quantum mechanics is being steered by additional parameters that we possibly don’t know anything about. Heisenberg then argued that even with such causes, knowing them would lead to complications in other experiments because of the wave nature of electrons.

Suppose, using a hidden variable, we could predict exactly which direction an electron would move. The electron wave wouldn’t then be able to split and interfere with itself, resulting in an extinction of the electron. But such electron interference effects are experimentally observed, which Heisenberg took as evidence that no additional hidden variables are needed to make quantum mechanics complete. Once again, Hermann pointed out a discrepancy in Heisenberg’s argument.
In the end, neither side fully convinced the other, but inroads were made, with Heisenberg concluding in his 1971 book that “we had all learned a good deal about the relationship between Kant’s philosophy and modern science”. Hermann herself paid tribute to Heisenberg in a 1935 paper “Natural-philosophical foundations of quantum mechanics”, which appeared in a relatively obscure philosophy journal called Abhandlungen der Fries’schen Schule (6 69). In it, she thanked Heisenberg “above all for his willingness to discuss the foundations of quantum mechanics, which was crucial in helping the present investigations”.
Quantum indeterminacy versus causality
In her 1933 paper, Hermann aimed to understand if the indeterminacy of quantum mechanics threatens causality. Her overall finding was that wherever indeterminacy is invoked in quantum mechanics, it is not logically essential to the theory. So without claiming that quantum theory actually supports causality, she left the possibility open that it might.
To illustrate her point, Hermann considered Heisenberg’s uncertainty principle, which says that there’s a limit to the accuracy with which complementary variables, such as position, q, and momentum, p, can be measured, namely ΔqΔp ≥ h where h is Planck’s constant. Does this principle, she wondered, truly indicate quantum indeterminism?
Hermann asserted that this relation can mean only one of two possible things. One is that measuring one variable leaves the value of the other undetermined. Alternatively, the result of measuring the other variable can’t be precisely predicted. Hermann dismissed the first option because its very statement implies that exact values exist, and so it cannot be logically used to argue against determinism. The second choice could be valid, but that does not exclude the possibility of finding new properties – hidden variables – that give an exact prediction.
Hermann used her mathematical training to point out a flaw in von Neumann’s 1932 famous proof, which said that no hidden-variable theory can ever reproduce the features of quantum mechanics
In making her argument about hidden variables, Hermann used her mathematical training to point out a flaw in von Neumann’s 1932 famous proof, which said that no hidden-variable theory can ever reproduce the features of quantum mechanics. Quantum mechanics, according to von Neumann, is complete and no extra deterministic features need to be added.
For decades, his result was cited as “proof” that any deterministic addition to quantum mechanics must be wrong. Indeed, von Neumann had such a well-deserved reputation as a brilliant mathematician that few people had ever bothered to scrutinize his analysis. But in 1964 the Northern Irish theorist John Bell famously showed that a valid hidden-variable theory could indeed exist, though only if it’s “non-local” (Physics 1 195).
Non-locality says that things can happen at different parts of the universe simultaneously without needing faster-than-light communication. Despite being a notion that Einstein never liked, non-locality has been widely confirmed experimentally. In fact, non-locality is a defining feature of quantum physics and one that’s eminently useful in quantum technology.
Then, in 1966 Bell examined von Neumann’s reasoning and found an error that decisively refuted the proof (Rev. Mod, Phys. 38 447). Bell, in other words, showed that quantum mechanics could permit hidden variables after all – a finding that opened the door to alternative interpretations of quantum mechanics. However, Hermann had reported the very same error in her 1933 paper, and again in her 1935 essay, with an especially lucid exposition that almost exactly foresees Bell’s objection.
She had got there first, more than three decades earlier (see box).
Grete Hermann: 30 years ahead of John Bell
According to Grete Hermann, John von Neumann’s 1933 proof that quantum mechanics doesn’t need hidden variables “stands or falls” on his assumption concerning “expectation values”, which is the sum of all possible outcomes weighted by their respective probabilities. In the case of two quantities, say, r and s, von Neumann supposed that the expectation value of (r + s) is the same as the expectation value of r plus the expectation value of s. In other words, <(r + s)> = <r> + <s>.
This is clearly true in classical physics, Hermann writes, but the truth is more complicated in quantum mechanics. Suppose r and s are the conjugate variables in an uncertainty relationship, such as momentum q and position p given by ΔqΔp ≥ h. By definition, measuring q eliminates making a precise measurement of p, so it is impossible to simultaneously measure them and satisfy the relation <q + p> = <q> + <p>.
Further analysis, which Hermann supplied and Bell presented more fully, shows exactly why this invalidates or at least strongly limits the applicability of von Neumann’s proof; but Hermann caught the essence of the error first. Bell did not recognize or cite Hermann’s work, most probably because it was hardly known to the physics community until years after his 1966 paper.
A new view of causality
After rebutting von Neumann’s proof in her 1935 essay, Hermann didn’t actually turn to hidden variables. Instead, Hermann went in a different and surprising direction, probably as a result of her discussions with Heisenberg. She accepted that quantum mechanics is a complete theory that makes only statistical predictions, but proposed an alternative view of causality within this interpretation.
We cannot foresee precise causal links in a quantum mechanics that is statistical, she wrote. But once a measurement has been made with a known result, we can work backwards to get a cause that led to that result. In fact, Hermann showed exactly how to do this with various examples. In this way, she maintains, quantum mechanics does not refute the general Kantian category of causality.
Not all philosophers have been satisfied by the idea of retroactive causality. But writing in The Oxford Handbook of the History of Quantum Interpretations, Crull says that Hermann “provides the contours of a neo-Kantian interpretation of quantum mechanics”. “With one foot squarely on Kant’s turf and the other squarely on Bohr’s and Heisenberg’s,” Crull concludes, “[Hermann’s] interpretation truly stands on unique ground.”
Grete Hermann’s 1935 paper shows a deep and subtle grasp of elements of the Copenhagen interpretation.
But Hermann’s 1935 paper did more than just upset von Neumann’s proof. In the article, she shows a deep and subtle grasp of elements of the Copenhagen interpretation such as its correspondence principle, which says that – in the limit of large quantum numbers – answers derived from quantum physics must approach those from classical physics.
The paper also shows that Hermann was fully aware – and indeed extended the meaning – of the implications of Heisenberg’s thought experiment that he used to illustrate the uncertainty principle. Heisenberg envisaged a photon colliding with an electron, but after that contact, she writes, the wave function of the physical system is a linear combination of terms, each being “the product of one wave function describing the electron and one describing the light quantum”.
As she went on to say, “The light quantum and the electron are thus not described each by itself, but only in their relation to each other. Each state of the one is associated with one of the other.” Remarkably, this amounts to an early perception of quantum entanglement, which Schrödinger described and named later in 1935. There is no evidence, however, that Schrödinger knew of Hermann’s insights.
Hermann’s legacy
On the centenary of the birth of a full theory of quantum mechanics, how should we remember Hermann? According to Crull, the early founders of quantum mechanics were “asking philosophical questions about the implications of their theory [but] none of these men were trained in both physics and philosophy”. Hermann, however, was an expert in the two. “[She] composed a brilliant philosophical analysis of quantum mechanics, as only one with her training and insight could have done,” Crull says.
Had Hermann’s 1935 paper been more widely known, it could have altered the early development of quantum mechanics
Sadly for Hermann, few physicists at the time were aware of her 1935 paper even though she had sent copies to some of them. Had it been more widely known, her paper could have altered the early development of quantum mechanics. Reading it today shows how Hermann’s style of incisive logical examination can bring new understanding.
Hermann leaves other legacies too. As the Second World War drew to a close, she started writing about the ethics of science, especially the way in which it was carried out under the Nazis. After the war, she returned to Germany, where she devoted herself to pedagogy and teacher training. She disseminated Nelson’s views as well as her own through the reconstituted PPA, and took on governmental positions where she worked to rebuild the German educational system, apparently to good effect according to contemporary testimony.
Hermann also became active in politics as an adviser to the Social Democratic Party. She continued to have an interest in quantum mechanics, but it is not clear how seriously she pursued it in later life, which saw her move back to Bremen to care for an ill comrade from her early socialist days.
Hermann’s achievements first came to light in 1974 when the physicist and historian Max Jammer revealed her 1935 critique of von Neumann’s proof in his book The Philosophy of Quantum Mechanics. Following Hermann’s death in Bremen on 15 April 1984, interest slowly grew, culminating in Crull and Bacciagaluppi’s 2016 landmark study Grete Hermann: Between Physics and Philosophy.
The life of this deep thinker, who also worked to educate others and to achieve worthy societal goals, remains an inspiration for any scientist or philosopher today.
This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.
Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.
Find out more on our quantum channel.
The post Grete Hermann: the quantum physicist who challenged Werner Heisenberg and John von Neumann appeared first on Physics World.