Swissto12 announced Feb. 27 its first major contract for electronically steered antennas, securing a deal to supply SES with ground terminals for the Luxembourg-based operator’s O3b mPower medium Earth orbit network.
A new technique for using frequency combs to measure trace concentrations of gas molecules has been developed by researchers in the US. The team reports single-digit parts-per-trillion detection sensitivity, and extreme broadband coverage over 1000 cm-1 wavenumbers. This record-level sensing performance could open up a variety of hitherto inaccessible applications in fields such as medicine, environmental chemistry and chemical kinetics.
Each molecular species will absorb light at a specific set of frequencies. So, shining light through a sample of gas and measuring this absorption can reveal the molecular composition of the gas.
Cavity ringdown spectroscopy is an established way to increase the sensitivity of absorption spectroscopy and needs no calibration. A laser is injected between two mirrors, creating an optical standing wave. A sample of gas is then injected into the cavity, so the laser beam passes through it, normally many thousands of times. The absorption of light by the gas is then determined by the rate at which the intracavity light intensity “rings down” – in other words, the rate at which the standing wave decays away.
Researchers have used this method with frequency comb lasers to probe the absorption of gas samples at a range of different light frequencies. A frequency comb produces light at a series of very sharp intensity peaks that are equidistant in frequency – resembling the teeth of a comb.
Shifting resonances
However, the more reflective the mirrors become (the higher the cavity finesse), the narrower each cavity resonance becomes. Due to the fact that their frequencies are not evenly spaced and can be heavily altered by the loaded gas, normally one relies on creating oscillations in the length of the cavity. This creates shifts in all the cavity resonance frequencies to modulate around the comb lines. Multiple resonances are sequentially excited and the transient comb intensity dynamics are captured by a camera, following spatial separation by an optical grating.
“That experimental scheme works in the near-infrared, but not in the mid-infrared,” says Qizhong Liang. “Mid-infrared cameras are not fast enough to capture those dynamics yet.” This is a problem because the mid-infrared is where many molecules can be identified by their unique absorption spectra.
Liang is a member of Jun Ye’s group in JILA in Colorado, which has shown that it is possible to measure transient comb dynamics simply with a Michelson interferometer. The spectrometer entails only beam splitters, a delay stage, and photodetectors. The researchers worked out that, the periodically generated intensity dynamics arising from each tooth of the frequency comb can be detected as a set of Fourier components offset by Doppler frequency shifts. Absorption from the loaded gas can thus be determined.
Dithering the cavity
This process of reading out transient dynamics from “dithering” the cavity by a passive Michelson interferometer is much simpler than previous setups and thus can be used by people with little experience with combs, says Liang. It also places no restrictions on the finesse of the cavity, spectral resolution, or spectral coverage. “If you’re dithering the cavity resonances, then no matter how narrow the cavity resonance is, it’s guaranteed that the comb lines can be deterministically coupled to the cavity resonance twice per cavity round trip modulation,” he explains.
The researchers reported detections of various molecules at concentrations as low as parts-per-billion with parts-per-trillion uncertainty in exhaled air from volunteers. This included biomedically relevant molecules such as acetone, which is a sign of diabetes, and formaldehyde, which is diagnostic of lung cancer. “Detection of molecules in exhaled breath in medicine has been done in the past,” explains Liang. “The more important point here is that, even if you have no prior knowledge about what the gas sample composition is, be it in industrial applications, environmental science applications or whatever you can still use it.”
Konstantin Vodopyanov of the University of Central Florida in Orlando comments: “This achievement is remarkable, as it integrates two cutting-edge techniques: cavity ringdown spectroscopy, where a high-finesse optical cavity dramatically extends the laser beam’s path to enhance sensitivity in detecting weak molecular resonances, and frequency combs, which serve as a precise frequency ruler composed of ultra-sharp spectral lines. By further refining the spectral resolution to the Doppler broadening limit of less than 100 MHz and referencing the absolute frequency scale to a reliable frequency standard, this technology holds great promise for applications such as trace gas detection and medical breath analysis.”
As the 119th Congress convenes and the Trump administration returns to the helm, the advanced air mobility (AAM) industry stands at a pivotal juncture. Air taxis and other next generation […]
In this week's episode of Space Minds, Robert Zubrin, president of the Mars Society, sits down with host David Ariosto. With the debate heating up over exploration priorities, Zubrin lays out how — and why — humanity could become a multiplanetary species by heading to Mars. Watch — or listen — to learn more about Zubrin's vision for life on Mars and how it will be molded — and help mold — society back on Earth.
In this episode of the Physics World Weekly podcast, online editor Margaret Harris chats about her recent trip to CERN. There, she caught up with physicists working on some of the lab’s most exciting experiments and heard from CERN’s current and future leaders.
Founded in Geneva in 1954, today CERN is most famous for the Large Hadron Collider (LHC), which is currently in its winter shutdown. Harris describes her descent 100 m below ground level to visit the huge ATLAS detector and explains why some of its components will soon be updated as part of the LHC’s upcoming high luminosity upgrade.
She explains why new “crab cavities” will boost the number of particle collisions at the LHC. Among other things, this will allow physicists to better study how Higgs bosons interact with each other, which could provide important insights into the early universe.
Harris describes her visit to CERN’s Antimatter Factory, which hosts several experiments that are benefitting from a 2021 upgrade to the lab’s source of antiprotons. These experiments measure properties of antimatter – such as its response to gravity – to see if its behaviour differs from that of normal matter.
Harris also heard about the future of the lab from CERN’s director general Fabiola Gianotti and her successor Mark Thomson, who will take over next year.
HELSINKI — China added to a commercial high-resolution remote sensing constellation early Thursday with the launch of a pair of SuperView Neo-1 satellites. A Long March 2C rocket lifted off […]
Daily life at US-run Antarctic stations has already been disrupted. Scientists worry that the long-term impacts could upend not only important research but the continent’s delicate geopolitics.
Something extraordinary happened on Earth around 10 million years ago, and whatever it was, it left behind a “signature” of radioactive beryllium-10. This finding, which is based on studies of rocks located deep beneath the ocean, could be evidence for a previously-unknown cosmic event or major changes in ocean circulation. With further study, the newly-discovered beryllium anomaly could also become an independent time marker for the geological record.
Most of the beryllium-10 found on Earth originates in the upper atmosphere, where it forms when cosmic rays interact with oxygen and nitrogen molecules. Afterwards, it attaches to aerosols, falls to the ground and is transported into the oceans. Eventually, it reaches the seabed and accumulates, becoming part of what scientists call one of the most pristine geological archives on Earth.
Because beryllium-10 has a half-life of 1.4 million years, it is possible to use its abundance to pin down the dates of geological samples that are more than 10 million years old. This is far beyond the limits of radiocarbon dating, which relies on an isotope (carbon-14) with a half-life of just 5730 years, and can only date samples less than 50 000 years old.
Almost twice as much 10Be than expected
In the new work, which is detailed in Nature Communications, physicists in Germany and Australia measured the amount of beryllium-10 in geological samples taken from the Pacific Ocean. The samples are primarily made up of iron and manganese and formed slowly over millions of years. To date them, the team used a technique called accelerator mass spectrometry (AMS) at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR). This method can distinguish beryllium-10 from its decay product, boron-10, which has the same mass, and from other beryllium isotopes.
The researchers found that samples dated to around 10 million years ago, a period known as the late Miocene, contained almost twice as much beryllium-10 as they expected to see. The source of this overabundance is a mystery, says team member Dominik Koll, but he offers three possible explanations. The first is that changes to the ocean circulation near the Antarctic, which scientists recently identified as occurring between 10 and 12 million years ago, could have distributed beryllium-10 unevenly across the Earth. “Beryllium-10 might thus have become particularly concentrated in the Pacific Ocean,” says Koll, a postdoctoral researcher at TU Dresden and an honorary lecturer at the Australian National University.
Another possibility is that a supernova exploded in our galactic neighbourhood 10 million years ago, producing a temporary increase in cosmic radiation. The third option is that the Sun’s magnetic shield, which deflects cosmic rays away from the Earth, became weaker through a collision with an interstellar cloud, making our planet more vulnerable to cosmic rays. Both scenarios would have increased the amount of beryllium-10 that fell to Earth without affecting its geographic distribution.
To distinguish between these competing hypotheses, the researchers now plan to analyse additional samples from different locations on Earth. “If the anomaly were found everywhere, then the astrophysics hypothesis would be supported,” Koll says. “But if it were detected only in specific regions, the explanation involving altered ocean currents would be more plausible.”
Whatever the reason for the anomaly, Koll suggests it could serve as a cosmogenic time marker for periods spanning millions of years, the likes of which do not yet exist. “We hope that other research groups will also investigate their deep-ocean samples in the relevant period to eventually come to a definitive answer on the origin of the anomaly,” he tells Physics World.
The private firm Intuitive Machines has launched a lunar lander to test extraction methods for water and volatile gases. The six-legged Moon lander, dubbed Athena, took off yesterday aboard a SpaceX Falcon 9 rocket from NASA’s Kennedy Space Center in Florida . Also aboard the rocket was NASA’s Lunar Trailblazer – a lunar orbiter that will investigate water on the Moon and its geology.
In February 2024, Intuitive Machines’ Odysseus mission became the first US mission to make a soft landing on the Moon since Apollo 17 and the first private craft to do so. After a few hiccups during landing, the mission carried out measurements with an optical and radio telescope before it ended seven days later.
Athena is the second lunar lander by Intuitive Machines in its quest to build infrastructure on the Moon that would be required for long-term lunar exploration.
The mission, standing almost five meters tall, aims to land in the Mons Mouton region, which is about 160 km from the lunar south pole.
It will use a drill to bore one meter into the surface and test the extraction of substances – including volatiles such as carbon dioxide as well as water – that it will then analyse with a mass spectrometer.
Athena also contains a “hopper” dubbed Grace that can travel up to 25 kilometres on the lunar surface. Carrying about 10 kg of payloads, the rocket-propelled drone will aim to take images of the lunar surface and explore nearby craters.
As well as Grace, Athena carries two rovers. MAPP, built by Lunar Outpost, will autonomously navigate the lunar surface while a small, lightweight rover dubbed Yaoki, which has been built by the Japanese firm Dymon, will explore the Moon within 50 meters of the lander.
Athena is part of NASA’s $2.6bn Commercial Lunar Payload Services initiative, which contracts the private sector to develop missions with the aim of reducing costs.
Taking the Moon’s temperature
Lunar Trailblazer, meanwhile, will spend two years orbiting the Moon from a 100 km altitude polar orbit. Weighing 200 kg and about the size of a washing machine, it will map the distribution of water on the Moon’s surface about 12 times a day with a resolution of about 50 meters.
While it is known that water exists on the lunar surface, little is known about its form, abundance, distribution or how it arrived. Various hypothesis range from “wet” asteroids crashing into the Moon to volcanic eruptions producing water vapour from the Moon’s interior.
Water hunter: NASA’s Lunar Trailblazer will spend two years mapping the distribution of water on the surface of the Moon (courtesy: Lockheed Martin Space for Lunar Trailblazer)
To help answer that question, the craft will examine water deposits via an imaging spectrometer dubbed the High-resolution Volatiles and Minerals Moon Mapper that has been built by NASA’s Jet Propulsion Laboratory.
A thermal mapper, meanwhile, that has been developed by the University of Oxford, will plot the temperature of the Moon’s surface and help to confirm the presence and location of water.
Lunar Trailblazer was selected in 2019 as part of NASA’s Small Innovative Missions for Planetary Exploration programme.
Faced with rising costs and supply chain disruptions, the space industry is scrambling to mitigate the financial strain of the Trump administration’s “America First” trade and tariff policies.
More than 50 people have died in the Democratic Republic of the Congo, most within 48 hours of the onset of symptoms. Initial analysis suggests neither Ebola nor Marburg is the cause.
Learn about NASA's Lunar Trailblazer and IM-2, two missions that will soon launch into space to find clues on the presence of water underneath the Moon's surface.
Learn how a West African excavation shows that early humans lived in multiple ecosystems simultaneously, challenging our understanding of human history.
The Government Accountability Office specifically raises concerns about the technical maturity of the Proliferated Warfighter Space Architecture’s laser communications technology.
The Space Force Association issued a strong rebuttal to a Mitchell Institute report that argued the Space Force is more focused on competitive endurance than on defeating adversaries
While the biology of how an entire organism develops from a single cell has long been a source of fascination, recent research has increasingly highlighted the role of mechanical forces. “If we want to have rigorous predictive models of morphogenesis, of tissues and cells forming organs of an animal,” says Konstantin Doubrovinski at the University of Texas Southwestern Medical Center, “it is absolutely critical that we have a clear understanding of material properties of these tissues.”
Now Doubrovinski and his colleagues report a rheological study explaining why the developing fruit fly (Drosophila melanogaster) epithelial tissue stretches as it does over time to allow the embryo to change shape.
Previous studies had shown that under a constant force, tissue extension was proportional to the time the force had been applied to the power of one half. This had puzzled the researchers, since it did not fit a simple model in which epithelial tissues behave like linear springs. In such a model, the extension obeys Hooke’s law and is proportional to the force applied alone, such that the exponent of time in the relation would be zero.
They and other groups had tried to explain this observation of an exponent equal to 0.5 as due to the viscosity of the medium surrounding the cells, which would lead to deformation near the point of pulling that then gradually spreads. However, their subsequent experiments ruled out viscosity as a cause of the non-zero exponent.
Tissue pulling experiments Schematic showing how a ferrofluid droplet positioned inside one cell is used to stretch the epithelium via an external magnetic field. The lower images are snapshots from an in vivo measurement. (Courtesy: Konstantin Doubrovinski/bioRxiv 10.1101/2023.09.12.557407)
For their measurements, the researchers had exploited a convenient feature of Drosophila epithelial cells – a small hole, through which they could manipulate a droplet of ferrofluid to enter using a permanent magnet. Once inside the cell, a magnet acting on this droplet could exert forces on the cell to stretch the surrounding tissue.
For the current study, the researchers first tested the observed scaling law over longer periods of time. A power law gives a straight line on a log–log plot but as Doubrovinski points out, curves also look like straight lines over short sections. However, even when they increased the time scales probed in their experiments to cover three orders of magnitude – from fractions of a second to several minutes – the observed power law still held.
Understanding the results
One of the post docs on the team – Mohamad Ibrahim Cheikh – stumbled upon the actual relation giving the power law with an exponent of 0.5 while working on a largely unrelated problem. He had been modelling ellipsoids in a hexagonal meshwork on a surface, in what Doubrovinski describes as a “large” and “relatively complex” simulation. He decided to examine what would happen if he allowed the mesh to relax in its stretched position, which would model the process of actin turnover in cells.
Cheikh’s simulation gave the power law observed in the epithelial cells. “We totally didn’t expect it,” says Doubrovinski. “We pursued it and thought, why are we getting it? What’s going on here?”
Although this simulation yielded the power law with an exponent of 0.5, because the simulation was so complex, it was hard to get a handle on why. “There are all these different physical effects that we took into account that we thought were relevant,” he tells Physics World.
To get a more intuitive understanding of the system, the researchers attempted to simplify the model into a lattice of springs in one dimension, keeping only some of the physical effects from the simulations, until they identified the effects required to give the exponent value of 0.5. They could then scale this simplified one-dimensional model back up to three dimensions and test how it behaved.
According to their model, if they changed the magnitude of various parameters, they should be able to rescale the curves so that they essentially collapse onto a single curve. “This makes our prediction falsifiable,” says Doubrovinski, and in fact the experimental curves could be rescaled in this way.
When the researchers used measured values for the relaxation constant based on the actin turnover rate, along with other known parameters such as the size of the force and the size of the extension, they were able to calculate the force constant of the epithelial cell. This value also agreed with their previous estimates.
Doubrovinski explains how the ferrofluid droplet engages with individual “springs” of the lattice as it moves through the mesh. “The further it moves, the more springs it catches on,” he says. “So the rapid increase of one turns into a slow increase with an exponent of 0.5.” Against this model, all the pieces fit into place.
“I find it inspiring that the authors, first motivated by in vivo mechanical measurements, could develop a simple theory capturing a new phenomenological law of tissue rheology,” says Pierre Françoise Lenne, group leader at the Institut de Biologie du Development de Marseille at L’Universite d’Aix-Marseille. Lenne specializes in the morphogenesis of multicellular systems but was not involved in the current research.
Next, Doubrovinski and his team are keen to see where else their results might apply, such as other developmental stages and other types of organisms, such as mammals, for example.
Quantum-inspired “tensor networks” can simulate the behaviour of turbulent fluids in just a few hours rather than the several days required for a classical algorithm. The new technique, developed by physicists in the UK, Germany and the US, could advance our understanding of turbulence, which has been called one of the greatest unsolved problems of classical physics.
Turbulence is all around us, found in weather patterns, water flowing from a tap or a river and in many astrophysical phenomena. It is also important for many industrial processes. However, the way in which turbulence arises and then sustains itself is still not understood, despite the seemingly simple and deterministic physical laws governing it.
The reason for this is that turbulence is characterized by large numbers of eddies and swirls of differing shapes and sizes that interact in chaotic and unpredictable ways across a wide range of spatial and temporal scales. Such fluctuations are difficult to simulate accurately, even using powerful supercomputers, because doing so requires solving sets of coupled partial differential equations on very fine grids.
An alternative is to treat turbulence in a probabilistic way. In this case, the properties of the flow are defined as random variables that are distributed according to mathematical relationships called joint Fokker-Planck probability density functions. These functions are neither chaotic nor multiscale, so they are straightforward to derive. However, they are nevertheless challenging to solve because of the high number of dimensions contained in turbulent flows.
For this reason, the probability density function approach was widely considered to be computationally infeasible. In response, researchers turned to indirect Monte Carlo algorithms to perform probabilistic turbulence simulations. However, while this approach has chalked up some notable successes, it can be slow to yield results.
Highly compressed “tensor networks”
To overcome this problem, a team led by Nikita Gourianov of the University of Oxford, UK, decided to encode turbulence probability density functions as highly compressed “tensor networks” rather than simulating the fluctuations themselves. Such networks have already been used to simulate otherwise intractable quantum systems like superconductors, ferromagnets and quantum computers, they say.
These quantum-inspired tensor networks represent the turbulence probability distributions in a hyper-compressed format, which then allows them to be simulated. By simulating the probability distributions directly, the researchers can then extract important parameters, such as lift and drag, that describe turbulent flow.
Importantly, the new technique allows an ordinary single CPU (central processing unit) core to compute a turbulent flow in just a few hours, compared to several days using a classical algorithm on a supercomputer.
This significantly improved way of simulating turbulence could be particularly useful in the area of chemically reactive flows in areas such as combustion, says Gourianov. “Our work also opens up the possibility of probabilistic simulations for all kinds of chaotic systems, including weather or perhaps even the stock markets,” he adds.
The researchers now plan to apply tensor networks to deep learning, a form of machine learning that uses artificial neural networks. “Neural networks are famously over-parameterized and there are several publications showing that they can be compressed by orders of magnitude in size simply by representing their layers as tensor networks,” Gourianov tells Physics World.
In every vibrant technology company, the efforts of talented engineers and technicians are divided between two equally important areas: research and development (R&D) and operations. Both areas require ingenuity, attention […]
The second lunar lander mission by Intuitive Machines is set to launch, taking to the moon NASA and commercial payloads as well as several rideshare spacecraft.
Vacuum technology is routinely used in both scientific research and industrial processes. In physics, high-quality vacuum systems make it possible to study materials under extremely clean and stable conditions. In industry, vacuum is used to lift, position and move objects precisely and reliably. Without these technologies, a great deal of research and development would simply not happen. But for all its advantages, working under vacuum does come with certain challenges. For example, once something is inside a vacuum system, how do you manipulate it without opening the system up?
Heavy duty: The new transfer arm. (Courtesy: UHV Design)
The UK-based firm UHV Design has been working on this problem for over a quarter of a century, developing and manufacturing vacuum manipulation solutions for new research disciplines as well as emerging industrial applications. Its products, which are based on magnetically coupled linear and rotary probes, are widely used at laboratories around the world, in areas ranging from nanoscience to synchrotron and beamline applications. According to engineering director Jonty Eyres, the firm’s latest innovation – a new sample transfer arm released at the beginning of this year – extends this well-established range into new territory.
“The new product is a magnetically coupled probe that allows you to move a sample from point A to point B in a vacuum system,” Eyres explains. “It was designed to have an order of magnitude improvement in terms of both linear and rotary motion thanks to the magnets in it being arranged in a particular way. It is thus able to move and position objects that are much heavier than was previously possible.”
The new sample arm, Eyres explains, is made up of a vacuum “envelope” comprising a welded flange and tube assembly. This assembly has an outer magnet array that magnetically couples to an inner magnet array attached to an output shaft. The output shaft extends beyond the mounting flange and incorporates a support bearing assembly. “Depending on the model, the shafts can either be in one or more axes: they move samples around either linearly, linear/rotary or incorporating a dual axis to actuate a gripper or equivalent elevating plate,” Eyres says.
Continual development, review and improvement
While similar devices are already on the market, Eyres says that the new product has a significantly larger magnetic coupling strength in terms of its linear thrust and rotary torque. These features were developed in close collaboration with customers who expressed a need for arms that could carry heavier payloads and move them with more precision. In particular, Eyres notes that in the original product, the maximum weight that could be placed on the end of the shaft – a parameter that depends on the stiffness of the shaft as well as the magnetic coupling strength – was too small for these customers’ applications.
“From our point of view, it was not so much the magnetic coupling that needed to be reviewed, but the stiffness of the device in terms of the size of the shaft that extends out to the vacuum system,” Eyres explains. “The new arm deflects much less from its original position even with a heavier load and when moving objects over longer distances.”
The new product – a scaled-up version of the original – can move an object with a mass of up to 50 N (5 kg) over an axial stroke of up to 1.5 m. Eyres notes that it also requires minimal maintenance, which is important for moving higher loads. “It is thus targeted to customers who wish to move larger objects around over longer periods of time without having to worry about intervening too often,” he says.
Moving multiple objects
As well as moving larger, single objects, the new arm’s capabilities make it suitable for moving multiple objects at once. “Rather than having one sample go through at a time, we might want to nest three or four samples onto a large plate, which inevitably increases the size of the overall object,” Eyres explains.
Before they created this product, he continues, he and his UHV Design colleagues were not aware of any magnetic coupled solution on the marketplace that enabled users to do this. “As well as being capable of moving heavy samples, our product can also move lighter samples, but with a lot less shaft deflection over the stroke of the product,” he says. “This could be important for researchers, particularly if they are limited in space or if they wish to avoid adding costly supports in their vacuum system.”
Blue Origin conducted the tenth crewed flight of its New Shepard suborbital vehicle Feb. 25, carrying six people, one of whom was at least semi-anonymous.
Learn about the tracks and footprints at White Sands National Park in New Mexico, possibly representing the earliest evidence of human-made transportation technology.