↩ Accueil

Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Aujourd’hui — 16 septembre 2024Physics World

Magnetically controlled prosthetic hand restores fine motion control

Par : Tami Freeman
16 septembre 2024 à 17:30

A magnetically controlled prosthetic hand, tested for the first time in a participant with an amputated lower arm, provided fine control of hand motion and enabled the user to perform everyday actions and grasp fragile objects. The robotic prosthetic, developed by a team at Scuola Superiore Sant’Anna in Pisa, uses tiny implanted magnets to predict and carry out intended movements.

Losing a hand can severely affect a person’s ability to perform everyday work and social activities, and many researchers are investigating ways to restore lost motor function via prosthetics. Most available or proposed strategies rely on deciphering electrical signals from residual nerves and muscles to control bionic limbs. But this myoelectric approach cannot reproduce the dexterous movements of a human hand.

Instead, Christian Cipriani and colleagues developed an alternative technique that exploits the physical displacement of skeletal muscles to decode the user’s motor intentions. The new myokinetic interface uses permanent magnets implanted into the residual muscles of the user’s amputated arm to accurately control finger movements of a robotic hand.

“Standard myoelectric prostheses collect non-selective signals from the muscle surface and, due to that low selectivity, typically support only two movements,” explains first author Marta Gherardini. “In contrast, myokinetic control enables simultaneous and selective targeting of multiple muscles, significantly increasing the number of control sources and, consequently, the number of recognizable movements.”

First-in-human test

The first patient to test the new prosthesis was a 34-year-old named Daniel, who had recently lost his left hand and had started to use a myoelectric prosthesis. The team selected him as a suitable candidate because his amputation was recent and blunt, he could still feel the lost hand and the residual muscles in his arm moved in response to his intentions.

For the study, the team implanted six cylindrical (2 mm radius and height) neodymium magnets coated with a biocompatible shell into three muscles in Daniel’s residual forearm. In a minimally invasive procedure, the surgeon used plastic instruments to manipulate the magnets into the tip of the target muscles and align their magnetic fields, verifying their placement using ultrasound.

Daniel also wore a customized carbon fibre prosthetic arm containing all of the electronics needed to track the magnets’ locations in space. When he activates the residual muscles in his arm, the implanted magnets move in response to the muscle contractions. A grid of 140 magnetic field sensors in the prosthesis detect the position and orientation of these magnets and transmit the data to an embedded computing unit. Finally, a pattern recognition algorithm translates the movements into control signals for a Mia-Hand robotic hand.

Gherardini notes that the pattern recognition algorithm rapidly learnt to control the hand based on Daniel’s intended movements. “Training the algorithm took a few minutes, and it was immediately able to correctly recognize the movements,” she says.

In addition to the controlled hand motion arising from intended grasping, the team found that elbow movement activated other forearm muscles. Tissue near the elbow was also compressed by the prosthetic socket during elbow flexion, which caused unintended movement of nearby magnets. “We addressed this issue by estimating the elbow movement through the displacement of these magnets, and adjusting the position of the other magnets accordingly,” says Gherardini.

Robotic prosthesis user grasps a fragile plastic cup

Test success The robotic prosthesis allowed Daniel to grasp fragile objects such as a plastic cup. (Courtesy: © 2024 Scuola Superiore Sant’Anna)

During the six-week study, the team performed a series of functional tests commonly used to assess the dexterity of upper limb prostheses. Daniel successfully completed these tests, with comparable performance to that achieved using a traditional myoelectric prosthetic (in tests performed before the implantation surgery).

Importantly, he was able to control finger movements well enough to perform a wide range of everyday activities – such as unscrewing a water bottle cap, cutting with a knife, closing a zip, tying shoelaces and removing pills from a blister pack. He could also control the grasp force to manipulate fragile objects such as an egg and a plastic cup.

The researchers report that the myokinetic interface worked even better than they expected, with the results highlighting its potential to restore natural motor control in people who have lost limbs. “This system allowed me to recover lost sensations and emotions: it feels like I’m moving my own hand,” says Daniel in a press statement.

At the end of the six weeks, the team removed the magnets. Asides for low-grade inflammation around one magnet that had lost its protective shell, all of the surrounding tissue was healthy. “We are currently working towards a long-term solution by developing a magnet coating that ensures long-term biocompatibility, allowing users to eventually use this system at home,” Gherardini tells Physics World.

She adds that the team is planning to perform another test of the myokinetic prosthesis within the next two years.

The myokinetic prosthesis is described in Science Robotics.

The post Magnetically controlled prosthetic hand restores fine motion control appeared first on Physics World.

  •  

NASA suffering from ageing infrastructure and inefficient management practices, finds report

Par : No Author
16 septembre 2024 à 15:34

NASA has been warned that it may need to sacrifice new missions in order to rebalance the space agency’s priorities and achieve its long-term objectives. That is the conclusion of a new report – NASA at a Crossroads: Maintaining Workforce, Infrastructure, and Technology Preeminence in the Coming Decades – that finds a space agency battling on many fronts including ageing infrastructure, China’s growing presence in space, and issues recruiting staff.

The report was requested by Congress and published by the National Academies of Sciences, Engineering, and Medicine. It was written by a 13-member committee, which included representatives from industry, academia and government, and was chaired by Norman Augustine, former chief executive of Lockheed Martin. Members visited all nine NASA centres and talked to about 400 employees to compile the report.

While the panel say that NASA had “motivate[ed] many of the nation’s youth to pursue careers in science and technology” and “been a source of inspiration and pride to all Americans”, they highlight a variety of problems at the agency. Those include out-of-date infrastructure, a pressure to prioritize short-term objectives, budget mismatches, inefficient management practices, and an unbalanced reliance on commercial partners. Yet according to Augustine, the agency’s main problem is “the more mundane tendency to focus on near-term accomplishments at the expense of long-term viability”.

As well as external challenges such as China’s growing role in space, the committee discovered that many were homegrown. They found that 83% of NASA’s facilities are past their design lifetimes. For example, the capacity of the Deep Space Network, which provides critical communications support for uncrewed missions, “is inadequate” to support future craft and even current missions such as the Artemis Moon programme “without disrupting other projects”.

There is also competition from private space firms in both technology development and recruitment. According to the report, NASA has strict hiring rules and salaries it can offer. It takes 81 days, on average, from the initial interview to an offer of employment. During that period, the subject will probably receive offers from private firms, not only in the space industry but also in the “digital world”, which offer higher salaries.

In addition, Augustine notes, the agency is giving its engineers less opportunity “to get their hands dirty” by carrying out their own research. Instead, they are increasingly managing outside contractors who are doing the development work. At the same time, the report identifies a “major reduction” over the past few decades in basic research that is financed by industry – a trend that the report says is “largely attributable to shareholders seeking near-term returns as opposed to laying groundwork for the future”.

Yet the committee also finds that NASA faces “internal and external pressure to prioritize short-term measures” without considering longer-term needs and implications. “If left unchecked these pressures are likely to result in a NASA that is incapable of satisfying national objectives in the longer term,” the report states. “The inevitable consequence of such a strategy is to erode those essential capabilities that led to the organization’s greatness in the first place and that underpin its future potential.”

Cash woes

Another concern is the US government budget process that operates year by year and is slowly reducing NASA’s proportional share of funding. The report finds that the budget is “often incompatible with the scope, complexity, and difficulty of [NASA’s] work” and the funding allocation “has degraded NASA’s capabilities to the point where agency sustainability is in question”. Indeed, during the agency’s lifetime, the proportion of the US budget devoted to government R&D has declined from 1.9% of gross domestic product to 0.7%. The panel also notes a trend of reducing investment in research and technology as a fraction of funds devoted to missions. “NASA is likely to face budgetary problems in the future that greatly exceed those we’ve seen in recent years,” Augustine told a briefing.

The panel now calls on NASA to work with Congress to establish “an annually replenished revolving fund – such as a working capital fund” to maintain and improve the agency’s infrastructure. It would be financed by the US government as well as users of NASA’s facilities and be “sufficiently capitalized to eliminate NASA’s current maintenance backlog over the next decade”. While it is unclear how the government and the agency will react to that proposal, as Augustine warned, for NASA, “this is not business as usual”.

The post NASA suffering from ageing infrastructure and inefficient management practices, finds report appeared first on Physics World.

  •  

Stop this historic science site in St Petersburg from being sold

16 septembre 2024 à 09:00

In the middle of one of the most expensive neighbourhoods in St Petersburg, Russia, is a vacant and poorly kept lot about half an acre in size. It’s been empty for years for a reason: on it stood the first scientific research laboratory in Russia – maybe even the world – and for over two and a half centuries generations of Russian scientists hoped to restore it. But its days as an empty lot may be over, for the land could soon be sold to the highest bidder.

The laboratory was the idea of Mikhail Lomonosov (1711–1765), Russia’s first scientist in the modern sense. Born in 1711 into a shipping family on an island in the far north of Russia, Lomonosov developed a passion for science that saw him study in Moscow, Kyiv and St Petersburg. He then moved to Germany, where he got involved in the then revolutionary, mathematically informed notion that matter is made up of smaller elements called “corpuscles”.

In 1741, at the age of 30, Lomonosov returned to Russia, where he joined the St Petersburg Academy of Science. There he began agitating for the academy to set up a physico-chemistry laboratory of its own. Until then, experimental labs in Russia and elsewhere had been mainly applied institutions for testing and developing paints, dyes and glasses, and for producing medicines and chemicals for gunpowder. But Lomonosov wanted something very different.

His idea was for a lab devoted entirely to basic research and development that could engage and train students to do empirical research on materials. Most importantly, he wanted the academy to run the lab, but the state to pay for it. After years of agitating, Lomonosov’s plan was approved, and the St Petersburg laboratory opened in 1748 on a handy site in the centre of St Petersburg, just a 20-minute walk from the academy, near the university, museums and the city’s famous bridges.

The laboratory was a remarkable place, equipped with furnaces, ovens, scales, thermometers, microscopes, grindstones and other instruments for studying materials

The laboratory was a remarkable place, equipped with furnaces, ovens, scales, thermometers, microscopes, grindstones and various other instruments for studying materials and their properties. Lomonosov and his students used these to analyse ores, minerals, silicates, porcelain, silicates, glasses and mosaics. He also carried out experiments with implications for fundamental theory.

In 1756, for instance, Lomonosov found that certain experiments involving the oxidation of lead carried out by the British chemist Robert Boyle were in error. Indirectly, Lomonosov also suggested a general law of conservation covering the total weight of chemically reacting substances. The law is, these days, usually attributed to the French chemist Antoine Lavoisier, who also came up with the notion three decades later. But Lomonosov’s work had suggested it.

A symbol for science

Lomonosov left the formal leadership of the laboratory in 1757, after which it was headed by several other academy professors. The lab continued to serve the academy’s research until 1793 when several misfortunes, including a flood and a robbery, led to it running down. Still, the lab has had huge significance as a symbol that Russian scientists have appealed to ever since as a model for more state support. It also inspired the setting-up of other chemical laboratories, including a similar facility built at Moscow University in 1755.

For the last two and a half centuries, however, the laboratory’s allies have struggled to keep the site from becoming just real estate in a pricey St Petersburg neighbourhood. In 1793 an academician bought the land from the Academy of Sciences and rebuilt the lab as housing, although preserving its foundations and the old walls. Over the next century, a series of private owners owned the plot, again rebuilding the laboratory and associated house.

The area was levelled again during the Siege of Leningrad in the Second World War, though the lab’s foundations remained intact. After the war, the Soviet Union tried to reconstruct the lab, as did the Russian Academy of Sciences. More recently, advocates have tried to rebuild the lab in time for the 300th anniversary of the Russian Academy of Science, which takes place in 2024–2025.

Three photos of a disused plot of land in a city
A piece of history The current state of the Lomonosov lab plot in St Petersburg. Fermilab accelerator physicist Vladimir Shiltsev (with umbrella, left) visited the site with lab restoration enthusiasts in 2021. A limited view can be seen from outside the plot on 2nd Linia Street (top right). Locked gates prevent general access to the land. (Courtesy: Vladimir Shiltsev (left image); Victor Borisov (right-hand images))

All these attempts have failed. Meanwhile, ownership of the site was passed around several Russian administrative agencies, most recently to the Russian State Pedagogical University. Last March, the university put the land in the hands of a private real estate agent who advertised the site in a public notice with the statement that the land was “intended for scientific facilities”, without reference to the lab. The plot is supposed to open for bids this fall.

But scientists and historians worry about the vagueness of that phrase and are distrustful of its source. There is nothing to stop the university from succumbing to the extremely high market prices that developers would pay for its enticing location in the centre of St Petersburg.

The critical point

Money, wrote Karl Marx in his famous article on the subject, is “the confounding and confusing of all natural and human qualities”. As he saw it, money strips what it is used for of ties to human life and meaning. Monetizing Lomonosov’s lab makes us speak of it quantitatively in real-estate terms. In such language, the site is simply a flat, featureless half-acre plot of land that, one metre down, has pieces of stone that were once part of an earlier building.

It also encourages us to speak of the history of this plot as just a series of owners, buildings and events. Some might even say that we have already preserved the history of Lomonosov’s lab because much of its surviving contents are on display in a nearby museum called the Kunstkamera (or art chamber). What, therefore, could be the harm of selling the land?

The land is where Lomonosov, his spirited colleagues and students, shared experiences and techniques, made friendships and established networks

Turning the history of science into nothing more than a tale of instruments promotes the view that science is all about clever individuals who use tools to probe the world for knowledge. But the places where scientists work are integral to science too. The plot of land on the 2nd avenue of Vasilevsky Island is where Lomonosov, his spirited colleagues and students, shared experiences and techniques, made friendships and established networks.

It’s where humans, instruments, materials and funding came together in dynamic events that revealed new knowledge of how materials behave in different conditions. The lab is also historically important because it impressed academy and state authorities enough that they continued to support scientific research as essential to Russia’s future.

Sure, appreciating this dimension of science history requires more than restoring buildings. But preserving the places where science happens keeps alive important symbols of what makes science possible, then and now, in a world that needs more of it. Selling the site of Lomonosov’s lab for money amounts to repudiating the cultural value of science.

The post Stop this historic science site in St Petersburg from being sold appeared first on Physics World.

  •  
À partir d’avant-hierPhysics World

What happens when a warp drive collapses?

14 septembre 2024 à 15:02

Simulations of space–times that contain negative energies can help us to better understand wormholes or the interior of black holes. For now, however, the physicists who performed the new study, who admit to being big fans of Star Trek, have used their result to model the gravitational waves that would be emitted by a hypothetical failing warp drive.

Gravitational waves, which are ripples in the fabric of space–time, are emitted by cataclysmic events in the universe, like binary black hole and neutron star mergers. They might also be emitted by more exotic space–times such as wormholes or warp drives, which unlike black hole and neutron mergers, are still the stuff of science fiction.

First predicted by Albert Einstein in his general theory of relativity, gravitational waves were observed directly in 2015 by the Advanced LIGO detectors, which are laser interferometers comprising pairs of several-kilometre-long arms positioned at right angles to each other. As a gravitational wave passes through the detector, it slightly expands one arm while contracting the other. This creates a series of oscillations in the lengths of the arms that can be recorded as interference pattern variations.

The first detection by LIGO arose from the collision and merging of two black holes. These observations heralded the start of the era of gravitational-wave astronomy and viewing extreme gravitational events across the entire visible universe. Since then, astrophysicists have been asking themselves if signals from other strongly distorted regions of space–time could be seen in the future, beyond the compact binary mergers already detected.

Warp drives or bubbles

A “warp drive” (or “warp bubble”) is a hypothetical device that could allow space travellers to traverse space at faster-than-light speeds – as measured by some distant observer. Such a bubble contracts spacetime in front of it and expands spacetime behind it. It can do this, in theory, because unlike objects within space–time, space–time itself can bend, expand or contract at any speed. A spacecraft contained in such a drive could therefore arrive at its destination faster than light would in normal space without breaking Einstein’s cosmic speed limit.

The idea of warp drives is not new. They were first proposed in 1994 by the Mexican physicist Miguel Alcubierre who named them after the mode of travel used in the sci-fi series Star Trek. We are not likely to see such drives anytime soon, however, since the only way to produce them is by generating vast amounts of negative energy – perhaps by using some sort of undiscovered exotic matter.

A warp drive that is functioning normally, and travelling at a constant velocity, does not emit any gravitational waves. When it collapses, accelerates or decelerates, however, this should generate gravitational waves.

A team of physicists from Queen Mary University of London (QMUL), the University of Potsdam, the Max Planck Institute (MPI) for Gravitational Physics in Potsdam and Cardiff University decided to study the case of a collapsing warp drive. The warp drive is interesting, say the researchers, since it uses gravitational distortion of spacetime to propel a spaceship forward, rather than a usual kind of fuel/reaction system.

Decomposing spacetime

The team, led by Katy Clough of QMUL, Tim Dietrich from Potsdam and Sebastian Khan at Cardiff, began by describing the initial bubble by the original Alcubierre definition and gave it a fixed wall thickness. They then developed a formalism to describe the warp fluid and how it evolved. They varied its initial velocity at the point of collapse (which is related to the amplitude of the warp bubble). Finally, they analysed the resulting gravitational-wave signatures and quantified the radiation of energy from the space–time region.

While Einstein’s equations of general relativity treat space and time on an equal footing, we have to split the time and space dimensions to do a proper simulation of how the system evolves, explains Dietrich. This approach is normally referred to as the 3+1 decomposition of spacetime. “We followed this very common approach, which is routinely used to study binary black hole or binary neutron star mergers.”

It was not that simple, however: “given the particular spacetime that we were investigating, we also had to determine additional equations for the simulation of the material that is sustaining the warp bubble from collapse,” says Dietrich. “We also had to find a way to introduce the collapse that then triggers the emission of gravitational waves.”

Since they were solving Einstein’s field equation directly, the researchers say they could read off how spacetime evolves and the gravitational waves emitted from their simulation.

Very speculative work

Dietrich says that he and his colleagues are big Star Trek fans and that the idea for the project, which they detail in The Open Journal of Astrophysics, came to them a few years ago in Göttingen in Germany, where Clough was doing her postdoc. “Sebastian then had the idea of using the simulations that we normally use to help detect black holes to look for signatures of the Alcubierre warp drive metric,” recalls Dietrich. “We thought it would be a quick project, but it turned out to be much harder than we expected.”

The researchers found that, for warp ships around a kilometre in size, the gravitational waves emitted are of a high frequency and, therefore, not detectable with current gravitational-wave detectors. “While there are proposals for new gravitational-wave detectors at higher frequencies, our work is very speculative, and so it probably wouldn’t be sufficient to motivate anyone to build anything,” says Dietrich. “It does have a number of theoretical implications for our understanding of exotic spacetimes though,” he adds. “Since this is one of the few cases in which consistent simulations have been performed for spacetimes containing exotic forms of matter, namely negative energy, our work could be extended to also study wormholes, the inside of black holes, or the very early stages of the universe, where negative energy might prevent the formation of singularities.

Even though they “had a lot of fun” during this proof-of-principle project, the researchers say that they will now probably go back to their “normal” work, namely the study of compact binary systems.

The post What happens when a warp drive collapses? appeared first on Physics World.

  •  

UK reveals next STEPs toward prototype fusion power plant

13 septembre 2024 à 10:30

“Fiendish”, “technically tough”, “difficult”, “complicated”. Those were just a few of the choice words used at an event last week in Oxfordshire, UK, to describe ambitious plans to build a prototype fusion power plant. Held at the UK Atomic Energy Authority (UKAEA) Culham campus, the half-day meeting on 5 September saw engineers and physicists discuss the challenges that lie ahead as well the opportunities that this fusion “moonshot” represents.

The prototype fusion plant in question is known as the Spherical Tokamak for Energy Production (STEP), which was first announced by the UK government in 2019 when it unveiled a £220m package of funding for the project. STEP will be based on “spherical” tokamak technology currently being pioneered at the UK’s Culham Centre for Fusion Energy (CCFE). In 2022 a site for STEP was chosen at the former coal-fired power station at West Burton in Nottinghamshire. Operations are expected to begin in the 2040s with STEP aiming to prove the commercial viability of fusion by demonstrating net energy, fuel self-sufficiency and a viable route to plant maintenance.

A spherical tokamak is more compact than a traditional tokamak, such as the ITER experimental fusion reactor currently being built in Cadarache, France, which has been hit with cost hikes and delays in recent years. The compact nature of the spherical tokamak, which was first pioneered in the UK in the 1980s, is expected to minimize costs, maximise energy output and possibly make it easier to maintain when scaled up to a fully-fledged fusion power plant.

The current leading spherical tokamaks worldwide are the Mega Amp Spherical Tokamak (MAST-U) at the CCFE and the National Spherical Torus Experiment at the Princeton Plasma Physics Laboratory (PPPL) in the US, which is nearing the completion of an upgrade. Despite much progress, however, those tokamaks are yet to demonstrate fusion conditions through the use of the hydrogen isotope tritium in the fuel, which is necessary to achieve a “burning” plasma. This goal has, though, already been achieved in traditional tokamaks such as the Joint European Torus, which turned off in 2023.

“STEP is a big extrapolation from today’s machines,” admitted STEP chief engineer Chris Waldon at the event. “It is complex and complicated but we are now beginning to converge on a single design [for STEP]”.

A fusion ‘moonshot’

The meeting at Culham was held to mark the publication of 15 papers on the technical progress made on STEP over the past four years. They cover STEP’s plasma, its maintenance, magnets, tritium-breeding programme as well as pathways for fuel self-sufficiency (Philosophical Transactions A 382 20230416). Officials were keen to stress, however, that the papers were a snapshot of progress to date and that since then some aspects of the design have progressed.

One issue that crept up during the talks was the challenge of extrapolating every element of tokamak technology to STEP – a feat described by one panellist as being “so far off our graphs”. While theory and modelling have come a long way in the last decade, even the best models will not be a substitute for the real thing. “Until we do STEP we won’t know everything,” says physicist Steve Cowley, director of the PPPL. Those challenges involve managing potential instabilities and disruptions in the plasma – which at worst could obliterate the wall of a reactor – as well as operating high-temperature superconducting magnets to confine the plasma that have yet to be tested under the intensity of fusion conditions.

We need to produce a project that will deliver energy someone will buy

Ian Chapman

Another significant challenge is self-breeding tritium via neutron capture in lithium, which would be done in a roughly one-metre thick “blanket” surrounding the reactor. This is far from straightforward and the STEP team are still researching what technology might prevail – whether to use a solid pebble-bed or liquid lithium. While liquid lithium is good at producing tritium, for example, extracting the isotope to put back into the reactor is complex.

Howard Wilson, fusion pilot plant R&D lead at the Oak Ridge National Laboratory in the US, was keen to stress that STEP will not be a commercial power plant. Instead, its job rather is to demonstrate “a pathway towards commercialisation”. That is likely to come in several stages, the first being to generate 1 GW of power, which would result in 100 MW to the “grid” (the other 900 MW needed to power the systems). The second stage will be to test if that power production is sustainable via the self-breeding of tritium back into the reactor, what is known as a “closed fuel cycle”.

Ian Chapman, chief executive of the UKAEA, outlined what he called the “fiendish” challenges that lie ahead for fusion, even if STEP demonstrates that it is possible to deliver energy to the grid in a sustainable way. “We need to produce a project that will deliver energy someone will buy,” he said. That will be achieved in part via STEP’s third objective, which is to get a better understanding of the maintenance requirements of a fusion power plant and the impact that would have on reactor downtime. “We fail if there is not cost-effective solution,” added STEP engineering director Debbie Kempton.

STEP officials are now selecting industry partners — in engineering and construction — to work alongside the UKAEA to work on the design. Indeed, STEP is as much about physically building a plant as it is creating a whole fusion industry. A breathless two-minute pre-event promotional film — that loftily compared the development of fusion to the advent of the steam train and vaccines — was certainly given a much needed reality check.

The post UK reveals next STEPs toward prototype fusion power plant appeared first on Physics World.

  •  

Annular eclipse photograph bags Royal Observatory Greenwich prize

12 septembre 2024 à 20:30

US photographer Ryan Imperio has beaten thousands of amateur and professional photographers from around the world to win the 2024 Astronomy Photographer of the Year.

The image – Distorted Shadows of the Moon’s Surface Created by an Annular Eclipse – was taken during the 2023 annular eclipse.

It captures the progression of Baily’s beads, which are only visible when the Moon either enters or exits an eclipse. They are formed when sunlight shines through the valleys and craters of the Moon’s surface, breaking the eclipse’s well-known ring pattern.

“This is an impressive dissection of the fleeting few seconds during the visibility of the Baily’s beads,” noted meteorologist and competition judge Kerry-Ann Lecky Hepburn. “This image left me captivated and amazed. It’s exceptional work deserving of high recognition.”

As well as winning the £10,000 top prize, the image will go on display along with other selected pictures from the competition at an exhibition at the National Maritime Museum observatory that opens on 13 September.

The award – now in its 16th year – is run by the Royal Observatory Greenwich in association with insurer Liberty Specialty Markets and BBC Sky at Night Magazine.

The competition received over 3500 entries from 58 countries.

The post Annular eclipse photograph bags Royal Observatory Greenwich prize appeared first on Physics World.

  •  

Looking to the future of statistical physics, how intense storms can affect your cup of tea

12 septembre 2024 à 16:47

In this episode of the Physics World Weekly podcast we explore two related areas of physics, statistical physics and thermodynamics.

First up we have two leading lights in statistical physics who explain how researchers in the field are studying phenomena as diverse as active matter and artificial intelligence.

They are Leticia Cugliandolo who is at Sorbonne University in Paris and Marc Mézard at Bocconi University in Italy.

Cugliandolo is also chief scientific director of Journal of Statistical Mechanics, Theory, and Experiment (JSTAT) and Mézard has just stepped down from that role. They both talk about how the journal and statistical physics have evolved over the past two decades and what the future could bring.

The second segment of this episode explores how intense storms can affect your cup of tea. Our guests are the meteorologists Caleb Miller and Giles Harrison, who measured the boiling point of water as storm Ciarán passed through the University of Reading in 2023. They explain the thermodynamics of what they found, and how the storm could have affected the quality of the millions of cups of tea brewed that day.

The post Looking to the future of statistical physics, how intense storms can affect your cup of tea appeared first on Physics World.

  •  

Carbon defect in boron nitride creates first omnidirectional magnetometer

12 septembre 2024 à 11:43

A newly discovered carbon-based defect in the two-dimensional material hexagonal boron nitride (hBN) could be used as a quantum sensor to detect magnetic fields in any direction – a feat that is not possible with existing quantum sensing devices. Developed by a research team in Australia, the sensor can also detect temperature changes in a sample using the boron vacancy defect present in hBN. And thanks to its atomically thin structure, the sensor can conform to the shape of a sample, making it useful for probing structures that aren’t perfectly smooth.

The most sensitive magnetic field detectors available today exploit quantum effects to map the presence of extremely weak fields. To date, most of these have been made out of diamond and rely on the nitrogen vacancy (NV) centres contained within. NV centres are naturally occurring defects in the diamond lattice in which two carbon atoms are replaced with a single nitrogen atom, leaving one lattice site vacant. Together, the nitrogen atom and the vacancy can behave as a negatively charged entity with an intrinsic spin. NV centres are isolated from their surroundings, which means that their quantum behaviour is robust and stable.

When a photon hits an NV– centre, it can excite an electron to a higher-energy state. As it then decays back to the ground state, it may emit a photon of a different wavelength. The NV– centre has three spin sublevels, and the excited state of each sublevel has a different probability of emitting a photon when it decays.

By exciting an individual NV– centre repeatedly and collecting the emitted photons, researchers can detect its spin state. And since the spin state can be influenced by external variables such as magnetic field, electric field, temperature, force and pressure, NV– centres can therefore be used as atomic-scale sensors. Indeed, they are routinely employed today to study a wide variety of biological and physical systems.

There is a problem though – NV centres can only detect magnetic fields that are aligned in the same direction as the sensors. Devices must therefore contain many sensors placed at different alignment angles, which makes them difficult to use and limited to specific applications. What’s more, the fact that they are rigid (diamond being the hardest material known), means they cannot conform to the sample being studied.

A new carbon-based defect

Researchers recently discovered a new carbon-based defect in hBN, in addition to the boron vacancy that it is already known to contain. In this latest work, and thanks to a carefully calibrated Rabi experiment (a method for measuring nuclear spin), a team led by Jean-Philippe Tetienne of RMIT University and Igor Aharonovich of the University of Technology Sydney found that the carbon-based defect behaves as a spin-half system (S=1/2). In comparison, the spin in the boron defect is equal to one. And it’s this spin-half nature of the former that enables it to detect magnetic fields in any direction, say the researchers.

Team members Sam Scholten and Priya Singh
Research team Sam Scholten and Priya Singh working on their hBN quantum sensing system. (Courtesy: RMIT University)

“Having two different independently addressable spin species within the same material at room temperature is unique, not even diamond has this capability,” explains Priya Singh from RMIT University, one of the lead authors of this study. “This is exciting because each spin species has its advantages and limitations, and so with hBN we can combine the best of both worlds. This is important especially for quantum sensing, where the spin half enables omnidirectional magnetometry, with no blind spot, while the spin one provides directional information when needed and is also a good temperature sensor.”

Until now, the spin multiplicity of the carbon defect was under debate in the hBN community, adds co-first author Sam Scholten from the University of Melbourne. “We have been able to unambiguously prove its spin-half nature, or more likely a pair of weakly coupled spin-half electrons.”

The new S=1/2 sensor can be controlled using light in the same way as the boron vacancy-based sensor. What’s more, the two defects can be tuned to interact with each other and thus used together to detect both magnetic fields and temperature at the same time. Singh points out that the carbon-based defects were also naturally present in pretty much every hBN sample the team studied, from commercially sourced bulk crystals and powders to lab-made epitaxial films. “To create the boron vacancy defects in the same sample, we had to perform just one extra step, namely irradiating the samples with high-energy electrons, and that’s it,” she explains.

To create the hBN sensor, the researchers simply drop casted a hBN powder suspension onto the target object or transferred an epitaxial film or an exfoliated flake. “hBN is very versatile and easy to work with,” says Singh. “It is also low cost and easy to integrate with various other materials so we expect lots of applications will emerge in nanoscale sensing – especially thanks to the omnidirectional magnetometry capability, unique for solid-state quantum sensors.”

The researchers are now trying to determine the exact crystallographic structure of the S=1/2 carbon defects and how they can engineer them on-demand in a few layers of hBN. “We are also planning sensing experiments that leverage the omnidirectional magnetometry capability,” says Scholten. “For instance, we can now image the stray field from a van der Waals ferromagnet as a function of the azimuthal angle of the applied field. In this way, we can precisely determine the magnetic anisotropy, something that has been a challenge with other methods in the case of ultrathin materials.”

The study is detailed in Nature Communications.

The post Carbon defect in boron nitride creates first omnidirectional magnetometer appeared first on Physics World.

  •  

Dancing humans embody topological properties

Par : Anna Demming
11 septembre 2024 à 18:08

High school students and scientists in the US have used dance to illustrate the physics of topological insulators. The students followed carefully choreographed instructions developed by scientists in what was a fun outreach activity that explained topological phenomena. The exercise demonstrates an alternative analogue for topologically nontrivial systems, which could be potentially useful for research.

“We thought that the way all of these phenomena are explained is rather contrived, and we wanted to, in some sense, democratize the notions of topological phases of matter to a broader audience,” says Joel Yuen-Zhou who is a theoretical chemist at the University of California, San Diego (UCSD). Yuen-Zhou led the research, which was done in collaboration with students and staff at Orange Glen High School near San Diego.

Topological insulators are a type of topological material where the bulk is an electrical  insulator but the surface or edges (depending on whether the system is 3D or 2D) conducts electricity. The conducting states arise due to a characteristic of the electronic band structure associated with the system as a whole, which means they persist despite defects or distortions in the system so long as the fundamental topology of the system is undisturbed. Topology can be understood in terms of a coffee mug being topologically equivalent to a ring doughnut, because they both have a hole all the way through. This is unlike a jam doughnut which does not have a hole and is therefore not topologically equivalent to a coffee mug.

Insulators without the conducting edge or surface states are “topologically trivial” and have insulating properties throughout. Yuen-Zhou explains that for topologically nontrivial properties to emerge, the system must be able to support wave phenomena and have something that fulfils the role of a magnetic field in condensed matter topological insulators. As such, analogues of topological insulators have been reported in systems ranging from oceanic and atmospheric fluids to enantiomeric molecules and active matter. Nonetheless, and despite the interest in topological properties for potential applications, they can still seem abstract and arcane.

Human analogue

Yuen-Zhou set about devising a human analogue of a topological insulator with then PhD student Matthew Du, who is now at the University of Chicago. The first step was to establish a Hamiltonian that defines how each site in a 2D lattice interacts with its neighbours and a magnetic field. They then formulated the Schrödinger equation of the system as an algorithm that updates after discrete steps in time and reproduces essential features of topological insulator behaviour. These are chiral propagation around the edges when initially excited at an edge; robustness to defects; propagation around the inside edge when the lattice has a hole in it; and an insulating bulk.

The USCD researchers then explored how this quantum behaviour could be translated into human behaviour. This was a challenge because quantum mechanics operates in the realm of complex numbers that have real and an imaginary components. Fortunately, they were able to identify initial conditions that lead to only real number values for the interactions at each time step of the algorithm. That way the humans, for whom imaginary interactions might be hard to simulate, could legitimately manifest only real numbers as they step through the algorithm. These real values were either one (choreographed as waving flags up), minus one (waving flags down) or zero (standing still).

“The structure isn’t actually specific just to the model that we focus on,” explains Du. “There’s actually a whole class of these kinds of models, and we demonstrate this for another example – actually a more famous model – the Haldane model, which has a honeycomb lattice.”

The researchers then created a grid on a floor at Orange Glen High School, with lines in blue or red joining neighbouring squares. They defined whether the interaction between those sites was parallel or antiparallel (that is, whether the occupants of the squares should wave the flags in the same or opposite direction to each other when prompted).

Commander and caller

A “commander” acts as the initial excitation that starts things off. This is prompted by someone who is not part of the 2D lattice, whom the researchers liken to a caller in line, square or contra dancing. The caller then prompts the commander to come to a standstill, at which point all those who have their flags waving determine if they have a “match”, that is, if they are dancing in kind or opposite to their neighbours as designated by the blue and red lines. Those with a match then stop moving, after which the “commander” or excitation moves to the one site where there is no such match.

Yuen-Zhou and Du taught the choreography to second and third year high school students. The result was that excitations propagated around the edge of the lattice, but bulk excitations fizzled out. There was also a resistance to “defects”.

“The main point about topological properties is that they are characterized by mathematics that are insensitive to many details,” says Yuen-Zhou. “While we choreograph the dance, even if there are imperfections and the students mess up, the dance remains and there is the flow of the dance along the edges of the group of people.”

The researchers were excited about showing that even a system as familiar as a group of people could provide an analogue of a topological material, since so far these properties have been “restricted to very highly engineered systems or very exotic materials,” as Yuen-Zhou points out.

“The mapping of a wave function to real numbers then to human movements clearly indicates the thought process of the researchers to make it more meaningful to students as an outreach activity,” says Shanti Pise, a principal technical officer at the Indian Institute of Science, Education and Research in Pune. She was not involved in this research project but specializes in using dance to teach mathematical ideas. “I think this unique integration of wave physics and dance would also give a direction to many researchers, teachers and the general audience to think, experiment and share their ideas!”

The research is described in Science Advances.

The post Dancing humans embody topological properties appeared first on Physics World.

  •  

Almost 70% of US students with an interest in physics leave the subject, finds survey

Par : No Author
11 septembre 2024 à 13:30

More than two-thirds of college students in the US who initially express an interest in studying physics drop out to pursue another degree. That is according to a five-year-long survey by the American Institute of Physics, which found that students often quit due to a lack of confidence in mathematics or having poor experiences within physics departments and instructors. Most students, however, ended up in another science, technology, engineering and mathematics (STEM) field.

Carried out by AIP Statistical Research, the survey initially followed almost 4000 students in their first year of high school or college who were doing an introductory physics course at four large, predominantly white universities.

Students highlighted “learning about the universe”, “applying their problem-solving and maths skills”, “succeeding in a challenging subject” and “pursuing a satisfying career” as reasons why they choose to study physics.

Anne Marie Porter and her colleagues Raymond Chu and Rachel Ivie concentrated on the 745 students who had expressed interest in  pursuing physics, following them for five academic years.

Over that period, only 31% graduated with a physics degree, with most of those switching to another degree during their first or second year. Under-represented groups, including women, African-American and Hispanic students, were the most likely to avoid physics degree courses.

Pull and push

While many who quit physics enjoyed their experience, they left due to “issues with poor teaching quality and large class sizes” as well as “negative perceptions that physics employment consists only of academic positions and desk jobs”. Self-appraisal played a role in the decision to leave too. “They may feel unable to succeed because they lack the necessary skills in physics,” Porter says. “That’s a reason for concern.”

Porter adds that intervention early in college is essential to retain physicists with introductory physics courses being “incredibly important”. Indeed, the survey comes at a time when the number of bachelor’s degrees in physics offered by US universities is growing more slowly than in other STEM fields.

Meanwhile, a separate report published by the National Academies of Science, Engineering, and Medicine has called on the US government to adopt a new strategy to recruit and retain talent in STEM subjects. In particular, the report urges Congress to smooth the path to permanent residency and US citizenship for foreign-born individuals working in STEM fields.

The post Almost 70% of US students with an interest in physics leave the subject, finds survey appeared first on Physics World.

  •  

Improved antiproton trap could shed more light on antimatter-matter asymmetry

11 septembre 2024 à 10:30
The "Maxwell’s demon cooling double trap" developed by the BASE collaboration can cool antiprotons very quickly
Fast chill The “Maxwell’s demon cooling double trap” developed by the BASE collaboration can cool antiprotons very quickly to the extremely cold temperatures necessary for high-precision measurements. (Courtesy: BASE-Collaboration/Stefan Ulmer)

A novel particle trap invented at CERN could allow physicists to measure the magnetic moments of antiprotons with higher precision than ever before. The experiment, carried out by the international BASE collaboration, revealed that the magnetic moments of the antiparticles differ by a maximum of 10–9 from those of their matter counterparts.

One of the biggest mysteries in physics today is why the universe appears to be made up almost entirely of matter and contains only tiny amounts of antimatter. According to the Standard Model, our universe should be largely matter-less. This is because when the universe formed nearly 14 billion years ago, equal amounts of antimatter and matter were generated. When pairs of these antimatter and matter particles collided, they annihilated and produced a burst of energy. This energy created new antimatter and matter particles, which annihilated each other again, and so on.

Physicists have been trying to solve this enigma by looking for tiny differences between a particle (such as a proton) and its antiparticle. If successful, such differences (even if extremely small) would shed more light on antimatter–matter asymmetry and perhaps even reveal physics beyond the Standard Model.

The aim of the BASE (Baryon Antibaryon Symmetry Experiment) collaboration is to measure the magnetic moment of the antiproton to extremely high precision and compare it with the magnetic moment of the proton. To do this, the researchers are using Penning traps, which employ magnetic and electric fields to hold a negatively charged antiproton, and can store antiprotons for years.

Quicker cooling

Preparing individual antiprotons so that their spin quantum states can be measured, however, involves cooling them down to extremely cold temperatures of 200 mK. Previous techniques took 15 h to achieve this, but BASE has now shortened this cooling time to just eight minutes.

The BASE team achieved this feat by joining two Penning traps to make a so-called “Maxwell’s demon cooling double trap”. The first trap cools the antiprotons. The second (referred to as the analysis trap in this study) has the highest magnetic field gradient for a device of its kind, as well as improved noise-protection electronics, a cryogenic cyclotron motion detector and ultrafast transport between the two traps.

The new instrument allowed the researchers to prepare only the coldest antiprotons for measurement, while at the same time rejecting any that had a higher temperature. This means that they did not have to waste time cooling down these warmer particles.

“With our new trap we need a measurement time of around one month, compared with almost 10 years using the old technique, which would be impossible to realize experimentally,” explains BASE spokesperson Stefan Ulmer, an experimental physicist at Heinrich Heine University Düsseldorf and a researcher at CERN and RIKEN.

Ulmer says that he and his colleagues have already been able to measure that the magnetic moments of protons and antiprotons differ by a maximum of one billionth (10–9). They have also improved the error rate in identifying the antiproton’s spin by more than a factor of 1000. Reducing this error rate was one of the team’s main motivations for this project.

The new cooling device could be of benefit to the Penning trap community at large, since colder particles generally result in more precise measurements. For example, it could be used for phase sensitive detection methods or spin state analysis, says Barbara Maria Latacz, CERN team member and lead author of this study. “Our trap is particularly interesting because it is relatively simple and robust compared to laser cooling systems,” she tells Physics World. “Specifically, it allows us to cool a single proton or antiproton to temperatures below 200 mK in less than eight minutes, which is not achievable with other cooling methods.”

The new device will now be a key element of the BASE experimental set-up, she says.

Looking forward, the researchers hope to improve the detection accuracy of the antiproton magnetic moment to 10–10 in their next measurement campaign. They report their current work in Physical Review Letters.

The post Improved antiproton trap could shed more light on antimatter-matter asymmetry appeared first on Physics World.

  •  

Vacuum for physics research

Par : No Author
11 septembre 2024 à 09:28

Your research can’t happen without vacuum! If you’re pushing the boundaries of science or technology, you know that creating a near-perfect empty space is crucial. Whether you’re exploring the mysteries of subatomic particles, simulating the harsh conditions of outer space, or developing advanced materials, mastering ultra-high (UHV) and extreme-high vacuum (XHV) is necessary.

In this live webinar:

  • You will learn how vacuum enables physics research, from quantum computing, to fusion, to the fundamental nature of the universe.
  • You will discover why ultra-low-pressure environments directly impact the success of your experiments.
  • We will dive into the latest techniques and technologies for creating and maintaining UVH and XHV.

Join us to gain practical insights and stay ahead in your field ­– because in your research, vacuum isn’t just important; it’s critical.

John Screech

John Screech graduated in 1986 with a BA in physics and has worked in analytical instrumentation ever since. His career has spanned general mass spectrometry, vacuum system development, and contraband detection. John joined Agilent in 2011 and currently leads training and education programmes for the Vacuum Products division. He also assists Agilent’s sales force and end-users with pre- and post-sales applications support. He is based near Toronto, Canada.

The post Vacuum for physics research appeared first on Physics World.

  •  

Flagship journal Reports on Progress in Physics marks 90th anniversary with two-day celebration

Par : No Author
10 septembre 2024 à 18:05

When the British physicist Edward Andrade wrote a review paper on the structure of the atom in the first volume of the journal Reports on Progress in Physics (ROPP) in 1934, he faced a problem familiar to anyone seeking to summarize the latest developments in a field. So much exciting research had happened in atomic physics that Andrade was finding it hard to cram everything in. “It is obvious, in view of the appalling number of papers that have appeared,” he wrote, “that only a small fraction can receive reference.”

Review articles are the ideal way to get up to speed with developments and offer a gateway into the scientific literature

Apologizing that “many elegant pieces of work have been deliberately omitted” due to a lack of space, Andrade pleaded that he had “honestly tried to maintain a just balance between the different schools [of thought]”. Nine decades on, Andrade’s struggles will be familiar to anyone has ever tried to write a review paper, especially of a fast-moving area of physics. Readers, however, appreciate the efforts authors put in because review articles are the ideal way to get up to speed with developments and offer a gateway into the scientific literature.

Writing review papers also benefits authors because such articles are usually widely read and cited by other scientists – much more in fact than a paper containing new research findings. As a result, most review journals have an extraordinarily high “impact factor”, which is the yearly mean number of citations received by articles published in the last two years in the journal. ROPP, for example, has an impact factor of 19.0. While there are flaws with using impact factor to judge the quality of a journal, it’s still a well-respected metric in many parts of the world. And who wouldn’t want to appear in a journal with that much influence?

New dawn for ROPP

Celebrating its 90th anniversary this year, ROPP is the flagship journal of IOP Publishing, which also publishes Physics World. As a learned-society publisher, IOP Publishing does not have shareholders, with any financial surplus ploughed back into the Institute of Physics (IOP) to support everyone from physics students to physics teachers. In contrast to journals owned by commercial publishers, therefore, ROPP has the international physics community at its heart.

Over the last nine decades, ROPP has published over 2500 review papers. There have been more than 20 articles by Nobel-prize-winning physicists, including famous figures from the past such as Hans Bethe (stellar evolution), Lawrence Bragg (protein crystallography) and Abdus Salam (field theory). More recently, ROPP has published papers by still-active Nobel laureates including Konstantin Novoselov (2D materials), Ferenc Krausz (attosecond physics) and Isamu Akasaki (blue LEDS) – see the box below for a full list.

Subhir Sachdev
New directions Subir Sachdev from Harvard University in the US is the current editor-in-chief of Reports on Progress in Physics. (Courtesy: Subir Sachdev)

But the journal isn’t resting on its laurels. ROPP has recently started accepting articles containing new scientific findings for the first time, with the plan being to publish 150–200 very-high-quality primary-research papers each year. They will be in addition to the usual output of 50 or so review papers, most of which will still be commissioned by ROPP’s active editorial board. IOP Publishing hopes the move will cement the journal’s place at the pinnacle of its publishing portfolio.

“ROPP will continue as before,” says Subir Sachdev, a condensed-matter physicist from Harvard University, who has been editor-in-chief of the journal since 2022. “There’s no change to the review format, but what we’re doing is really more of an expansion. We’re adding a new section containing original research articles.” The journal is also offering an open-access option for the first time, thereby increasing the impact of the work. In addition, authors have the option to submit their papers for “double anonymous” and transparent peer review.

Maintaining high standards

Those two new initiatives – publishing primary research and offering an open-access option – are probably the biggest changes in the journal’s 90-year history. But Sachdev is confident the journal can cope. “Of course, we want to maintain our high standards,” he says. “ROPP has over the years acquired a strong reputation for very-high-quality articles. With the strong editorial board and the support we have from referees, we hope we will be able to maintain that.”

Early signs are promising. Among the first primary-research papers in ROPP are CERN’s measurement of the speed of sound in a quark–gluon plasma (87 077801), a study into flaws in the Earth’s gravitational field (87 078301), and an investigation into whether supersymmetry could be seen in 2D materials (10.1088/1361-6633/ad77f0). A further paper looks into creating an overarching equation of state for liquids based on phonon theory (87 098001).

The idea is to publish a relatively small number of papers but ensure they’re the best of what’s going on in physics and provide a really good cross section of what the physics community is doing

David Gevaux

David Gevaux, ROPP’s chief editor, who is in charge of the day-to-day running of the journal, is pleased with the quality and variety of primary research published so far. “The idea is to publish a relatively small number of papers – no more than 200 max – but ensure they’re the best of what’s going on in physics and provide a really good cross section of what the physics community is doing,” he says. “Our first papers have covered a broad range of physics, from condensed matter to astronomy.”

Another benefit of ROPP only publishing a select number of papers is that each article can have, as Gevaux explains, “a little bit more love” put into it. “Traditionally, publishers were all about printing journals and sending them around the world – it was all about distribution,” he says. “But with the Internet, everything’s immediately available and researchers almost have too many papers to trawl through. As a flagship journal, ROPP gives its published authors extra visibility, potentially through a press release or coverage in Physics World.”

Nobel laureates who have published in ROPP

Change of focus Home for the last 90 years to top-quality review articles, Reports on Progress in Physics now also accepts primary research papers for the first time. (Courtesy: IOP Publishing)

Since its launch in 1934, Reports on Progress in Physics has published papers by numerous top scientists, including more than 20 current or future Nobel-prize-winning physicists. A selection of those papers written or co-authored by Nobel laureates over the journal’s first 90 years is given chronologically below. For brevity, papers by multiple authors list only the contributing Nobel winner.

Nevill Mott 1938 “Recent theories of the liquid state” (5 46) and 1939 “Reactions in solids” (6 186)
Hans Bethe 1939 “The physics of stellar interiors and stellar evolution” (6 1)
Max Born 1942 “Theoretical investigations on the relation between crystal dynamics and x-ray scattering” (9 294)
Martin Ryle 1950 “Radio astronomy” (13 184)
Willis Lamb 1951 “Anomalous fine structure of hydrogen and singly ionized helium” (14 19)
Abdus Salam 1955 “A survey of field theory” (18 423)
Alexei Abrikosov 1959 “The theory of a fermi liquid” (22 329)
David Thouless 1964 “Green functions in low-energy nuclear physics” (27 53)
Lawrence Bragg 1965 “First stages in the x-ray analysis of proteins” (28 1)
Melvin Schwartz 1965 “Neutrino physics” (28 61)
Pierre-Gilles de Gennes 1969 “Some conformation problems for long macromolecules” (32 187)
David Gabor 1969 Progress in holography” (32 395)
John Clauser 1978 “Bell’s theorem. Experimental tests and implications” (41 1881)
Norman Ramsey 1982 “Electric-dipole moments of elementary particles” (45 95)
Martin Perl 1992 “The tau lepton” (55 653)
Charles Townes 1994 “The nucleus of our galaxy” (57 417)
Pierre Agostini 2004 “The physics of attosecond light pulses” (67 813)
Takaaki Kajita 2006 “Discovery of neutrino oscillations” (69 1607)
Konstantin Novoselov 2011 “New directions in science and technology: two-dimensional crystals” (7 082501)
John Michael Kosterlitz 2016 “Physics: a review of key issues” (2016 79 026001)
Anthony Leggett 2016 “Liquid helium-3: a strongly correlated but well understood Fermi liquid” (79 054501)
Ferenc Krausz 2017 “Attosecond physics at the nanoscale” (80 054401)
Isamu Akasaki 2018 GaN-based vertical-cavity surface-emitting lasers with AlInN/GaN distributed Bragg reflectors” (82 012502)

An event for the community

As another reminder of its place in the physics community, ROPP is hosting a two-day event at the IOP’s headquarters in London and online. Taking place on 9–10 October 2024, the hybrid event will present the latest cutting-edge condensed-matter research, from fundamental work to applications in superconductivity, topological insulators, superfluids, spintronics and beyond. Confirmed speakers at Progress in Physics 2024 include Piers Coleman (Rutgers University), Susannah Speller (University of Oxford), Nandini Trivedi (Ohio State University) and many more.

artist's impression of a superconductive cube levitiating
Keep up with the action The latest advances in superconductivity are among the hot topics to be discussed at a hybrid meeting on 9–10 October 2024 online and in London to mark the 90th anniversary of IOP Publishing’s flagship journal Reports on Progress in Physics. (Courtesy: iStock/koto_feja)

“We’re taking the journal out into the community,” says Gevaux. “IOP Publishing is very heavily associated with the IOP and of course the IOP has a large membership of physicists in the UK, Ireland and beyond. With the meeting, the idea is to bring that community and the journal together. This first meeting will focus on condensed-matter physics, with some of the ROPP board members giving plenary talks along with lectures from invited, external scientists and a poster session too.”

Longer-term, IOP Publishing plans to put ROPP at the top of a wider series of journals under the “Progress in” brand. The first of those journals is Progress in Energy, which was launched in 2019 and – like ROPP – has now also expanded its remit to included primary- research papers. Other, similar spin-off journals in different topic areas will be launched over the next few years, giving IOP Publishing what it hopes is a series of journals to match the best in the world.

For Sachdev, publishing with ROPP is all about having “the stamp of approval” from the academic community. “So if you think your field is now reached a point where a scholarly assessment of recent advances is called for, then please consider ROPP,” he says. “We have a very strong editorial board to help you produce a high-quality, impactful article, now with the option of open access and publishing really high-quality primary research papers too.”

The post Flagship journal <em>Reports on Progress in Physics</em> marks 90th anniversary with two-day celebration appeared first on Physics World.

  •  

Quantum growth drives investment in diverse skillsets

Par : No Author
10 septembre 2024 à 15:20

The meteoric rise of quantum technologies from research curiosity to commercial reality is creating all the right conditions for a future skills shortage, while the ongoing pursuit of novel materials continues to drive demand for specialist scientists and engineers. Within the quantum sector alone, headline figures from McKinsey & Company suggest that less than half of available quantum jobs will be filled by 2025, with global demand being driven by the burgeoning start-up sector as well as enterprise firms that are assembling their own teams to explore the potential of quantum technologies for transforming their businesses.

While such topline numbers focus on the expertise that will be needed to design, build and operate quantum systems, a myriad of other skilled professionals will be needed to enable the quantum sector to grow and thrive. One case in point is the diverse workforce of systems engineers, measurement scientists, service engineers and maintenance technicians who will be tasked with building and installing the highly specialized equipment and instrumentation that is needed to operate and monitor quantum systems.

“Quantum is an incredibly exciting space right now, and we need to prepare for the time when it really takes off and explodes,” says Matt Martin, Managing Director of Oxford Instruments NanoScience, a UK-based company that manufactures high-performance cryogenics systems and superconducting magnets. “But for equipment makers like us the challenge is not just about quantum, since we are also seeing increased demand from both academia and industry for emerging applications in scientific measurement and condensed-matter physics.”

Martin points out that Oxford Instruments already works hard to identify and nurture new talent. Within the UK the company has for many years sponsored doctoral students to foster a deeper understanding of physics in the ultracold regime, and it also offers placements to undergraduates to spark an early interest in the technology space. The firm is also dialled into the country’s apprenticeship scheme, which offers an effective way to train young people in the engineering skills needed to manufacture and maintain complex scientific instruments.

Despite these initiatives, Martin acknowledges that NanoScience faces the same challenges as other organizations when it comes to recruiting high-calibre technical talent. In the past, he says, a skilled scientist would have been involved in all stages of the development process, but now the complexity of the systems and depth of focus required to drive innovation across multiple areas of science and engineering has led to the need for greater specialization. While collaboration with partners and sister companies can help, the onus remains on each business to develop a core multidisciplinary team.

Building ultracold and measurement expertise

The key challenge for companies like Oxford Nanoscience is finding physicists and engineers who can create the ultracold environments that are needed to study both quantum behaviour and the properties of novel materials. Compounding that issue is the growing trend towards providing the scientific community with more automated solutions, which has made it much easier for researchers to configure and conduct experiments at ultralow temperatures.

Harriet van der Vliet
Quantum focus Harriet van der Vliet, the product manager for quantum technologies at Oxford Instruments NanoScience, with one of the company’s dilution refrigerators. (Courtesy: Oxford Instruments NanoScience)

“In the past PhD students might have spent a significant amount of time building their experiments and the hardware needed for their measurements,” explains Martin. “With today’s push-button solutions they can focus more on the science, but that changes their knowledge because there’s no need for them to understand what’s inside the box. Today’s measurement scientists are increasingly skilled in Python and integration, but perhaps less so in hardware.”

Developing such comprehensive solutions demands a broader range of technical specialists, such as software programmers and systems engineers, that are in short supply across all technology-focused industries. With many other enticing sectors vying for their attention, such as the green economy, energy and life sciences, and the rise of AI-enabled robotics, Martin understands the importance of inspiring young people to devote their energies to the technologies that underpin the quantum ecosystem. “We’ve got to be able to tell our story, to show why this new and emerging market is so exciting,” he says. “We want them to know that they could be part of something that will transform the future.”

To raise that awareness Oxford Instruments has been working to establish a series of applications centres in Japan, the US and the UK. One focus for the centres will be to provide training that helps users to get to the most out of the company’s instruments, particularly for those without direct experience of building and configuring an ultracold system. But another key objective is to expose university-level students to research-grade technology, which in turn should help to highlight future career options within the instrumentation sector.

To build on this initiative Oxford Instruments is now actively discussing opportunities to collaborate with other companies on skills development and training in the US. “We all want to provide some hands-on learning for students as they progress through their university education, and we all want to find ways to work with government programmes to stimulate this training,” says Martin. “It’s better for us to work together to deliver something more substantial rather than doing things in a piecemeal way.”

That collaboration is likely to centre around an initiative launched by US firm Quantum Design back in 2015. Under the scheme, now badged Discovery Teaching Labs, the company has donated one of its commercial systems for low-temperature material analysis, the PPMS VersaLab, to several university departments in the US. As part of the initiative the course professors are also asked to create experimental modules that enable undergraduate students to use this state-of-the-art technology to explore key concepts in condensed-matter physics.

“Our initial goal was to partner with universities to develop a teaching curriculum that uses hands-on learning to inspire students to become more interested in physics,” says Quantum Design’s Barak Green, who has been a passionate advocate for the scheme. “By enabling students to become confident with using these advanced scientific instruments, we have also found that we have equipped them with vital technical skills that can open up new career paths for them.”

One of the most successful partnerships has been with California State University San Marcos (CSUSM), a small college that mainly attracts students from communities with no prior tradition of pursuing a university education. “There is no way that the students at CSUSM would have been able to access to this type of equipment in their undergraduate training, but now they have a year-long experimental programme that enhances their scientific learning and makes them much more comfortable with using such an advanced system,” says Green. “Many of these students can’t afford to stay in school to study for a PhD, and this programme has given them the knowledge and experience they need to get a good job.”

California State University San Marcos (CSUSM)
Teaching and discovery To build knowledge and skills among physics students, Quantum Design has developed an initiative for donating research-grade equipment to undergraduate teaching labs. (Courtesy: Quantum Design)

Indeed, Quantum Design has already hired around 20 students from CSUSM and other local programmes. “We didn’t start the initiative with that in mind, but over the years we discovered that we had all these highly skilled people who could come and work for us,” Green continues. “Students who only do theory are often very nervous around these machines, but the CSUSM graduates bring a whole extra layer of experience and know-how. Not everyone needs to have a PhD in quantum physics, we also need people who can go into the workforce and build the systems that the scientists rely on.”

This overwhelming success has given greater impetus to the programme, with Quantum Design now seeking to bring in other partners to extend its reach and impact. LakeShore Cryotronics, a long-time collaborator that designs and builds low-temperature measurement systems that can be integrated into the VersaLab, was the first company to make the commitment. In 2023 the US-based firm donated one of its M91 FastHall measurement platforms to join the VersaLab already installed at CSUSM, and the two partners are now working together to establish an undergraduate teaching lab at Stony Brook University in New York.

“It’s an opportunity for like-minded scientific companies to give something back to the community, since most of our products are not affordable for undergraduate programmes,” says LakeShore’s Chuck Cimino, who has now joined the board of advisors for the Discovery Teaching Labs programme. “Putting world-class equipment into the hands of students can influence their decisions to continue in the field, and in the long term will help to build a future workforce of skilled scientists and engineers.”

Conversations with other equipment makers at the 2024 APS March Meeting also generated significant interest, potentially paving the way for Oxford Instruments to join the scheme. “It’s a great model to build on, and we are now working to see how we might be able to commit some of our instruments to those training centres,” says Martin, who points out that the company’s Proteox S platform offers the ideal entry-level system for teaching students how to manage a cold space for experiments with qubits and condensed-matter systems. “We’ve developed a lot of training on the hardware and the physicality of how the systems work, and in that spirit of sharing there’s lots of useful things we could do.”

While those discussions continue, Martin is also looking to a future when quantum-powered processors become a practical reality in compute-intensive settings such as data centres. “At that point there will be huge demand for ultracold systems that are capable of hosting and operating large-scale quantum computers, and we will suddenly need lots of people who can install and service those sorts of systems,” he says. “We are already thinking about ways to set up training centres to develop that future workforce, which will primarily be focused around service engineers and maintenance technicians.”

Martin believes that partnering with government labs could offer a solution, particularly in the US where various initiatives are already in place to teach technical skills to college-level students. “It’s about taking that forward view,” he says. “We have already built a product that can be used for training purposes, and we have started discussions with US government agencies to explore how we could work together to build the workforce that will be needed to support the big industrial players.”

The post Quantum growth drives investment in diverse skillsets appeared first on Physics World.

  •  

Quantum brainwave: using wearable quantum technology to study cognitive development

10 septembre 2024 à 12:00

Though she isn’t a physicist or an engineer, Margot Taylor has spent much of her career studying electrical circuits. As the director of functional neuroimaging at the Hospital for Sick Children in Toronto, Canada, Taylor has dedicated her research to the most complex electrochemical device on the planet – the human brain.

Taylor uses various brain imaging techniques including MRI to understand cognitive development in children. One of her current projects uses a novel quantum sensing technology to map electrical brain activity. Magnetoencephalography with optically pumped magnetometry (OPM-MEG) is a wearable technology that uses quantum spins to localize electrical impulses coming from different regions of the brain.

Physics World’s Hamish Johnston caught up with Taylor to discover why OPM-MEG could be a breakthrough for studying children, and how she’s using it to understand the differences between autistic and non-autistic people.

The OPM-MEG helmets Taylor uses in this research were developed by Cerca Magnetics, a company founded in 2020 as a spin-out from the University of Nottingham‘s Sir Peter Mansfield Imaging Centre in the UK. Johnston also spoke to Cerca’s chief executive David Woolger, who explained how the technology works and what other applications they are developing.

Margot Taylor: understanding the brain

What is magnetoencephalography, and how is it used in medicine?

Magnetoencephalography (MEG) is the most sensitive non-invasive means we have of assessing brain function. Specifically, the technique gives us information about electrical activity in the brain. It doesn’t give us any information about the structure of the brain, but the disorders that I’m interested in are disorders of brain function, rather than disorders of brain structure. There are some other techniques, but MEG gives us amazing temporal and spatial resolution, which makes it very valuable.

So you’re measuring electrical signals. Does that mean that the brain is essentially an electrical device?

Indeed, they are hugely complex, electrical devices. Technically it’s electrochemical, but we are measuring the electrical signals that are the product of the electrochemical reactions in the brain.

When you perform MEG, how do you know where that signal’s coming from?

We usually get a structural MRI as well, and then we have very good source localization approaches so that we can tell exactly where in the brain different signals are coming from. We can also get information about how the signals are connecting with each other, the interactions among different brain regions, and the timing of those interactions.

Three complex-looking helmets on shelves next to a fun child-friendly mural
Good fit Margot Taylor and her team are working with quantum MEG helmets in various sizes, from large adult (purple) down to one-year-old (green). (Courtesy: Hospital for Sick Children)

Why does quantum MEG make it easier to do brain scans on children?

The quantum technology is called optically pumped magnetometry (OPM) and it’s a wearable system, where the sensors are placed in a helmet. This means there is allowed movement because the helmet moves with the child. We’re able to record brain signals in very young children because they can move or sit on their parents’ laps, they don’t have to be lying perfectly still.

Conventional MEG uses cryogenic technology and is typically one size fits all. It’s designed for an adult male head and if you put in a small child, their head is a long way from the sensors. With OPM, however, the helmet can be adapted for different sized heads. We have little tiny helmets up to bigger helmets. This is a game changer in terms of recording signals in little children.

Can you tell us more about the study you’re leading at the Hospital for Sick Children in Toronto using a quantum MEG system from the UK’s Cerca Magnetics?

We are looking at early brain function in autistic and non-autistic children. Autism is usually diagnosed by about three years of age, although sometimes it’s not diagnosed until they’re older. But if a child could be diagnosed with autism earlier, then interventions could start earlier. And so we’re looking at autistic and non-autistic children as well as children that have a high likelihood of being autistic to see if we can get brain signals that will predict whether they will go on to get a diagnosis or not.

How do the responses you measure using quantum MEG differ between autistic and non-autistic people, or those with a high likelihood of developing autism?

We don’t have that data yet because we’re looking at the children who have a high likelihood of being autistic, so we have to wait until they grow up and for another year or so to see if they get a diagnosis. For the children who do have a diagnosis of autism already, it seems like the responses are atypical, but we haven’t fully analysed that data. We think that there is a signal there that we’ll be able to report in the foreseeable future, but we have only tested 32 autistic children so far, and we’d like to get more data before we publish.

A woman sits with her back to the camera wearing a helmet covered with electronics. Two more women stand either side
Testing times Margot Taylor (right) and her postdoctoral fellow Julie Sato (left) place a quantum MEG helmet on a research participant (postdoc Kristina Safar). (Courtesy: Hospital for Sick Children)

Do you have any preliminary results or published papers based on this data yet?

We’re still analysing data. We’re seeing beautiful, age-related changes in our cohort of non-autistic children. Because nobody has been able to do these studies before, we have to establish the foundational datasets with non-autistic children before we can compare it to autistic children or children who have a high likelihood of being autistic. And those will be published very shortly.

Are you using the quantum MEG system for anything else at the moment?

With the OPM system, we’re also setting up studies looking at children with epilepsy. We want to compare the OPM technology with the cryogenic MEG and other imaging technologies and we’re working with our colleagues to do that. We’re also looking at children who have a known genetic disorder to see if they have brain signals that predict whether they will also go on to experience a neurodevelopmental disorder. We’re also looking at children who are born to mothers with HIV to see if we can get an indication of what is happening in their brains that may affect their later development.

David Woolger: expanding applications

Can you give us a brief description of Cerca Magnetics’ technology and how it works?

When a neuron fires, you get an electrical current and a corresponding magnetic field. Our technology uses optically pumped magnetometers (OPMs), which are very sensitive to magnetic fields. Effectively, we’re sensing magnetic fields 500 million times lower than the Earth’s magnetic field.

To enable us to do that, as well as the quantum sensors, we need to shield against the Earth’s magnetic field, so we do this in a shielded environment with both active and passive shielding. We are then able to measure the magnetic fields from the brain, which we can use to understand functionally what’s going on in that area.

Are there any other applications for this technology beyond your work with Margot Taylor?

There’s a vast number of applications within the field of brain health. For example, we’re working with a team in Oxford at the moment, looking at dementia. So that’s at the other end of the life cycle, studying ways to identify the disease much earlier. If you can do that you can potentially start treatment with drugs or other interventions earlier.

Outside brain health, there are a number of groups who are using this quantum technology in other areas of medical science. One group in Arkansas is looking at foetal imaging during pregnancy, using it to see much more clearly than has previously been possible.

There’s another group in London looking at spinal imaging using OPM. Concussion is another potential application of these sensors, for sports or military injuries. There’s a vast range of medical-imaging applications that can be done with these sensors.

Have you looked at non-medical applications?

Cerca is very much a medical-imaging company, but I am aware of other applications of the technology. For example, applications with car batteries have potential to be a big market. When they make car batteries, there’s a lot of electrochemistry that goes into the cells. If you can image those processes during production, you can effectively optimize that production cycle, and therefore reduce the costs of the batteries. This has a real potential benefit for use in electric cars.

What’s next for Cerca Magnetics’ technology?

We are in a good position in that we’ve been able to deliver our initial systems to the research market and actually earn revenue. We have made a profit every year since we started trading. We have then reinvested that profit back into further development. For example, we are looking at scanning two people at once, looking at other techniques that will continue to develop the product, and most importantly, working on medical device approval. At the moment, our system is only sold to research institutes, but we believe that if the product were available in every hospital and every doctor’s surgery, it could have an incredible societal impact across the human lifespan.

Magnetoencephalography with optically pumped magnetometers

Schematic showing the working principle behind optically pumped magnetometry
Seeing the light A schematic showing the working principle behind optically pumped magnetometry (OPM). (CC BY 4.0 Trends in Neurosciences 45 621)

Like any electrical current, signals transmitted by neurons in the brain generate magnetic fields. Magnetoencephalography (MEG) is an imaging technique that detects these signals and locates them in the brain. MEG has been used to plan brain surgery to treat epilepsy. It is also being developed as a diagnostic tool for disorders including schizophrenia and Alzheimer’s disease.

MEG traditionally uses superconducting quantum interference devices (SQUIDs), which are sensitive to very small magnetic fields. However, SQUIDs must be cryogenically cooled, which makes the technology bulky and immobile. Magnetoencephalography with optically pumped magnetometers (OPM-MEG) is an alternative technology that operates at room temperature. Optically pumped magnetometers (OPMs) are small quantum devices that can be integrated into a helmet, which is an advantage for imaging children’s brains.

The key components of an OPM device are a cloud of alkali atoms (generally rubidium), a laser and a photodetector. Initially, the spins of the atoms point in random directions (top row in figure), but applying a polarized laser of the correct frequency aligns the spins along the direction of the light (middle row in figure). When the atoms are in this state, they are transparent to the laser so the signal reaching the photodetector is at a maximum.

However, in the presence of a magnetic field, such as that from a brain wave, the spins of the atoms are perturbed and they are no longer aligned with the laser (bottom row in figure). The atoms can now absorb some of the laser light, which reduces the signal reaching the photodetector.

In OPM-MEG, these devices are placed around the patient’s head and integrated into a helmet. By measuring the signal from the devices and combining this with structural images and computer modelling, it’s possible to work out where in the brain the signal came from. This can be used to understand how electrical activity in different brain regions is linked to development, brain disorders and neurodivergence.

Katherine Skipper

The post Quantum brainwave: using wearable quantum technology to study cognitive development appeared first on Physics World.

  •  

Electro-active material ‘learns’ to play Pong

10 septembre 2024 à 10:45

An electro-active polymer hydrogel can be made to “memorize” experiences in the same way as biological neurons do, say researchers at the University of Reading, UK. The team demonstrated this finding by showing that when the hydrogel is configured to play the classic video game Pong, it improves its performance over time. While it would be simplistic to say that the hydrogel truly learns like humans and other sentient beings, the researchers say their study has implications for studies of artificial neural networks. It also raises questions about how “simple” such a system can actually be, if it is capable of such complex behaviour.

Artificial neural networks are machine-learning algorithms that are configured to mimic structures found in biological neural networks (BNNs) such as human brains. While these forms of artificial intelligence (AI) can solve problems through trial and error without being explicitly programmed with pre-defined rules, they are not generally regarded as being adaptive, as BNNs are.

Playing Pong with neurons

In a previous study, researchers led by neuroscientist Karl Friston of University College London, UK and Brett Kagan of Cortical Labs in Melbourne, Australia, integrated a BNN with computing hardware by growing a large cluster of human neurons on a silicon chip. They then connected this chip to a computer programmed to play a version of Pong, a table-tennis-like game that originally involved a player and the computer bouncing an electronic ball between two computerized paddles. In this case, however, the researchers simplified the game so that there was only a single paddle on one side of the virtual table.

To find out whether this paddle had contacted the ball, Friston, Kagan and colleagues transmitted electrical signals to the neuronal network via the chip. At first, the neurons did not play Pong very well, but over time, they hit the ball more frequently and made more consecutive hits, allowing for longer rallies.

In this earlier work, the researchers described the neurons as being able to “learn” the game thanks to the concept of free energy as defined by Friston in 2010. He argued that neurons endeavour to minimize free energy, and therefore “choose” the option that allows them to do this most efficiently.

An even simpler version

Inspired by this feat and by the technique employed, the Reading researchers wondered whether such an emergent memory function could be generated in media that were even simpler than neurons. For their experiments, they chose to study a hydrogel (a complex polymer that jellifies when hydrated) that contains free-floating ions. These ions make the polymer electroactive, meaning that its behaviour is influenced by an applied electric field. As the ions move, they draw water with them, causing the gel to swell in the area where the electric field is applied.

The time it takes for the hydrogel to swell is much greater than the time it takes to de-swell, explains team member Vincent Strong. “This means there is a form of hysteresis in the ion motion because each consecutive stimulation moves the ions less and less as they gather,” says Strong, a robotics engineer at Reading and the first author of a paper in Cell Reports Physical Science on the new research. “This acts as a form of memory since the result of each stimulation on the ion’s motion is directly influenced by previous stimulations and ion motion.”

This form of memory allows the hydrogel to build up experience about how the ball moves in Pong, and thus to move its paddle with greater accuracy, he tells Physics World. “The ions within the gel move in a way that maps a memory of the ball’s motion not just at any given point in time but over the course of the entire game.”

The researchers argue that their hydrogel represents a different type of “intelligence”, and one that could be used to develop algorithms that are simpler than existing AI algorithms, most of which are derived from neural networks.

“We see this work as an example of how a much simpler system, in the form of an electro-active polymer hydrogel, can perform similar complex tasks to biological neural networks,” Strong says. “We hope to apply this as a stepping stone to finding the minimum system required for such tasks that require memory and improvement over time, looking both into other active materials and tasks that could provide further insight.

“We’ve shown that memory is emergent within the hydrogels, but the next step is to see whether we can also show specifically that learning is occurring.”

The post Electro-active material ‘learns’ to play Pong appeared first on Physics World.

  •  

Fusion’s public-relations drive is obscuring the challenges that lie ahead

Par : No Author
9 septembre 2024 à 12:00

“For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.” So stated the Nobel laureate Richard Feynman during a commission hearing into NASA’s Challenger space shuttle disaster in 1986, which killed all seven astronauts onboard.

Those famous words have since been applied to many technologies, but they are becoming especially apt to nuclear fusion where public relations currently appears to have the upper hand. Fusion has recently been successful in attracting public and private investment and, with help from the private sector, it is claimed that fusion power can be delivered in time to tackle climate change in the coming decades.

Yet this rosy picture hides the complexity of the novel nuclear technology and plasma physics involved. As John Evans – a physicist who has worked at the Atomic Energy Research Establishment in Harwell, UK – recently highlighted in Physics World, there is a lack of proven solutions for the fusion fuel cycle, which involves breeding and reprocessing unprecedented quantities of radioactive tritium with extremely low emissions.

Unfortunately, this is just the tip of the iceberg. Another stubborn roadblock lies in instabilities in the plasma itself – for example, so-called Edge Localised Modes (ELMs), which originate in the outer regions of tokamak plasmas and are akin to solar flares. If not strongly suppressed they could vaporize areas of the tokamak wall, causing fusion reactions to fizzle out. ELMs can also trigger larger plasma instabilities, known as disruptions, that can rapidly dump the entire plasma energy and apply huge electromagnetic forces that could be catastrophic for the walls of a fusion power plant.

In a fusion power plant, the total thermal energy stored in the plasma needs to be about 50 times greater than that achieved in the world’s largest machine, the Joint European Torus (JET). JET operated at the Culham Centre for Fusion Energy in Oxfordshire, UK, until it was shut down in late 2023. I was responsible for upgrading JET’s wall to tungsten/beryllium and subsequently chaired the wall protection expert group.

JET was an extremely impressive device, and just before it ceased operation it set a new world record for controlled fusion energy production of 69 MJ. While this was a scientific and technical tour de force, in absolute terms the fusion energy created and plasma duration achieved at JET were minuscule. A power plant with a sustained fusion power of 1 GW would produce 86 million MJ of fusion energy every day. Furthermore, large ELMs and disruptions were a routine feature of JET’s operation and occasionally caused local melting. Such behaviour would render a power plant inoperable, yet these instabilities remain to be reliably tamed.

Complex issues

Fusion is complex – solutions to one problem often exacerbate other problems. Furthermore, many of the physics and technology features that are essential for fusion power plants and require substantial development and testing in a fusion environment were not present in JET. One example being the technology to drive the plasma current sustainably using microwaves. The purpose of the international ITER project, which is currently being built in Cadarache, France, is to address such issues.

ITER, which is modelled on JET, is a “low duty cycle” physics and engineering experiment. Delays and cost increases are the norm for large nuclear projects and ITER is no exception. It is now expected to start scientific operation in 2034, but the first experiments using “burning” fusion fuel – a mixture of deuterium and tritium (D–T) – is only set to begin in 2039. ITER, which is equipped with many plasma diagnostics that would not be feasible in a power plant, will carry out an extensive research programme that includes testing tritium-breeding technologies on a small scale, ELM suppression using resonant magnetic perturbation coils and plasma-disruption mitigation systems.

The challenges ahead cannot be understated. For fusion to become commercially viable with an acceptably low output of nuclear waste, several generations of power-plant-sized devices could be needed

Yet the challenges ahead cannot be understated. For fusion to become commercially viable with an acceptably low output of nuclear waste, several generations of power-plant-sized devices could be needed following any successful first demonstration of substantial fusion-energy production. Indeed, EUROfusion’s Research Roadmap, which the UK co-authored when it was still part of ITER, sees fusion as only making a significant contribution to global energy production in the course of the 22nd century. This may be politically unpalatable, but it is a realistic conclusion.

The current UK strategy is to construct a fusion power plant – the Spherical Tokamak for Energy Production (STEP) – at West Burton, Nottinghamshire, by 2040 without awaiting results from intermediate experiments such as ITER. This strategy would appear to be a consequence of post-Brexit politics. However, it looks unrealistic scientifically, technically and economically. The total thermal energy of the STEP plasma needs to be about 5000 times greater than has so far been achieved in the UK’s MAST-U spherical tokamak experiment. This will entail an extreme, and unprecedented, extrapolation in physics and technology. Furthermore, the compact STEP geometry means that during plasma disruptions its walls would be exposed to far higher energy loads than ITER, where the wall protection systems are already approaching physical limits.

I expect that the complexity inherent in fusion will continue to provide its advocates, both in the public and private sphere, with ample means to obscure both the severity of the many issues that lie ahead and the timescales required. Returning to Feynman’s remarks, sooner or later reality will catch up with the public relations narrative that currently surrounds fusion. Nature cannot be fooled.

The post Fusion’s public-relations drive is obscuring the challenges that lie ahead appeared first on Physics World.

  •  

To make Mars warmer, just add nanorods

9 septembre 2024 à 10:00

If humans released enough engineered nanoparticles into the atmosphere of Mars, the planet could become more than 30 K warmer – enough to support some forms of microbial life. This finding is based on theoretical calculations by researchers in the US, and it suggests that “terraforming” Mars to support temperatures that allow for liquid water may not be as difficult as previously thought.

“Our finding represents a significant leap forward in our ability to modify the Martian environment,” says team member Edwin Kite, a planetary scientist at the University of Chicago.

Today, Mars is far too cold for life as we know it to thrive there. But it may not have always been this way. Indeed, streams may have flowed on the red planet as recently as 600 000 years ago. The idea of returning Mars to this former, warmer state – terraforming – has long kindled imaginations, and scientists have proposed several ways of doing it.

One possibility would be to increase the levels of artificial greenhouse gases, such as chlorofluorocarbons, in Mars’ currently thin atmosphere. However, this would require volatilizing roughly 100 000 megatons of fluorine, an element that is scarce on the red planet’s surface. This means that essentially all the fluorine required would need to be transported to Mars from somewhere else – something that is not really feasible.

An alternative would be to use materials already present on Mars’ surface, such as those in aerosolized dust. Natural Martian dust is mainly made of iron-rich minerals distributed in particles roughly 1.5 microns in radius, which are easily lofted to altitudes of 60 km and more. In its current form, this dust actually lowers daytime surface temperatures by attenuating infrared solar radiation. A modified form of dust might, however, experience different interactions. Could this modified dust make the planet warmer?

Nanoparticles designed to trap escaping heat and scatter sunlight

In a proof-of-concept study, Kite and colleagues at the University of Chicago, the University of Central Florida and Northwestern University analysed the atmospheric effects of nanoparticles shaped like short rods about nine microns long, which is about the same size as commercially available glitter. These particles have an aspect ratio of around 60:1, and Kite says they could be made from readily-available Martian materials such as iron or aluminium.

Calculations using finite-difference time domains showed that such nanorods, which are randomly oriented due to Brownian motion, would strongly scatter and absorb upwelling thermal infrared radiation in certain spectral windows. The nanorods would also scatter sunlight down towards the surface, adding to the warming, and would settle out of the atmosphere and onto the Martian surface more than 10 times more slowly than natural dust. This implies that, once airborne, the nanorods would be lofted to high altitudes and remain in the atmosphere for long periods.

More efficient than previous Martian warming proposals

These factors give the nanorod idea several advantages over comparable schemes, Kite says. “Our approach is over 5000 times more efficient than previous global warming proposals (on a per-unit-mass-in-the-atmosphere basis) because it uses much less mass of material to achieve significant warming,” he tells Physics World. “Previous schemes required importing large amounts of gases from Earth or mining rare Martian resources, [but] we find that nanoparticles can achieve similar warming with a much smaller total mass.”

However, Kite stresses that the comparison only applies to approaches that aim to warm Mars’ atmosphere on a global scale. Other approaches, including one developed by researchers at Harvard University and NASA’s Jet Propulsion Laboratory (JPL) that uses silica aerogels, would be better suited for warming the atmosphere locally, he says, adding that a recent workshop on Mars terraforming provides additional context.

While the team’s research is theoretical, Kite believes it opens new avenues for exploring planetary climate modification. It could inform future Mars exploration or even long-term plans for making Mars more habitable for microbes and plants. Extensive further research would be required, however, before any practical efforts in this direction could see the light of day. In particular, more work is needed to assess the very long-term sustainability of a warmed Mars. “Atmospheric escape to space would take at least 300 million years to deplete the atmosphere at the present-day rate,” he observes. “And nanoparticle warming, by itself, is not sufficient to make the planet’s surface habitable again either.”

Kite and colleagues are now studying the effects of particles of different shapes and compositions, including very small carbon nanoparticles such as graphene nanodisks. They report their present work in Science Advances.

The post To make Mars warmer, just add nanorods appeared first on Physics World.

  •  

Taking the leap – how to prepare for your future in the quantum workforce

Par : No Author
6 septembre 2024 à 17:16

It’s official: after endorsement from 57 countries and the support of international physics societies, the United Nations has officially declared that 2025 is the International Year of Quantum Science and Technology (IYQ).

The year has been chosen as it marks the centenary of Werner Heisenberg laying out the foundations of quantum mechanics – a discovery that would earn him the Nobel Prize for Physics in 1932. As well as marking one of the most significant breakthroughs in modern science, the IYQ also reflects the recent quantum renaissance. Applications that use the quantum properties of matter are transforming the way we obtain, process and transmit information, and physics graduates are uniquely positioned to make their mark on the industry.

It’s certainly big business these days. According to estimates from McKinsey, in 2023 global quantum investments were valued at $42bn. Whether you want to build a quantum computer, an unbreakable encryption algorithm or a high-precision microscope, the sector is full of exciting opportunities. With so much going on, however, it can be hard to make the right choices for your career.

To make the quantum landscape easier to navigate as a jobseeker, Physics World has spoken to Abbie Bray, Araceli Venegas-Gomez and Mark Elo – three experts in the quantum sector, from academia and industry. They give us their exclusive perspectives and advice on the future of the quantum marketplace; job interviews; choosing the right PhD programme; and managing risk and reward in this emerging industry.

Quantum going mainstream: Abbie Bray

According to Abbie Bray, lecturer in quantum technologies at University College London (UCL) in the UK, the second quantum revolution has broadened opportunities for graduates. Until recently, there was only one way to work in the quantum sector – by completing a PhD followed by a job in academia. Now, however, more and more graduates are pursuing research in industry, where established companies such as Google, Microsoft and BT – as well as numerous start-ups like Rigetti and Universal Quantum – are racing to commercialize the technology.

Abbie Bray
Abbie Bray “Theorists and experimentalists need to move at the same time.” (Courtesy: Henry Bennie)

While a PhD is generally needed for research, Bray is seeing more jobs for bachelor’s and master’s graduates as quantum goes mainstream. “If you’re an undergrad who’s loving quantum but maybe not loving the research or some of the really high technical skills, there’s other ways to still participate within the quantum sphere,” says Bray. With so many career options in industry, government, consulting or teaching, Bray is keen to encourage physics graduates to consider these as well as a more traditional academic route.

She adds that it’s important to have physicists involved in all parts of the industry. “If you’re having people create policies who maybe haven’t quite understood the principles or impact or the effort and time that goes into research collaboration, then you’re lacking that real understanding of the fundamentals. You can’t have that right now because it’s a complex science, but it’s a complex science that is impacting society.”

So whether you’re a PhD student or an undergraduate, there are pathways into the quantum sector, but how can you make yourself stand out from the crowd? Bray has noticed that quantum physics is not taught in the same way across universities, with some students getting more exposure to the practical applications of the field than others. If you find yourself in an environment that isn’t saturated with quantum technology, don’t panic – but do consider getting additional experience outside your course. Bray highlights PennyLane, which is a Python library for programming quantum computers, that also produces learning resources.

Consider your options

Something else to be aware of, particularly for those contemplating a PhD, is that “quantum technologies” is a broad umbrella term, and while there is some crossover between, say, sensing and computing, switching between disciplines can be a challenge. It’s therefore important to consider all your options before committing to a project and Bray thinks that Centres for Doctoral Training (CDTs) are a step in the right direction. UCL has recently launched a quantum computing and quantum communications CDT where students will undergo a six-month training period before writing their project proposal. She thinks this enables them to get the most out of their research, particularly if they haven’t covered some topics in their undergraduate degree. “It’s very important that during a PhD you do the research that you want to do,” Bray says.

When it comes to securing a job, PhD position or postdoc, non-technical skills can be just as valuable as quantum know-how. Bray says it’s important to demonstrate that you’re passionate and deeply knowledgeable about your favourite quantum topic, but graduates also need to be flexible and able to work in an interdisciplinary team. “If you think you’re a theorist, understand that it also does sometimes mean looking at and working with experimental data and computation. And if you’re an experimentalist, you’ve got to understand that you need to have a rigorous understanding of the theory before you can make any judgements on your experimentation.” As Bray summarises: “theorists and experimentalists need to move at the same time”.

The ability to communicate technical concepts effectively is also vital. You might need to pitch to potential investors, apply for grants or even communicate with the HR department so that they shortlist the best candidates. Bray adds that in her experience, physicists are conditioned to communicate their research very directly, which can be detrimental in interviews where panels want to hear narratives about how certain skills were demonstrated. “They want to know how you identified a situation, then you identified the action, then the resolution. I think that’s something that every single student, every single person right now should focus on developing.”

The quantum industry is still finding its feet and earlier this year it was reported that investment has fallen by 50% since a high in 2022. However, Bray argues that “if there has been a de-investment, there’s still plenty of money to go around” and she thinks that even if some quantum technologies don’t pan out, the sector will continue to provide valuable skills for graduates. “No matter what you do in quantum, there are certain skills and experiences that can cross over into other parts of tech, other parts of science, other parts of business.”

In addition, quantum research is advancing everything from software to materials science and Bray thinks this could kick-start completely new fields of research and technology. “In any race, there are horses that will not cross the finish line, but they might run off and cross some other finish line that we didn’t know existed,” she says.

Building the quantum workforce: Araceli Venegas-Gomez

While working in industry as an aerospace engineer, Araceli Venegas-Gomez was looking for a new challenge and decided to pursue her passion for physics, getting her master’s degree in medical physics alongside her other duties. Upon completing that degree in 2016, she decided to take on a second master’s followed by a PhD in quantum optics and simulation at the University of Strathclyde, UK. By the time the COVID-19 pandemic hit in 2020, she had defended her thesis, registered her company, and joined the University of Bristol Quantum Technology Enterprise Centre as an executive fellow.

Araceli Venegas-Gomez
Araceli Venegas-Gomez “If you have a background in physics and business, everyone is looking for you.” (Courtesy: Qureca)

It was during her studies at Strathclyde that Venegas-Gomez decided to use her vast experience across industry and academia, as well as her quantum knowledge. Thanks to a fellowship from the Optica Foundation, she was able to launch QURECA (Quantum Resources and Careers). Today, it’s a global company that helps to train and recruit individuals, while also providing business development advice for for both individuals and companies in the quantum sphere. As founder and chief executive of the firm, her aims were to link the different stakeholders in the quantum ecosystem and to raise the quantum awareness of the general public. Crucially, she also wanted to ease the skills bottleneck in the quantum workforce and to bring newcomers into the quantum ecosystem.

As Venegas-Gomez points out, there is a significant scarcity of skilled quantum professionals for the many roles that need filling. This shortage is exacerbated by the competition between academia and industry for the same pool of talent. “Five or ten years ago, it was difficult enough to find graduate students who would like to pursue a career in quantum science, and that was just in academia,” explains Venegas-Gomez. “With the quantum market booming, industry is also looking to hire from the same pool of candidates, so you have more competition, for pretty much the same number of people.”

Slow progress

Venegas-Gomez highlights that the quantum arena is very broad. “You can have a career in research, or work in industry, but there are so many different quantum technologies that are coming onto the market, at different stages of development. You can work on software or hardware or engineering; you can do communications; you can work on developing the business side; or perhaps even in patent law.” While some of these jobs are highly technical and would require a master’s or a PhD in that specific area of quantum tech, there are plenty of roles that would accept graduates with only an MSc in physics or even a more interdisciplinary experience. “If you have a background in physics and business, everyone is looking for you,” she adds.

From what she sees in the quantum recruitment market today, there is no job shortage for physicists – instead there is a dearth of physicists with the right skills for a specific role. Venegas-Gomez explains that graduates with a physics degree in many fields have transferable skills that allow them to work in “absolutely any sector that you could imagine”. But depending on the specific area of academia or industry within the quantum marketplace that you might be interested in, you will likely require some specific competences.

As Bray also stated, Venegas-Gomez acknowledges that the skills and knowledge that physicists pick up can vary significantly between universities – making it challenging for employers to find the right candidates. To avoid picking the wrong course for you, Venegas-Gomez recommends that potential master’s and PhD students speak to a number of alumni from any given institute to find out more about the course, and see what areas they work in today. This can also be a great networking strategy, especially as some cohorts can have as few as 10–15 students all keen work with these companies or university departments in the future.

Despite the interest and investment in the quantum industry, new recruits should note that it is is still in its early stages. This slow progress can lead to high expectations that are not met, causing frustration for both employers and potential employees. “Only today, we had an employer approach us (QURECA) saying that they wanted someone with three to four years’ experience in Python, and a bachelor’s or master’s degree – it didn’t have to be quantum or even physics specifically,” reveals Venegas-Gomez. “This means that [to get this particular job] you could have a background in computer science or software engineering. Having an MSc in quantum per se is not going to guarantee that you get a job in quantum technologies, unless that is something very specific that employer is looking for.”

So what specific competencies are employers across the board looking for? If an company isn’t looking for a specific technical qualification, what happens if they get two similar CVs for the same role? Do they look at an applicant’s research output and publications, or are they looking for something different? “What I find is that employers are looking for candidates who can show that, alongside their academic achievements, they have been doing outreach and communication activities,” says Venegas-Gomez. “Maybe you took on a business internship and have a good idea of how the industry works beyond university – this is what will really stand out.”

She adds that so-called soft-skills – such as demonstrating good leadership, teamwork, and excellent communication skills – are very valued. “This is an industry where highly skilled technical people need to be able to work with people vastly beyond their area of expertise. You need to be able to explain Hamiltonians or error corrections to someone who is not quantum-literate and explain the value of what you are working on.”

Venegas-Gomez is also keen that job-seekers realize that the chances of finding a role at a large firm such as Google, IBM or Microsoft are still slim-to-none for most quantum graduates. “I have seen a lot of people complete their master’s in a quantum field and think that they will immediately find the perfect job. The reality is that they likely need to be patient and get some more experience in the field before they get that dream job.” Her main advice to students is to clearly define their career goals, within the context of the booming and ever-growing quantum market, before pursuing a specific degree. The skills you acquire with a quantum degree are also highly transferable to other fields, meaning there are lots of alternatives out there even if you can’t find the right job in the quantum sphere. For example, experience in data science or software development can complement quantum expertise, making you a versatile and coveted contender in today’s job market.

Approaching “quantum advantage”: Mark Elo

Last year, IBM broke records by building the first quantum chip with more than 1000 qubits. The project represents millions of dollars of investment and the company is competing with the likes of Intel and Google to achieve “quantum advantage”, which refers to a quantum computer that can solve problems that are out of reach for classical machines.

Despite the hype, there is work to be done before the technology becomes widespread – a commercial quantum computer needs millions of qubits, and challenges in error correction and algorithm efficiency must be addressed.

Mark Elo
Mark Elo “There are some geniuses in the world, but if they can’t communicate it’s no good in an industrial environment.”

“We’re trying to move it away from a science experiment to something that’s more an industrial product,” says Mark Elo, chief marketing officer at Tabor Electronics. Tabor has been building electronic signal equipment for over 50 years and recently started applying this technology to quantum computing. The company’s focus is on control systems – classical electronic signals that interact with quantum states. At the 2024 APS March Meeting, Tabor, alongside its partners FormFactor and QuantWare, unveiled the first stage of the Echo-5Q project, a five-qubit quantum computer.

Elo describes the five years he’s worked on quantum computing as a period of significant change. Whereas researchers once relied on “disparate pieces of equipment” to build experiments, he says that the industry has changed such that “there are [now] products designed specifically for quantum computing”.

The ultimate goal of companies like Tabor is a “full-stack” solution where software and hardware are integrated into a single platform. However, the practicalities of commercializing quantum computing require a workforce with the right skills. Two years ago the consultancy company McKinsey reported that companies were already struggling to recruit, and they predicted that by 2025, half of the jobs in quantum computing will not be filled. Like many in the industry, Elo sees skills gaps in the sector that must be addressed to realize the potential of quantum technology.

Elo’s background is in solid-state electronics, and he worked for nearly three decades on radio-frequency engineering for companies including HP and Keithley. Most quantum-computing control systems use radio waves to interface with the qubits, so when he moved to Tabor in 2019, Elo saw his career come “full circle”, combining the knowledge from his degree with his industry experience. “It’s been like a fusion of two technologies” he says.

It’s at this interface between physics and electronic engineering where Elo sees a skills shortage developing. “You need some level of electrical engineering and radio-frequency knowledge to lay out a quantum chip,” he explains. “The most common qubit is a transmon, and that is all driven by radio waves. Deep knowledge of how radio waves propagate through cables, through connectors, through the sub-assemblies and the amplifiers in the refrigeration unit is very important.” Elo encourages physics students interested in quantum computing to consider adding engineering – specifically radio-frequency electronics – courses to their curricula.

Transferable skills

The Tabor team brings together engineers and physicists, but there are some universal skills it looks for when recruiting. People skills, for example, are a must. “There are some geniuses in the world, but if they can’t communicate it’s no good in an industrial environment,” says Elo.

Elo describes his work as “super exciting” and says “I feel lucky in the career and the technology I’ve been involved in because I got to ride the wave of the cellular revolution all the way up to 5G and now I’m on to the next new technology.” However, because quantum is an emerging field, he thinks that graduates need to be comfortable with some risk before embarking on a career. He explains that companies don’t always make money right now in the quantum sector – “you spend a lot to make a very small amount”. But, as Elo’s own career shows, the right technical skills will always allow you to switch industries if needed.

Like many others, Elo is motivated by the excitement of competing to commercialize this new technology. “It’s still a market that’s full of ideas and people marketing their ideas to raise money,” he says. “The real measure of success is to be able to look at when those ideas become profitable. And that’s when we know we’ve crossed a threshold.”

The post Taking the leap – how to prepare for your future in the quantum workforce appeared first on Physics World.

  •  

BepiColombo takes its best images yet of Mercury’s peppered landscape

6 septembre 2024 à 12:18

The BepiColombo mission to Mercury – Europe’s first craft to the planet – has successfully completed its fourth gravity-assist flyby as it uses the planet’s gravity to enter orbit around Mercury in November 2026. As it did so, the craft captured its best images yet of some of Mercury’s largest impact craters.

BepiColombo, which launched in 2018, comprises two science orbiters that will circle Mercury – the European Space Agency’s Mercury Planetary Orbiter (MPO) and the Japan Aerospace Exploration Agency’s Mercury Magnetospheric Orbiter (MMO).

The two spacecraft are travelling to Mercury as part of a coupled system. When they reach the planet, the MMO will study Mercury’s magnetosphere while the MPO will survey the planet’s surface and internal composition.

The aim of the BepiColombo mission is to provide information on the composition, geophysics, atmosphere, magnetosphere and history of Mercury.

The closest approach so far for the mission – about 165 km above the planet’s surface – took place at on 4 September. For the first time, the spacecraft had a clear view of Mercury’s south pole.

Mercury by BepiColombo
The winged messenger: BepiColombo will carry out its next flyby on 1 December (Courtesy: ESA/BepiColombo/MTM)

One image (top), taken by the craft’s M-CAM2 camera, features a large “peak ring basin” inside a crater measuring 210 km across, which is named after the famous Italian composer Antonio Vivaldi. The visible gap in the peak ring is thought to be where more recent lava flows have entered and flooded the crater.

BepiColombo will now conduct a fifth and sixth flyby of the planet on 1 December and 8 January 2025, respectively, before arriving in November 2025. The mission is planned to operate until 2029.

The post BepiColombo takes its best images yet of Mercury’s peppered landscape appeared first on Physics World.

  •  

Hybrid quantum–classical computing chips and neutral-atom qubits both show promise

5 septembre 2024 à 17:33

This episode of the Physics World Weekly podcast looks at quantum computing from two different perspectives.

Our first guest is Elena Blokhina, who is chief scientific officer at Equal1 – an award-winning company that is developing hybrid quantum–classical computing chips. She explains why Equal1 is using quantum dots as qubits in its silicon-based quantum processor unit.

Next up is Brandon Grinkemeyer, who is a PhD student at Harvard University working in several cutting-edge areas of quantum research. He is a member of Misha Lukin’s research group, which is active in the fields of quantum optics and atomic physics and is at the forefront of developing  quantum processors that use arrays of trapped atoms as qubits.

The post Hybrid quantum–classical computing chips and neutral-atom qubits both show promise appeared first on Physics World.

  •  

Researchers with a large network of unique collaborators have longer careers, finds study

Par : No Author
5 septembre 2024 à 17:00

Are you keen to advance your scientific career? If so, it helps to have a big network of colleagues and a broad range of unique collaborators, according to a new analysis of physicists’ publication data. The study also finds that female scientists tend to work in more tightly connected groups than men, which can hamper their career progression.

The study was carried out by a team led by Mingrong She, a data analyst at Maastricht University in the Netherlands. It examined the article history of more than 23,000 researchers who had published at least three papers in American Physical Society (APS) journals. Each scientist’s last paper had been published before 2015, suggesting their research career had ended (arXiv:2408.02482).

To measure “collaboration behaviour”, the study noted the size of each scientist’s collaborative network, the reoccurrence of collaborations, the “interconnectivity” of the co-authors and the average number of co-authors per publication. Physicists with larger networks and a greater number of unique collaborators were found to have had longer careers and been more likely to become principal investigators, as given by their position in the author list.

On the other hand, publishing repeatedly with the same highly interconnected co-authors is associated with shorter careers and a lower chance of achieving principal investigator status, as is having a larger average number of coauthors.

The team also found that the more that physicists publish with the same co-authors, the more interconnected their networks become. Conversely, as network size increases, networks tended to be less dense and repeat collaboration less frequent.

Close-knit collaboration

In terms of gender, the study finds that women have more interconnected networks and a higher average number of co-authors than men. Female physicists are also more likely to publish repeatedly with the same co-authors, with women therefore being less likely than men to become principal investigators. Male scientists also have longer overall careers and stay in science longer after achieving principal investigator status than women, the study finds.

Collaborating with experts from diverse backgrounds introduces novel perspectives and opportunities

Mingrong She

“Collaborating with experts from diverse backgrounds introduces novel perspectives and opportunities [and] increases the probability of establishing connections with prominent researchers and institutions,” She told Physics World. Diverse collaboration also “mitigates the risk of being confined to a narrow niche and enhances adaptability” she adds,”both of which are indispensable for long-term career growth”.

Close-knit collaboration networks can be good for fostering professional support, the study authors state, but they reduce opportunities for female researchers to form new professional connections and lower their visibility within the broader scientific community. Similarly, larger numbers of co-authors dilute individual contributions, making it harder for female researchers to stand out.

She says the study “highlights how the structure of collaboration networks can reinforce existing inequalities, potentially limiting opportunities for women to achieve career longevity and progression”. Such issues could be improved with policies that help scientists to engage a wider array of collaborators, rewarding and encouraging small-team publications and diverse collaboration. Policies could include adjustments to performance evaluations and grant applications, and targeted training programmes.

The study also highlights lower mobility as a major obstacle for female scientists, suggesting that better childcare support, hybrid working and financial incentives could help improve the mobility and network size of female scientists.

The post Researchers with a large network of unique collaborators have longer careers, finds study appeared first on Physics World.

  •  
❌
❌