↩ Accueil

Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Ancient lull in Earth’s magnetic field may have allowed large animals to evolve

The list of conditions required for complex life to emerge on Earth is contentious, poorly understood, and one item longer than it used to be. According to an international team of geoscientists, an unusual lull in the Earth’s magnetic field nearly 600 million years ago may have triggered a rise in the planet’s oxygen levels, thereby allowing large, complex animals to evolve and thrive for the first time. Though evidence for the link is circumstantial, the team say that new measurements of magnetization in 591-million-year-old rocks suggest an important connection between processes occurring deep within the Earth and the appearance of life on its surface.

Scientists believe that the Earth’s magnetic field – including the bubble-like magnetosphere that shields us from the solar wind – stems from a dynamo effect created by molten metal moving inside the planet’s outer core. The strength of this field varies, and between 591 and 565 million years ago, it was almost 30 times weaker than it is today. Indeed, researchers think that in these years, which lie within the geological Ediacaran Period, the field dropped to its lowest-ever point.

Early in the Ediacaran, life on Earth was limited to small, soft-bodied organisms. Between 575 and 565 million years ago, however, fossil records show that lifeforms became significantly larger, more complex and mobile. While previous studies linked this change to an increase in atmospheric and oceanic oxygen levels that occurred around the same time, the cause of this increase was not known.

Weak magnetic field allowed more hydrogen gas to escape

In the latest study, which is published in Communications Earth & Environment, researchers led by geophysicist John Tarduno of the University of Rochester, US, present a hypothesis that links these two phenomena. The weak magnetic field, they argue, could have allowed more hydrogen gas to escape from the atmosphere into space, resulting in a net increase in the percentage of oxygen in the atmosphere and oceans. This, in turn, allowed larger, more oxygen-hungry animals to emerge.

To quantify the drop in Earth’s magnetic field, the team used a technique known as single-crystal paleointensity that Tarduno and colleagues invented 25 years ago and later refined. This technique enabled them to measure the magnetization of feldspar plagioclase crystals from a 591-million-year-old rock formation in Passo da Fabiana Gabbros, Brazil. These crystals are common in the Earth’s crust and contain minute magnetic inclusions that record, with high fidelity, the intensity of the Earth’s magnetic field at the time they crystallized. They also preserve accurate values of magnetic field strength for billions of years.

Photograph of a rounded, flattish fossilized organism with lines around its perimeter embedded in yellowish rock
Ancient organism: Photograph of a cast of Dickinsonia costata, an organism that lived ~560 million years ago, during the Ediacaran Period. The fossil was found at Flinders Ranges in South Australia. (Courtesy: Shuhai Xiao, Virginia Tech)

Tarduno says it was challenging to find rock formations that would have cooled slowly enough to yield a representative average for the magnetic field, and that also contained crystals suitable for paleointensity analyses. “We were able to find these in Brazil, aided by our colleagues at Universidade Federal do Rio Grande do Sul,” he says.

The team mounted the samples in quartz holders (which had negligible magnetic moments) and measured their magnetism using a DC SQUID magnetometer at Rochester. This instrument is composed of high-resolution sensing coils housed in a magnetically shielded room with an ambient field of less 200 nT.

Geodynamo remained ultra-weak for at least 26 million years

The team compared its results to those from previous studies of 565-million-year-old anorthosite rocks from the Sept Îles Mafic Intrusive Suite in Quebec, Canada, which also contain feldspar plagioclase crystals. Together, these measurements suggest that the Earth’s geodynamo remained ultra-weak, with a time-averaged field intensity of less than 0.8 x 1022 A m2, for least 26 million years. “This timespan allowed the oxygenation of the atmosphere and oceans to cross a threshold that in turn helped the Ediacaran radiation of animal life,” Tarduno says. “If this is true, it would represent a profound connection between the deep Earth and life. This history could also have implications for the search for life elsewhere.”

The researchers say they need additional records of geomagnetic field strength throughout the Ediacaran, plus the Cryogenian (720 to 635 million years ago) and Cambrian (538.8 to 485.4 million years ago) Periods that bookend it, to better constrain the duration of the ultra-low magnetic fields. “This is crucial for understanding how much hydrogen could be lost from the atmosphere to space, ultimately resulting in the increased oxygenation,” Tarduno tells Physics World.

The post Ancient lull in Earth’s magnetic field may have allowed large animals to evolve appeared first on Physics World.

Physics in Ukraine: scientific endeavour lives on despite the Russian invasion

Slide show of the photos from Ukraine - these are repeated through the article with full caption information
Photographs of researchers and scientific facilities in Kharkiv, Ukraine, taken in November 2023. Read the article below to hear from photographer Eric Lusito about his experiences documenting the effects of the Russian invasion on Ukraine’s physics community. (All images courtesy: Eric Lusito)

Kharkiv, Ukraine’s second-largest city, has a long history as a world cradle of physics. The first experiments in the Soviet Union on nuclear fission were conducted there in the 1930s, and in 1932 the future Nobel-prize-winning physicist Lev Landau founded an influential school of theoretical physics in the city. Over the years, Kharkiv has been visited by influential scientists including Niels Bohr, Paul Dirac and Paul Ehrenfest.

Kharkiv is still home to many research institutes, but, located just 30 km from the Russian border, the city has been heavily damaged and suffered hundreds of casualties since the Russian invasion that began in February 2022. In the past week, advances made by the Russian forces have put the region under increased pressure, with thousands of civilians fleeing away from the border towards Kharkiv.

I first travelled to Kharkiv in November 2021 as part of a photography project documenting Soviet-era scientific facilities. The Russian army was massing at the border, but I was nevertheless stunned when, having returned home to France, I heard the news of the invasion.

The day after I arrived, explosions rang out in the city centre

I considered abandoning my project, but I felt compelled to document what was happening and in November 2023, I returned to Kharkiv. Though the Ukrainian counter-offensive had pushed back the Russian forces, the city was nevertheless plunged into darkness, the streets deserted. The day after I arrived, explosions rang out in the city centre.

For two weeks, I photographed Kharkiv’s scientific facilities, many of which had been destroyed or badly damaged since my previous visit. I also interviewed and photographed scientists who were continuing to perform their research despite the ongoing war. As the situation in Kharkiv becomes increasingly critical, the pictures I took stand as a record of the effects of the conflict on the people of Ukraine.

Department of Solid State Physics and Condensed Matter, Kharkiv Institute of Physics and Technology, November 2021 and November 2023

Even when I was there in 2021, visiting the old cryogenics laboratory of the Kharkiv Institute of Physics and Technology (KIPT) was like stepping back in time (photo 1). Founded in 1928 as the Ukrainian Physics and Technology Institute, KIPT is one of the oldest and largest physical science research institutes in Ukraine. New facilities have been built on the outskirts of the city, but the original buildings are still standing and, because of the historical value of the site, a project is underway to convert them into a museum.

Cryogenics laboratory at Department of Solid State Physics and Condensed Matter, Kharkiv Institute of Physics and Technology – November 2021
Photo 1 Cryogenics laboratory at the Department of Solid State Physics and Condensed Matter, Kharkiv Institute of Physics and Technology, November 2021. (Courtesy: Eric Lusito)
Alexander Mats, pictured next to an electron microscope
Photo 2 Alexander Mats, pictured next to an electron microscope, at the Department of Solid State Physics and Condensed Matter, Kharkiv Institute of Physics and Technology, November 2023. (Courtesy: Eric Lusito)

When I returned in 2023, the corridors were silent, the heating was shut off and all the doors were closed. However, a few of the offices were still occupied by researchers in KIPT’s Department of Solid State Physics and Condensed Matter (photos 2, 3). During my visit, I also met students from the Kharkiv National University who were visiting with their teacher, Alexander Gapon (photo 4). Their faculty had been damaged in the early days of the fighting, and they had moved lab classes to KIPT so that teaching could continue.

Vladimir Kloshko, ultrasound specialist at the Department of Solid State Physics and Condensed Matter, Kharkiv Institute of Physics and Technology, November 2023
Photo 3 Vladimir Kloshko, ultrasound specialist at the Department of Solid State Physics and Condensed Matter, Kharkiv Institute of Physics and Technology, November 2023. (Courtesy: Eric Lusito)
Alexander Gapon, Kharkiv National University
Photo 4 Alexander Gapon, Kharkiv National University, November 2023. (Courtesy: Eric Lusito)

Plasma Physics Institute, Kharkiv Institute of Physics and Technology, November 2023

The Plasma Physics Institute houses two huge stellarators for studying nuclear fusion. These machines confine plasma under high magnetic fields (photo 5), creating the hot, dense conditions necessary for nuclei to fuse.

The stellarator Uragan-2M at the Plasma Physics Institute, Kharkiv Institute of Physics and Technology, November 2023. (Courtesy: Plasma Physics Institute, Kharkiv Institute of Physics and Technology
Photo 5 The stellarator Uragan-2M at the Plasma Physics Institute, Kharkiv Institute of Physics and Technology. (Courtesy: Eric Lusito)

I met Igor Garkusha, the director of the institute, who showed me where the roof of the facility had been pierced by projectiles during the early days of the Russian invasion. Debris was still scattered everywhere and, in the room housing the stellarators, he held up a piece of shrapnel (photo 6). The roof played its protective role and the stellarators have not suffered any damage, but with research temporarily halted, Garkusha worried that they were at risk of losing skills.

Igor Garkusha holding a piece of shrapnel in the stellarator room of the Plasma Physics Institute, Kharkiv Institute of Physics and Technology
Photo 6 Igor Garkusha holding a piece of shrapnel in the stellarator room of the Plasma Physics Institute, Kharkiv Institute of Physics and Technology. (Courtesy: Eric Lusito)

“I never thought the Russians would wage war on us because I have a lot of family and friends on the other side,” Garkusha said. “I attended a nuclear fusion conference in London organized by the IAEA (International Atomic Energy Agency) in October. There were Russian scientists whom I’ve known for 20 or 30 years, and not one of them spoke to me.”

Damaged dormitories at the Kharkiv Institute of Physics and Technology
Photo 7 Dormitories at the Kharkiv Institute of Physics and Technology, in the same neighbourhood as the Plasma Physics Institute. (Courtesy: Eric Lusito)

The Institute for Scintillation Materials, Kharkiv, November 2023

“I can’t complain: we get a lot of orders,” says Borys Grynyov (photo 8), director of the Institute for Scintillation Materials. Scintillation crystals, which emit light when exposed to radiation, are a vital component of particle detectors, and even though some of its buildings have been destroyed, the institute continues to participate in CERN research programmes.

Borys Grynyov, director of the Institute for Scintillation Materials
Photo 8 Borys Grynyov, director of the Institute for Scintillation Materials, Kharkiv. (Courtesy: Eric Lusito)

For several months at the beginning of the war, around 50 people – members of staff, and their families, lived in the basement of the facility. The teams were eventually able to move the equipment to better-protected rooms, which allowed the production of the scintillation crystals to resume (photo 9).

Andriy Shille, technician, processing alkali-halide scintillating crystals at the Institute for Scintillation Materials, Kharkiv
Photo 9 Andriy Shille, technician, processing alkali-halide scintillation crystals at the Institute for Scintillation Materials, Kharkiv. (Courtesy: Eric Lusito)

Kharkiv Polytechnic Institute, November 2023

At around 5.30 a.m. on 19 August 2022, a missile struck a university building of the Kharkiv Polytechnic Institute (photo 10). From 8 a.m., Kseniia Minakova, head of the optics and photonics laboratory, searched through the rubble for what could be salvaged. Much of the equipment had been destroyed, but she and her colleagues found some microscopes, welding equipment and computers. After a few days, heavier equipment such as vacuum pumps could be evacuated.

The university building that was destroyed on 19 August 2022.
Photo 10 The university building of the Kharkiv Polytechnic Institute that was destroyed on 19 August 2022. (Courtesy: Eric Lusito)

I met Minakova at the institute, where she showed me her small temporary laboratory (photo 11). Her research is dedicated to solar energy and improving heat dissipation in photovoltaic systems. To enable Minakova’s team to continue working, Tulane University in the US, where they have collaborators, donated photovoltaic cells as well as a new 3D printer, screens, oscilloscopes and other equipment.

Kseniia Minakova, from the Kharkiv Polytechnic Institute, pictured with equipment recovered from the rubble of the laboratory
Photo 11 Kseniia Minakova, from the Kharkiv Polytechnic Institute, pictured with equipment recovered from the rubble of the laboratory. (Courtesy: Eric Lusito)

But what she was most grateful for was at the back of the room, where her team was huddled around a solar simulator donated by a Canadian company (photo 12). “I explained to them what I needed and they came up with a technical solution for our laboratory and completely assembled a new installation for us from scratch. This equipment is worth $60,000!”, exclaimed Minakova.

Kseniia Minakova and her colleagues around the solar simulator
Photo 12 Kseniia Minakova and her colleagues around the solar simulator. (Courtesy: Eric Lusito)

In 2023, she was appointed ambassador of the Optica foundation and a finalist for the L’Oréal-UNESCO “For Women in Science” prize. She has received several offers to work abroad. “I turned them down. My place is here, in Ukraine. If everyone leaves, who will take care of rebuilding our country?”

The post Physics in Ukraine: scientific endeavour lives on despite the Russian invasion appeared first on Physics World.

Zurich Instruments launch their SHF+ series platform for quantum computing technologies

In this short video, filmed at the 2024 March Meeting of the American Physical Society earlier this year, Vikrant Mahajan, president of Zurich Instruments USA, outlines the company’s new products to support scientists in quantum computing.

Mahajan explains how Zurich Instruments, which has more than 10 locations around the globe, wants to move quantum computing forward with its products, such as its newly released SHF+ series platform. “Our mission is to help build a practical quantum computer,” he says.

Moritz Kirste, head of business development quantum technologies, then describes the main features of the SHF+ product line, which are the building blocks for its quantum computing control systems. “It provides qubit control and read-out functionality of any chip size from one to hundreds of qubits,” he says.

Next up is Linsey Rodenbach, application scientist quantum technologies, who explains that the SHF+ offers better qubit performance metrics – and therefore higher algorithm fidelities. The new platform has lower noise, which means less measurement-induced dephasing, thereby boosting phase coherence between control pulses, especially for long-lived qubits.

As Rodenbach adds, the SHF+ provides a route for developing high quality qubits or when operating large quantum processing units, with Zurich Instruments partnering with some of the world’s leading labs to ensure that the technical specifcations of the SHF+ provide the desired performance benefits.

Get in touch with their team to discuss your specific requirements and collaborate on advancing quantum technology.

The post Zurich Instruments launch their SHF+ series platform for quantum computing technologies appeared first on Physics World.

Institute of Physics launches new inclusion programme for universities

diverse group of adults
Open to all: the new Physics Inclusion Award will be open to universities in the UK and Ireland. (Courtesy: iStock/Angelina Bambina)

The Institute of Physics (IOP) has launched a new award to help universities attract, support and retain a diverse physics community. The Physics Inclusion Award will encompass several aspects of diversity such as race and ethnicity, neurodiversity and sexual orientation.

It replaces the long-established Project Juno, which rewarded university physics departments and organizations that showed they had taken action to address gender equality.

Project Juno was originally set up after the IOP examined the challenges facing UK university departments in the mid-2000s. Over the last 15 years, the number of women physics professors at UK universities has doubled, with women now making up a quarter of academic staff. But there remain many parts of the population that are under-represented in physics. Less than 1% of university physics staff, for example, are Black.

A steering group, chaired by University of Birmingham theoretical physicist Nicola Wilkin, began work on the new award in 2021. A pilot scheme ran from September 2023 to January 2024 involving staff from 11 physics departments in the UK and Ireland. They worked through a Physics Inclusion Award self-assessment tool, reviewed the effectiveness of the award criteria and took part in feedback sessions with the IOP.

“Building upon the success of Project Juno, [the new award] widens our offer to anyone who faces barriers because of who they are or where they come from – so that everyone is welcomed and included in physics”, says the IOP president, atomic physicist Keith Burnett. “To realize the incredible potential physics offers society, we need a growing, diverse, sustainable physics community which drives the physics of today and attracts the generation of tomorrow.”

Applications for the new award will open in late 2024.

The post Institute of Physics launches new inclusion programme for universities appeared first on Physics World.

Venus is losing water much faster than previously thought, study suggests

Venus could be shedding water to space at a much faster rate than previously thought. That is the conclusion of researchers in the US, who have identified a mechanism in the Venusian ionosphere that could be involved in water loss.

How much water Venus had in the past is uncertain, with some planetary scientists suggesting that the planet may have once had oceans that eventually evaporated as the Venusian greenhouse effect began to run away with itself. Today, only 0.002% of the planet’s atmosphere is composed of water vapour. If condensed on Venus’ surface, this water would form a global equivalent layer (GEL) just 3 cm deep – compared to Earth’s GEL of 3 km.

If Venus began life with a large amount of water, then it has lost almost all of it. Current thinking is that non-thermal hydrogen escape is responsible. This process involves solar radiation splitting water molecules into oxygen and hydrogen. Being lightweight, some of this hydrogen will then escape into space and be swept away by the solar wind.

Lost forever

“Once a hydrogen atom has gone, Venus has lost, in some sense, a water molecule forever,” says Mike Chaffin of the University of Boulder, Colorado.

In recent years, Chaffin’s team have explored water loss from planetary atmospheres via a different mechanism involving the formyl cation (HCO+). This ion is a product of molecular recombination in a planetary ionosphere after molecules such as water and carbon dioxide are broken apart by solar radiation. In their model, Chaffin and his colleagues describe the dissociative recombination of HCO+, whereby an electron that has been liberated when an atom or molecule is ionized then collides with the ion, splitting HCO+ apart into carbon monoxide(CO) and an energetic hydrogen atom. This “hot hydrogen” has enough energy to escape the planet’s gravity.

In 2023, Chaffin and colleague Bethan Gregory found that HCO+ dissociative recombination is responsible for 5–50% of the water loss from Mars’ ionosphere.

However, when Chaffin, Gregory and Eryn Cangi led their team to apply the same mechanism to Venus’ atmosphere, the models showed that HCO+ dissociative recombination must be the dominant form of water loss on the planet, doubling the previously calculated rate of water loss.

Lack of data

The problem is, HCO+ has yet to be detected on Venus.

“I worried about this a lot when we were preparing our paper [describing our results],” Chaffin told Physics World. “Based on our modelling work, there should be a lot more HCO+ on Venus than we thought previously, but how much can we trust our model?”

Chaffin says that the uncertainty is related to NASA’s Pioneer Venus Orbiter, which has been the only space mission so far with an instrument capable of probing Venus’ ionosphere. Launched in 1978, Pioneer was not specifically designed to detect HCO+.

“This has been a gap in measurements of Venus,” says Chaffin, although he does say that by extrapolating from the Pioneer results, one can infer a water-loss rate that is “in the same ballpark” as that predicted by the HCO+ mechanism. Chaffin takes this as indirect confirmation of the model.

Chaffin’s confidence is backed up by Janet Luhmann  at the Space Sciences Laboratory at the University of California, Berkeley. “So long as [Chaffin and colleagues’] assumptions are accurate, there is no reason I can see to dismiss this concept,” she says.

Water sources

If true, then the HCO+ dissociative recombination model changes the history of water on Venus, somewhat. If Venus did have oceans, then some of its surviving water vapour will originate from those oceans, some will come from outgassing via volcanoes, and the remainder arriving through comet and asteroid impacts.

The increase in the escape rate means that if Venus did once have oceans, compressing the time it takes to lose that water to space thanks to the efficiency of the HCO+ mechanism means those oceans could have survived on the surface for longer. If Venus never had oceans, then either the rate of outgassing or the rate of impacts, or both, must be higher to at least keep pace with the speed of water loss.

Unfortunately, forthcoming missions to Venus may not be able to confirm the presence of HCO+. Neither Europe’s Envision mission, planned to launch in 2031, nor NASA’s DAVINCI spacecraft that will blast off in the late 2020s, will study the ionosphere.

However, Sanjay Limaye of the University of Wisconsin, Madison points out that Russia has proposed an instrument for India’s planned Venus orbiter that may be able to detect HCO+ around 2031.

However, Luhmann, who specializes in studying the interaction between the solar wind and planetary atmospheres, thinks there might be another way. The escaping hot hydrogen from the HCO+ dissociative recombination process is moving fast. Whereas Chaffin believes it will be too fast to detect, Luhmann is not so ready to dismiss it.

“In-situ measurements of hydrogen pickup ions may be useful if they are sensitive enough and can distinguish the characteristic initial energies of the escaping neutrals before they are ionized,” she says, pointing to previous work on Mars that has accomplished the same thing.

The research is described in Nature.

The post Venus is losing water much faster than previously thought, study suggests appeared first on Physics World.

Next-generation quantum sensors detect human biomagnetism

Anna Kowalczyk, an assistant professor in the University of Birmingham’s Centre for Human Brain Health, has set out to develop new tools that could help neuroscientists do their work better.

“Our motivation is to use transcranial magnetic stimulation with a brain imaging technique called magnetoencephalography, and this is a very challenging task, because commercial sensors may not work well in this setup,” Kowalczyk says. “I wanted to start developing quantum sensors for brain imaging, and I basically started from scratch.”

Transcranial magnetic stimulation uses magnetic fields to stimulate neurons in the brain by placing a magnetic coil against the scalp and generating brief magnetic pulses that induce electrical currents in the brain. Magnetoencephalography measures the magnetic fields produced by this resultant activity and can be used to map brain activity and understand the relationships between brain regions and their functions.

Optically pumped magnetometers (OPMs) are emerging as preferred sensors for biomagnetism applications. Commercial OPMs used for magnetoencephalography are “zero field sensors,” meaning that they can detect very small magnetic fields, often at femtotesla-level sensitivities. Many zero-field sensors are sensitive to environmental magnetic fields and so require extensive shielding systems and noise cancellation techniques.

Kowalczyk’s research, led by Harry Cook, a graduate student studying physics at the University of Birmingham, decided to employ “slightly different physics” with their sensors. “We decided to make use of slightly different physics to develop a sensor that can work in higher magnetic fields with great precision,” says Kowalczyk.

Their prototype sensor is an optically pumped magnetic gradiometer (OPMG) that operates by nonlinear magneto-optical rotation: linearly polarized light passes through a rubidium vapour, preparing the atoms in the vapour to be magnetically sensitive. When an external magnetic field changes, the frequency of atoms’ precession around the magnetic field changes, which is in turn detected with laser light. In other words, changes in the magnetic field are tracked by measuring changes in light properties.

With one “cell” of rubidium vapour, the researchers can measure an external magnetic field locally. With two cells, they can directly measure the difference, or gradient, in the magnetic field (i.e., their sensor becomes a gradiometer).

After designing and prototyping the sensor, the researchers tested its performance by characterizing an auditory evoked response in the human brain, a common benchmark for OPMs. Next, they set out to measure the field from a human heartbeat. During recording, the building lifts were running and the team recorded the resulting magnetic disturbance using commercial sensors.

“[In our experimental setup] we’re inside this metal box, basically, but the [external field] is still many times larger than a human heartbeat, let alone the human brain. So we wanted to see how well our sensor could measure the gradient from the heart while rejecting the unwanted magnetic field,” Cook explains. “It wasn’t a perfect setup, but we could clearly see that there’s a human heartbeat that’s well-characterized from a magnetic perspective…Our gradient method remained almost flat during this measurement, while the reference sensor showed all the variations.”

The researchers concluded that the OPMG could be explored further for biomagnetism applications and are pursuing two directions for research going forward: improving the current sensor; and developing sensors that could measure multiple types of neural activity simultaneously.

“We need to make the sensor more robust…and we need to make more sensors. The main motivation is to make it work with transcranial magnetic stimulation, and we also want to make hybridized quantum sensors that can measure two kinds of neural activities at the same time,” says Kowalczyk, referencing functional near infrared spectroscopy, another optical method for measuring brain activity. “Each method would bring different kinds of information. We’re not saying that our sensor is better than commercial sensors, it’s just different. Our aim is to develop technology that enables new approaches and new capabilities in neuroscience and beyond.”

Initial results from the OPMG prototype are published in Quantum Science and Technology.

The post Next-generation quantum sensors detect human biomagnetism appeared first on Physics World.

Celebrating attosecond science, physics tournament focuses on fun

The 2023 Nobel Prize For Physics was shared by three scientists who pioneered the use of ultrashort, attosecond laser pulses for studying the behaviour electrons in matter.

In this episode of the Physics World Weekly podcast, I chat with three people involved with the IOPP-ZJU International Symposium on Progress in Attosecond Science. The event will be held on 23 May at China’s Zhejiang University and can also be attended online via Zoom. It is organized by IOP Publishing (which brings you Physics World) and Zhejiang University.

Joining me in a lively discussion of attosecond science are Haiqing Lin of Zhejiang University, Caterina Vozzi of Italy’s Institute for Photonics and Nanotechnologies and David Gevaux of the IOPP journal Reports on Progress in Physics, which is supporting the symposium.

This week’s episode also features an interview with Anthony Quinlan, who was a two-time contestant in the PLANCKS international theoretical physics competition for students. He now helps organize the event, the finals of which will be held in Dublin next week.

Quinlan chats with Physics World’s Katherine Skipper about competition, which involves teams of undergraduate and masters’ students solving “fun” physics problems. Quinlan explains that contestants are encouraged to come up with creative solutions – which sometimes leads to unexpected paths to the correct answer.

The post Celebrating attosecond science, physics tournament focuses on fun appeared first on Physics World.

Domain walls in twisted graphene make 1D superconductors

Domain walls in graphene form strictly one-dimensional (1D) systems that can become superconducting via the so-called proximity effect. This is the finding of a team led by scientists at the University of Manchester, UK, who uncovered the behaviour by isolating individual domain walls in graphene and studying the transport of electrons within them – something that had never been done before. The discovery has applications in metrology and in some types of quantum bits (qubits), though team member Julien Barrier, who is now a postdoctoral researcher at the Institute of Photonic Sciences (ICFO) in Barcelona, Spain, suggests it might also impact other fields.

“Such strict 1D systems are extremely rare,” Barrier says, “and could serve a number of potential applications.”

The researchers made their 1D system by stacking two layers of graphene (a sheet of carbon just one atom thick) atop each other. When they misalign the layers ever so slightly (less than 0.1°) with respect to each other, the material experiences a strain that makes the atoms in its lattice rearrange themselves into micrometre-scale domains of aligned bilayer graphene.

The narrow regions at the intersection between these domains are known as domain walls, and previous work by members of the same team showed that these walls are very good at conducting electricity. The boundaries between domains were also known (thanks to work by Vladimir Fal’ko and colleagues at Manchester’s National Graphene Institute) to contain special counterpropagating electronic channels that form by hybridizing “edge states” within the conducting domain walls.

These edge states are a consequence of the quantum Hall effect, which occurs when a current passing along the length of a thin conducting sheet gives rise to an extremely precise voltage across opposite surfaces of the sheet. This voltage only occurs when a strong magnetic field is applied perpendicular to the sheet, and it is quantized – that is, it can only change in discrete steps.

One important property of edge states is that electrons in them are said to be “topologically protected” because they can only travel in one direction. They also steer around imperfections or defects in the material without backscattering. Since backscattering is the main energy-dissipating process in electronic devices, such protected states could be useful components in next-generation energy-efficient devices.

Achieving superconductivity in the quantum Hall regime

Over the years, much effort has therefore gone into trying to achieve superconductivity in the quantum Hall regime, as this would mean that the transport of Cooper pairs of electrons (that is, those that travel unhindered through a material to form supercurrents) is mediated by these 1D edge states.

To identify the domain walls in their bilayer graphene sample, the Manchester team turned to a technique called near-field photocurrent imaging developed by Krishna Kumar’s group at the ICFO. This technique provided the information the scientists needed to isolate the domain walls and place them between two superconductors.

They found that doing so not only induces robust superconductivity inside the walls, it also allows the walls to carry individual electronic modes. “This is proof of the 1D nature of this system,” Barrier says.

One unexpected finding is that the superconductivity stems from the proximity of edge states in the domain walls and, in particular, from the strictly 1D states existing within the walls. This means it does not arise from quantum Hall edges on each of the domain walls, as the researchers previously thought.

New approach overcomes previous limitations

Barrier and colleagues began their study using a conventional approach in which counterpropagating quantum Hall edge states were brought close together, but these experiments did not produce the results they expected. In previous studies, experimental progress in this direction was limited to observing oscillatory behaviour in the normal (non-superconducting) state, or more recently, to very small supercurrent (of less than 1 nA) at ultralow temperatures of less than 10 mK, Barrier explains.

The new approach, which the team describes in Nature, overcomes these limitations. Thanks to support on the theory side from Fal’ko’s group, the researchers realized that the strictly 1D electronic states they observed in the graphene domain walls hybridize much better with superconductivity than quantum Hall edge states. This realization enabled the researchers to measure supercurrents of a few 10nA at temperatures of ~1K.

Barrier says the new system could develop along numerous research directions. There is currently an intense interest in quasi-1D (multimode) proximity superconductivity using nanowires, quantum point contacts, quantum dots and other such structures. Indeed, electronic devices containing these structures are already on the market. The new system provides superconductivity via single-mode 1D states and could make such research redundant, he says.

“Our estimation is that electrons propagate in two opposite directions, less than a nanometre apart, without scattering,” he tells Physics World. “Such ballistic 1D systems are exceptionally rare and can be controlled by a gate voltage (like quantum point contacts) and exhibit standing waves (like superconducting quantum dots).”

The post Domain walls in twisted graphene make 1D superconductors appeared first on Physics World.

Researchers split on merits and pitfalls of AI in peer review, IOP Publishing survey finds

Researchers are divided over whether artificial intelligence (AI) is having a positive or negative impact on peer review. That is according to a new report from IOP Publishing, which looks at scientists’ perception and experiences of peer review. The study also finds that interest in a paper and the reputation of the journal remain the most important factors for researchers when considering whether to peer review an article.

Entitled State of Peer Review 2024, the report is based on a survey of more than 3000 researchers from over 100 countries. IOP Publishing carried out a similar survey in 2020 but researchers’ growing use of AI tools since then to write or augment peer-review reports has raised various ethical issues. In particular, there are questions over data protection, confidentiality and the accuracy of reviewer reports.

IOP Publishing, which publishes Physics World, currently does not allow the use of generative AI to “write or augment” peer-review reports or for AI tools to be named as authors on manuscripts. Instead, it encourages authors to be “open and transparent” about their use of such tools in their work. However, publishers do not yet have a way to accurately detect whether text has been generated by AI.

About 35% of respondents to the survey think that open-source generative AI tools such as ChatGPT will harm peer review, while 36% say it will have no impact. Just 29% believe AI can benefit scholarly communication. When asked to expand on their responses, researchers admit that such tools can provide some “useful outputs”. However, they warn that expert human verification and editing is vital before AI-generated text can be used in peer review.

The study also looks at how much peer review researchers carry out, finding that 30% of reviewers from high-income countries say they receive too many peer-review requests, compared with just 10% from low and middle-income countries. Moreover, 28% of senior researchers also say they get too many requests to peer review, compared to just 7% of PhD students and 9% of postdocs.

“Quality peer review is essential to the integrity and validity of science and relies on reviewers who are engaged, motivated and competent at providing constructive feedback,” says Laura Feetham-Walker, peer review engagement manager at IOP Publishing. “The insights we gain from this survey helps us to ensure we can continue to evolve the support we provide to the global reviewer community to help with their important work.”

The post Researchers split on merits and pitfalls of AI in peer review, IOP Publishing survey finds appeared first on Physics World.

Decimal time: life in a world where our days are divided differently

In an era where conspiracy theories run amok, this fictional thriller invites more than just the curiosity of its adolescent target audience. Written by the London-based author Sam Sedgman – a self-styled “nerd and enthusiastic ferroequinologist” – The Clockwork Conspiracy is a story about the potential impact of a new law to decimalize time in the UK. According to Miriam – one of the MPs in the novel – the law will “simplify the way we measure things [and] make the UK a leader in scientific research”.

The plan sounds innocuous enough: a day will be divided into 10 hours, each of 100 minutes, with each minute being 100 seconds. The proposed new law is not entirely in the realms of fiction – after all, decimal time was the legal standard in France during the French Revolution.

Before the bill is debated in parliament, however, renowned horologist Diggory disappears from the belfry at Big Ben. His son Isaac begins a quest to find his father, assisted by newly acquainted partner in crime, the rebellious Hattie.

As well as learning about the personalities of the protagonists, readers gain knowledge of topics they might otherwise know little about. Diggory, for example, elucidates the mechanics of clocks, Isaac provides insight into the sciences, and Hattie sheds light on how parliament works as she helps them get to the bottom of Diggory’s disappearance.

Thanks to her penchant for exploration, we are privy to the top-secret meeting of the Timekeepers – the group responsible for monitoring and protecting time – held at none other than the Royal Observatory, home of Greenwich Mean Time and the Prime Meridian. It’s an exciting adventure as our protagonists navigate London to overcome the plots’ obstacles.

To bolster the plausibility of decimal time as a threat to our everyday life, Sedgman refers to various significant historical events, including the suffragette movement, the introduction of the decimal pound in 1971 and the Millennium Bug. Timekeeper Penny explains the threat of the new law to the children: “Time is like water…it can be misused”.

The novel portrays different perspectives surrounding the bill in a nicely balanced way. These include the views of the Timekeepers (who disapprove of the bill), the Machinists (who rebel against it with vandalism) and the parliamentarians (who mysteriously support it). It’s an approach that lets readers appreciate the different views and rationale behind them.

The Clockwork Conspiracy uses fun facts to propel the plot and teach readers about time and the history of timekeeping, without potentially dry information being clumsily shoehorned in. Amid the suspense, the novel highlights the universal importance of friendship, family and personal integrity. Despite being aimed at children, it offers insight into the mechanics of society at large. It’s therefore a book that can be enjoyed by readers of any age.

The post Decimal time: life in a world where our days are divided differently appeared first on Physics World.

Hope or hype: can upright treatment increase access to advanced radiotherapy?

Upright radiotherapy, in which a patient is treated in an upright position rather than lying down (supine), is an old technique that’s seen a resurgence of interest recently. At the ESTRO 2024 meeting, experts in the field discussed the potential benefits and ongoing challenges of upright radiotherapy – in a quest to discover whether the approach offers hope or is merely hype.

Up first, Ye Zhang from PSI took a look at some of the key driving forces for moving to upright radiotherapy. The ability to rotate and reposition the patient during upright treatments removes the need for a costly and bulky gantry. And eliminating the gantry reduces the space requirements for a particle therapy system by an estimated 80%, greatly lowering the costs of installing a treatment facility.

“Reducing the cost increases the accessibility of particle therapy to the population worldwide,” Zhang explained, citing a recent review paper that concluded “only 1% of radiotherapy patients currently receive particle therapy, however, at least 15% of patients could benefit”.

As well as offering economic advantages, removing the gantry provides more space at the isocentre, enabling the integration of advanced treatment methods such as proton arc therapy, combined proton–photon treatments and the use of MR guidance.

Upright positioning may also provide clinical advantages, by utilizing geometric variations between the target and the organs-at-risk to distribute dose. The idea here is that different organ positions, morphology and motion directly affect the dose distribution and may provide a means to optimize radiation delivery.

“The potential for upright treatment is significant,” Zhang concluded. “Not only to reduce the cost, which is important to push particle therapy worldwide to benefit more patients, but it also has potential clinical benefits which are under investigation, and there’s definitely a lot of potential to integrate future advanced radiotherapy concepts.”

Work in progress

While upright radiotherapy offers many potential benefits, there are still ongoing challenges to solve. “Upright radiotherapy is a true paradigm shift because it affects every step in the workflow: imaging, segmentation, treatment planning, quality assurance, treatment delivery and even follow up,” explained Lennart Volz from GSI. “All of these steps have to be translated into the upright position.”

Volz pointed out that upright treatment is not a new concept, with investigations into chair-based solutions dating back to the 1960s. “Yet, to the best of my knowledge, there are currently only six centres treating in an upright position, for indications other than ocular tumours,” he said. “On the other side, there are 110 gantries.”

One reason behind this slow uptake may be a previous lack of commercial upright imaging systems, as well as the absence of modern tools such as adaptive workflows and image guidance. Nearly all body sites will undergo anatomic changes between supine and upright postures, making upright planning-quality CT essential for clinical transfer. Upright MRI and PET systems may also be needed. Luckily, said Volz, several industry vendors are now developing cost-efficient upright imaging systems.

For image segmentation, one unknown is whether existing auto-contouring tools trained for supine treatments still work for upright geometries. Treatment planning and quality assurance (QA) procedures may also require modification, with new margin recipes and bespoke QA equipment, for example. It’s also important to define which upright posture (straight, leaning forward, half standing, for instance) is best suited for each treatment site. Finally, Volz emphasized that an adaptive workflow is mandatory.

“Upright radiotherapy has a lot of potential, but many challenges remain,” he concluded. “The most crucial challenge is that we need more data, more evidence to support the different decisions that have to be taken for upright radiotherapy. Luckily, there are many ongoing projects in the field to address these challenges.”

Early experience

Sophie Boisbouvier of Centre Léon Bérard examined upright treatment from the patient’s perspective, sharing her experience of using the upright positioner developed by Leo Cancer Care. In one example, a study of nine breast cancer patients who spent 40 min in the positioner, seven of the participants preferred the upright to the supine position.

In another study, 16 pelvis cancer patients were setup three times in the upright positioner for comparison with supine treatment. In the supine position, some patients reported pain and found it difficult to be setup or get out; this was not the case for the upright position. “Globally, patients seem more comfortable and more satisfied in the upright position than the supine position, or at least as comfortable and as satisfied,” she said.

Patient setup for upright radiotherapy
Ease of use A study of 16 pelvis cancer patients demonstrated the rapid setup of patients in the upright positioner. (Courtesy: Sophie Boisbouvier)

For these 16 patients, initial positioning took 5 min (with two radiation therapy technologists), repositioning less than 3 min and uninstallation less than 1 min. The mean inter-fraction positioning shift was below 1 mm in all directions. And after 20 min on the chair and several rotations, the mean position shift was close to 0 mm.

“It seems feasible to setup a patient in a reasonable time frame, but we still need additional data and comparison with data on the supine position,” she said. “For reproducibility and stability, the first data are promising, however, we also need more data on various localizations and 3D images to know whether the organ position is reproducible and stable.”

At the Northwestern Medicine Proton Center in Chicago, there’s a fully clinical upright system that has already treated over 675 patients. Mark Pankuch explained that the four-room proton centre has one gantry, one fixed beam and two inclined beams that offer two treatment positions. When the gantry reached full capacity, “we wanted to go to upright treatments, because we needed more gantry-type treatments and only had one gantry system,” he said.

To achieve the move to upright, the group first needed an upright imaging system. They collaborated with system vendor P-Cure to install a wall-mounted CT scanner that lowers itself vertically over the patient. The next task was to design a positioning chair that was compatible with the treatment room’s existing robotic systems.

Due to limitations in the robot’s vertical movement, they ended up designing two chairs: one optimized for the thorax and the other for cranial treatment. The centre now treats about 80 patients per year, the majority in the cranial chair. To date, 57% of these have been ophthalmic treatments and 29% brain or spinal cord treatments.

“Lots of lessons were learned, but the one big takeaway is that we designed a scanner and a chair for an existing room for a particular need,” Pankuch explained. “If a treatment room was designed around an ideal scanner or chair, there’s much more utility, functionality and efficiency that can be gained for sure.”

Strategy and enthusiasm

Rounding off the symposium, Thomas Bortfeld from Massachusetts General Hospital described some advanced treatment concepts facilitated by upright radiotherapy. Advances in beam geometry could enable particle therapy to fit into a conventional treatment room of about 50 m2, giving more patients access to this treatment.

Increasing delivery speed, meanwhile, allows more patients to be treated in a single clinical centre.  Currently, the main speed limiting factor is the switching between energy layers, which can account for over 70% of the delivery time. “Once we simplify our beamlines in the upright scenario, we can maybe get away with much simpler beamlines that will reduce the energy switching time to essentially zero,” Bortfeld said, adding that his colleague Konrad Nesteruk and international collaborators are currently investigating fixed-field alternating gradient techniques that remove the need to switch the magnet for different energy settings.

Image guidance remains a key challenge for upright radiotherapy. Bortfeld’s group is currently studying ultralow-field MR guidance using a mobile 64 mT scanner, which they are about to test in the proton environment. They are also testing 6.5 mT MRI for breast imaging, finding that even this extremely low field provides decent image quality.

As for the symposium’s theme of hope versus hype, Bortfeld thinks there is a little bit of both. “Hype is not what we want, but it’s good to see a lot of interest. There certainly is also hope, no question of that,” he concluded. “What we really need is not hype but enthusiasm, and many of us have enthusiasm in this field. We also need, maybe not only hope, but to define a strategy. Strategy and enthusiasm is where we want to be.”

The post Hope or hype: can upright treatment increase access to advanced radiotherapy? appeared first on Physics World.

Pump–probe microscopy reveals how historical paintings fade

Munch's Despair
Mellowed yellows Despair is an example of how Edvard Munch used yellows in his paintings. (Edvard Munch 1894. File courtesy: Munchmuseet)

New insights into how a yellow pigment widely used in historical artwork fades over time have been gained by researchers in the US. Using an imaging technique that they had developed to detect skin cancer, Martin Fischer and colleagues at Duke University showed how signs of paint degradation can appear on a microscopic scale, before it is visible to the eye.

As they painted their masterpieces, artists throughout history knew that the colours they used would fade over time. Recently, however, analytical techniques are providing insights into the properties of microscopic grains of pigment and why they fade. This allows us to imagine artworks as they looked when they were first painted, and how to conserve and restore paintings.

“Understanding the reason for pigment degradation is extremely important to halt damage that has occurred, prevent damage that has not yet happened, and to get an idea how a degraded masterpiece might have looked originally,” Fischer explains.

One of the most challenging aspects of this task is the deep complexity hidden beneath the surface of a painting. In many paintings, numerous pigments have been mixed together and layered on top of each other, making them difficult to analyse without damaging the artwork.

In their study, Fischer and colleagues overcame this challenge using a method called pump-probe microscopy, which uses pairs of synchronized femtosecond laser pulses. The two different pulsed laser beams are superimposed and then focused onto the sample being imaged, with a controlled delay between the arrival of the pump and pulse. The pump pulse comes first and creates excitations within the sample. Then the probe pulse interacts with the sample such that the reflected light contains information about specific excitations and therefore the chemical composition of the sample.

Powerful technique

Pump–prove microscopy has become a powerful technique for generating high-contrast images of non-fluorescent images, especially in living tissues. Indeed, Fischer’s team have already adapted it to examine moles for signs of skin cancer. Now, the group has used technique to examine the degradation of pigments hidden within complex layers of paint. They focused on pigments containing cadmium sulphide, which are bright yellow and have played an important role in the history of art.

“CdS was popular with artists like Edvard Munch, Henri Matisse, and Pablo Picasso, but is also very prone to degradation,” Fischer explains. “Despite the importance of CdS, the influence of the environmental conditions and manufacturing methods on the degradation process is not very well understood.”

To investigate the effect, the team started by synthesizing a CdS pigment, using a historical method often used by artists of the past.  They then accelerated the aging process by exposing their pigment to high levels of light and humidity. This quickly degraded the CdS grains into hydrated cadmium sulphate, causing the yellow colour to fade.

Variable breakdown

During the degradation process, the researchers used pump–probe microscopy to monitor changes to individual CdS grains. Their experiment revealed that the breakdown process can vary widely, depending on the size and shape of the grains.

“We discovered that degradation tends to happen more strongly for small, rough CdS crystals that are closer to the surface,” Fischer explains. In contrast, “degradation in larger crystals tends to start from the outside in, and from top to bottom.”

The experiment also showed that signs of degradation can start to appear well before they are visible even to the sharp eyes of art conservators. “Having such an early warning signal could be very helpful to adjust storage or display conditions for the artwork or to indicate the need for early intervention,” Fischer adds.

Based on their success, Fischer and colleagues now hope that pump–probe microscopy will help them to gain a better understanding of the degradation processes that cause fading in other types of pigment. In turn, their work could enable conservators to develop new and improved techniques, helping them to protect priceless historical artwork for years to come.

The research is described in JPhys Photonics.

The post Pump–probe microscopy reveals how historical paintings fade appeared first on Physics World.

Effective Science Communication (3rd edition) with Sam Illingworth

Whatever career stage you are at, join us to learn more about communicating science to your colleagues and external partners.

We will seek to empower scientists to effectively share their work, enhancing the impact of their research and facilitating a deeper public understanding of science. Our goal is to bridge the gap between the scientific community and society at large, promoting science as a cornerstone of informed citizenship and social progress.

Sam Illingworth

Sam Illingworth is an associate professor at Edinburgh Napier University, where his work and research focus on using poetry and games as a way of developing dialogues between scientists and other publics. He is also an award-winning science communicator, poet, games designer, Principal Fellow of Advance HE (PFHEA), chief executive editor of Geoscience Communication and the founder of Consilience, the world’s first peer-reviewed science and poetry journal.

 

 

About this ebook

Effective Science Communication: A practical guide to surviving as a scientist (3rd edition) is an essential handbook tailored for scientists at any stage of their career, aiming to enhance both their inward-facing and outward-facing communication skills. The book is structured into detailed chapters, each focusing on different aspects of science communication, from publishing in peer-reviewed journals to engaging with the media. It offers a blend of theoretical insights and practical exercises designed to build confidence and competence in communicating scientific research. This guide seeks to empower scientists to effectively share their work, enhancing the impact of their research and facilitating a deeper public understanding of science. Through this, it aims to bridge the gap between the scientific community and society at large, promoting science as a cornerstone of informed citizenship and social progress.

Authors Sam Illingworth and Grant Allen

The post Effective Science Communication (3rd edition) with Sam Illingworth appeared first on Physics World.

Sucking up crude oil with laser-treated cork

New research suggests that laser-treated cork could be used to tackle crude oil spills. In a study published in Applied Physics Letters, researchers from China and Israel found that femtosecond laser processing alters the surface structure of cork so that it heats rapidly in sunlight and absorbs oil.

Oil spills are ecological disasters that wreak havoc on marine ecosystems, with devastating, long lasting effects on marine animals and their habitats. Oil spill cleanup also presents a major technical challenge.

There’s a lack of effective strategies for clearing water contaminated with high-viscosity oil. Various techniques are employed, such as physically skimming the crude oil off the surface, use of chemical dispersants and even setting the oil alight, but these are often expensive with low efficacy and can cause secondary pollution. Alternative, environmentally friendly solutions are needed.

The authors of the latest research stumbled upon cork as a possible material for cleaning up oil spills by accident.

“In a different laser experiment, we accidentally found that the wettability of the cork processed using a laser changed significantly, gaining superhydrophobic (water-repelling) and superoleophilic (oil-attracting) properties,” says first author Yuchun He, a physicist at Central South University, China, in a press statement.

Following this discovery, the researchers wondered whether, as a porous material, cork could be used to suck up oil. They also found that after femtosecond laser processing, the surface of the cork became very dark, “which made us realize that it might be an excellent material for photothermal conversion,” He explains.

By absorbing energy from sunlight and heating the crude oil, the scientists hypothesized that the black cork would lower the oil’s viscosity, making it easier to absorb.

To characterize the properties of the femtosecond laser-processed cork, the team used scanning electron and confocal laser scanning microscopy. Their analysis showed that while untreated cork had an uneven honeycomb-like surface structure, the laser-processed cork was covered in a regular, striped arrangement of micro-scale grooves, scattered with nanoparticles.

The microscopy studies also showed that the laser-processed cork contained a higher proportion of carbon than untreated cork. This carbonization process occurs when heat from the laser breaks down long-chain cellulose molecules and releases carbon oxides as gas, reducing the cork’s relative oxygen content.

Laser-processed cork
Cork modification Femtosecond laser (FSLA) treatment alters the structure of the cork, which changes the behaviour of light incident upon on its surface. (Courtesy: Yuchun He)

Tests showed that the laser-treated cork had an extremely high solar absorption rate, absorbing more than 90% of light and reflecting less than 10%. Under simulated sunlight, the processed cork hit temperatures of 63°C after 120 s, while untreated cork rose to only 50°C. The laser-treated cork was also found to heat rapidly, reaching 50°C within 15–30 s.

When the researchers tested how well cork absorbed oil, they found that in the dark, the performance of laser-processed and untreated cork was similar, with crude oil gradually infiltrating over 45 min. Under light exposure, however, similar samples of laser-treated cork became saturated with oil within 2 min, while 10 min was needed for complete infiltration of unprocessed cork.

The team says that the nanoscale grooves and an increased surface area on the carbonized, laser-treated cork trap light. This causes the cork to rapidly heat in sunlight and heat nearby oil, reducing its high viscosity. The processed cork’s superoleophilic properties, linked to the changed surface structure and carbonization, then come into play, enabling the cork to suck up the more fluid oil while repelling water.

Cork comes from the bark of cork oak trees. As it is a renewable material that the trees can replace after harvesting, the researchers say it would be a sustainable, inexpensive and environmentally friendly material to use for cleaning up oil spills.

They propose that a pump connected to an oil tanker could be used to suck the crude oil out of cork as it is absorbed from the seawater. In laboratory experiments, the team demonstrated that a small-scale version of this setup could successfully extract oil from seawater in a Petri dish.

“Oil recovery is a complex and systematic task, and participating in oil recovery throughout its entire life cycle is our goal,” He says. “The next step is to prepare electrothermal materials using polyurethane foam as the skeleton for oil adsorption, combining photothermal and electrothermal techniques to form an all-weather oil recovery system.”

The post Sucking up crude oil with laser-treated cork appeared first on Physics World.

Implantable and biocompatible battery powered by the body’s own oxygen

When a medical device such as a pacemaker or neurostimulator is implanted within a person’s body, the immediate question is how long its battery will function before requiring surgical removal and replacement.

Researchers in China have developed an implantable Na–O2 battery with an open cathode structure that runs on oxygen circulating in the body, potentially removing the limit on battery life. In tests on laboratory rats, the team showed that the proof-of-concept design delivers stable power and has excellent biocompatibility.

Metal–O2 batteries have been previously tested for potential implantable use, but have encountered challenges. Their design requires an open cathode architecture to absorb oxygen from body fluids, and any discharge products created must be easily metabolized by the body during battery cycling. In addition, all components must be biocompatible and the battery must be flexible enough to enable stable contact with soft tissues.

The novel battery consists of a nanoporous gold catalytic cathode, an ion-selective membrane that acts as the separator and an anode made from a sodium-based alloy (NaGaSn). Nanoporous gold, which demonstrates excellent biocompatibility, has previously been used as a cathode in metal–air batteries to provide the oxygen reduction reaction. In the Na–O2 battery, oxygen continuously supplied from body fluids is reduced through the catalysis of the nanoporous gold during discharging.

The ion-selective membrane prevents body fluids from reaching the anode, though the team note that the NaGaSn alloy electrode possesses high safety and stability in water. The entire battery is encased within a soft and flexible porous polymer film.

Principal co-designers Yang Lv, Xizheng Liu and Jiucong Liu, of Tianjin University of Technology, initially conducted in vitro experiments, after which they implanted the battery under the skin on the backs of laboratory rats. After 24 h, they observed an unstable discharge voltage plateau. However, after two weeks of implantation, the battery was able to produce stable voltages of between 1.3 and 1.4 V, with a maximum power density of 2.6 µW/cm2.

“We were puzzled by the unstable electricity output right after implantation,” explains Xizheng Liu in a press statement. “It turned out that we had to give the wound time to heal, for blood vessels to regenerate around the battery and supply oxygen, before the battery could provide stable electricity. This is a surprising and interesting finding because it means that the battery can help monitor wound healing.”

The rats healed well after battery implantation, with the hair on their backs completely regrown after four weeks. Importantly, blood vessels regenerated well around the cathode, providing a continuous source of oxygen. Quantitative analysis confirmed that the number of capillaries around the battery was the same in rats with implanted batteries and control animals without batteries. In fact, the number of capillaries gradually increased with prolonged implantation time.

The researchers assessed the biocompatibility of the implanted battery through biochemical and immunohistochemical analyses. None of the rats developed inflammation around the batteries. Byproducts created by the chemical reactions of the battery, including sodium ions, hydroxide ions and low levels of hydrogen peroxide, were easily metabolized in the kidneys and liver. Upon completion of the study four weeks later, the rats did not experience any negative physiological effects, suggesting that the implanted battery has potential for practical applications.

While the energy generated by the proof-of-concept battery is not sufficient to power medical devices for human use, the results demonstrate that harnessing oxygen in the body for energy is possible. Xizheng Liu advises that the team’s next plan is to improve the battery’s energy delivery by exploring more efficient electrode materials and optimizing the battery structure and design. “We think that the battery will be easy to scale up in production, and choosing cost-effective materials will further lower the cost to produce.”

The researchers note that, in addition to having a novel architecture with the ability to generate extremely high energy densities, the battery’s oxygen concentration can be controlled precisely. This capability may expand its use to therapeutic applications, such as starving cancerous tumours of oxygen or converting the battery energy to heat to destroy cancer cells. Future research initiatives will include evaluating other uses for this promising implantable battery.

The research is reported in Chem.

The post Implantable and biocompatible battery powered by the body’s own oxygen appeared first on Physics World.

A career in physics: a universe of possibilities

Demand for physics skills and knowledge is growing. New opportunities can be found in emerging fields such as data science and AI. At the same time, physics graduates are increasingly sought after in established sectors like construction, business and innovation. Learn how to navigate the current jobs market with Physics World Careers 2024, a free-to-read guide packed with tips, interviews and case studies.

The post A career in physics: a universe of possibilities appeared first on Physics World.

Synthetic diamonds grow in liquid metal at ambient pressure

The usual way of manufacturing synthetic diamonds involves applying huge pressures to carbon at high temperatures. Now, however, researchers at the Institute for Basic Science (IBS) in Korea have shown that while high temperatures are still a prerequisite, it is possible to make polycrystalline diamond film at standard pressures. The new technique could revolutionize diamond manufacturing, they say.

Natural diamonds form over billions of years in the Earth’s upper mantle at temperatures of between 900 and 1400 °C and pressures of 5–6 gigapascals (GPa). For the most part, the manufacturing processes used to make most synthetic diamonds mimic these conditions. In the 1950s, for example, scientists at General Electric in the US developed a way to synthesize diamonds in the laboratory using molten iron sulphide at around 7 GPa and 1600 °C. Although other researchers have since refined this technique (and developed an alternative known as chemical vapour deposition for making high-quality diamonds), diamond manufacturing largely still depends on liquid metals at high pressures and temperatures (HPHT).

A team led by Rodney Ruoff has now turned this convention on its head by making a polycrystalline diamond film using liquid metal at just 1 atmosphere of pressure and 1025 °C. When Ruoff and colleagues exposed a liquid alloy of gallium, iron, silicon and nickel to a mix of methane and hydrogen, they observed diamond growing in the subsurface of the liquid metal. The team attribute this effect to the catalytic activation of methane and the diffusion of carbons atoms in the subsurface region.

No seed particles

Unusually, the first diamond crystals in the IBS experiment began to form (or nucleate) without seed particles, which are prerequisites for conventional HPHT and chemical vapour deposition techniques. The individual crystals later merged into a film that is easy to detach and transfer to other substrates.

To investigate the nucleation process further, the team used high-resolution transmission electron microscopy (TEM) to capture cross-sections of the diamond film. These TEM images showed that carbon builds up in the liquid metal subsurface until it reaches supersaturated levels. This, according to the researchers, is likely what leads to the nucleation and growth of the diamonds.

Separately, synchrotron X-ray diffraction measurements revealed that although the diamond formed via this method was highly pure, it contained some silicon atoms situated between two unoccupied sites in the diamond lattice of carbon atoms. The researchers say that these silicon-vacancy colour centres, as they are known, could have applications in magnetic sensing and quantum computing, where similar defects known as nitrogen-vacancy centres are already an active topic of research. The presence of silicon also appears to play an important role in stabilizing the tetravalently-bonded carbon clusters responsible for nucleation, they add.

Scaling up

The researchers, who report their work in Nature, are now trying to pin down when the nucleation of the diamond begins. They also plan “temperature drop” experiments in which they will first supersaturate the liquid metal with carbon and then rapidly lower the temperature in the experimental chamber to trigger diamond nucleation.

Another future research direction might involve studying alternative metal liquid alloy compositions. “Our optimized growth was achieved using the gallium/nickel/iron/silicon liquid alloy,” explains team member Da Luo. “However, we also found that high-quality diamond can be grown by substituting nickel with cobalt or by replacing gallium with a gallium-indium mixture.”

Ruoff adds that the team might also try carbon precursors other than methane, noting that various gases and solid carbons could yield different results. Overall, the discovery of diamond nucleation and growth in this liquid is “fascinating”, he says, and it offers many exciting opportunities for basic science and for scaling up the growth of synthetic diamonds in new ways. “New designs and methods for introducing carbon atoms and/or small carbon clusters into liquid metals for diamond growth will surely be important,” he concludes.

The post Synthetic diamonds grow in liquid metal at ambient pressure appeared first on Physics World.

Grounds for celebration as ‘hub of all things coffee’ opens at University of California, Davis

Physicists are well-known for their interest in coffee, not only drinking it but also studying the fascinating science behind an espresso.

Now researchers at the University of California, Davis (UC Davis), have taken it a whole new level by forming a research institute dedicated to the science of the perfect brew.

The Coffee Center will be used by more than 50 researchers and includes labs dedicated to brewing, “sensory and cupping” and the chemical analysis of coffee.

The centre has its origins in a 2013 course on “the design of coffee” by UC Davis chemical engineers William Ristenpart and Tonya Kuhl.

Two years later and a coffee lab at the university was established and in 2022 construction began on the Coffee Center, which was funded with $6m from private donors.

The official opening on 3 May was attended by over 200 people, who were treated to bean roasting and espresso brewing demonstrations.

“Think of this center as a hub of all things coffee,” noted UC Davies chancellor Gary May at the opening. “Together, we bring rigorous coffee science and cutting-edge technology to the world stage.”

Better latte than never.

The post Grounds for celebration as ‘hub of all things coffee’ opens at University of California, Davis appeared first on Physics World.

The future of 2D materials: grand challenges and opportunities

Source: Shutterstock, Marco de Benedictis

Graphene, the first 2D material, was isolated by Prof. Andre Geim and Prof. Konstantin Novoselov in 2004. Since then, a variety of 2D materials have been discovered, including transition metal dichalcogenides, phosphorene and mxene. 2D materials have remarkable characteristics and are making significant contributions towards quantum technologies, electronics, medicine, and renewable energy generation and storage to name but a few fields. However, we are still exploring the full potential of 2D materials, and many challenges must be overcome.

Join us for this panel discussion, hosted by 2D Materials, where leading experts will share their insights and perspectives on the current status, challenges and future directions of 2D materials research. You will have the opportunity to ask questions during the Q&A session.

Have a question for the panel?

We welcome questions in advance of the webinar, so please fill in this form.

Left to right: Stephan Roche, Konstantin Novoselov, Joan Redwing, Yury Gogotsi and Cecilia Mattevi

Chair
Prof. Stephan Roche has been ICREA Research Professor and head of the Theoretical & Computational Nanoscience Group at the Catalan Institute of Nanoscience and Nanotechnology (ICN2). He is a theoretician expert in the study of quantum transport theory in condensed matter, spin transport physics and devices simulation.

Speakers
Prof. Konstantin Novoselov is the Langworthy Professor of Physics and Royal Society Research Professor at The University of Manchester. In 2004, he isolated graphene alongside Andre Geim and was awarded the Nobel Prize in Physics in 2010 for his achievements.

Prof. Joan Redwing is a Distinguished Professor of Materials Science and Engineering at Penn State University where she holds an adjunct appointment in the Department of Electrical and Computer Engineering. Her research focuses on crystal growth and epitaxy of electronic materials, with an emphasis on thin film and nanomaterial synthesis by metalorganic chemical vapour deposition.

Prof. Yury Gogotsi is a Distinguished University Professor and Charles T and Ruth M Bach Endowed Chair in the Department of Materials Science and Engineering at Drexel University. He is the founding director of the A.J. Drexel Nanomaterials Institute.

Prof. Cecilia Mattevi is a Professor of Materials Science in the Department of Materials at Imperial College London. Cecilia’s expertise centres on science and engineering of novel 2D atomically thin materials to enable applications in energy conversion and energy storage.

About this journal

2D Materials is a multidisciplinary, electronic-only journal devoted to publishing fundamental and applied research of the highest quality and impact covering all aspects of graphene and related two-dimensional materials.

Editor-in-chief: Wencai Ren Shenyang National Laboratory for Materials Science, Chinese Academy of Sciences, China.

The post The future of 2D materials: grand challenges and opportunities appeared first on Physics World.

Data science CDT puts industry collaboration at its heart

Physics is a constantly evolving field – how do we make sure the next generation of physicists receive training that keeps pace with new developments and continues to support the cutting edge of research?

According to Carsten P Welsch, a distinguished accelerator scientist at the University of Liverpool, in the age of machine learning and AI, PhD students in different physics disciplines have more in common than they might think.

“Research is increasingly data-intensive, so while a particle physicist and a medical physicist might spend their days thinking about very different concepts, the approaches, the algorithms, even the tools that people use, are often either the same or very similar,” says Professor Welsch.

Data science is extremely important for any type of research and will probably outlive any particular research field

Professor Welsch

Welsch is the director of the Liverpool Centre for Doctoral Training (CDT) for Innovation in Data Intensive Science (LIV.INNO). Founded in 2022, the CDT is currently recruiting its third cohort of PhD students. Current students are undertaking research that spans medical, environmental, particle and nuclear physics, but their projects are all underpinned by data science. According to Professor Welsch, “Data science is extremely important for any type of research and will probably outlive any particular research field.”

Next-generation PhD training

Carsten Welsch has a keen interest in improving postgraduate education, he was chair of STFC’s Education Training and Careers Committee and a member of the UKRI Skills Advisory Group. When it comes to the future of doctoral training he says “The big question is ‘where do we want UK researchers to be in a few years, across all of the different research areas?’”

He believes that LIV.INNO holds the solution. The CDT aims to give students with data-intensive PhD projects the skills that will enable them to succeed not only in their research but throughout their careers.

Lauryn Eley is a PhD student in the first LIV.INNO cohort who is researching medical imaging. She became interested in this topic during her undergraduate studies because it applied what she had learned in university to real-world situations. “It’s important that I can see the benefits of my work translated into everyday experiences, which I think medical imaging does quite nicely,” she says.

Miss Eley’s project is partnered with medical technology company Adaptix. The company has developed a mobile X-ray device which, it hopes, will enable doctors to produce a high-quality 3D X-ray image more cheaply and easily than with a traditional CT scanner.

Her task is to build a computational model of the X-ray device and investigate how to optimize the images it produces. To generate high-quality results she must simulate millions of X-rays. She says that the data science training she received at the start of the PhD has been invaluable.

From their first year, students attend lectures on data science topics which cover Monte Carlo simulation, high-performance computing, machine learning and AI, and data analysis. Lauryn Eley has an experimental background, and she says that the lectures enabled her to get to grips with the C++ she needed for her research.

Boosting careers with industry placements

Professor Welsch says that from the start, industry partnership has been at the centre of the LIV.INNO CDT. Students spend six months of their PhD on an industrial placement, and Lauryn Eley says that her work with Adaptix has been eye-opening, enabling her to experience first-hand the fast-paced, goal-driven world of industry, which she found very different to academic research.

While the CDT may particularly appeal to those keen on pursuing a career in industry, Professor Welsch emphazises the importance of students delivering high-quality research. Indeed, he believes that LIV.INNO’s approach provides students with the best chance of success in their academic endeavours. Students are taught to use project management skills to plan and deliver their projects, which he says puts them “in the driving seat” as researchers. They are also empowered to take initiative, working in partnership with their supervisors rather than waiting for external guidance.

LIV.INNO builds on a previous programme called the Liverpool Big Data Science Centre for Doctoral Training, which ran between 2017 and 2024. Professor Welsch was also the director of that CDT, and he has noticed that when it comes to partnering with student projects, industry attitudes have undergone a shift.

“When we approached the companies for the first time, you could definitely see that there was a lot of scepticism,” he says. “However, with the case studies from the first CDT, they found it much easier to attract industry partners to LIV.INNO.” Professor Welsch thinks that this demonstrates the benefits that industry-academia partnerships bring to both students and companies.

The first cohort from LIV.INNO are only in their second year, but many of the students from the previous CDT secured full-time jobs from the company where they did their placement. But whatever career path students eventually go down, Carsten Welsch is convinced that the cross-sector experience students get with LIV.INNO sets them up for success, saying “They can make a much better informed decision about where they would like to continue their careers.”

LIVINNO CDT logo

The post Data science CDT puts industry collaboration at its heart appeared first on Physics World.

GMT or TMT? Fate of next-generation telescope falls to expert panel set up by US National Science Foundation

The US National Science Foundation (NSF) is to assemble a panel to help it decide whether to fund the Giant Magellan Telescope (GMT) or the Thirty Meter Telescope (TMT). The agency expects the panel, whose membership has yet to be determined, to report by 30 September, the end of the US government’s financial year.

The NSF first announced in February that it would support the construction of only one of the two next-generation ground-based telescopes due to rising costs. The GMT, priced at $2.54bn, will be located in Chile, while the TMT, which is expected to cost at least $3bn, is set to be built in Hawaii.

A decision on which telescope to fund was initially slated for May. But at a meeting of the National Science Board (NSB) last week, NSF boss Sethuraman Panchanathan revealed the panel would provide further advice to the agency. The decision to look to outsiders followed discussions with the US government and the NSB, which oversees the NSF.

The panel, which will include scientists and engineers, will assess “the readiness of the project from all perspectives” and consider how supporting each telescope would affect the NSF’s overall budget.

It will examine progress made to date, the level of partnerships and resources, and risk management. Complementarity to the European Extremely Large Telescope, opportunities for early-career scientists, and public engagement will be looked at too.

“I want to be very clear that this is not a decision to construct any telescopes,” Panchanathan, who originally trained as a physicist, told the NSB. “This is simply part of a process of gathering critical information to inform my decision-making on advancing either project to the final design stage.”

The post GMT or TMT? Fate of next-generation telescope falls to expert panel set up by US National Science Foundation appeared first on Physics World.

❌