↩ Accueil

Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Hier — 10 mai 20246.5 📰 Sciences English

The future of 2D materials: grand challenges and opportunities

Par : No Author
10 mai 2024 à 16:41
Source: Shutterstock, Marco de Benedictis

Graphene, the first 2D material, was isolated by Prof. Andre Geim and Prof. Konstantin Novoselov in 2004. Since then, a variety of 2D materials have been discovered, including transition metal dichalcogenides, phosphorene and mxene. 2D materials have remarkable characteristics and are making significant contributions towards quantum technologies, electronics, medicine, and renewable energy generation and storage to name but a few fields. However, we are still exploring the full potential of 2D materials, and many challenges must be overcome.

Join us for this panel discussion, hosted by 2D Materials, where leading experts will share their insights and perspectives on the current status, challenges and future directions of 2D materials research. You will have the opportunity to ask questions during the Q&A session.

Have a question for the panel?

We welcome questions in advance of the webinar, so please fill in this form.

Left to right: Stephan Roche, Konstantin Novoselov, Joan Redwing, Yury Gogotsi and Cecilia Mattevi

Chair
Prof. Stephan Roche has been ICREA Research Professor and head of the Theoretical & Computational Nanoscience Group at the Catalan Institute of Nanoscience and Nanotechnology (ICN2). He is a theoretician expert in the study of quantum transport theory in condensed matter, spin transport physics and devices simulation.

Speakers
Prof. Konstantin Novoselov is the Langworthy Professor of Physics and Royal Society Research Professor at The University of Manchester. In 2004, he isolated graphene alongside Andre Geim and was awarded the Nobel Prize in Physics in 2010 for his achievements.

Prof. Joan Redwing is a Distinguished Professor of Materials Science and Engineering at Penn State University where she holds an adjunct appointment in the Department of Electrical and Computer Engineering. Her research focuses on crystal growth and epitaxy of electronic materials, with an emphasis on thin film and nanomaterial synthesis by metalorganic chemical vapour deposition.

Prof. Yury Gogotsi is a Distinguished University Professor and Charles T and Ruth M Bach Endowed Chair in the Department of Materials Science and Engineering at Drexel University. He is the founding director of the A.J. Drexel Nanomaterials Institute.

Prof. Cecilia Mattevi is a Professor of Materials Science in the Department of Materials at Imperial College London. Cecilia’s expertise centres on science and engineering of novel 2D atomically thin materials to enable applications in energy conversion and energy storage.

About this journal

2D Materials is a multidisciplinary, electronic-only journal devoted to publishing fundamental and applied research of the highest quality and impact covering all aspects of graphene and related two-dimensional materials.

Editor-in-chief: Wencai Ren Shenyang National Laboratory for Materials Science, Chinese Academy of Sciences, China.

The post The future of 2D materials: grand challenges and opportunities appeared first on Physics World.

  •  

Data science CDT puts industry collaboration at its heart

Par : No Author
10 mai 2024 à 15:01

Physics is a constantly evolving field – how do we make sure the next generation of physicists receive training that keeps pace with new developments and continues to support the cutting edge of research?

According to Carsten P Welsch, a distinguished accelerator scientist at the University of Liverpool, in the age of machine learning and AI, PhD students in different physics disciplines have more in common than they might think.

“Research is increasingly data-intensive, so while a particle physicist and a medical physicist might spend their days thinking about very different concepts, the approaches, the algorithms, even the tools that people use, are often either the same or very similar,” says Professor Welsch.

Data science is extremely important for any type of research and will probably outlive any particular research field

Professor Welsch

Welsch is the director of the Liverpool Centre for Doctoral Training (CDT) for Innovation in Data Intensive Science (LIV.INNO). Founded in 2022, the CDT is currently recruiting its third cohort of PhD students. Current students are undertaking research that spans medical, environmental, particle and nuclear physics, but their projects are all underpinned by data science. According to Professor Welsch, “Data science is extremely important for any type of research and will probably outlive any particular research field.”

Next-generation PhD training

Carsten Welsch has a keen interest in improving postgraduate education, he was chair of STFC’s Education Training and Careers Committee and a member of the UKRI Skills Advisory Group. When it comes to the future of doctoral training he says “The big question is ‘where do we want UK researchers to be in a few years, across all of the different research areas?’”

He believes that LIV.INNO holds the solution. The CDT aims to give students with data-intensive PhD projects the skills that will enable them to succeed not only in their research but throughout their careers.

Lauryn Eley is a PhD student in the first LIV.INNO cohort who is researching medical imaging. She became interested in this topic during her undergraduate studies because it applied what she had learned in university to real-world situations. “It’s important that I can see the benefits of my work translated into everyday experiences, which I think medical imaging does quite nicely,” she says.

Miss Eley’s project is partnered with medical technology company Adaptix. The company has developed a mobile X-ray device which, it hopes, will enable doctors to produce a high-quality 3D X-ray image more cheaply and easily than with a traditional CT scanner.

Her task is to build a computational model of the X-ray device and investigate how to optimize the images it produces. To generate high-quality results she must simulate millions of X-rays. She says that the data science training she received at the start of the PhD has been invaluable.

From their first year, students attend lectures on data science topics which cover Monte Carlo simulation, high-performance computing, machine learning and AI, and data analysis. Lauryn Eley has an experimental background, and she says that the lectures enabled her to get to grips with the C++ she needed for her research.

Boosting careers with industry placements

Professor Welsch says that from the start, industry partnership has been at the centre of the LIV.INNO CDT. Students spend six months of their PhD on an industrial placement, and Lauryn Eley says that her work with Adaptix has been eye-opening, enabling her to experience first-hand the fast-paced, goal-driven world of industry, which she found very different to academic research.

While the CDT may particularly appeal to those keen on pursuing a career in industry, Professor Welsch emphazises the importance of students delivering high-quality research. Indeed, he believes that LIV.INNO’s approach provides students with the best chance of success in their academic endeavours. Students are taught to use project management skills to plan and deliver their projects, which he says puts them “in the driving seat” as researchers. They are also empowered to take initiative, working in partnership with their supervisors rather than waiting for external guidance.

LIV.INNO builds on a previous programme called the Liverpool Big Data Science Centre for Doctoral Training, which ran between 2017 and 2024. Professor Welsch was also the director of that CDT, and he has noticed that when it comes to partnering with student projects, industry attitudes have undergone a shift.

“When we approached the companies for the first time, you could definitely see that there was a lot of scepticism,” he says. “However, with the case studies from the first CDT, they found it much easier to attract industry partners to LIV.INNO.” Professor Welsch thinks that this demonstrates the benefits that industry-academia partnerships bring to both students and companies.

The first cohort from LIV.INNO are only in their second year, but many of the students from the previous CDT secured full-time jobs from the company where they did their placement. But whatever career path students eventually go down, Carsten Welsch is convinced that the cross-sector experience students get with LIV.INNO sets them up for success, saying “They can make a much better informed decision about where they would like to continue their careers.”

LIVINNO CDT logo

The post Data science CDT puts industry collaboration at its heart appeared first on Physics World.

  •  

GMT or TMT? Fate of next-generation telescope falls to expert panel set up by US National Science Foundation

Par : No Author
10 mai 2024 à 14:01

The US National Science Foundation (NSF) is to assemble a panel to help it decide whether to fund the Giant Magellan Telescope (GMT) or the Thirty Meter Telescope (TMT). The agency expects the panel, whose membership has yet to be determined, to report by 30 September, the end of the US government’s financial year.

The NSF first announced in February that it would support the construction of only one of the two next-generation ground-based telescopes due to rising costs. The GMT, priced at $2.54bn, will be located in Chile, while the TMT, which is expected to cost at least $3bn, is set to be built in Hawaii.

A decision on which telescope to fund was initially slated for May. But at a meeting of the National Science Board (NSB) last week, NSF boss Sethuraman Panchanathan revealed the panel would provide further advice to the agency. The decision to look to outsiders followed discussions with the US government and the NSB, which oversees the NSF.

The panel, which will include scientists and engineers, will assess “the readiness of the project from all perspectives” and consider how supporting each telescope would affect the NSF’s overall budget.

It will examine progress made to date, the level of partnerships and resources, and risk management. Complementarity to the European Extremely Large Telescope, opportunities for early-career scientists, and public engagement will be looked at too.

“I want to be very clear that this is not a decision to construct any telescopes,” Panchanathan, who originally trained as a physicist, told the NSB. “This is simply part of a process of gathering critical information to inform my decision-making on advancing either project to the final design stage.”

The post GMT or TMT? Fate of next-generation telescope falls to expert panel set up by US National Science Foundation appeared first on Physics World.

  •  
À partir d’avant-hier6.5 📰 Sciences English

Magnetic islands stabilize fusion plasma, simulations suggest

Par : No Author
9 mai 2024 à 17:23

By combining two different approaches to plasma stabilization, physicists in the US and Germany have developed a new technique for suppressing instabilities in tokamak fusion reactors. The team, led by Qiming Hu at Princeton Plasma Physics Laboratory, hopes its computer-modelling results could be an important step towards making nuclear fusion a viable source of energy.

Tokamak fusion reactors use intense magnetic fields to confine and heat hydrogen plasma within their doughnut-shaped interiors. At suitably high temperatures, the hydrogen nuclei will gain enough energy to overcome their mutual repulsion and fuse together to form helium nuclei, releasing energy in the process.

If more energy is released in the reaction than is fed into the tokamak, it would provide an abundant source of clean energy. This has been a goal of researchers since fusion was first created in the laboratory in the 1930s.

Stubborn roadblock

One of the most stubborn roadblocks to achieving sustained fusion is the emergence of periodic plasma instabilities called edge-localized modes (ELMs). These originate in the outer regions of the plasma and result in energy leaking into the tokamak’s walls. If left unchecked, this will cause the fusion reaction to fizzle out, and it can even damage the tokamak.

One of the most promising approaches for suppressing ELMs is the use of resonant magnetic perturbations (RMPs). These are controlled ripples in the confining magnetic field that create closed loops of magnetic fields to form inside the plasma.

Dubbed magnetic islands, these loops do not always have a desirable influence. If they are too large, they risk destabilizing the plasma even further. But by carefully engineering RMPs to generate islands with just the right size, it should be possible to redistribute the pressure inside the plasma, suppressing the growth of ELMs.

In their study, Hu’s team introduced an extra step to this process, which would enable them to better control the parameters of RMPs to generate magnetic islands of just the right size.

Spiralling electrons

This involved injecting the plasma with high-frequency microwaves in a method called edge-localized electron cyclotron current drive (ECCD). Inside the plasma, these waves cause energetic electrons to spiral along the direction of the confining magnetic field lines, generating local currents which run parallel to the field lines.

In previous experiments, ECCD microwaves were most often injected into the core of the plasma. But in their simulations, the Hu and colleagues instead directed them to the edge.

“Usually, people think applying localized ECCD at the plasma edge is risky because the microwaves may damage in-vessel components,” Hu explains. “We’ve shown that it’s doable, and we’ve demonstrated the flexibility of the approach.”

Tight control

In simulated tokamak reactors, the team found that their new approach can lower the amount of current necessary to generate RMPs, while also providing tight control over the sizes of magnetic islands as they formed in the plasma.

“Our simulation refines our understanding of the interactions in play,” Hu continues. “When the ECCD was added in the same direction as the current in the plasma, the width of the island decreased, and the pedestal pressure increased.”

The pedestal pressure refers to the region close to the edge of the plasma where the pressure peaks, before dropping off steeply towards the plasma boundary. “Applying the ECCD in the opposite direction produced opposite results, with island width increasing and pedestal pressure dropping or facilitating island opening,” explains Hu.

These simulation results could provide important guidance for physicists running tokamaks – including ITER experiment, which should begin operation in late 2025. If the same results can be replicated in real plasma it could bring the long-awaited goal of sustained nuclear fusion a step closer.

The research is described in Nuclear Fusion.

The post Magnetic islands stabilize fusion plasma, simulations suggest appeared first on Physics World.

  •  

Astronomy conference travel is on par with Africa’s per-capita carbon footprint

Par : No Author
9 mai 2024 à 14:15

Travel to more than 350 astronomy meetings in 2019 resulted in the emission of 42 500 tonnes of carbon dioxide. That’s the conclusion of the first-ever study to examine the carbon emissions from travel to meetings by an entire field. The carbon cost amounts to about one tonne of carbon dioxide equivalent (tCO2e) per participant per meeting – roughly Africa’s average per capita carbon footprint in 2019 (1.2 tCO2e) (PNAS Nexus 3 pgae143).

Carried out by a team led by Andrea Gokus at Washington University in St. Louis in the US, the study examined 362 meetings in 2019 that were open to anyone in the astronomical community. These included conferences disseminating scientific findings as well as schools providing lectures and training to students and early-career scientists.

Using data on each participant’s home institutions that were available for 300 of the meetings, the researchers estimated travel-related emissions for each event, assuming delegates went by train or plane. For these meetings, the emissions totalled 38 000 tCO2e and a distance equivalent to travelling to the Sun and half way back.

For the other 62 meetings that did not have details of the participants’ home institutions, the team estimated the emissions using average data from other conferences. Emissions from those events were put at 4500 tCO2e, bringing the total to 42 500 tCO2e.

The meeting with the highest emissions per participant was Great Barriers in Planet Formation held in Palm Cove, Queensland in Australia, with almost all attendees traveling from outside the country. The travel from the 115 participants resulted in 461 tCO2e, or 4 tCO2e for every person, on average. The team found that emissions could have been more than halved if it had been held in Europe or the northeastern US.

Hub model

Gokus says that while meetings are important for researchers, “adjustments can be made to reduce their hefty carbon cost”, for example by knowing where participants are based. The researchers found, for example, that emissions from 2019’s biggest astronomical conference – the 223rd American Astronomical Society (AAS) meeting in Seattle – could have been cut by a quarter if it had been held in a more central US location.

The team also explored the impact of switching the 223rd AAS meeting from a single-venue meeting to a hub model, in which simultaneous satellite events are held at different locations. A two-hub model for that conference, with an eastern and western US hub, would have reduced emissions by around 60%, the study finds. Adding a third European hub could have saved 65% of emissions, while a fourth hub in Asia, for instance in Tokyo, would have cut emissions by about 70%.

The researchers claim that such alternative meeting setups as well as virtual attendance, could have benefits beyond the environment. They point out that finances, complex visa processes, parenting and other careering responsibilities as well as disabilities can make travelling to meetings challenging for some.

“By making use of technology to connect virtually, we can foster a more inclusive collaborative approach, which can help us advance our understanding of the Universe further,” says Gokus. “It is important that we work together as a community to achieve this goal, because there is no Planet B.”

The post Astronomy conference travel is on par with Africa’s per-capita carbon footprint appeared first on Physics World.

  •  

Tetris-inspired radiation detector uses machine learning

Par : No Author
8 mai 2024 à 16:19

Inspired by the tetromino shapes in the classic video game Tetris, researchers in the US have designed a simple radiation detector that can monitor radioactive sources both safely and efficiently. Created by Mingda Li and colleagues at the Massachusetts Institute of Technology, the device employs a machine learning algorithm to process data, allowing it to build up accurate maps of sources using just four detector pixels.

Wherever there is a risk of radioactive materials leaking into the environment, it is critical for site managers to map out radiation sources as accurately as possible.

At first glance, there is an obvious solution to maximizing precision, while keeping costs as low as possible, explains Li. “When detecting radiation, the inclination might be to draw nearer to the source to enhance clarity. However, this contradicts the fundamental principles of radiation protection.”

For the people tasked with monitoring radiation, these principles advise that the radiation levels they expose themselves to should be kept as low as reasonably achievable.

Complex and expensive

However, since radiation can interact with intervening objects via a wide array of mechanisms, it is often both complex and expensive to map out radiation sources from reasonably safe distances.

“Thus, the crux of the matter lies in simplifying detector setups without compromising safety by minimizing proximity to radiation sources,” Li explains.

In a typical detector, radiation maps are created by monitoring intensity distribution patterns across a 10×10 array of detector pixels. The main drawback here is that radiation can approach the detector from a variety of directions and distances, making it difficult to extract useful information about the source of that radiation. This is usually done by placing an absorbing mask over the pixels, which provides some directional information, and by doing lots of data processing.

For Li’s team, the first step to reducing the complexity of this process was to minimize redundant information collected by multiple pixels within the array. “By strategically incorporating small [lead] paddings between pixels, we enhance contrast to ensure that each detector receives distinct information, even when the radioactive source is distant,” Li explains.

Machine learning

Next, the team developed machine learning algorithms to extract more accurate information regarding the direction of incoming radiation and the detector’s distance to the source.

Inspiration for the final step of the design would come from an unlikely source. In Tetris, players encounter seven unique tetrominoes, which represent every possible way that four squares can be arranged contiguously to create shapes.

By using these shapes to create detector pixel arrays, the researchers predicted they could achieve similar levels of accuracy as detectors with far larger square arrays. As Li explains, “these shapes offer superior efficiency in utilizing pixels, thereby enhancing accuracy.”.

To demonstrate this, the team designed a series of four–pixel radiation detectors, with the pixels arranged in Tetris-inspired tetromino shapes. To build up radiation maps, these arrays were moved in circular paths around the radioactive sources being studied. This allowed the detector’s algorithms to discern accurate information about source positions and directions, based on the counts received by the four pixels.

Successful field test

“Particularly noteworthy was our successful execution of a field-test at Lawrence Berkeley National Laboratory,” Li recalls. “Even when we withheld the precise source location, the machine learning algorithm could effectively localize it within real experimental data.”

Li’s team is now confident that its novel approach to detector design and data processing could be useful for radiation detection. “The adoption of Tetris-like configurations not only enhances accuracy but also minimizes complexity in detector setups,” Li says. “Moreover, our successful field-test underscores the real-world applicability of our approach, paving the way for enhanced safety and efficacy in radiation monitoring.”

Based on their success, the team hopes the detector design could soon be implemented for applications including the routine monitoring of nuclear reactors, the processing of radioactive material, and the safe storage of harmful radioactive waste.

The detector is described in Nature Communications.

The post Tetris-inspired radiation detector uses machine learning appeared first on Physics World.

  •  

From pulsars and fast radio bursts to gravitational waves and beyond: a family quest for Maura McLaughlin and Duncan Lorimer

Par : No Author
7 mai 2024 à 18:41

Most physicists dream of making new discoveries that expand what we know about the universe, but they know that such breakthroughs are extremely rare. It’s even more surprising for a scientist to make a great discovery with someone who is not just a colleague, but also their life partner. The best-known husband-and-wife couples in physics are the Curies, Marie and Pierre; as well as their daughter, Irène Joliot-Curie and her husband Frédéric Joliot-Curie. Each couple won a Nobel prize, in 1903 and 1935 respectively, for early work on radioactivity.

Joining the ranks of these pioneering physicists are contemporary married couple Maura McLaughlin and Duncan Lorimer, who last year were two of three laureates awarded the $1.2m Shaw Prize in Astronomy (see box below) for their breakthroughs in radio astronomy. Together with astrophysicist Matthew Bailes, director of the Australian Research Council Centre of Excellence for Gravitational Wave Discovery, McLaughlin and Lorimer won the prize for their 2007 discovery of fast radio bursts (FRBs) – powerful but short-lived pulses of radio waves from distant cosmological sources. Since their discovery, several thousand of these mysterious cosmic flashes, which last for milliseconds, have been spotted.

Over the years, McLaughlin and Lorimer’s journeys – through academia and their personal life – have been inherently entwined and yet distinctly discrete, as the duo developed careers in radio astronomy and astrophysics that began with pulsars, then included FRBs and now envelop gravitational waves. The couple have also advanced science education and grown astronomical research and teaching at their home base, West Virginia University (WVU) in the US. There, McLaughlin is Eberly Family distinguished professor of physics and astronomy, and chair of the Department of Physics and Astronomy, while Lorimer currently serves as associate dean for research in WVU’s Eberly College of Arts and Sciences.

The Shaw Prize

Photo of two people superimposed with artist impression of radio waves
Shaw laureates Astrophysicists Duncan Lorimer and Maura McLaughlin received the Shaw Prize in 2023 for their discovery of fast radio bursts. (Courtesy: WVU Photo/Raymond Thompson Jr)

The 2023 Shaw Prize in Astronomy, awarded jointly to Duncan Lorimer and Maura McLaughlin, and to their colleague Matthew Bailes, is part of the legacy of Sir Run Run Shaw (1907–2014), a successful Hong Kong-based film and television mogul. Known for his philanthropy, he gave away billions in Hong Kong dollars to support schools and universities, hospitals and charities in Hong Kong, China and elsewhere.

In 2002 he established the Shaw Prize to recognize “those persons who have achieved distinguished contributions in academic and scientific research or applications or have conferred the greatest benefit to mankind”. A gold medal and a certificate for each Shaw laureate, and a monetary award of $1.2m shared among the laureates, is given yearly in astronomy, life science and medicine, and mathematical sciences. Previous winners of the Shaw Prize in Astronomy include Ronald Drever, Kip Thorne and Rainer Weiss, for the first observation of gravitational waves with LIGO. They are among the 16 of the 106 Shaw laureates since 2004 who have also been awarded Nobel prizes.

Accidental cosmic probe

Radio astronomy, which led to much of McLaughlin and Lorimer’s work, was not initially a formal area of research. Instead, it began rather serendipitously in 1928, when Bell Labs radio engineer Karl Jansky was trying to find the possible sources of static at 20.5 MHz that were disrupting the new transatlantic radio telephone service. Among the types of static that he detected was a constant “hiss” from an unknown source that he finally tracked down to the centre of the Milky Way galaxy, using a steerable antenna 30 m in length. His 1933 paper “Electrical disturbances apparently of extraterrestrial origin” received considerable media attention but little notice from the astronomy establishment of the time (see “Radio astronomy: from amateur roots by worldwide groups” by Emma Chapman).

Radio astronomy truly flourished after the Second World War, with new purpose-built facilities. An early example from 1957 was the steerable 76 m dish antenna built by Bernard Lovell and colleagues at Jodrell Bank in the UK – where McLaughlin and Lorimer would later work. Other researchers who led the way include the Nobel-prize-winning astronomer Sir Martin Ryle, who pioneered radio interferometry and developed aperture synthesis; as well as Australian electrical engineer Bernard Mills, who designed and built radio interferometers.

Extraterrestrial radio signals soon yielded important science. In 1951 researchers detected a predicted emission from neutral hydrogen at 1.4 GHz – a fingerprint of this fundamental atom. In 1964 Arno Penzias and Robert Wilson (also based at Bell Labs) inadvertently found a 4.2 GHz signal across the whole sky, while testing orbiting telecom satellites – thereby discovering the cosmic background radiation. And in 1968 another spectacular discovery shaped McLaughlin and Lorimer’s careers, when University of Cambridge graduate student Jocelyn Bell Burnell and her PhD supervisor Antony Hewish announced the observation of an unusual radio signal from space – a pulse that arrived every 1.3 seconds. That signal was the first to come from what were soon called “pulsars”. Hewish would go on to share the 1974 Nobel Prize for Physics for the discovery – while Bell Burnell was infamously left out, supposedly due to her then student status.

As more pulsars were found with varied periods and in different directions of the sky, it became clear that the signals were not being sent by an alien civilization as some researchers had speculated – after all, the chances of an extraterrestrial civilization sending many signals of varying periods, or different civilizations sending out different periodic signals, was slim. One clue was that the pulses were short and coherent, so they had to come from sources smaller than the distance light could travel during the pulse’s lifetime – for instance, the source of a 5 ms pulse could be at a maximum of 1500 km.

As it happened, the signals were our first look at neutron stars – small, extremely dense and rapidly rotating remnants of massive stars after they have gone supernova and had their protons and electrons squeezed into neutrons by gravity’s implacable power. As the star rotates, its strong off-axis magnetic field produces beams of electromagnetic radiation from the magnetic poles. These beams create regular pulses as they sweep past a detector on a direct line of sight. Pulsars are mostly studied at radio frequencies, but they also radiate at other, higher frequencies.

Pulsars to fast bursts

Lorimer and McLaughlin began their careers by studying these exotic stellar objects, but each of them had already been captivated by astronomy and astrophysics as teenagers. Lorimer was born in Darlington, UK. After studying astrophysics as an undergraduate at the University of Wales in Cardiff, he moved to the University of Manchester in 1994, where his PhD research focused on analysing classes of radio pulsars with different periods.

McLaughlin was born in Philadelphia, Pennsylvania, and first studied pulsars as an undergraduate student at Penn State. Her PhD dissertation at Cornell University in 2001 covered pulsars that variously emitted radio waves, X-rays or gamma rays. By 1995 Lorimer was working as a researcher at the Max Planck Institute for Radio Astronomy in Bonn, Germany, whereas McLaughlin joined the Jodrell Bank Observatory in 2003. He met McLaughlin in 1998 while working at the Arecibo Observatory in Puerto Rico. McLaughlin and Lorimer moved to the UK in 2001 to work at the Jodrell Bank observatory.

It was an interesting and exciting time in the pulsar research community, with new pulsars found by computerized Fourier transform analysis that detected the telltale periodicities in vast amounts of observational data. But radio astronomers also sometimes saw transient signals, and McLaughlin had written computer code designed to find single bright pulses. This led to the 2006 discovery of a new class of pulsars dubbed rotating radio transients (RRATS, an acronym recalling a pet rat McLaughlin once had). These stars could be detected only through their sporadic millisecond-long bursts, unlike most pulsars, which were found through their periodic emissions. The discovery in turn initiated further searches for transient pulses (Nature 439 817).

The following year, Lorimer and McLaughlin, now a married couple, joined WVU’s department of physics and astronomy as assistant professors. To uncover more distant and bright pulsars, Lorimer gave his graduate student Ash Narkevic the task of looking through archival observational data that the Parkes radio telescope in Australia had taken of the Large and Small Magellanic Clouds – two small galaxies that are satellites to our very own Milky Way, roughly 200,000 light-years away from Earth – of which the Large was already known to host 14 pulsars.

Narkevic examined the data and found a single strong burst – nearly 100 times stronger than the background – at 1.4 GHz with a 5 msec duration. But the burst seemed to come from the Small Magellanic Cloud, where there were five known pulsars at that time. Even more surprising was the fact that this extremely bright burst did not arrive all the same time. Known as pulse or frequency dispersion, this occurs when radio waves travelling through interstellar space interact with free electrons, dispersing the waves, as higher-frequency waves travel through the free-electron plasma quicker than lower-frequency ones, and arrive earlier at our telescopes.

This dispersion depends on the total number of electrons (or the column density) along the path. The further away the source of the burst, the more likely it is that the waves will encounter even more electrons on their path to Earth, and so the lag between the high- and low-frequency waves is greater. The pulse Narkevic spotted was so distorted by the time it reached Earth that it suggested the source was almost three billion light-years away – well beyond our local galactic neighbourhood. This also meant that the source must be significantly smaller than the Sun, and more on par with the proposed size of pulsars, while also somehow being 1012 times more luminous than a typical pulsar.

1 The first burst

Photo of two men holding a sheaf of paper and a graph of radio data showing a clear black line
Courtesy: Duncan Lorimer; Lorimer et al., NRAO/AUI/NSF

(Top) Duncan Lorimer (left) and Ash Narkevic in 2008 with the paper they published in Science about their observation of a fast radio burst (bottom).

The report of this seemingly new phenomenon – a single extremely energetic event at an enormous cosmological distance – was published in Science later that year, after being initially rejected (Science 318 777). This first detected fast radio burst came to be known as the “Lorimer burst” (figure 1). After several years and significant further work by Lorimer, McLaughlin, Bailes and others, they found first four and then tens of similar bursts. This launched a new class of cosmological phenomena that now includes more than 1000 FRBs, which have fulfilled the prediction in 2007 that they would serve as cosmological probes.

Thanks to FRBs having been found in different galaxies beyond our own across the sky, they serve as a probe of the intergalactic medium, allowing astrophysicists to measure the density of the material that lies between Earth and the host galaxy (Nature 581 391). By measuring the distance to the source of the FRB, and then looking at the dispersion as a function of wavelength of the pulses, astronomers can determine the density of the matter the pulse passed through, thereby yielding a value for the baryonic density of our universe. This is otherwise extremely difficult to measure, thanks to how diffused this matter is in our observable universe. FRBs have also provided an independent measurement for the Hubble constant, the exact value of which has lately come under new scrutiny (MNRAS 511 662).

Detecting a gravitational-wave background

While Lorimer is still working on pulsars and FRBs, McLaughlin has now moved into another area of pulsar astronomy. That’s because for almost two decades, she has been a researcher in and co-director of the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) Physics Frontier Center, which uses pulsars to detect low-frequency gravitational waves with periods of years to decades. One of its facilities is the steerable 100 m Green Bank Telescope about 150 km south of WVU.

“We are observing an array of pulsars distributed across the sky,” says McLaughlin. “These are 70 millisecond pulsars, so very rapidly rotating. We search for very small deviations in the arrival times of the pulsars that we can’t explain with a timing model that accounts for all the known astrophysical delays.” General relativity predicts that certain deviations in the timing would depend on the relative orientation of pairs of pulsars, so seeing this special angular correlation in the timing would be a clear sign of gravitational waves.

2 Gravitational-wave spectrum

Two figures: a globe covered in coloured symbols and a chart
Courtesy: NANOGrav Collaboration

(a) The NANOGrav 15-year data set contains timing observations from 68 pulsars using the Arecibo Observatory, the Green Bank Telescope and the Very Large Array. The map shows pulsar locations in equatorial co-ordinates. (b) The background comes from correlating changes in pulsar arrival times between all possible pairs of the 67 pulsars (2211 distinct pairs in total), and is based on three or more years of timing data. The black line is the expected correlation predicted by general relativity. These calculations assume the gravitational-wave background is from inspiralling supermassive black-hole binaries.

In June 2023 the NANOGrav collaboration published an analysis of 15 years of its data (figure 2), looking at 68 pulsars with millisecond periods, which showed this signature for the first time (ApJL 951 L8). McLaughlin says that it represents not just one source of gravitational waves, but a background arising from all gravitational events such as merging supermassive black holes at the hearts of galaxies. This background may contain information about how galaxies interact and perhaps also the early universe. Five years from now, she predicts, NANOGrav will be detecting individual supermassive black-hole binaries and will tag their locations in specific galaxies, to form a black hole atlas.

Star-crossed astronomers

The connections between McLaughlin and Lorimer that played a role in their academic achievements began rather fittingly with an interaction in 1999, at the Arecibo radio telescope in Puerto Rico (now sadly decommissioned). Lorimer was based there at the time, while McLaughlin was a visiting graduate student, and their contact, though not in person, was definitely not cordial. Lorimer sent what he calls a “little snippy e-mail” to McLaughlin about her use of the computer that blocked his own access, which she also recalls as “pretty grumpy”.

Two photos: a woman stood on a telescope gantry and a man in a control room
Near miss Maura McLaughlin and Duncan Lorimer both worked at Arecibo Observatory in Puerto Rico in 1999, but they didn’t meet in person during that time. (Courtesy: Maura McLaughlin and Duncan Lorimer)

But things improved after they later met in person, and they joined the Jodrell Bank Observatory in the UK. The pair married in 2003 and now have three sons. Over the years, they moved together to the US, set up their own astronomy group at WVU by 2006, and proceeded to work together and alongside each other, publishing many research papers, both joint and separate.

Given all these successes, how do the two researchers balance science and family, especially when they first arrived at WVU with a five-month-old baby to join a department with just one astronomer and no graduate astronomy programme? McLaughlin says it was “Really hard work. Lots of grant writing, developing courses,” but adds that it was also “really fun because we were both building a programme and building a family and moving to a new place”.

Life got even busier in 2007, when another child and the FRB discovery both arrived. The couple says that it was all doable because they fully understood the need to shift scientific or family responsibilities to each other as necessary. According to McLaughlin, this includes equal parenting from her husband, for which she feels “very lucky”. As Lorimer puts it, “We get each other’s mindset.”

However, the fact that they are married may have coloured perceptions of their work and status. “When we first started here at WVU,” Lorimer explains, “a lot of people assumed we were sharing a single position. But the university’s been great. It’s always made it clear from the get-go that we’re obviously on different career trajectories.” And they agree that as they’ve progressed in their individual careers and are known for different things, they’re now unmistakably seen as two distinct scientists.

Three photos of the same couple: their wedding; riding a tandem bike; and posing with a dog
Shared wavelength Maura McLaughlin and Duncan Lorimer married in 2003 (top). They credit their ability to both have successful careers to sharing and shifting family responsibilities as needed, as well as taking their initially similar career paths on different trajectories. (Courtesy: Maura McLaughlin and Duncan Lorimer)

Beyond the Shaw Prize

The Shaw Prize came as a total surprise to the couple. The pair both received e-mails simultaneously one evening, but Lorimer spotted his first. “We almost missed it as it was just about time to go to bed and the announcement was being made in Hong Kong a few hours after that,” says Lorimer. McLaughlin recalls her husband screaming and excitedly running up the stairs to give her the news. “He doesn’t scream much to begin with, maybe only when the dogs do something bad, and I’m wondering ‘Why is he screaming late on a Sunday night?’ He told me to pull up the e-mail and I thought it was a prank. I read it again and realized it was real. That was quite a Sunday night.” Amusingly, the e-mail for their co-winner Matthew Bailes initially went into his spam folder. The trio would later describe their work in a Shaw Prize Lecture in Hong Kong in November 2023.

So what comes next for the stellar pair? Further research into the different types of FRBs that are still being found, using new telescopes and detection schemes. One new project, an extension of Lorimer’s earlier work in pulsar populations, is to locate FRBs in specific galaxies and among groups of both younger and older stars using the Green Bank telescope in West Virginia, along with others, to help uncover what causes them. FRBs may come from neutron stars with especially huge magnetic fields – dubbed magnetars – but this remains to be seen.

Data from Green Bank is also used in the Pulsar Science Collaboratory, co-founded by McLaughlin and Lorimer (see box below). Meanwhile, the NANOGrav pulsar observation of the gravitational wave background, where McLaughlin continues her long-time involvement, has been hailed by the LIGO Collaboration for opening up the spectrum in the exciting new era of gravitational-wave astronomy and cosmology.

The Pulsar Science Collaboratory

Photo of two high-schoolers and a woman looking at data on a computer screen
Engaging science Participants in the Pulsar Science Collaboratory, at the Green Bank Telescope control room. (Courtesy: NSF/AUI/GBO)

The Pulsar Science Collaboratory (PSC) was founded in 2007 by Maura McLaughlin, Duncan Lorimer and Sue Ann Heatherly at the Green Bank Observatory; with support from the US National Science Foundation. It is an educational project in which, to date, more than 2000 high-school students have been involved in the search for new pulsars.

Students are trained via a six-week online course and then must pass a certification test to use an online interface to access terabytes of pulsar data from the Green Bank Observatory. They are also invited to a summer workshop at the observatory. McLaughlin and Lorimer proudly note the seven new pulsars that high-school students have so far discovered. Many of these students have continued as college undergraduates or even graduate students working on pulsar and fast-radio-burst science.

At the end of the Shaw Prize Lecture, Lorimer pointed out that there is “still much left to explore”. In an interview for the press, McLaughlin said “We’ve really just started.” Both statements seem fair predictions for anything each one does in their areas of interest in the future – surely with hard work but also with the continuing sense that it’s “really fun”.

The post From pulsars and fast radio bursts to gravitational waves and beyond: a family quest for Maura McLaughlin and Duncan Lorimer appeared first on Physics World.

  •  

Australia raises eyebrows by splashing A$1bn into US quantum-computing start-up PsiQuantum

Par : No Author
7 mai 2024 à 17:16

The Australian government has controversially announced it will provide A$940m (£500m) for the US-based quantum-startup PsiQuantum. The investment, which comes from the country’s National Quantum Strategy budget, makes PsiQuantum the world’s most funded independent quantum company.

Founded in 2015 by five physicists who were based in the UK, PsiQuantum aims to build a large-scale quantum computer by 2029 using photons as quantum bits (or qubits). As photonic technology is silicon-based, it benefits from advances in large-scale chip-making fabrication and does not need as much cryogenic cooling as other qubit platforms require.

The company has already reported successful on-chip generation and the detection of single-photon qubits, but the technique is not plain sailing. In particular, optical losses still need to be reduced to sufficient levels, while detection needs to be more efficient to improve the quality (or fidelity) of the qubits.

Despite these challenges, PsiQuantum has already attracted several supporters. In 2021 private investors gave the firm $665m and in 2022 the US government provided $25m to both GlobalFoundries and PsiQuantum to develop and build photonic components.

The money from the Australian government comes mostly via equity-based investment as well as grants and loans. The amount represents half of the budget that was allocated by the government last year to boost Australia’s quantum industry over a seven-year period until 2030.

The cash come with some conditions, notably that PsiQuantum should build its regional headquarters in the Queensland capital Brisbane and operate the to-be-developed quantum computer from there. Anthony Albanese, Australia’s prime minister, claims the move will create up to 400 highly skilled jobs, boosting Australia’s tech sector.

A bold declaration

Stephen Bartlett, a quantum physicist from the University of Sydney, welcomes the news. He adds that the scale of the investment “is required to be on par” with companies such as Google, Microsoft, AWS, and IBM that are investing similar amounts into their quantum computer programmes.

Ekaterina Almasque, general partner at the venture capital firm OpenOcean, says that the investment may bring further benefits to Australia. “The [move] is a bold declaration that quantum will be at the heart of Australia’s national tech strategy, firing the starting gun in the next leg of the race for quantum [advantage],” she says. “This will ripple across the venture capital landscape, as government funding provides a major validation of the sector and reduces the risk profile for other investors.”

Open questions

The news, however, did not please everyone. Paul Fletcher, science spokesperson for Australia’s opposition Liberal/National party coalition, criticises the selection process. He says it was “highly questionable” and failed to meet normal standards of transparency and contestability.

“There was no public transparent expression of interest process to call for applications. A small number of companies were invited to participate, but they were required to sign non-disclosure agreements,” says Fletcher. “And the terms made it look like this had all been written so that PsiQuantum was going to be the winner.”

Fletcher adds that is is “particularly troubling” that the Australian government “has chosen to allocate a large amount of funding to a foreign based quantum-computing company” rather than home-grown firms. “It would be a tragedy if this decision ends up making it more difficult for Australian-based quantum companies to compete for global investment because of a perception that their own government doesn’t believe in them,” he states.

Kees Eijkel, director of business development at the quantum institute QuTech in the Netherlands, adds that it is still an open question what “winning technology” will result in a full-scale quantum computer due to the “huge potential” in the scalability of other qubit platforms.

Indeed, quantum physicist Chao-Yang Lu from University of Science and Technology of China took to X to note that there is “no technologically feasible pathway to the fault-tolerant quantum computers PsiQuantum promised” adding that there are many “formidable” challenges”.

Lu points out that PsiQuantum had already claimed to have a working quantum computer by 2020, which was then updated to 2025. He says that the date now slipping to 2029 “is [in] itself worrying”.

The post Australia raises eyebrows by splashing A$1bn into US quantum-computing start-up PsiQuantum appeared first on Physics World.

  •  

Sound and light waves combine to create advanced optical neural networks

Par : No Author
6 mai 2024 à 14:00

One of the things that sets humans apart from machines is our ability to process the context of a situation and make intelligent decisions based on internal analysis and learned experiences.

Recent years have seen the development of new “smart” and artificially “intelligent” machine systems. While these do have intelligence based on analysing data and predicting outcomes, many intelligent machine networks struggle to contextualize information and tend to just create a general output that may or may not have situational context.

Whether we want to build machines that can make informed contextual decisions like humans can is an ethical debate for another day, but it turns out that neural networks can be equipped with recurrent feedback that allows them to process current inputs based on information from previous inputs. These so-called recurrent neural networks (RNNs) can contextualize, recognise and predict sequences of information (such as time signals and language) and have been used for numerous tasks including language, video and image processing.

There’s now a lot of interest in transferring electronic neural networks into the optical domain, creating optical neural networks that can process large data volumes at high speeds with high energy efficiency. But while there’s been much progress in general optical neural networks, work on recurrent optical neural networks is still limited.

New optoelectronics required

Development of recurrent optical neural networks will require new optoelectronic devices with a short-term memory that’s programmable, computes optical inputs, minimizes noise and is scalable. In a recent study led by Birgit Stiller at the Max Planck Institute for the Science of Light, researchers demonstrated an optoacoustic recurrent operator (OREO) that meets these demands.

optoacoustic recurrent operator concept
OREO concept Information in an optical pulse is partially converted into an initial acoustic wave, which affects the second and third light–sound processing steps. (Courtesy: Stiller Research Group, MPL)

The acoustic waves in the OREO link subsequent optical pulses and capture the information within, using it to manipulate the next operations. The OREO is based on stimulated Brillouin-Mandelstam scattering, an interaction between the optical waves and travelling sound waves that’s used to add latency and slow the acoustic velocity. This process enables the OREO to contextualize a time-encoded stream of information using sound waves as a form of memory, which could be used not only to remember previous operations but as a basis to manipulate the output of the current operation – much like in electronic RNNs.

“I am very enthusiastic about the generation of sound waves by light waves and the manipulation of light by the means of acoustic waves,” says Stiller. “The fact that sound waves can create fabrication-less temporary structures that can be seen by light and can manipulate light in a hair-thin optical fibre is fascinating to me. Building a smart neural network based on this interaction of optical and acoustic waves motivated me to embark on this new research direction.”

Designed to function in any optical waveguide, including on-chip devices, the OREO controls the recurrent operation entirely optically. In contrast to previous approaches, it does not need an artificial reservoir that requires complex manufacturing processes. The all-optical control is performed on a pulse-by-pulse basis and offers a high degree of reconfigurability that can be used to implement a recurrent dropout (a technique used to prevent overfitting in neural networks) and perform pattern recognition of up to 27 different optical pulse patterns.

“We demonstrated for the first time that we can create sound waves via light for the purposes of optical neural networks,” Stiller tells Physics World. “It is a proof of concept of a new physical computation architecture based on the interaction and reciprocal creation of optical and acoustic waves in optical fibres. These sound waves are, for example, able to connect several subsequent photonic computation steps with each other, so they give a current calculation access to past knowledge.”

Looking to the future

The researchers conclude that they have, for the first time, combined the field of travelling acoustic waves with artificial neural networks, creating the first optoacoustic recurrent operator that connects information carried by subsequent optical data pulses.

These developments pave the way towards more intelligent optical neural networks that could be used to build a new range of computing architectures. While this research has brought an intelligent context to the optical neural networks, it could be further developed to create fundamental building blocks such as nonlinear activation functions and other optoacoustic operators.

“This demonstration is only the first step into a novel type of physical computation architecture based on combining light with travelling sound waves,” says Stiller. “We are looking into upscaling our proof of concepts, working on other light–sound building blocks and aiming to realise a larger optical processing structure mastered by acoustic waves.”

The research is published in Nature Communications.

The post Sound and light waves combine to create advanced optical neural networks appeared first on Physics World.

  •  

Bilayer of ultracold atoms has just a 50 nm gap

Par : No Author
2 mai 2024 à 20:00

Two Bose-Einstein condensates (BECs) of magnetic atoms have been created just 50 nm apart from each other – giving physicists the first opportunity to study atomic interactions on this length scale. The work by physicists in the US could lead to studies of several interesting collective phenomena in quantum physics, and could even be useful in quantum computing.

First created in 1995, BECs have become invaluable tools for studying quantum physics. A BEC is a macroscopic entity comprising thousands of atoms that are described by a single quantum wavefunction.  They are created by cooling a trapped cloud of bosonic atoms to a temperature so low that a large fraction of the atoms are in the lowest energy (ground) state of the system.

BECs should be ideal for studying the quantum physics of exotic, strongly interacting systems. However, to prolong the lifetime of a BEC, physicists need to keep it isolated from the outside world to prevent decoherence. This need for isolation makes it difficult to manoeuvre BECs close enough together for the interactions to be studied.

Pancake layers

In the new work, researchers at Massachusetts Institute of Technology in the group of Wolfgang Ketterle (who shared the 2001 Nobel Prize for Physics for cresting BECs) tackled this problem by creating a double-layer BEC of dysprosium atoms, with the two layers just 50 nm apart. To achieve this, the researchers had to keep two pancake-like condensate layers a constant distance apart using lasers with wavelengths more than ten times their separation. This would have been almost impossible using separate optical traps.

Instead, the researchers utilized the fact that dysprosium has a very large spin magnetic moment. They lifted the degeneracy of two electronic spin states using an applied magnetic field. Atoms with opposite spins coupled to light with slightly different frequencies and opposite polarizations. The researchers sent light at both frequencies down the same optical fibre onto the same mirror. Both beams formed standing waves in the cavity. “If the frequency of these two standing waves is slightly different, then at the position where we load this bilayer array, these two standing waves are going to slightly walk off,” says Li Du, who is lead author on the paper describing the research. “Therefore by tuning the frequency difference we’re able to tune the interlayer separation,” he adds.

As both beams utilize the same optical fibre and the same mirror, they are robust to physical disturbance of these components. “Our scheme guarantees that you have two standing waves that can shake a little – or maybe a lot – but the shaking is a common mode, so the difference between the two layers is always fixed,” says Du.

Ringing atoms

The researchers heated one of the layers by about 2 μK and showed how the heat flowed across the vacuum gap to the other layer through the magnetic coupling of the atomic dipoles. Next they induced oscillations in the position of one layer and showed how these affected the position of the other layer: “We hit one layer with a hammer and we see that the other [layer] also starts to ring,” says Du.

The researchers now hope to use the platform to study how atoms closer together than one photon wavelength interact with light. “If the separation is much smaller than the wavelength of light, then the light can no longer tell [the atoms] apart,” says Du. “That potentially allows us to study a special effect called super-radiance.”

Beyond this, the researchers would like to investigate the work’s potential in quantum computing: “We would really like to implement a magnetic quantum gate purely driven by the magnetic dipole-dipole interaction,” he says. The same platform could also be used with BECs of molecules, which would open up the study of electric dipole–dipole interactions. Indeed, in late 2023, researchers at Columbia University in the US published a preprint that describes how they created a BEC of dipolar molecules. This preprint has yet to be peer reviewed.

Twisted graphene

Experimental atomic physicist Cheng Chin of the University of Chicago in Illinois, who last year collaborated with researchers at Shanxi University in China to produce a double layer of rubidium atoms to model twisted bilayer graphene, says that Ketterle and colleagues’ research is “very, very interesting”.

He adds, “This is the first time we’re able to prepare cold atom systems in two layers with such a small spacing…To control such a 2D system is hard but necessary in order to induce the interaction that’s required in two planes. It’s a very smart choice of atom because dysprosium has a very large dipole-dipole interaction. At a conventional spacing of half a micron, you wouldn’t be able to see any kind of coupling between the two layers, but 50 nm is just enough to show that the atoms in the two planes can really talk to each other.”

He suggests follow-up work from both teams’ research could focus on simulating new phases of matter and simulating emergent ones such as superconducting bilayer graphene.

The research is described in Science

The post Bilayer of ultracold atoms has just a 50 nm gap appeared first on Physics World.

  •  

Quantum Machines’ processor-based approach for quantum control

Par : No Author
2 mai 2024 à 15:13

This short video – filmed at the March Meeting of the American Physical Society in Minneapolis earlier in the year – features Itamar Sivan, chief executive and co-founder of Quantum Machines (QM). In the video, he introduces explains how QM makes the control electronics parts of quantum computers – that is, the classical hardware that drives quantum processors.

Yonatan Cohen, chief technology officer and fellow co-founder, then outlines the firm’s evolution to a processor-based approach for quantum control. As a result, QM has a unique processor that generates all the signals that communicate with it, allowing the firm to build more scalable architectures while maintaining high performance.

Cohen explains that the key for QM technology is to implement quantum error correction at scale – which is where the firm’s OPX1000 platform comes in. It is a scaled-up system with very high channel density, which means it can control many qubits with a relatively small system – making it, the firm says, the most scalable control system on the market.

Cohen also discusses the importance to QM of hiring staff who combine an expert knowledge with a passion for the technology and explainshow partnerships help QM maintain a competitive edge in the market. One such tie-up, with NVIDIA, allowed QM to create a link between its control system and the NVIDIA GPU-CPU platform – bringing more computing power to the heart of the quantum computer.

Sivan believes that after installation of the QM tech, within a couple of days, the customer can realize all the experiments that they had conceived.

The post Quantum Machines’ processor-based approach for quantum control appeared first on Physics World.

  •  

Cryptic quantum-physics word search: the solution

Par : No Author
1 mai 2024 à 18:08

Word search answers
Answers

Wave range? A plum tide at sea (9) [AMPLITUDE]

Sequence of hobo sonata carries force (5) [BOSON]

Yell out “circle” immediately? Overpowered freezer (8) [CRYOSTAT]

Nineties served up “greatest physicist” candidate (8) [EINSTEIN]

Devotion to closeness (8) [FIDELITY]                                                                       

Hubbub after unwanted disturbances (5) [NOISE]                                                               

Line, we heard, for bishop with computers? A quantum of quantum (5) [QUBIT]

Inky creature is sensitive about fields (5) [SQUID]

Ill gent, nun in a bad way? It’s barrier breaking (10)  [TUNNELLING]

Single cat is smallest matter (4) [ATOM]

Odd chart reveals quantum victim, potentially (3) [CAT]                                                     

Policeman finds right account for electron equation chap (5) [DIRAC]

Power to get-up-and-go (6) [ENERGY]

Physicist namesake for fictional meth lord? Cooking begins here (10) [HEISENBERG]

Nobel winner recited isometric exercise (6) [PLANCK]

Australian quantum physicist brings short model to space mountain (7) [SIMMONS]

Situation report, clearly (5) [STATE]

Danish physicist sounds like a pig (4) [BOHR]

Run out after firm hand? Result of quantum measurement (8) [COLLAPSE]

Gee, it’s neat! Uncertain, when some things are predictable (10) [EIGENSTATE]

Iron soldier charged? Obeys exclusion principle (7) [FERMION]

Mr Munster gets a bearing on often overlooked German quantum pioneer (7) [HERMANN]

Take his head! Queen killed dispatcher for sending secret messages (3) [QKD]

Rushes backwards in pirouette (4) [SPIN]

Creamy pus? Doctor to have the upper hand (9) [SUPREMACY]

And the hidden phrase: “This year is the centenary of the first prediction of Bose-Einstein condensates”

The post Cryptic quantum-physics word search: the solution appeared first on Physics World.

  •  

What lies beneath: unearthing the secret interior lives of planets

Par : No Author
1 mai 2024 à 12:00

Humanity has a remarkable drive for exploration. We have sent astronauts 384,400 kilometres out into space to walk on the Moon; delivered rovers and helicopters roughly 225 million kilometres away to survey Mars; and sent probes a whopping 24.3 billion kilometres out to the furthest reaches of our solar system. It is remarkable, then, that when it comes to our own home, we have literally only scratched the surface – the deepest hole ever dug reached less than 1% of the distance to the centre of the Earth.

The question of how we get to grips with the other 99% of what lies under our feet – not to mention beneath the surface of other worlds – is the subject of this sparkling new book, “What’s Hidden Inside Planets?” by Sabine Stanley, a physicist at Johns Hopkins University

Starting with an imagined journey down to the centre of the Earth in a hi-tech travel capsule, Stanley explains how, even though we have only ever drilled about a third of the way through the crust, phenomena on the surface can be used to infer the structure of the rest of the planet. The seismic waves that follow an earthquake change speed and direction as they pass through the Earth, which tells us that the interior has distinct layers – the mantle, the liquid outer core and the solid inner core. In addition, diamonds found on the Earth’s surface can tell us about the hot, high-pressure conditions below the surface where they were formed.

Stanley’s focus soon sweeps out to explore the rest of the solar system. Though we can’t send probes to the centres of other planets, clues to their interior composition sometimes fall at our feet in the form of meteorites. These are remnants of the early solar system that tell us about the conditions in which the planets formed.

The book also explains why Venus is at least one planetary scientist’s bête noire given that it resists all the techniques used to investigate planetary interiors. The planet has an atmosphere that is opaque to remote optical observations and the extreme conditions on the surface make it incredibly challenging to operate seismometers.

Stanley also includes a spin through upcoming planetary science missions and what they might tell us – from the Mars Sample Return Mission, which could shine more light on the red planet’s geology, to the Jupiter Icy Moons Explorer, to various missions to study the surface and interior of Venus. She finishes with a reflection on the importance of looking after the Earth as our home.

The chapter I most enjoyed was “Curious planetary elements”, which explores the weird-and-wacky phenomena believed to occur on and within other worlds, from helium rain and metal volcanoes to exotic phases of water and diamond icebergs.

I was intrigued to encounter for the first time the term “precovery”, which is when fresh information on astronomical objects is found in archive data and images that predates the actual discovery. As Stanley notes, for example, “Pluto was officially discovered in 1930, but astronomers digging through archives since then have found evidence of its discovery going farther back, at least to 1914, and possibly to 1909.”

Stanley also takes the reader through one of my favourite episodes in the history of science, and the reason we have reached that aforementioned 1% down into the Earth. This was the space race’s geological counterpart, the contest to drill the deepest possible hole into the Earth. The US broke ground (both literally and metaphorically) in 1961 with “Project Mohole”, which aimed to collect samples from the Mohorovičić (Moho) discontinuity, the boundary between the crust and mantle identified some 50 years previously via its impact on the velocity of seismic waves. Beset by mismanagement, the endeavour was abandoned after its first phase, reaching just 183 metres beneath the ocean floor. In 1979 the Soviet Union picked up the gauntlet to bore, within a decade, to a depth of more than 12.2 kilometres; this is about a third of the way through the crust at the site on north-west Russia’s Kola Peninsula.

The strength of Stanley’s work lies in her engaging, conversational, almost conspiratorial writing style

The strength of Stanley’s work lies in her engaging, conversational, almost conspiratorial writing style, which – amid a slew of running jokes, anecdotes and charming food-based metaphors – makes light work of considerable scientific ground that, in less deft hands, could easily have become a painful slog.

However, I feel the preface has far too much of the author’s personality and life history. Some of the introduction sets up later preoccupations – a family background in restauranteering, for example, fits the conceit of comparing planets to soup, cake, pudding and fruit. However, other details venture too far into “Dear Diary” territory. Details of childhood friends, teachers, fictional idols and university mentors, for example, do little to advance the book’s theme and might have been better gently edited into the acknowledgements section instead.

My only other real criticism is that while the journey is engaging, the destination of the book isn’t entirely clear. The final chapter touches on how our home is unique, how there is no Earth 2.0 to retreat to amid the growing chaos of anthropogenic climate change. This is an important take-home message, but not one that the rest of the book feels like it was working towards. I cannot help but feel that a stronger through-line could have set up this conclusion to a more satisfying effect.

  • 2023 Johns Hopkins University Press 272pp £14/$16.95 pb

The post What lies beneath: unearthing the secret interior lives of planets appeared first on Physics World.

  •  

Missing gamma rays cast doubt on cosmic-ray origins

Par : No Author
30 avril 2024 à 17:26

The lack of observed gamma rays from a recent supernova has cast doubt on the generally-accepted idea that exploding stars are a major source of cosmic rays. The observation was made using NASA’s Fermi Gamma-ray Space Telescope.

Cosmic rays are high-energy charged particles (mostly protons) that arrive at Earth from beyond the solar system. Their exact origins are a long-standing mystery because their trajectories are deflected by the magnetic fields that they encounter along the way.

The particle’s high energies suggest that they are born in violent astrophysical events such as supernovae (exploding stars) – a theory first proposed 90 years ago.

“This idea has been very successful in explaining cosmic rays within the Milky Way,” explains Guilleme Martí-Devesa at the University of Trieste, Italy, who led the research. “However, it has been seriously challenged by observational data in the last two decades.”

Gamma-ray clues

High-energy protons from supernovae are expected to create gamma rays, which do travel in straight lines and therefore offer ways of determining the origins of cosmic rays. In 2013, the Fermi telescope observed two remnants of nearby supernovae that exploded more than 10,000 years ago.

These objects have blast waves and expanding clouds of debris that are expected to create cosmic rays. Indeed, the observed gamma-ray spectra of the objects matched that expected if high-energy protons were colliding with debris to produce pions – which then decay to produce distinctive gamma rays.

While this provides evidence that cosmic rays are produced by supernovae, the gamma-ray observations suggest a much lower rate of production than is needed to explain the cosmic-ray flux impinging on Earth. In particular, the 2013 study could not explain the observed the abundance of particles with petaelectronvolt energies, which are close to the middle of the cosmic-ray energy spectrum.

One explanation of this shortcoming is that more cosmic rays are created in the days and weeks following the initial supernova explosion. Confirming this, however, would require a nearby star to explode.

“Unfortunately, supernova events are quite rare, and those detected with ease by optical telescopes in other galaxies are too far away for our most sensitive gamma-ray detectors,” Martí-Devesa explains.

Fortuitous explosion

But astronomers got lucky on 18 May 2023, when a star exploded in the nearby Pinwheel galaxy, which is about 21 million light–years away. Dubbed SN 2023ixf, the explosion was the brightest supernova ever observed by the latest generation of gamma-ray space telescopes – including Fermi.

This was the ideal opportunity to search for evidence of cosmic rays produced in the immediate aftermath of the explosion. However, what Martí-Devesa and colleagues observed was not what they had expected. They saw no relevant gamma rays.

“When we attempted to model the underlying cosmic ray population, we found that no more than 1% of the supernova’s energy was used to accelerate cosmic rays,” Martí-Devesa recalls. “This was a surprise, as we expected it to be close to 10%.”

The result suggests that contrary to decades of predictions, early-stage supernovae may not be a primary source of cosmic rays – leaving a glaring gap in the cosmic ray spectrum. For Martí-Devesa and colleagues, there are several possible reasons for this negative result, which will all require further investigation.

“Perhaps our modelling approach was too unrealistic? Was this supernova peculiar in some way? Or are we looking at the wrong sources?” Martí-Devesa speculates. “We need to explore further the physics of supernova shocks, to see whether supernovae play or not the central role we thought for the origin of cosmic rays.”

The observations are described in Astronomy & Astrophysics.

The post Missing gamma rays cast doubt on cosmic-ray origins appeared first on Physics World.

  •  

Multiphysics modelling of photonic devices with COMSOL

Par : No Author
30 avril 2024 à 15:30

 

Optics and photonics serve as enabling technologies in various industries, including communication, medical technology, sensor development, quantum computing, and manufacturing. In these fields, simulation helps accelerate and reduce the cost of R&D of optical components, which can range in size from the sub-wavelength scale to optically large. Utilizing multiphysics analysis is an important aspect of R&D in this area, as it involves accounting for electro-optical, stress-optical, and plasmonic effects, in addition to ubiquitous thermal effects in optical systems.

In this webinar we will discuss recent applications of COMSOL Multiphysics in the design of photonic devices and components in both industry and academic research. The webinar will include a live demonstration showing how to model surface plasmonic effects, and will conclude with a Q&A session during which you are welcome to ask questions.

An interactive Q&A session follows the presentation.

Nathaniel Davies

Nathaniel Davies joined COMSOL in early 2020 as an applications engineer specialising in electromagnetism. He studied at Oxford University, completing an undergraduate degree and PhD in condensed matter physics with a research specialism in novel magnetic and superconducting materials.

The post Multiphysics modelling of photonic devices with COMSOL appeared first on Physics World.

  •  

Degradation of commercial lithium-ion cells beyond 80% capacity

Par : No Author
30 avril 2024 à 10:31

The true useful life of Li-ion batteries is not well defined. Current operational cutoffs are often at 80% capacity retention, a holdover from the early electric vehicle industry that may not be applicable to other applications such as energy storage for the grid. Thus, there is little data in the open literature about systematic cycling of Li ion batteries beyond the traditional 80% cutoff.

In this webinar, we detail our ongoing study of battery-cycle aging at varied ambient temperature, discharge rate, and state-of-charge range for three different positive electrode chemistries: lithium iron phosphate, nickel cobalt aluminum oxide, and nickel manganese oxide. These commercial cells have been cycled for more than seven years and their capacity retention spans 80% to 40%. We cover trends that occur before and after 80% capacity, initial materials characterization, knee point occurrence, and sudden cell failure. This work represents the broadest assessment of commercial Li-ion battery aging in the open literature.

An interactive Q&A session follows the presentation.

Reed Wittman

Reed Wittman is a senior member of technical staff at Sandia National Laboratories. His work focuses on fundamentals of battery reliability and safety. This includes materials and electrochemical characterization of Li-ion battery degradation mechanisms during long-term cycling and fundamental origins of gas evolution in aqueous flow batteries. He earned his BSE in material science engineering at Arizona State University in 2013. He then went on to complete a PhD in energy science and engineering through the Bredesen Center, an interdisciplinary degree programme centred on all aspects of energy, at the University Tennessee and Oak Ridge National Laboratory. His PhD dissertation focused on using materials and electrochemical methods to understand the fundamental processes at the Zinc (Zn) electrode of alkaline Zn batteries.

Yulia Preger

Yuliya Preger is a principal member of technical staff in the Energy Storage Technology and Systems Group at Sandia National Laboratories. She earned her PhD and BS in chemical engineering from the University of Wisconsin-Madison and the Massachusetts Institute of Technology, respectively. Her current work is centred on the safety and reliability of batteries for grid-level energy-storage applications. Her publications span battery degradation and abuse response, application of power electronics to energy-storage safety, and system-level energy-storage safety analysis. Yuliya is co-founder of batteryarchive.org, the first public repository for visualization and comparison of battery degradation data across institutions, which has been used by thousands of individuals in academia and industry in more than 60 countries. She is currently leading the revision of the Department of Energy (DOE) Office of Electricity Energy Storage Safety Strategic Plan and the development of data-collection requirements for DOE-funded energy-storage projects across the US.

The post Degradation of commercial lithium-ion cells beyond 80% capacity appeared first on Physics World.

  •  

Bolometer measures state of superconducting qubit

Par : No Author
29 avril 2024 à 18:08

Researchers Finland say they are the first to determine the state of a superconducting qubit using a bolometer – a device that measures radiant heat. While the fidelity and speed of the readout fall far short of state-of-the-art conventional methods for quantum computing, the technique has the potential to be more scalable than current methods and could be immune to some types of noise.

Quantum computers use quantum bits (qubits) to store and process information. At the end of a calculation the quantum states of qubits must be read to extract the result. Some of the most advanced quantum computers to date – including those developed by Google and IBM – use qubits made from superconducting electronic circuits that are operated at very low temperatures.

Reading these qubits is currently a complex and difficult process, explains Mikko Möttönen of Aalto University and Finland’s VTT Technical Research Centre: “If you tried to measure the voltage directly it would be very challenging, because the voltage is tiny. Instead a microwave resonator is coupled to the qubit, and depending on the state of the qubit the frequency of the resonator gets shifted slightly.”

This process involves injecting microwaves into the measurement circuit, and then reading the state as a change in the phase of the current–voltage oscillations of the microwaves caused by their interaction with the qubit.

This is not ideal, as Möttönen explains: “The signal that you can put into this measurement circuit is very, very weak, so if you were to take it to room temperature without amplifying you would measure nothing,” he says. This is important, because the results of a quantum calculation must be ultimately be relayed to electronics operating at room temperature.

Uncertainty principle

“So on the way up to room temperature there will be several stages of amplifiers, and each of these amplifiers must add some noise. It can be done quite well, but it’s not at the same level of accuracy as quantum logic at the moment.” Another problem is more fundamental: measuring the state of the qubit involves measuring voltage and current, and Heisenberg’s uncertainty principle limits the precision at which these can both be known simultaneously.

Measuring the power emitted by the qubit with a bolometer circumvents both these problems. The measurement can be made in the refrigerator, so repeated amplification is unnecessary. Moreover, as there is no need for complete knowledge of the phase of a microwave in order to read off the energy level of the qubit, Heisenberg’s uncertainty principle does not limit the measurement’s accuracy in this way.

The researchers connected their bolometer to a standard superconducting qubit chip with a coupled microwave resonator. However, they made a slight tweak to the input. In a standard measurement scheme, the input microwave frequency is intermediate between the two resonant frequencies. “In that case, there’s no information in the amplitude of the oscillation because you’re in the middle of the resonances, so you always excite the same amplitude,” says Möttönen.

Slightly higher power

Instead, the researchers drove the resonator at the ground state frequency of the qubit. “Now we just channel the signal to the absorber of the bolometer,” says Möttönen.  If the qubit is in the ground state, the bolometer detects slightly higher power because of resonance. If the qubit is in the excited state, then the signal in the bolometer is lower – and this difference is used to determine the state of qubit.

For this technique to work at very high fidelity, a very fast and very sensitive bolometer is needed to measure the quantum state before it decays. In 2020, the Finnish researchers unveiled a bolometer that used graphene as its absorber – a fast and sensitive design that was intended for use in quantum computing. Unfortunately, this bolometer degraded over time and the team instead used an older bolometer design involving interfaces between superconductors and normal metals.

Möttönen says that the researchers had initially not expected the older design to be effective for reading out the states of individual qubits. He also expects that the read-out fidelity could be boosted using improved graphene bolometers. “I’m hoping to get the new graphene bolometers out of the oven soon,” he says.

David Pahl at the Massachusetts Institute of Technology believes that the work is very preliminary, but potentially very important. He says that the two most important performance metrics for a scheme to read out quantum states are the fidelity and the speed: “The state of the art speed that we’ve seen in the past year is 0.1 μs and 99.5% fidelity…[Möttönen and colleagues] showed 14 μs and 61.7%,” he says.

Pahl points out that the bolometer is much more compact than amplifier-based systems.  Amplifiers require bulky isolators, whereas bolometers could potentially be integrated on a chip, he says. He also points out that Heisenberg’s uncertainty principle does impose some theoretical limits on the sensitivity of a bolometer, but says that today’s devices are far from those limits.  He looks forward to seeing the results with graphene bolometers.

The research is published in Nature Electronics.  

The post Bolometer measures state of superconducting qubit appeared first on Physics World.

  •  

Search for tiny black holes puts tighter constraints on quantum gravity

Par : No Author
26 avril 2024 à 14:06

New observations of the flavour composition of atmospheric neutrinos have revealed no conclusive evidence for the minuscule, short-lived black holes that have been predicted by some theories of quantum gravity. The study was done by researchers using the IceCube Neutrino Observatory at the South Pole and the result places some of the tightest constraints ever on the nature of quantum gravity.

Developing a viable theory of quantum gravity is one of the greatest challenges in physics. Today, gravity is described very well by Albert Einstein’s general theory of relativity, which is incompatible with quantum theory. One important difference is that general relativity invokes space–time curvature to explain gravitational attraction while quantum theory is based on flat space–time.

Finding a way forward is challenging because the two theories work at very different energy scales, which makes doing experiments that test theories of quantum gravity very difficult.

“Creative measurements”

“In recent years, creative measurements have been devised to search for the tiny influence of quantum gravity: either via the use of extreme precision in laboratory experiments, or by exploiting the highly energetic particles produced in the distant universe,” explains Thomas Stuttard at the University of Copenhagen, who is a member of the IceCube collaboration.

Among these new theories is the idea that the quantum effects of uncertainty, combined with energy fluctuations in the vacuum of space, could have a tangible effect on the curvature of space–time, as described by general relativity. This could result in the creation of “virtual black holes”. If they exist, these microscopic objects would decay on the order of Planck time. This is about 10−44 s and is the smallest interval of time that can be described by current physical theories.

As a result, virtual black holes would be impossible to detect in the lab. But, if they really exist, researchers predict that they should interact with neutrinos, altering how the particles change flavour states via the phenomenon of neutrino oscillation.

Cubic kilometre of ice

The team searched for evidence of these interactions in data collected by the IceCube Neutrino Observatory, located at the South Pole. As the world’s largest neutrino observatory, IceCube consists of thousands of sensors positioned throughout a cubic kilometre of Antarctic ice.

These sensors detect distinctive flashes of light created by charged leptons that are produced why neutrinos interact with the ice. In this latest study, the team focussed on IceCube detections of high-energy neutrinos produced when cosmic rays interact with Earth’s atmosphere.

Stuttard explains that their search is not the first of its type. “This time, however, we were able to exploit the naturally high energy and large propagation distance of these ‘atmospheric’ neutrinos (rather than earthbound neutrino sources such as particle accelerators or nuclear reactors), as well as the high statistics afforded by the vast detector size. This enabled us to search for effects far weaker than can be probed by any previous study.”

Flavour composition

In their study, the team examined the flavour composition of over 300,000 neutrinos, observed by IceCube over an 8-year period. They then compared this result with the composition they expected to find if the neutrinos had indeed interacted with virtual black holes on their journey through the atmosphere.

Even with the extreme sensitivity offered by IceCube, the results were not any different from the flavour compositions predicted by the current model of neutrino oscillation. For now, this means that the theory of virtual black holes remains without any conclusive evidence.

However, this null result did allow the team to place new limits on the maximum possible strength of black hole–neutrino interactions, which are orders of magnitude more stringent than the limits set in previous studies.

“Aside from quantum gravity, the result also serves to demonstrate that the neutrino does appear to remain truly unperturbed by its environment even after travelling thousands of kilometres, even for neutrino energies exceeding any man-made collider,” says Stuttard. “This was a remarkable demonstration of quantum mechanics over truly macroscopic distances.”

More broadly, the team’s findings place new constraints on the theory of quantum gravity as a whole, constraints that are currently few and far between. “Whilst this work rejects certain scenarios, quantum gravity as a concept is certainly not excluded,” Stuttard adds. “The true nature of quantum gravity may differ from the assumptions made in this study, or the effects may be weaker or more strongly suppressed with energy than previous thought.”

The research is described in Nature Physics.

The post Search for tiny black holes puts tighter constraints on quantum gravity appeared first on Physics World.

  •  

NIST researchers develop magnetics-based analyte sensor

Par : No Author
25 avril 2024 à 17:45

In a proof-of-concept study, researchers at the National Institute of Standards and Technology (NIST) used a smartphone’s built-in magnetometer, combined with hydrogels that change their shape in response to specific cues, to measure sugar concentrations in beverages. The platform, they say, could potentially be used to measure glucose in biological samples, detect environmental toxins, or even test the pH of liquids in an at-home brewery.

That a smartphone can be used as a compass is thanks to its magnetometer, which measures the Earth’s magnetic field (or a local source of magnetism) in three directions. Postdoctoral researcher Mark Ferris and project leader Gary Zabow, both of whom work in the Applied Physics Division at NIST, decided to employ smartphone magnetometers to measure chemical constituents in samples.

“We’re trying to make a new sensing platform, and in particular, trying to make something that is very accessible to a lot of people. And so we have been using a cell phone, which most people have already, as the basis of the sensor platform,” says Zabow.

“We think [the sensing platform is] a good complement to optical smartphone devices that are already out there that may have a little more issue getting around autofluorescence, scattering and so on in murky samples…That’s where, in general, magnetics does better,” Ferris adds.

The magnetics-based sensing platform hinges on hydrogels – materials that swell when immersed in water – that are embedded with tiny magnetic particles. The hydrogels react to the presence of different chemical constituents of a sample, such as glucose, or to changing pH levels.

The platform works by clamping to a smartphone a small well containing a few millilitres of test solution and a strip of hydrogel. As the hydrogels enlarge or shrink, they move the magnetic particles closer to or father from the magnetometer, which detects and measures corresponding changes in the strength of the magnetic field.

The researchers opted to use a stack of hydrogels to amplify particle motion, making it easier to measure changes in magnetic field strength.

“Magnetics lends itself to being directly quantitative,” Zabow says. “It’s a number that you measure, the strength of the magnetic field. It’s not a picture that you need to convert into something that’s quantitative.”

So far, the researchers have demonstrated proof-of-concept in the sensing platform using control test samples, including wine and champagne. They observed that a high-sugar wine (sangria) induced a bigger change in magnetic field than low-sugar options (pinot grigio and champagne brut). Glucose concentrations were measured at high sensitivities, as small as a few millionths of a mole per litre.

The sensing platform is inexpensive and relatively easy to build and could be used in locations with relatively few resources.

The researchers’ next steps will be to improve platform sensitivity and specificity, which can be addressed by altering the hydrogel chemistry or by incorporating hydrogels that are sensitive to different analytes.

“There are some steps that need to be taken first, in terms of specificity of the tests to ensure that when you’re measuring glucose or pH – those are the two examples in the paper – there’s no interference, you’re not getting some other inadvertent contribution from something else in the solution. That’s a question of specificity of the hydrogel test strips, and that’s something that we still have to work on,” Zabow says.

This study is published in Nature Communications.

The post NIST researchers develop magnetics-based analyte sensor appeared first on Physics World.

  •  

Quantum mechanical wormholes fill gaps in black hole entropy

Par : No Author
25 avril 2024 à 10:30

A new theoretical model could solve a 50-year-old puzzle on the entropy of black holes. Developed by physicists in the US, Belgium and Argentina, the model uses the concept of quantum-mechanical wormholes to count the number of quantum microstates within a black hole. The resulting counts agree with predictions made by the so-called Bekenstein-Hawking entropy formula and may lead to a deeper understanding of these extreme astrophysical objects.

Black hole thermodynamics

Black holes get their name because their intense gravity warps space-time so much that not even light can escape after entering them. This makes it impossible to observe what goes on inside them directly. However, thanks to theoretical work done by Jacob Bekenstein and Stephen Hawking in the 1970s, we know that black holes have entropy, and the amount of entropy is given by a formula that bears their names.

In classical thermodynamics, entropy arises from microscopic chaos and disorder, and the amount of entropy in a system is related to the number of microstates consistent with a macroscopic description of that system. For quantum objects, a quantum superposition of microstates also counts as a microstate, and entropy is related to the number of ways in which all quantum microstates can be built out of such superpositions.

The causes of black hole entropy are an open question, and a purely quantum mechanical description has so far eluded scientists. In the mid-1990s, string theorists derived a way of counting a black hole’s quantum microstates that agrees with the Bekenstein-Hawking formula for certain black holes. However, their methods only apply to a special class of supersymmetric black holes with finely tuned charges and masses. Most black holes, including those produced when stars collapse, are not covered.

Beyond the horizon

In the new work, researchers from the University of Pennsylvania, Brandeis University and the Santa Fe Institute, all in the US, together with colleagues at Belgium’s Vrije Universiteit Brussel and Argentina’s Instituto Balseiro, developed an approach that allows us to peek inside a black hole’s interior. Writing in Physical Review Letters, they note that an infinite number of possible microstates exists behind a black hole’s event horizon – the boundary surface from which no light can escape. Due to quantum effects, these microstates can slightly overlap via tunnels in space-time known as wormholes. These overlaps make it possible to describe the infinite microstates in terms of a finite set of representative quantum superpositions. These representative quantum superpositions can, in turn, be counted and related to the Bekenstein-Hawking entropy.

According to Vijay Balasubramanian, a physicist at the University of Pennsylvania who led the research, the team’s approach applies to black holes of any mass, electric charge and rotational speed. It could therefore offer a complete explanation of the microscopic origin of black hole thermodynamics. In his view, black hole microstates are “paradigmatic examples of complex quantum states with chaotic dynamics”, and the team’s results may even hold lessons for how we think about such systems in general. One possible extension would be to search for a way to use subtle quantum effects to detect black hole microstates from outside the horizon.

Juan Maldacena, a theorist at the Institute for Advanced Study in Princeton, US, who was not involved in this study, calls the research an interesting perspective on black hole microstates. He notes that it is based on computing statistical properties of the overlap of black hole pure states that are prepared via different processes; while one cannot compute the inner product between these different states, gravity theory, through wormhole contributions, makes it possible to compute statistical properties of their overlap. The answer, he says, is statistical in nature and in the same spirit as another computation of black hole entropy performed by Hawking and Gary Gibbons in 1977, but it provides a more vivid picture of the possible microstates.

The post Quantum mechanical wormholes fill gaps in black hole entropy appeared first on Physics World.

  •  

Individual polyatomic molecules are trapped in optical-tweezer arrays

Par : No Author
24 avril 2024 à 17:28

Individual polyatomic molecules have been trapped in arrays of optical tweezers for the first time. Researchers in the US were able to control individual quantum states of the three-atom molecules and the technique could find applications in quantum computing and searches for physics beyond the Standard Model.

Cooling molecules to temperatures near absolute zero is an exciting frontier in ultracold physics because it provides a window into how chemical processes are driven by quantum mechanics. For decades, physicists have been cooling atoms to ultracold temperatures. However, molecules are much more challenging to cool because they can hold energy in many more degrees of freedom (rotation and vibration) – and cooling a molecule requires removing the energy from all of these. Considerable success has been achieved with diatomic molecules, but the number of degrees of freedom grows steeply with every additional atom, so progress with larger molecules has been more limited.

Now, John Doyle, Nathaniel Vilas and colleagues at Harvard University have cooled individual triatomic molecules to their quantum ground states. Each molecule comprises a calcium, an oxygen and a hydrogen atom.

Linear geometry

“The main thing that we like about this molecule is that, in the ground state, it has a linear geometry,” explains Vilas, “but it has a low-lying excited state with a bent geometry…and that gives you an additional rotational degree of freedom.”

In 2022, a team including Vilas and Doyle laser cooled a cloud of these molecules to 110 μK in a magneto-optical trap. No one, however, has ever previously cooled individual molecules containing more than two atoms to their quantum ground states.

In the new work, Vilas and colleagues loaded their molecules from a magneto-optical trap into an array of six adjacent optical tweezer traps. They used a laser pulse to promote some of the molecules to an excited state: “Because this excited molecule is there there’s a much larger cross section for the molecules to interact,” says Vilas, “So there’s some dipole-dipole interaction between the ground state and excited state, that leads to inelastic collisions and they get lost from the trap.” Using this method, the researchers reduced the number of molecules in almost all the tweezer traps to just one.

Before they could proceed with imaging the molecules, the researchers had to decide what wavelength of light they should use for the optical tweezer. The central requirement is that the tweezer must not cause unintended excitation into dark states. These are quantum states of the molecule that are invisible to the probe laser. The energy structure of the molecule is so complex that many of the high-lying states have not been assigned to any motion of the molecule, but the researchers found empirically that light at a wavelength of 784.5 nm led to minimal loss.

Population accumulation

The researchers then used a 609 nm laser to drive a transmission from a linear configuration of the molecule in which the three atoms are in a line, to a vibrational mode in which the line bends. The molecules were left in a combination of three near-degenerate spin sublevels. By subsequently pumping the molecules with a 623 nm laser, they excited the molecules to a state that either decayed back to one of the original sublevels or to a fourth, lower-energy sublevel that did not absorb the laser. With repeated excitation and decay, therefore, the population accumulated in the lower sublevel.

Finally, the researchers showed that a small radio-frequency magnetic field could drive Rabi oscillations between two energy levels of the system. This could be hugely important for future research in quantum computing: “The geometry doesn’t have any bearing on this current work…We have these six traps and each one is behaving completely independently,” says Vilas. “But you can think of each one as an independent molecular qubit, so our goal would be to start implementing gates on these qubits.” It could even be possible to encode information in multiple orthogonal degrees of freedom, creating “qudits” that carry more information than qubits.

Other possibilities include searches for new physics.  “Because of the diverse structure of these molecules there’s coupling between the structure and different types of new physics – either dark matter or high-energy particles beyond the Standard Model, and having them controlled at the level we have now will make the spectroscopic methods way more sensitive,” says Vilas.

“It’s sort of a milestone in the field because it says we can control even single molecules that have more than two atoms,” says Lawrence Cheuk of Princeton University in New Jersey; “If you add a third atom, you get a bending mode, and this is very useful in certain applications. So in the same work, the Doyle group not only showed they can trap and detect single triatomics: they also showed that they can manipulate in a coherent manner the bending mode inside these triatomics.” He is intrigued about whether still larger molecules can be manipulated, opening up the study of features such as chirality.

The research is described in Nature.   

The post Individual polyatomic molecules are trapped in optical-tweezer arrays appeared first on Physics World.

  •  
❌
❌